I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?
Last time, they dropped the price by $50 on the day before release day .... which can only be seen as raising the performance white flag. You don't lower prices when you have a better product you raise them
Looking forward to not being able to buy this anywhere. And when it does pop up somewhere, It's gonna be overpriced.
You could sacrifice being the 1st one on the block to have one .... and just wait till the new and improved later steppings arrive with mature BIOSs, flaws corrected and slightly better OCs.
I mean 3070 is such an easy target for AMD to beat. ....
I think if it was so easy, AMD would have done it with 7xx, 9xx, 10xx and 20xx. Last gen the only tier winner was the 5600 XT. I hope they can do it ... but it's not like they've had a horse in the upper tiers in quite a while.
I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.
4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.
That wasn't driven by RAM as much as GPUs at that point were now capable of handling higher resolutions ... if you look at the same card VRM comparisons done for 6xx, 7xx, 9xx and 10xx ... the conclusion is always the same .... when you hike the settings to a point where VRAM matters.... the game is unplayable. Playing games with X VRAM vs Y VRAM, yes when pushed with higher loads and max settings, we do see differences double digits ... like 15 fps to 20 fps. But either way, no one ifs going to play a game at those settings at that resolution. Then problem here is not the VRAM., don't play at 2160p k w/ a GPU targeted at 1440p. In every era, the VRAM thing rears its head and in every era it''s has come to the same conclusion.
6xx Era -
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
So, what can we glean from all of that? For one thing, with any single monitor the 2GB video card is plenty - even on the most demanding games and benchmarks out today. When you scale up to three high-res screens the games we tested were all still fine on 2GB, though maxing out some games’ settings at that resolution really needs more than a single GPU to keep smooth frame rates.
7xx Era -
http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
You couldn't install Max Payne on a system w/ a 2GB 770 (reported to use 2.7 GB) , but it would install on the 4 GB and yet .... when the popped out the 4 GB and installed the 2 GB it ran at same fps with same visual quality and user experience. RAM allocation and RAM usage are two different things. Resolutions used on this test went up to 5760 x 1080
"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ***If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards***.
9xx Era -
https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,26.html
The GTX 960 should probably have been launched with 3 GB VRAM standard, hence I can recommend these 4 GB versions very much. But yeah, the end-results remain a bit trivial with the price-premium in mind -- that I need to admit.
10xx Era -
http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
"We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly .... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
"We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” ... Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned.
First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, ***provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU***
Also .. in TPUs 1060 3GB Review have to wonder why nvidia stripped out 11% of the shaders from the 6 GB version. The reason is the 6 GB card had to show a performance difference to justify it's price and w/o the cut in shader count it would not. When ya look at 1080p scores between the two, the 6 GB card w/ 11% more shaders is 6% faster .... Stands to reason then if VRAM is in any way associated with hat difference, that 6% would grow substantially at 1440p .... in the real world it does not, just the same 6% performance difference.
I have certain games, that show anomalies, such as poor console ports. Have seen the performance difference between the 3 GB and 6 GB get igger at the 1440p reolution .... and one game where it got closer at 1440p resolution. In short, 99% of the time where we see substantial differences, in fps between same cards w/ different memory amounts, the GPU was stressed to a point that the game was unplayable. At any combination of resolution and setting, if VRAM is going to be a problem, the GPU is already a problem. Finding "a game" that doesn't follow this observation doesn't change the rule because there's a few rare anomalies.
Pay attention to the article ... "With those performance numbers, RTX 3070 is the perfect choice for the huge 1440p gamer crowd". That 8 GB is just fine for 1440p. Adding 4 GB isn't going to help you, by the time you get past 8GB (used not allocated) at 2160p and high settings, your GPU will be delivering < 30 fps. Again, W1zzard nailed it:
"The GeForce RTX 3070 comes with 8 GB of memory, which will be the basis for a lot of discussion, just like on the RTX 3080 10 GB. I feel like 8 GB is plenty of memory for the moment, especially considering this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel you'll run out of shading power long before memory becomes an issue.
As to why nvidia timed it this way ? .... Cause they are not idiots. Back in the 7xx era... AMD was in trouble. They had nothing ... many pundits were asking "why does the 780 have the specifications that were leaked for the 770 ? AMD spend a fortune building up the 290 / 290x over several months and the week before it's release , nvidia suddenly announced that apparently had this 780 Ti thing that was just sitting on the shelf and would ship in two weeks. Took all the air out of AMDs sales and the 2xx series fell flat.
Why do they charge what they charge ? ... because no one is stopping them. Price is what the market will bear and 4.5 times as many people are choosing with their wallets to pay the price premium. AMD doesn't have a recent generation card in the top 20 / nvidia has 7. The top 5 cards in use are 1xxx series cards... the 580 places 10th and the 570 places 16th. Wat message is sent when new cards come out and folks are buying them up at 10, 20, 30 % or more of MSRP ? ... it says "we can charge more" Corporate office are legally bound, within legal boundaries, to maximize profit to their investors. Failure to do so is grounds for dismissal or even malfeasance charges.
The 480 had more RAM than the 1060 but it accomplished what ? ... with both cards overclocked, 1060 was16.6% faster in TPUs Gaming test suite ,quieter and ran cooler. When folks say "it will be easy to beat nVidia .... upon what recent past wins is this being based on ? The 1060 is the most popular card in use today w/ 10.79 % market share ... the 480 has 0.49% ... the 580 has 2.22%. Combined that's 2.71 % or 25% of the 1060. That's not a win ... the 5600 XT is a win, it's clearly better than the competition in that price range.
If we want prices to drop, one of two things has to happen, only one of which seems possible:
a) Stop buying
b) A competitive product hits the market
a) is not going to happen leaving us only with b) as a the only possible price impacting scenario ... so while I can still root for b).... nothing AMD has done in the last 7-8 years (other than the 5600 XT) tells me that this is likely to happen. Having more RAM is not a feature unless it brings something to the table. The fact that nvidia is beginning to show their hand with the xx70 tends to suggest that they know something we don't and that's not a good sign. I hope I'm wrong. Our hope for better prcing lies in AMD being able to deliver a 3070 level "5600 XT like" card that has the power to compete with the 3070.... slapping extra RAM on it is not going to help here.