Wednesday, April 12th 2023
AMD Plays the VRAM Card Against NVIDIA
In a blog post, AMD has pulled the VRAM card against NVIDIA, telling potential graphics card buyers that they should consider AMD over NVIDIA, because current and future games will require more VRAM, especially at higher resolution. There's no secret that there has been something of a consensus from at least some of the PC gaming crowd that NVIDIA is being too stingy when it comes to VRAM on its graphics cards and AMD is clearly trying to cash in on that sentiment with its latest blog post. AMD is showing the VRAM usage in games such as Resident Evil 4—with and without ray tracing at that—The Last of US Part I and Hogwarts Legacy, all games that use over 11 GB of VRAM or more.
AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
Source:
AMD
AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
218 Comments on AMD Plays the VRAM Card Against NVIDIA
2. The difference between DLSS 2.0 and FSR 2.0 is placebo at best. Personally, I prefer native resolution to either of the two. I don't care about magic frames, either. The only valid argument is ray tracing, but even that isn't worth 30% extra money when it's only 16% faster.
Even for 1200 USD, the 4080 offers terrible value:
If for a moment you start to compare the two companies:
1) AMD does have a history of releasing bad drivers....but in the last decade that has turned around.
2) Both AMD and Nvidia have released bad hardware. AMD has poorly built coolers, Nvidia has poorly specified power connectors. Neither company is free from that.
3) Nvidia and AMD don't compete at the high end. Conversely, if you can put the numbers away Nvidia and AMD both compete between the $100 and $1000 market. This is kind of silly to have to state, but the 7900 XTX just isn't built to compete with the 4090...and that's alright for anyone looking to spend south of a grand on a GPU.
4) Nvidia favors tech and closed source solutions. AMD favors old school performance and large numbers. Both are viable strategies. If Nvidia wants to be the best with edge case tech it's fine, because AMD sees running the 99.99% of all games that don't run this new tech better then it's a win for them.
Let me follow this up with a war chest of stupidity. Linked physics accelerator cards. TressFX. SLI. Crossfire. Good lord I'm already depressed. My point is that both AMD and Nvidia have released some absolutely bizarre stuff. AMD rightfully earned a reputation for bad drivers (from the owner of a 5000 series card), but Nvidia has also rightfully earned a reputation for charging super premium prices. So we are clear, this is a permutation of the Apple vs. PC debate...or put more succinctly the ongoing debate between absolute ease of use versus features.
Now that I've said all of this, let me end with a real response. Yes, AMD is obviously going to point out the unique features which make their product more competitive when social media suddenly bring it to the front. Nvidia does the exact same with their streaming integration and absolutely silly ray tracing videos...and somehow that's not called out for the same? I'm a firm believer that both companies offer value in different ways, so seeing this level of criticism for what is standard practices is baffling. Long live both team Red and Green, may they actually compete and get card prices back down to a place where a good medium tier card (x070 and x800 respectively) cost less than buying an entire game console. That's the win I'm hoping for, and if we want that it'll take both of them having to compete harder.
Let's see what will happen if AMD's DESPERATE, as you say, marketing succeeds.
- People might start buying AMD cards. That's good for competition. It could force Nvidia to lower prices or come up with more models addressing the reasons people are turning to AMD.
- People don't buy AMD, but also don't buy Nvidia either. And I don't say they don't buy 8GB-12GB models and turn to 16GB-24GB models, which would have been what Nvidia would want to see and probably what some posters would want to see. I mean they DON'T BUY NVIDIA EITHER. Nvidia would have to address the reasons that make consumers to hesitate to buy their cards. So we could see new 4070 models coming out with 16GB of VRAM and 4060 and 4060 Ti coming out with a minimum of 12GBs instead of 8GBs. Models with less VRAM could see a price reduction or even slowly get removed from the market.
So, why calling out "desperate AMD" with their "pitiful attempts" instead of hopping they succeed?
As far as I'm concerned, whether it's $1,000 or $1,000,000, it's all the same to me. I'm never-ever going to pay that much for a video card.
Regarding prices in specific countries, here in Poland, the cheapest 4080 (Gainward) costs $1287, and the cheapest 7900xtx (Biostar) costs $1170. The price difference is 10%. And such a "cheap" 7900xtx is available from only one manufacturer, from Biostar. Prices for this card from other brands start at $1287.
Your graph,"performance per dollar" doesn't take into account either RT or FG.
2. The hw in consoles is entirely different than the one in pcs, not to mention, consoles use a different api to access it and they do not rely on drivers. With the upcoming archs, cpu and gpu wise from AMD, they will lose that little of the indirect optimisation advantage, they had, because those will be almost wholly different archs than the ones in the consoles currently. It is the same story every console generation. It is getting really dull.
7800X3D only matched Intel 13900K lows depending on the game and there are still plenty of titles in which Intel is considerably faster, including 1% and 0,1% lows, so I don't what you are talking about, AMD has no particular advantage in the lows % area and especially, when comes to 2 CCX models, 7900x3D and 7950x3D, it is actually their weakness if the games decide to use the CCX without the cache.
I personally believe that prices for all graphics cards are unreasonably high right now. This is a very bad trend. At one time, I used to build SLI configurations with two 470/770/970 cards, and it was 50% cheaper than buying a single 4080 now. I advocate for healthy competition, but when I see AMD's latest slides, I am appalled by their lies. Do they think people don't look at real tests before buying?) If they want to compete properly, they need to do only one thing - lower prices. Right now, I see that the recommended prices for the 4080 and 7900xtx are fair in terms of the ratio between the cards. They have roughly the same performance in regular rasterization, and Nvidia has an advantage in RT and technologies, which is why the 4080 is 20% more expensive. But this doesn't change the fact that both cards are unreasonably expensive. I would like to see the 4080 for at least $900 and the 7900xtx for $800.
Now let's get into your opinions.
1. Once the limit is reached? When Consoles have 16GB of VRAM. I have not heard 1 owner of a 6700 or above Complaining.
2. The Hardware is different? Explain to me how in hardware the AM4 CPU and RDNA2 GPU in PS5 and Xbox1 are different than AM4 on PC. AMD is going to abandon it partners? Will RDNA3 not work on AM4?
3. Do you think that Xbox Game Pass on PC is going to become incompatible when the Xbox2 or whatever launches on AMD systems? Did you see the Post about Game Pass being in 40 countries?
Yep you can talk about the 13900K but what happens next year this time when I can buy whatever AMD buys that will likely make a 13900K look like an 11900K. It does not matter though because all of that is just oxygen for the Culture Wars. I am of the strong opinion that if you put 4 people in front of 4 systems and not tell them what is inside that they will not know which one has AMD or Intel without being told.
I do have to say though that X3D is that good and you can't convince me or likely anyone that owns one to not appreciate that. The 13900K is also an absolute beast of a CPU and we should be glad that there are real choices to make you smile. Instead of preaching opinions from people that may have different reasons than you to say what they say.
I paid for a 4090.
I got 24GB.
I think the funny thing is that most people still using 1080p and 1440p monitors are the ones so emotionally concerned with 4K performance.
My local Microcenter has PLENTY of 4090 in stock right now (WESTBURY). There's no way I'd buy any card below the "top model". My 3090 served me extremely well during Cyberpunk's release week and continues to serve me well in my DCS WORLD Flight Sim with Oculus VR.
The 3090 and the 4090 were UNCOMPROMISING. If they'd put more VRAM in the 4080, I could say the same thing there as well.
You need to upgrade your rig to a 1000W PSU and get a bigger tower case. I got the Strix 4090 on launch and that behemoth is as large as my PS5. I AM SADDENED that NVIDIA stepped out and I couldn't get a Kingpin 4090...but I may consider selling my card and getting a Liquid Suprim X - or waiting for the 4090Ti version of it. I would wish people waiting on line for a 4070 luck - BUT THEY WON'T NEED IT. The economic downturn and lack of stimulus welfare checks is keeping Demand low.
Besides, I had PS3. It was fun, but the games were getting old quickly and internet browser was horrible so it was just sitting there for months unused. If I could use it akin to AiO PC, albeit impossible to upgrade, it'd make more sense. I considered buying a console numerous time, it just doesn't seem viable due to many reasons.
While you give a detailed explanation of how you see things, you don't really answer my question. I do understand that you talk about lies. Well, which company doesn't lie? Nvidia? You say that AMD is lying by promoting the importance of having more VRAM. How about saying nothing and promoting $600-$800+ cards to the general public as cards that are capable to max out every AAA game at 1440p? Isn't this a lie? And no. People don't look at real tests before buying. You think people have the time to check every product they buy to check if it is the best in the market? No one does that.
And let me repeat my reply to another member that I gave a few days ago of, why AMD can't see lowering prices as a viable strategy today Only Nvidia can fix Nvidia's pricing.
Anyway, all those trolls, fanboys, or just plain posters posting their own objective opinion about how bad AMD is, how bad their drivers are, how bad idea is to buy an RX 6600 over an RTX 3050, all those brought us today in a situation where AMD can't compete in GPUs. And they can't even start a price war because they know they will lose. And maybe they don't care, because they know that the whole "increasing prices" strategy from Nvidia, also helps them too. GPUs will become more expensive and even the integrated GPUs will become more expensive, at least the "strong" ones in the APUs. Everything will become more expensive in the end, but rejoice, people will avoid those bad Radeon drivers and those bad AMD graphics cards.
Damn, people shooting their own foot and never realizing.... Only a couple youtubers out there seems to realize it and they are constantly getting attacked. Hardware Unboxed that is of course AMD Unboxed, even when their videos support Nvidia products and technologies, and Linus of LTT. He sees that we are going into a monopoly and tries to push AMD as an alternative at least. Other Youtubers still drive all the focus of their audience to Nvidia. In their last video Gamers Nexus also realized that AMD also makes GPUs. They where promoting Nvidia constantly the last 12 months. JayTwoCents still makes videos that Nvidia's marketing department would envy. JMO
I understand your point of view on reducing prices, it's quite logical. But there's one thing. Wouldn't it be better for AMD to put 16GB of VRAM in the 7900XTX, thus saving money and selling more cards at a lower price while maintaining their profit margin and strengthening competition with Nvidia? Instead, they decided to pay people who make useless slides about the usefulness of 24GB in the distant future and about how in their fantasy the 7900XTX outperforms the 4080 in 95% of games)
And look at FSR 2... There would never have been this upscaling if Nvidia hadn't made its DLSS. The same story with FSR 3. Without Nvidia's FG, AMD wouldn't have even thought about it. The support for ray tracing on RX 7000 also surprises me. Couldn't they have implemented it better in the RX 7000 after the poor results of the RX 6000? Even the first generation of cards from Intel works with RT better. When attempts to compete come down to playing catch-up and publishing false slides, I logically cannot hope for any success. That is why I am calling out AMD for their pitiful attempts instead of hoping they succeed.
GSync? That was also a good idea from Nvidia. Again not their invention. Just the usage in PC desktop monitors for smoother game, that was their idea. And even there there was a way to implement it in a way to make it free to everyone, but, no, let's throw a hardware board there, make money even from that board and make it Nvidia exclusive. Do you remember PhysX? Don't let me start about PhysX. Anyway Nvidia innovates but for their own profit. So, even if AMD follows, you have to be grateful that there is someone following and also giving away the results of their efforts as FREE alternatives. Would it be better if there was NO ONE to follow Nvidia? My new TV is FreeSync Premium and GSync compatible through HDMI 2.1. It's GSync compatible thanks to AMD, not Nvidia. Nvidia followed AMD here.
You ask from AMD to compete when at the same time you will probably do anything to stop people from buying their cards. So, why compete if no one will buy their cards? You ask from AMD it's marketing slides to be perfect, but do you ask from Nvidia to NOT lie in their marketing slides? It's not just you. There are so many people out there expecting from AMD to build the best hardware, offer it at the lowest prices and also market it in ways that will not showing it's strengths against competing hardware, only it's disadvantages. So many that I could copy this post and repost it 1000 times in the next 10 years. But you will never call Nvidia. Only about pricing, another fault that is 100% AMD's fault. NOT Nvidia's......
For the equipped VRAM questions, we can all be sure that RXs with 10-12Gb will age better than their RTX 8Gb-geared counterparts, experience should be smoother with demanding games.
And talking about FSR/DLSS, don't get on this slippery path. A few years ago (was it for the GeForce 5xx or 6xx series, I honestly can't recall), there were proofs that nVidia started playing with rendering quality to take the lead in Ultra settings. Nowadays, this type of cheat is considered as a feature.
But yes, I can say that nVidia can innovate, RT being one nice eye-candy, even if it is not mature yet (I still remember the fuss about the first Anti Aliasing techniques back in late '90s :D)
And don't get me wrong, I had a lot of cards from various vendors over the last 30 years or so, from Trident to S3, to Matrox, to 3DFx, then a relatively balanced mix between ATi/AMD and nVidia (and a Kyro2, I must say :))
The 8GB cards could end up running lower texture settings but higher quality effects.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.
Hardware unboxed used to make videos finding the best settings for game with a good balance between performance and visual quality. Most games look good starting from medium/high effects, except for textures, which should be at maximum.
RTGI will offer a higher visual boost than higher texture settings will.
From a common-sense perspective: more VRAM is going to be more future-proofed. You have more resources available to you when future games demand more. AMD sell higher-VRAM-equipped cards more than Nvidia, ergo this advantage belongs to them and it is a valid critique against the competition. Just as Nvidia's Ray Tracing prowess is a future-proofed advantage that belongs to them. Once again, don't get offended, just buy the equipment that better suits your needs. Saddling yourself to a single tribe is doing yourself a disservice and defending that tribe with empty ad-hom and no data is a good way to look foolish.