Wednesday, April 12th 2023

AMD Plays the VRAM Card Against NVIDIA

In a blog post, AMD has pulled the VRAM card against NVIDIA, telling potential graphics card buyers that they should consider AMD over NVIDIA, because current and future games will require more VRAM, especially at higher resolution. There's no secret that there has been something of a consensus from at least some of the PC gaming crowd that NVIDIA is being too stingy when it comes to VRAM on its graphics cards and AMD is clearly trying to cash in on that sentiment with its latest blog post. AMD is showing the VRAM usage in games such as Resident Evil 4—with and without ray tracing at that—The Last of US Part I and Hogwarts Legacy, all games that use over 11 GB of VRAM or more.

AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.
Source: AMD
Add your own comment

218 Comments on AMD Plays the VRAM Card Against NVIDIA

#101
AusWolf
Winssy1. $1000 vs $1200. 2. The 4080 has way better RT performance, better upscale quality (DLSS) and frame generation capability. So yes, in my opinion paying $200 more is absolutely worth it.
1. I quoted UK prices at Scan. The 7900 XTX is £1000, the 4080 is £1300. That's 30% price increase for a -2% reduction in performance. Even in RT, it's only 16% faster. You must be crazy to think it's a good deal.
2. The difference between DLSS 2.0 and FSR 2.0 is placebo at best. Personally, I prefer native resolution to either of the two. I don't care about magic frames, either. The only valid argument is ray tracing, but even that isn't worth 30% extra money when it's only 16% faster.


Even for 1200 USD, the 4080 offers terrible value:
Posted on Reply
#102
lilhasselhoffer
ymdhisTranslation: our cards suck, but we put a lot of VRAM in them because that was the only thing we could do to make them look more valuable.
You know, I love reading about all this hate for AMD, and reading statements without substance, and then trying to parse out whether this is a fanboy post or someone with an actual point.

If for a moment you start to compare the two companies:
1) AMD does have a history of releasing bad drivers....but in the last decade that has turned around.
2) Both AMD and Nvidia have released bad hardware. AMD has poorly built coolers, Nvidia has poorly specified power connectors. Neither company is free from that.
3) Nvidia and AMD don't compete at the high end. Conversely, if you can put the numbers away Nvidia and AMD both compete between the $100 and $1000 market. This is kind of silly to have to state, but the 7900 XTX just isn't built to compete with the 4090...and that's alright for anyone looking to spend south of a grand on a GPU.
4) Nvidia favors tech and closed source solutions. AMD favors old school performance and large numbers. Both are viable strategies. If Nvidia wants to be the best with edge case tech it's fine, because AMD sees running the 99.99% of all games that don't run this new tech better then it's a win for them.


Let me follow this up with a war chest of stupidity. Linked physics accelerator cards. TressFX. SLI. Crossfire. Good lord I'm already depressed. My point is that both AMD and Nvidia have released some absolutely bizarre stuff. AMD rightfully earned a reputation for bad drivers (from the owner of a 5000 series card), but Nvidia has also rightfully earned a reputation for charging super premium prices. So we are clear, this is a permutation of the Apple vs. PC debate...or put more succinctly the ongoing debate between absolute ease of use versus features.



Now that I've said all of this, let me end with a real response. Yes, AMD is obviously going to point out the unique features which make their product more competitive when social media suddenly bring it to the front. Nvidia does the exact same with their streaming integration and absolutely silly ray tracing videos...and somehow that's not called out for the same? I'm a firm believer that both companies offer value in different ways, so seeing this level of criticism for what is standard practices is baffling. Long live both team Red and Green, may they actually compete and get card prices back down to a place where a good medium tier card (x070 and x800 respectively) cost less than buying an entire game console. That's the win I'm hoping for, and if we want that it'll take both of them having to compete harder.
Posted on Reply
#103
john_
WinssyAMD is desperately trying to sell their cards. According to all the tests, on average the difference between the 4080 and the 7900xtx in regular rasterization is only 2-3%, but they are trying to make it seem like their product dominates with a 10-20% lead everywhere except for five games with ray tracing) It's a pitiful attempt.
You see things from a totally wrong perspective here. Remember. Nvidia already won. They don't need and they shouldn't manage to get the rest of the market share. Why?
Let's see what will happen if AMD's DESPERATE, as you say, marketing succeeds.

- People might start buying AMD cards. That's good for competition. It could force Nvidia to lower prices or come up with more models addressing the reasons people are turning to AMD.
- People don't buy AMD, but also don't buy Nvidia either. And I don't say they don't buy 8GB-12GB models and turn to 16GB-24GB models, which would have been what Nvidia would want to see and probably what some posters would want to see. I mean they DON'T BUY NVIDIA EITHER. Nvidia would have to address the reasons that make consumers to hesitate to buy their cards. So we could see new 4070 models coming out with 16GB of VRAM and 4060 and 4060 Ti coming out with a minimum of 12GBs instead of 8GBs. Models with less VRAM could see a price reduction or even slowly get removed from the market.

So, why calling out "desperate AMD" with their "pitiful attempts" instead of hopping they succeed?
Posted on Reply
#104
bug
AusWolfYeah, the £1,000 7900 XTX is only 2-3% faster than the £1,300 4080. How pitiful, really! :rolleyes:
He only said it's pitiful how AMD is trying to paint it faster than it is. Pricing is a different aspect.

As far as I'm concerned, whether it's $1,000 or $1,000,000, it's all the same to me. I'm never-ever going to pay that much for a video card.
Posted on Reply
#105
Winssy
AusWolfI prefer native resolution to either of the two. I don't care about magic frames, either.
If you're not interested in upscaling or frame generation, that's your personal choice. I am interested in those features. Currently, I'm playing Cyberpunk with new path tracing at 1440p on ultra settings, with DLSS Q and FG enabled, and I'm getting 95-115 fps with great smoothness and image quality using my 4080. The 7900xtx is not even capable of coming close to that level of performance. In my opinion, paying 20% more or even 30% more in your case is absolutely worth it.
Regarding prices in specific countries, here in Poland, the cheapest 4080 (Gainward) costs $1287, and the cheapest 7900xtx (Biostar) costs $1170. The price difference is 10%. And such a "cheap" 7900xtx is available from only one manufacturer, from Biostar. Prices for this card from other brands start at $1287.
Your graph,"performance per dollar" doesn't take into account either RT or FG.
Posted on Reply
#106
Suspecto
kapone32Expect this trend to continue to grow instead of diminish. AMD is is in the Xbox and PS5. That means that Games that are made for all 3 are already optimized for AMD. One of the things I have noticed about AMD on new Games is how consistent the Clocks are. Translating to ultra smooth Gameplay. This is why for me averages mean nothing and 1% lows are everything. Once you go X3D you don't go back.
1. The trend will stop once they reach the limit on consoles.
2. The hw in consoles is entirely different than the one in pcs, not to mention, consoles use a different api to access it and they do not rely on drivers. With the upcoming archs, cpu and gpu wise from AMD, they will lose that little of the indirect optimisation advantage, they had, because those will be almost wholly different archs than the ones in the consoles currently. It is the same story every console generation. It is getting really dull.

7800X3D only matched Intel 13900K lows depending on the game and there are still plenty of titles in which Intel is considerably faster, including 1% and 0,1% lows, so I don't what you are talking about, AMD has no particular advantage in the lows % area and especially, when comes to 2 CCX models, 7900x3D and 7950x3D, it is actually their weakness if the games decide to use the CCX without the cache.
Posted on Reply
#107
Winssy
john_So we could see new 4070 models coming out with 16GB of VRAM and 4060 and 4060 Ti coming out with a minimum of 12GBs instead of 8GBs.
We will never see a 4070/4070ti with 16GB of VRAM. It's not as simple as just soldering two memory chips, the entire GPU needs to be redesigned. This is only possible if these cards are based on the AD103 chip from the 4080. However, this is not practical. The 4060/4060ti also cannot have more than 8GB for the same reason.

I personally believe that prices for all graphics cards are unreasonably high right now. This is a very bad trend. At one time, I used to build SLI configurations with two 470/770/970 cards, and it was 50% cheaper than buying a single 4080 now. I advocate for healthy competition, but when I see AMD's latest slides, I am appalled by their lies. Do they think people don't look at real tests before buying?) If they want to compete properly, they need to do only one thing - lower prices. Right now, I see that the recommended prices for the 4080 and 7900xtx are fair in terms of the ratio between the cards. They have roughly the same performance in regular rasterization, and Nvidia has an advantage in RT and technologies, which is why the 4080 is 20% more expensive. But this doesn't change the fact that both cards are unreasonably expensive. I would like to see the 4080 for at least $900 and the 7900xtx for $800.
Posted on Reply
#108
kapone32
Suspecto1. The trend will stop once they reach the limit on consoles.
2. The hw in consoles is entirely different than the one in pcs, not to mention, consoles use a different api to access it and they do not rely on drivers. With the upcoming archs, cpu and gpu wise from AMD, they will lose that little of the indirect optimisation advantage, they had, because those will be almost wholly different archs than the ones in the consoles currently. It is the same story every console generation. It is getting really dull.

7800X3D only matched Intel 13900K lows depending on the game and there are still plenty of titles in which Intel is considerably faster, including 1% and 0,1% lows, so I don't what you are talking about, AMD has no particular advantage in the lows % area and especially, when comes to 2 CCX models, 7900x3D and 7950x3D, it is actually their weakness if the games decide to use the CCX without the cache.
Where are your system specs? You are telling me that my 7900X3D does not always drive my GPU and if the CCX without the Cache is used what is the all core maximum clock speed? One of the assumptions that people make are that CPUs do not improve based on Day 1 reviews and Microsoft updates do not bring performance improvements.

Now let's get into your opinions.

1. Once the limit is reached? When Consoles have 16GB of VRAM. I have not heard 1 owner of a 6700 or above Complaining.
2. The Hardware is different? Explain to me how in hardware the AM4 CPU and RDNA2 GPU in PS5 and Xbox1 are different than AM4 on PC. AMD is going to abandon it partners? Will RDNA3 not work on AM4?
3. Do you think that Xbox Game Pass on PC is going to become incompatible when the Xbox2 or whatever launches on AMD systems? Did you see the Post about Game Pass being in 40 countries?

Yep you can talk about the 13900K but what happens next year this time when I can buy whatever AMD buys that will likely make a 13900K look like an 11900K. It does not matter though because all of that is just oxygen for the Culture Wars. I am of the strong opinion that if you put 4 people in front of 4 systems and not tell them what is inside that they will not know which one has AMD or Intel without being told.

I do have to say though that X3D is that good and you can't convince me or likely anyone that owns one to not appreciate that. The 13900K is also an absolute beast of a CPU and we should be glad that there are real choices to make you smile. Instead of preaching opinions from people that may have different reasons than you to say what they say.
Posted on Reply
#109
QUANTUMPHYSICS
You pay for what you get.

I paid for a 4090.

I got 24GB.

I think the funny thing is that most people still using 1080p and 1440p monitors are the ones so emotionally concerned with 4K performance.

My local Microcenter has PLENTY of 4090 in stock right now (WESTBURY). There's no way I'd buy any card below the "top model". My 3090 served me extremely well during Cyberpunk's release week and continues to serve me well in my DCS WORLD Flight Sim with Oculus VR.

The 3090 and the 4090 were UNCOMPROMISING. If they'd put more VRAM in the 4080, I could say the same thing there as well.

You need to upgrade your rig to a 1000W PSU and get a bigger tower case. I got the Strix 4090 on launch and that behemoth is as large as my PS5. I AM SADDENED that NVIDIA stepped out and I couldn't get a Kingpin 4090...but I may consider selling my card and getting a Liquid Suprim X - or waiting for the 4090Ti version of it. I would wish people waiting on line for a 4070 luck - BUT THEY WON'T NEED IT. The economic downturn and lack of stimulus welfare checks is keeping Demand low.
Posted on Reply
#110
BoboOOZ
WinssyWe will never see a 4070/4070ti with 16GB of VRAM. It's not as simple as just soldering two memory chips, the entire GPU needs to be redesigned. This is only possible if these cards are based on the AD103 chip from the 4080. However, this is not practical. The 4060/4060ti also cannot have more than 8GB for the same reason.
Technically it is possible when higher density memory chips become available, on the last gen there was a guy offering to make "custom" 3080 20GB (desoldering the 1GB chips and soldering 2GB ones). This doesn't require any modification to the die, the bus width is still the same. The probability that we see this from Nvidia is very, very small, though.
Posted on Reply
#111
tommo1982
AMD ought to stop making dumb comments and concentrate on developing their hardware. Their comment looks pathetic after RDNA3 release and ZEN4 pricing.
TheinsanegamerNThat's another big part, people forget that console settings are perfectly playable even on RX 5600xts now. You just gotta accept low/meduim settings at 1080p, maybe 1440p with resolution scaling turned down.

Consoles are not running 4k ultra native rez 60+ FPS.
They forget console games must be well optimised. The hardware is fixed and the game has to run smoothly on current and previous generation console. The games are also limited due to controller. I don't see Civilisation series playable on PS5 or XBox Series.
Besides, I had PS3. It was fun, but the games were getting old quickly and internet browser was horrible so it was just sitting there for months unused. If I could use it akin to AiO PC, albeit impossible to upgrade, it'd make more sense. I considered buying a console numerous time, it just doesn't seem viable due to many reasons.
Posted on Reply
#112
Winssy
BoboOOZTechnically it is possible when higher density memory chips become available, on the last gen there was a guy offering to make "custom" 3080 20GB (desoldering the 1GB chips and soldering 2GB ones). This doesn't require any modification to the die, the bus width is still the same. The probability that we see this from Nvidia is very, very small, though.
No. All RTX 4000 graphics cards already use 2GB chips. And there definitely won't be any 3GB chips in the next year or two. If 3GB chips do appear, it will only be with the release of the RTX 5000.
Posted on Reply
#113
john_
WinssyWe will never see a 4070/4070ti with 16GB of VRAM. It's not as simple as just soldering two memory chips, the entire GPU needs to be redesigned. This is only possible if these cards are based on the AD103 chip from the 4080. However, this is not practical. The 4060/4060ti also cannot have more than 8GB for the same reason.

I personally believe that prices for all graphics cards are unreasonably high right now. This is a very bad trend. At one time, I used to build SLI configurations with two 470/770/970 cards, and it was 50% cheaper than buying a single 4080 now. I advocate for healthy competition, but when I see AMD's latest slides, I am appalled by their lies. Do they think people don't look at real tests before buying?) If they want to compete properly, they need to do only one thing - lower prices. Right now, I see that the recommended prices for the 4080 and 7900xtx are fair in terms of the ratio between the cards. They have roughly the same performance in regular rasterization, and Nvidia has an advantage in RT and technologies, which is why the 4080 is 20% more expensive. But this doesn't change the fact that both cards are unreasonably expensive. I would like to see the 4080 for at least $900 and the 7900xtx for $800.
Obviously the data bus could be a reason to not see cards with more VRAM, but we could see, as you say, cut down versions of AD103 going into 4070s and AD104 going to 4060. If not this year, maybe next year. Nvidia is doing it constantly and while we could argue that those cases are just cases when there is a lack of chips, or just plenty of problematic chips, if sales are not the expected, that's also a reason for Nvidia to choose to come out with new models with more VRAM. They have plenty of profit margin to play ball and also I doubt the cost of the silicon is prohibiting. The cost of the silicon is mostly the R&D costs associated with it's development, not the cost of the silicon itself, not what TSMC charges.

While you give a detailed explanation of how you see things, you don't really answer my question.
john_So, why calling out "desperate AMD" with their "pitiful attempts" instead of hopping they succeed?
I do understand that you talk about lies. Well, which company doesn't lie? Nvidia? You say that AMD is lying by promoting the importance of having more VRAM. How about saying nothing and promoting $600-$800+ cards to the general public as cards that are capable to max out every AAA game at 1440p? Isn't this a lie? And no. People don't look at real tests before buying. You think people have the time to check every product they buy to check if it is the best in the market? No one does that.

And let me repeat my reply to another member that I gave a few days ago of, why AMD can't see lowering prices as a viable strategy today
john_AMD profit margin 45%, Nvidia profit margin 65%
AMD card A at $700, Nvidia card A at $800 (same performance) - people buy Nvidia
AMD drops price at $600, Nvidia follows at $700 - people buy Nvidia
AMD drops price at $500, Nvidia follows at $600 - people buy Nvidia
AMD drops price at $400, Nvidia follows at $500 - people buy Nvidia
AMD loses interest because of low or no profit, Nvidia still makes a good profit, because of the higher profit margin they begin with.
Next Nvidia card A comes at $1000, AMD follows with their next card A at $900. AMD NEVER drops price, knowing from the beginning they will lose. Nvidia never drops price, because it sees consumers buying. Consumers blame AMD for the high and stable prices. Because they demand from AMD to fix Nvidia's pricing. Things only get worst in the future.
Only Nvidia can fix Nvidia's pricing.
Posted on Reply
#114
BoboOOZ
john_And let me repeat my reply to another member that I gave a few days ago of, why AMD can't see lowering prices as a viable strategy today


Only Nvidia can fix Nvidia's pricing.
That's basically what Jim says in this video, selling cheaper in the past for AMD was a losing strategy, smaller and smaller margins lead to less and less R&D budget. If they should be gaining market share, they must do it by also keeping healthy margins in the same time. The same goes for Intel's GPU adventure, btw.
Posted on Reply
#115
john_
BoboOOZ
That's basically what Jim says in this video, selling cheaper in the past for AMD was a losing strategy, smaller and smaller margins lead to less and less R&D budget. If they should be gaining market share, they must do it by also keeping healthy margins in the same time. The same goes for Intel's GPU adventure, btw.
I haven't seen the video, but it's just logic. People hope for AMD to lower prices to pay less to Intel and Nvidia. Well, that's not sustainable. They will buy one, two, three times cheaper Intel or/and Nvidia hardware and that's it. With AMD not having the income it needs and neither market share increase to at least hope to gain customers for it's next products, they will lose the abillity or even the interest to compete. Then monopoly happens and whatever people saved thanks to AMD's lowering prices strategy, they end up paying it double or more to Intel and Nvidia because they are a monopoly now. Fun part. They blame AMD when it's them who didn't gave AMD money to use in it's R&D and keep making competitive products. They where expecting others, less smart, to do it, to buy AMD hardware. They where the smart ones, taking advantage of AMD's strategy to buy cheaper Intel and Nvidia hardware. And yes the same goes for Intel's GPU adventure. I didn't pay any money to get an ARC GPU. Did anyone in here done so? Maybe one or two. Who expect Intel to save them from Nvidia's pricing. Probably many more than just one or two. Well it will never happen if Intel sees that department losing money every quarter. Of course Intel enjoys strong ties with OEMs, but this is probably enough to empty it's inventory, not to make money.
Anyway, all those trolls, fanboys, or just plain posters posting their own objective opinion about how bad AMD is, how bad their drivers are, how bad idea is to buy an RX 6600 over an RTX 3050, all those brought us today in a situation where AMD can't compete in GPUs. And they can't even start a price war because they know they will lose. And maybe they don't care, because they know that the whole "increasing prices" strategy from Nvidia, also helps them too. GPUs will become more expensive and even the integrated GPUs will become more expensive, at least the "strong" ones in the APUs. Everything will become more expensive in the end, but rejoice, people will avoid those bad Radeon drivers and those bad AMD graphics cards.

Damn, people shooting their own foot and never realizing.... Only a couple youtubers out there seems to realize it and they are constantly getting attacked. Hardware Unboxed that is of course AMD Unboxed, even when their videos support Nvidia products and technologies, and Linus of LTT. He sees that we are going into a monopoly and tries to push AMD as an alternative at least. Other Youtubers still drive all the focus of their audience to Nvidia. In their last video Gamers Nexus also realized that AMD also makes GPUs. They where promoting Nvidia constantly the last 12 months. JayTwoCents still makes videos that Nvidia's marketing department would envy. JMO
Posted on Reply
#116
Winssy
john_You say that AMD is lying by promoting the importance of having more VRAM.
No. I said that AMD is lying when they show slides where their 7900XTX outperforms the 4080 by 10-20% in all games except for five games with RT. In reality, these cards are roughly equal in regular rasterization, as seen from the tests I have provided. As for VRAM, yes, when it comes to the 3070 8GB and 6700XT 12GB, AMD looks good here. Although the GPU's performance is lower, having 12GB of VRAM allows for better textures in horribly optimized console ports like TLOU. But in the case of the 4080 16GB and 7900XTX 24GB, this is already absolutely unimportant because 16GB is enough everywhere and will be enough for a long time.
I understand your point of view on reducing prices, it's quite logical. But there's one thing. Wouldn't it be better for AMD to put 16GB of VRAM in the 7900XTX, thus saving money and selling more cards at a lower price while maintaining their profit margin and strengthening competition with Nvidia? Instead, they decided to pay people who make useless slides about the usefulness of 24GB in the distant future and about how in their fantasy the 7900XTX outperforms the 4080 in 95% of games)
And look at FSR 2... There would never have been this upscaling if Nvidia hadn't made its DLSS. The same story with FSR 3. Without Nvidia's FG, AMD wouldn't have even thought about it. The support for ray tracing on RX 7000 also surprises me. Couldn't they have implemented it better in the RX 7000 after the poor results of the RX 6000? Even the first generation of cards from Intel works with RT better. When attempts to compete come down to playing catch-up and publishing false slides, I logically cannot hope for any success. That is why I am calling out AMD for their pitiful attempts instead of hoping they succeed.
Posted on Reply
#117
john_
WinssyNo. I said that AMD is lying when they show slides where their 7900XTX outperforms the 4080 by 10-20% in all games except for five games with RT. In reality, these cards are roughly equal in regular rasterization, as seen from the tests I have provided. As for VRAM, yes, when it comes to the 3070 8GB and 6700XT 12GB, AMD looks good here. Although the GPU's performance is lower, having 12GB of VRAM allows for better textures in horribly optimized console ports like TLOU. But in the case of the 4080 16GB and 7900XTX 24GB, this is already absolutely unimportant because 16GB is enough everywhere and will be enough for a long time.
OK, but every company manipulates it's charts. That's why everyone says "wait for interdependent benchmarks". You think when Nvidia makes charts it doesn't test under best case conditions for it's cards? Don't use double standards.
I understand your point of view on reducing prices, it's quite logical. But there's one thing. Wouldn't it be better for AMD to put 16GB of VRAM in the 7900XTX, thus saving money and selling more cards at a lower price while maintaining their profit margin and strengthening competition with Nvidia? Instead, they decided to pay people who make useless slides about the usefulness of 24GB in the distant future and about how in their fantasy the 7900XTX outperforms the 4080 in 95% of games)
You are still missing my point. AMD CAN NOT sell much lower. If 7900XTX was coming with 16GBs at $900, RTX 4080 could have seen a $100 price drop. And I say COULD and not WOULD, because in today's reality, people would still keep paying the extra money to Nvidia. But let's say that people where willing to buy an AMD card. Then RTX 4080 WOULD have seen a price drop of $100 and you would be saying here "Why haven't AMD names 7900XTX as 7800XT and sell it for $700? It's their fault they don't sell". That's how we go down in pricing and at levels where Nvidia makes money and AMD does not. Because AMD enjoys much lower profit margins and Nvidia is the brand that sells 9 times more cards. Even if their profit margins where the same, with Nvidia selling 9 times more, AMD doesn't make money, or even loses money.
And look at FSR 2... There would never have been this upscaling if Nvidia hadn't made its DLSS. The same story with FSR 3. Without Nvidia's FG, AMD wouldn't have even thought about it. The support for ray tracing on RX 7000 also surprises me. Couldn't they have implemented it better in the RX 7000 after the poor results of the RX 6000? Even the first generation of cards from Intel works with RT better. When attempts to compete come down to playing catch-up and publishing false slides, I logically cannot hope for any success.
I was always saying that Nvidia innovates like no other and that Huang is a genius. I was also shouting about the RT performance when RX 7000 came out and people where calling me Nvidia fanboy. They did had a point, meaning RTX 3000 performance was great from Nvidia cards, a few months latter the same RTX 3000 equivalent performance is trash from AMD cards. As for DLSS, that was cheating a few years ago. But Nvidia does have the market share and the tech press support to promote these kind of tricks. Upscaling isn't an Nvidia invention. It was out there all those years. It was just cheating. And DLSS/FSR 3.0 are just worst tricks that are based on one simple reality. The eye can NOT spot the difference. Even if it spots it, the brain chooses to ignore it in favor of the higher framerate. You get inferior image quality, but you can NOT realize it. The only thing you can understand is that you put "ultra" in settings and 4K as a resolution. So you get, in your mind, Ultra settings and 4K at the same time. Right? Well, no, but let's all agree to say yes.
GSync? That was also a good idea from Nvidia. Again not their invention. Just the usage in PC desktop monitors for smoother game, that was their idea. And even there there was a way to implement it in a way to make it free to everyone, but, no, let's throw a hardware board there, make money even from that board and make it Nvidia exclusive. Do you remember PhysX? Don't let me start about PhysX. Anyway Nvidia innovates but for their own profit. So, even if AMD follows, you have to be grateful that there is someone following and also giving away the results of their efforts as FREE alternatives. Would it be better if there was NO ONE to follow Nvidia? My new TV is FreeSync Premium and GSync compatible through HDMI 2.1. It's GSync compatible thanks to AMD, not Nvidia. Nvidia followed AMD here.

You ask from AMD to compete when at the same time you will probably do anything to stop people from buying their cards. So, why compete if no one will buy their cards? You ask from AMD it's marketing slides to be perfect, but do you ask from Nvidia to NOT lie in their marketing slides? It's not just you. There are so many people out there expecting from AMD to build the best hardware, offer it at the lowest prices and also market it in ways that will not showing it's strengths against competing hardware, only it's disadvantages. So many that I could copy this post and repost it 1000 times in the next 10 years.
That is why I am calling out AMD for their pitiful attempts instead of hoping they succeed.
But you will never call Nvidia. Only about pricing, another fault that is 100% AMD's fault. NOT Nvidia's......
Posted on Reply
#118
antuk15
QUANTUMPHYSICSI think the funny thing is that most people still using 1080p and 1440p monitors are the ones so emotionally concerned with 4K performance.
You hit the nail on the head!
Posted on Reply
#119
CGLBESE
AMD could also have played on their (currently) better support. I had a Vega56 which was on par with GTX 1070... and then, I regularly checked some youtube videos here and there comparing the two over time. I was astonished to see that the Vega was still playable (1080p V. High/Ultra on some titles) while GTX owners had to scale back on some settings. (but all good stories have to end, and I recently replaced it for an RX 6700 XT after over 5 years...).

For the equipped VRAM questions, we can all be sure that RXs with 10-12Gb will age better than their RTX 8Gb-geared counterparts, experience should be smoother with demanding games.

And talking about FSR/DLSS, don't get on this slippery path. A few years ago (was it for the GeForce 5xx or 6xx series, I honestly can't recall), there were proofs that nVidia started playing with rendering quality to take the lead in Ultra settings. Nowadays, this type of cheat is considered as a feature.

But yes, I can say that nVidia can innovate, RT being one nice eye-candy, even if it is not mature yet (I still remember the fuss about the first Anti Aliasing techniques back in late '90s :D)

And don't get me wrong, I had a lot of cards from various vendors over the last 30 years or so, from Trident to S3, to Matrox, to 3DFx, then a relatively balanced mix between ATi/AMD and nVidia (and a Kyro2, I must say :))
Posted on Reply
#120
antuk15
CGLBESEFor the equipped VRAM questions, we can all be sure that RXs with 10-12Gb will age better than their RTX 8Gb-geared counterparts, experience should be smoother with demanding games.
Not necessarily, having the VRAM is not the same as having the processing power required.

The 8GB cards could end up running lower texture settings but higher quality effects.
Posted on Reply
#121
BoboOOZ
antuk15Not necessarily, having the VRAM is not the same as having the processing power required.

The 8GB cards could end up running lower texture settings but higher quality effects.
Try playing a good-looking game with textures set to low. Observe the effect.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.

Hardware unboxed used to make videos finding the best settings for game with a good balance between performance and visual quality. Most games look good starting from medium/high effects, except for textures, which should be at maximum.
Posted on Reply
#122
antuk15
BoboOOZTry playing a good-looking game with textures set to low. Observe the effect.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.

Hardware unboxed used to make videos finding the best settings for game with a good balance between performance and visual quality. Most games look good starting from medium/high effects, except for textures, which should be at maximum.
Well designed good looking games have good textures below maximum texture settings.

RTGI will offer a higher visual boost than higher texture settings will.
Posted on Reply
#124
AusWolf
antuk15RTGI will offer a higher visual boost than higher texture settings will.
Better lighting can make one shit one's pants in joy, but higher model quality and better textures make the overall image look more real. I prefer realism to eye candy.
BoboOOZTry playing a good-looking game with textures set to low. Observe the effect.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.
Exactly! I can live with fewer light sources, or non-volumetric fog, but I can't do with mushy textures.
Posted on Reply
#125
tpa-pr
I was going to try write a more technical post but I don't have the knowledge to back it up (yet), so I'll just reiterate: corporate marketing, TAME corporate marketing at that, is a silly thing to get into a mud fight over. AMD and Nvidia are not your friends and you gain nothing by it.

From a common-sense perspective: more VRAM is going to be more future-proofed. You have more resources available to you when future games demand more. AMD sell higher-VRAM-equipped cards more than Nvidia, ergo this advantage belongs to them and it is a valid critique against the competition. Just as Nvidia's Ray Tracing prowess is a future-proofed advantage that belongs to them. Once again, don't get offended, just buy the equipment that better suits your needs. Saddling yourself to a single tribe is doing yourself a disservice and defending that tribe with empty ad-hom and no data is a good way to look foolish.
Posted on Reply
Add your own comment
Dec 2nd, 2024 09:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts