• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?


Intel is growing, nVidia is growing... the statistics show 90% "idiots" who do NOT buy AMD. AMD also has problems in the desktop processor market. Coming back to video cards, I don't think "old habits" are the cause. Actually, I'm sure she isn't. AMD fights with low prices and some extra memory because in other chapters it is far from nVidia. Anyway, it's funny to watch some AMD owners try to give as an example a very poorly optimized title to deny the reality of the other 99.99% titles.

View attachment 322523
So basically, buying Nvidia for DLSS is completely sensible, but buying AMD for the lower prices and a bit more VRAM is wrong. That's not a biased statement at all. I rest my case here.
 
So basically, buying Nvidia for DLSS is completely sensible, but buying AMD for the lower prices and a bit more VRAM is wrong. That's not a biased statement at all. I rest my case here.

Yeah, I've never understood why some are so obsessed with what others buy even if someone thinks a 4060ti is the best gpu in the world for them awesome, makes 0 difference that I think it's garbage and that it is possibly the worst gpu since the 960.... I'm happy for people regardless of what tech they purchase new tech is always fun regardless.
 
Your hate-on for AMD is getting tiresome.
Hatred? Let's be serious! I'm just trying to dismantle some nonsense coming from some.

So basically, buying Nvidia for DLSS is completely sensible, but buying AMD for the lower prices and a bit more VRAM is wrong. That's not a biased statement at all. I rest my case here.
It depends on the case. In no case do I buy the 7600 to the detriment of the 4060. The difference is 40 euros here. 4060 offers a plus for rasterization, supports DLSS FG and consumes about 40% less. The 4070 has enough memory for its target (1440p), and the average of the 7900XTX with RT ON is at the level of the 4070 Ti. Everyone chooses what they want, but you can't deny that a video card doesn't just mean rasterization nowadays.
 
Hatred? Let's be serious! I'm just trying to dismantle some nonsense coming from some.

It sure doesn't come across that way. You've spent the last two+ weeks responding to anybody admitting to owning and/or liking a Radeon with explanations, charts and graphs about why their decision was wrong, and why they're foolish for making it. At this point, you're quoting market share numbers. That's not even relevant to the topic.

I'm basically with you on what I believe to be your original theses on VRAM: Some cards have more than they need, whether for marketing or architecture reasons; there's little use in assigning more VRAM to a chip than it can reasonably make use of; and many games could do a better job with managing use of memory available. You're not defending that position anymore, but trotting out reason after reason why Nvidia > AMD. Let it go.

Speaking of letting things go, OP hasn't contributed since day one. I strongly suspect they were trolling the regulars, and seem to have done a damned fine job.
 
Hatred? Let's be serious! I'm just trying to dismantle some nonsense coming from some.


It depends on the case. In no case do I buy the 7600 to the detriment of the 4060. The difference is 40 euros here. 4060 offers a plus for rasterization, supports DLSS FG and consumes about 40% less. The 4070 has enough memory for its target (1440p), and the average of the 7900XTX with RT ON is at the level of the 4070 Ti. Everyone chooses what they want, but you can't deny that a video card doesn't just mean rasterization nowadays.
Now you're comparing hand-picked examples with hand-picked criteria to suit your own twisted narrative. Bravo! :rolleyes:

4060 is better than the 7600 because it's faster at rasterization (according to your claim), but the 4070 Ti is better than the 7900 XTX because of its RT performance, not even considering how much faster the 7900 is in pure raster?

Sure, you can twist every situation to make Nvidia (or AMD) look better, but what's the point? Everybody buys what they consider good value at the given price point.

With this logic, I could say that Nvidia is a bunch of liars trying to sell RT and fake frames like you couldn't live without them, and AMD is also a bunch of liars trying to sell on price and often times unnecessary amounts of VRAM because they haven't got the technical superiority. So don't buy either, go read a book. Fair? ;)
 
I'm sick and tired of the "AMD users can't have an opinion" bullshit. I used to have a 2070 (I still do on top of a shelf), and played Cyberpunk 2077 with DLSS Quality at 1080p.
From my perspective, and when it came to a head some months ago in the DLSS discussion thread, it's because you have an opinion of DLSS based on antiquated data. Cyberpunk had, what... DLSS 2.1.x maybe 2.2.x around launch? It's come a long way since then.

I don't think you can't have an opinion on it, but given the data and experience you're working with, it's not exactly the most up to data/informed opinion. I'd highly encourage you to play around with your RTX2070 in 2023 games using DLSS 3.5+, just to true up your experience with it. Hey you might feel the same after and that's cool, but It has absolutely improved since ~<2.2.x and it would surprise me if you didn't notice the improvement - even (especially) at 1080p.

Anyone can have an opinion on anything, personally I just apply a sort of personal weighting to how much merit they have depending on the persons amount / quality of experience with it. At a basic level it looks like this, at least for DLSS specifically;
  • Most valuable opinions - currently owns and operates an RTX card as their daily gaming machine, and plays newly released games with latest DLSS versions
  • Somewhat valuable opinions - Has some point in time experience with RTX card and DLSS, but not current, and length of time since last use brings this higher or lower (where I see your thoughts on it)
  • Not very valuable opinions - No direct experience with the technology whatsoever, only ever read/watched online content/comparisons/discussion
  • Worthless opinion - No direct experience and airing clear anti-Nvidia sentiments and bias.
C'mon now.. how old is that card? And you are playing one of the heaviest games out there on it. Of course it will be less than optimal. It doesn't run good on my 980 either. Why.
It runs DLSS just like a 4080 does.
To these points I'd also say that, quantifiably, FPS also matters. A DLSS image will look better the higher the framerate, iirc diminishing returns beyond 60fps but I remember empirical testing of ~30 vs ~60 showing moiré patterns and more image breakup at ~30fps and those artefacts were either dramatically reduced or gone at ~60+. Outside of that you're definitely spot on, your 2070 will compute like-for-like DLSS images just as a 4080 does, the only mismatch is muscle, enabling the 4080 to push higher fps and internal resolutions to derive it's better overall result.
 
Now you're comparing hand-picked examples with hand-picked criteria to suit your own twisted narrative. Bravo! :rolleyes:

4060 is better than the 7600 because it's faster at rasterization (according to your claim), but the 4070 Ti is better than the 7900 XTX because of its RT performance, not even considering how much faster the 7900 is in pure raster?

Sure, you can twist every situation to make Nvidia (or AMD) look better, but what's the point? Everybody buys what they consider good value at the given price point.

With this logic, I could say that Nvidia is a bunch of liars trying to sell RT and fake frames like you couldn't live without them, and AMD is also a bunch of liars trying to sell on price and often times unnecessary amounts of VRAM because they haven't got the technical superiority. So don't buy either, go read a book. Fair? ;)
You probably missed the remark: everyone chooses what they want.
Yes, if you want ray tracing, you should consider that the 7900 XTX is at the level of the 4070Ti. On the one hand, you only have rasterization (the performance/price ratio is clear on AMD's side), on the other hand, you have a palette of extra options that may interest you.
Anyway, the trick with vRAM remains only marketing because (repeat, repeat, repeat) the GPU gets tired long before you reach the vRAM limit. Even in the above example, with Ratchet & Clank - Rift Apart, with the maximum settings, including RT, the 3060 will have big problems in games. It's another one of those games where you can't afford stutter. At minute 0:48 you have such a stutter and it's just the intro. In the game, you create big problems for yourself with these steep fps drops, but at least you are satisfied that you have extra vRAM.
I don't consider any AMD owner an idiot. I have seen this label often (right here on the forum) among AMD supporters compared to Intel and/or nVidia owners.
 
After The Last of Us, Ratchet & Clank Rift Apart is sent to the nVidia Killer operation. I leave here two recordings with this game, 1080p and 1440p, Very High (best preset), RT High (All) and DLSS Quality. For a video card that is close to three years old, I say it's ok.
As a note: between Very High and maximum details the only differences are RT High->Very High and Anisotropic 8X->16X. Additional workload for GPU processing, in translation.
Using NVENC reduces video card performance by 3-5%. The number of fps is slightly higher in the game


 
Last edited:
You probably missed the remark: everyone chooses what they want.
Yes, if you want ray tracing, you should consider that the 7900 XTX is at the level of the 4070Ti. On the one hand, you only have rasterization (the performance/price ratio is clear on AMD's side), on the other hand, you have a palette of extra options that may interest you.
Anyway, the trick with vRAM remains only marketing because (repeat, repeat, repeat) the GPU gets tired long before you reach the vRAM limit. Even in the above example, with Ratchet & Clank - Rift Apart, with the maximum settings, including RT, the 3060 will have big problems in games. It's another one of those games where you can't afford stutter. At minute 0:48 you have such a stutter and it's just the intro. In the game, you create big problems for yourself with these steep fps drops, but at least you are satisfied that you have extra vRAM.
I don't consider any AMD owner an idiot. I have seen this label often (right here on the forum) among AMD supporters compared to Intel and/or nVidia owners.
The only thing I'd add is that RT and DLSS are marketing gimmicks, just as much as the extra VRAM is. You don't need any of the RT fanciness to enjoy a game, and DLSS doesn't give you free performance. It gives you more performance at the cost of some image quality. If you're fine with that, that's cool.

On the other hand, more VRAM may or may not give your GPU a longer life span depending on where future games are heading. The 2 GB 960 reached the end of its life way before the 4 GB version did. The 3 GB 1060 showed some weakness even at release. The 3060 is the other side of the coin, with not enough GPU power to use its superior VRAM capacity compared to the 3070.

That's why I think blanket statements such as "RT is pointless" or "more VRAM is pointless" are invalid. Everything needs to be looked at in context. RT is pointless if it's pointless for you. More VRAM is pointless if future games end up requiring more GPU power than VRAM. It's an investment that may pay off. Or may not. Only the future will see.
 
That's why I think blanket statements such as "RT is pointless" or "more VRAM is pointless" are invalid. Everything needs to be looked at in context. RT is pointless if it's pointless for you. More VRAM is pointless if future games end up requiring more GPU power than VRAM. It's an investment that may pay off. Or may not. Only the future will see.

For example I'm playing Immortals of Aveum and thats a game running on a more recent if not the most advanced version of UE 5 currently.
I'm not running out of my 8 GB Vram but you bet I'm running out of raw performance at high settings + Textures on ultra and its only thanks to DLSS/upscaling that I can even have an enjoyable experience in this game. 'for my standards that is'
Considering that more and more games will be using this engine, most likely I will have to deal with the same situation going forward. 'Don't know why but most games I end up playing use a version of UE and I guess this will be the case again..'

Personally this is why I ended up getting a 3060 Ti and not the 6700 XT for the same price in 2022 cause I wanted to have all of the upscaling options available to me and for me it did pay off.:) 'I don't plan on upgrading my resolution/monitor anytime soon so no extra Vram usage in that regard'
 
For example I'm playing Immortals of Aveum and thats a game running on a more recent if not the most advanced version of UE 5 currently.
I'm not running out of my 8 GB Vram but you bet I'm running out of raw performance at high settings + Textures on ultra and its only thanks to DLSS/upscaling that I can even have an enjoyable experience in this game. 'for my standards that is'
Considering that more and more games will be using this engine, most likely I will have to deal with the same situation going forward. 'Don't know why but most games I end up playing use a version of UE and I guess this will be the case again..'

Personally this is why I ended up getting a 3060 Ti and not the 6700 XT for the same price in 2022 cause I wanted to have all of the upscaling options available to me and for me it did pay off.:) 'I don't plan on upgrading my resolution/monitor anytime soon so no extra Vram usage in that regard'
That's your own personal take, which is fine. :) My point is that you can't apply your own personal take for everyone else.
 
While everyone argues about who makes the best GPU here I am just enjoying my problem free experience. No crashes, glitches, hitches, just perfection. I have to question the system stability of some users who have nothing but problems. Luckily I only play at 4K60, so my demands are not that great.
 
While everyone argues about who makes the best GPU here I am just enjoying my problem free experience. No crashes, glitches, hitches, just perfection.
Same here. You can do that with Intel, AMD or Nvidia. Who makes "the best" is 1. subjective to personal preferences, 2. doesn't matter. :)
 
That's your own personal take, which is fine. :) My point is that you can't apply your own personal take for everyone else.
Nor does your opinion apply to others. The fact that you deny the usefulness of DLSS and the impact of RT is only due to the fact that you lack one and the other drastically reduces the performance of your video card. Your video card (I see a 7900XT) is below the 4070Ti in all games with RT ON.

RT: why does everyone implement this option if it is useless, as you say? I don't think there is a big title released in the last two years without RT inside. Even the oldest ones (Doom Eternal, for example) introduced this facility through an update. Of course the denial is more accessible when your top AMD card behaves like a midrange nVidia with RT ON.

DLSS: Why did AMD and Intel invest human and financial resources in a similar solution?
Watch the clip (little time available, primitive tools, but it delivers a message) and tell me what the difference is. Of course, it is about DLSS off and on, but I don't see why you deny the usefulness of DLSS. Just because it's not available for you?

 
For example I'm playing Immortals of Aveum and thats a game running on a more recent if not the most advanced version of UE 5 currently.
I'm not running out of my 8 GB Vram but you bet I'm running out of raw performance at high settings + Textures on ultra and its only thanks to DLSS/upscaling that I can even have an enjoyable experience in this game. 'for my standards that is'
Considering that more and more games will be using this engine, most likely I will have to deal with the same situation going forward. 'Don't know why but most games I end up playing use a version of UE and I guess this will be the case again..'

Personally this is why I ended up getting a 3060 Ti and not the 6700 XT for the same price in 2022 cause I wanted to have all of the upscaling options available to me and for me it did pay off.:) 'I don't plan on upgrading my resolution/monitor anytime soon so no extra Vram usage in that regard'

That game isn't really representative of the VRAM usage you should expect of newer games. It uses 8.4GB tops where as other games, even ones on UE5, are using 8GB minimum and up to 18GB tops. Often there's only a 0.5GB - 2GB difference between 1080p and 4K in these titles to boot so using a lower resolution really isn't much of a saving grace. I have absolutely been leaning towards the 3060 12GB over the 3060 Ti 8GB for builds I've been doing when people prefer Nvidia. Otherwise the 6700 XT at the same price is great card for those that don't need the Nvidia bells and whistles. The 3060 Ti only makes sense when you aren't playing a ton of AAA games or you plan on upgrading within a few years. Everyone should be buying whatever works best for their specific situation.
 
That game isn't really representative of the VRAM usage you should expect of newer games. It uses 8.4GB tops where as other games, even ones on UE5, are using 8GB minimum and up to 18GB tops. Often there's only a 0.5GB - 2GB difference between 1080p and 4K in these titles to boot so using a lower resolution really isn't much of a saving grace. I have absolutely been leaning towards the 3060 12GB over the 3060 Ti 8GB for builds I've been doing when people prefer Nvidia. Otherwise the 6700 XT at the same price is great card for those that don't need the Nvidia bells and whistles. The 3060 Ti only makes sense when you aren't playing a ton of AAA games or you plan on upgrading within a few years. Everyone should be buying whatever works best for their specific situation.
Well in my case the 8GB Vram is yet to be an issue since I've bought this card in 2022 september so that 12 GB Vram wouldn't do much for me but I was using DLSS in pretty much every game that had it and I was playing it so at least thats something I made use of. 'DLAA in some cases'
Also going from Ultra to High textures is all good with me in new-ish games, like I said its yet to be an issue for my use case even tho I'm a variety gamer playing both older and new games. 'I don't play competitive games tho'

I don't really upgrade that often unless theres a good deal on the second hand market + I've saved up some extra money. 'I'm a mid range user at most so normally I upgrade from a mid range to another mid range when its needed'

But yes I agree that its very subjective to everyone's specific use case/preferences and as long as it works well enough for that then thats all that matters.
 
For example I'm playing Immortals of Aveum and thats a game running on a more recent if not the most advanced version of UE 5 currently.
I'm not running out of my 8 GB Vram but you bet I'm running out of raw performance at high settings + Textures on ultra and its only thanks to DLSS/upscaling that I can even have an enjoyable experience in this game. 'for my standards that is'
Considering that more and more games will be using this engine, most likely I will have to deal with the same situation going forward. 'Don't know why but most games I end up playing use a version of UE and I guess this will be the case again..'
For most games, UE5 adds nothing graphically compared to the old engines. It is strange that (many) games switch to this engine when they could just use their old engines.

I'm sure people are going to find that the UE5 engine looks better than games that use an older engine, but in 90% of cases that's purely because they now have a more powerful GPU so they will play the games on higher settings.

Take for example the following comparison:


Watching the video above, you might think that UE5.2 looks good and modern.

The problem with this thought is that UE5.2 is not going to offer anything extra in a lot of games over the much older engines, which you can observe here.


As you can see, the nine-year-old game often looks more detailed and rich than its UE5.2 successor.
 
The fact that you deny the usefulness of DLSS and the impact of RT is only due to the fact that you lack one and the other drastically reduces the performance of your video card. Your video card (I see a 7900XT) is below the 4070Ti in all games with RT ON.
That's where you're wrong. I don't deny the usefulness of RT and DLSS (or any other upscaling method, including FSR) because I bought an AMD GPU. It's the other way around. I bought an AMD GPU because RT doesn't mean much to me as long as you sacrifice half of your framerate to use it, and I prefer avoiding upscaling whenever possible. This wouldn't be any different even if I had a 4090.

There is no penis envy in not wanting to use DLSS, or FSR, or any other upscaling method. Why is this so f-ing hard to understand? :banghead:

Not spending more money on features you don't need is a thing, believe it or not.

I never said my opinion applies to everyone else - all I'm saying is that it is an opinion, and I had it even before I bought my graphics card. I'm not denying the greatness of DLSS because I can't use it. I'm denying it because I've used it in the past and wasn't impressed. If DLSS is so awesome, and I'm being envious, then why do you think I bought a 7800 XT instead of a 4070 Ti? Hm? :rolleyes:

I don't want a Lamborghini! If I had one, it would be on ebay in 5 minutes.
 
Last edited:
For most games, UE5 adds nothing graphically compared to the old engines. It is strange that (many) games switch to this engine when they could just use their old engines.

I'm sure people are going to find that the UE5 engine looks better than games that use an older engine, but in 90% of cases that's purely because they now have a more powerful GPU so they will play the games on higher settings.

Take for example the following comparison:


Watching the video above, you might think that UE5.2 looks good and modern.

The problem with this thought is that UE5.2 is not going to offer anything extra in a lot of games over the much older engines, which you can observe here.


As you can see, the nine-year-old game often looks more detailed and rich than its UE5.2 successor.

That also depends on how well the engine is used/utilized by the game devs in a said game.
I've played a lot of UE engine games since the UE 3 days and I can defo tell the difference in most cases, UE 4 went through improvements over the years too and UE 5 is still pretty new.

Nanite and Lumen is actually quite noticeable imo. 'Aveum can look amazing in certain ways and also kind of average in others'

This reminds me of when ppl said that Borderlands 3 does not even look better than Borderlands 2 'UE 3 vs 4' only because it shares the same art style yet BL 3 has much better particle effects/reflections and all that stuff that comes with UE 4.
To my eyes it was pretty clear right when I started playing the game.

While Talos 1 looks pretty decent for an old game, to say that it looks better than 2 is a stretch imo. 'I've tried Talos 2 demo on Steam, it looks quite good and feels modern to me'

I guess we will see what a more 'polished' UE 5 can do once we get the new Witcher/Tomb Raider game with it cause they both switched to that engine. 'yeah I know its probably years away..'
 
For most games, UE5 adds nothing graphically compared to the old engines. It is strange that (many) games switch to this engine when they could just use their old engines.

I'm sure people are going to find that the UE5 engine looks better than games that use an older engine, but in 90% of cases that's purely because they now have a more powerful GPU so they will play the games on higher settings.

Take for example the following comparison:


Watching the video above, you might think that UE5.2 looks good and modern.

The problem with this thought is that UE5.2 is not going to offer anything extra in a lot of games over the much older engines, which you can observe here.


As you can see, the nine-year-old game often looks more detailed and rich than its UE5.2 successor.

UE5 is mostly meant to make game development easier on developers the downside for gamers is that it's mostly at the cost of performance... If we don't start to see games using that engine look significantly better in the next couple years for gamers at least it's mostly a bus. Unless we start to see more frequent high quality releases which I'm not holding my breath for.

The reason developers are switching is due to it being easier not for it's visual benefits.
That also depends on how well the engine is used/utilized by the game devs in a said game.
I've played a lot of UE engine games since the UE 3 days and I can defo tell the difference in most cases, UE 4 went through improvements over the years too and UE 5 is still pretty new.

Nanite and Lumen is actually quite noticeable imo. 'Aveum can look amazing in certain ways and also kind of average in others'

This reminds me of when ppl said that Borderlands 3 does not even look better than Borderlands 2 'UE 3 vs 4' only because it shares the same art style yet BL 3 has much better particle effects/reflections and all that stuff that comes with UE 4.
To my eyes it was pretty clear right when I started playing the game.

While Talos 1 looks pretty decent for an old game, to say that it looks better than 2 is a stretch imo. 'I've tried Talos 2 demo on Steam, it looks quite good and feels modern to me'

For me it's the cost of the visuals perfomance wise that is the turn off not the quality... UE also has other issues like shader compilation stutters ans traversal stutters in the majority of shipping games that while better are still an issue in UE5

It's not that it can't look better in some aspects it's just needing upscaling or turning down settings significantly on the 3 major releases using it so far isn't very appealing.
 
UE5 is mostly meant to make game development easier on developers the downside for gamers is that it's mostly at the cost of performance... If we don't start to see games using that engine look significantly better in the next couple years for gamers at least it's mostly a bus. Unless we start to see more frequent high quality releases which I'm not holding my breath for.

The reason developers are switching is due to it being easier not for it's visual benefits.


For me it's the cost of the visuals perfomance wise that is the turn off not the quality... UE also has other issues like shader compilation stutters ans traversal stutters in the majority of shipping games that while better are still an issue in UE5

It's not that it can't look better in some aspects it's just needing upscaling or turning down settings significantly on the 3 major releases using it so far isn't very appealing.

Yeah the stutters is like a standard UE thing by now, tho its not exclusive to this engine.
Seems to plague a lot of games in the past few years regardless of the engine used, Dead Space remake still has traversal stutter till this day and thats on Frostbite. 'finished the game very recently on gamepass'

True that UE 5 is heavy to run and lowering settings doesn't help much other than lowering global illumination from max to high cause that affects Lumen which is heavy on max , I just hope that it will get better over time. 'supposedly 5.3 will already bring CPU thread optimizations and whatnot'

I do like how UE 5 looks in general but it sure needs some time to 'mature' with better/new relases and well the upscaling part is just where 'modern' gaming is heading imo so not much can be done about that.
 
then why do you think I bought a 7800 XT instead of a 4070 Ti? Hm? :rolleyes:
Why do you think the 7800XT has the 4070 Ti as an opponent? Hm? :rolleyes:
Maybe you tested with DLSS 1.0. Anyway, AMD graphics cards with RT ON lose half of the performance. The ones from nVidia are doing much better. AMD introduced RT on the principle: "weak is better than nothing", an undoubted proof of how important this gadget is, blamed by some until the moment AMD equals nVidia. DLSS will be blamed forever, despite the positive reviews.

Before the response, I ask you, what did AMD supporters cheer for the launch of the RTX 3000 and RX 6000 series? Is it power consumption? Now there is absolute silence in the red camp because the performance/watt ratio has reversed in the RTX 4000 and RX 7000 series. We fixate on a target and beyond it there is no life.

On: Ratchet & Clank Rift Apart
1080p versus 1440p
1440p: + 77.8% more pixels (that's right, there is no direct relationship between pixels and polygons)
For 1440p, the RTX 4090 loses 16.9% of performance (normal and logical here), but loads only 3.7% more in vRAM compared to 1080p.
What to be?
1. Really bad game optimization?
2. Really bad game optimization?
3. Really bad game optimization?

..........................
If they do not receive a strong sponsorship from the rival, the game producers will take into account the market share of video cards. That's why there are so few titles that favor AMD. No one can afford to jeopardize their sales by undermining nVidia and its 85% market share. It's like banning billionaires from staying in Dubai.
The Last of Us was bombarded by negative reviews upon release and in just two months the vRAM Usage issue was resolved.
As long as it has this huge market share, nVidia will dictate how much vRAM is needed in a specific resolution. Beyond that reality, it's just a forum discussion, nothing more.
 
Last edited:
Back
Top