• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Well we are reaching the end of 2023 and there are less than a handful of games which are using too much VRAM, not to mention those games are pretty far from being must-play. Needless to say the VRAM issue has been blown out of proportion in a typical fashion by Steve (AMD Unboxed channel).
Not really, its an issue that is present in the market right now for 8GB cards. Whether you meet that issue in your regular gaming on an 8GB card is a combination of personal preferences, budget and financial/upgrade possibilities, the resolution you play on, etc etc.

Fact remains there are quite a few games that exceed 8GB today. What you call 'too much' is a colored statement, fact is, they use it, so if you want to play said game, you'll need it, or you'll compromise in any which way to get it playable, and that compromise might dive under your expectations for what is good gaming.

Two years ago, that problem didn't exist.

I know, its real nice to be able to think in 'us vs them' to make the world simple, but that's not how the world works. And it never will.

Also, given Nvidia's history wrt VRAM gen-to-gen, I think the odd one out here is Nvidia, not AMD. AMD is just doing a logical VRAM scaling gen-to-gen, this holds true for RDNA1-3. Nvidia is cutting away hardware and then the story is 'AMD bad' :D Do you even logic. Its very clear Nvidia is approaching the market to obsolete its own product every gen so you buy the latest, they have 3 angles now: DLSS versions, RT being an extra VRAM requirement, and cutting down on VRAM along the way. Again: you don't need to be a rocket scientist to put two and two together. Is AMD pushing their VRAM story in marketing along the way? Of course, because that's Nvidia's weakness. All I see here is consensus about what's happening ;)
 
Last edited:
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Not really, its an issue that is present in the market right now for 8GB cards. Whether you meet that issue in your regular gaming on an 8GB card is a combination of personal preferences, budget and financial/upgrade possibilities, the resolution you play on, etc etc.

Fact remains there are quite a few games that exceed 8GB today. What you call 'too much' is a colored statement, fact is, they use it, so if you want to play said game, you'll need it, or you'll compromise in any which way to get it playable, and that compromise might dive under your expectations for what is good gaming.
There are a handful of games that exceed 8gb if you play on ultra. You don't need to play on ultra though. The same way lots of games can't be played with RT on amd cards, doesn't make the cards obsolete, you just don't enable rt on them.

Also, given Nvidia's history wrt VRAM gen-to-gen, I think the odd one out here is Nvidia, not AMD. AMD is just doing a logical VRAM scaling gen-to-gen, this holds true for RDNA1-3. Nvidia is cutting away hardware and then the story is 'AMD bad' :D Do you even logic. Its very clear Nvidia is approaching the market to obsolete its own product every gen so you buy the latest, they have 3 angles now: DLSS versions, RT being an extra VRAM requirement, and cutting down on VRAM along the way. Again: you don't need to be a rocket scientist to put two and two together. Is AMD pushing their VRAM story in marketing along the way? Of course, because that's Nvidia's weakness. All I see here is consensus about what's happening ;)
Problem is amd is pushing amd sponsored games to hog vram for no reason. It cannot be a coincidence that amd sponsored games require huge amounts of vram, are horribly optimized and don't look particularly great for the requirements either. It's like it's been done on purpose
 
Joined
Jan 29, 2023
Messages
1,533 (2.15/day)
Location
France
System Name KLM
Processor 7800X3D
Motherboard B-650E-E Strix
Cooling Arctic Cooling III 280
Memory 16x2 Fury Renegade 6000-32
Video Card(s) 4070-ti PNY
Storage 500+512+8+8+2+1+1+2+256+8+512+2
Display(s) VA 32" 4K@60 - OLED 27" 2K@240
Case 4000D Airflow
Audio Device(s) Edifier 1280Ts
Power Supply Shift 1000
Mouse 502 Hero
Keyboard K68
Software EMDB
Benchmark Scores 0>1000
4000 Super will have bigger VRam.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,320 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Problem is amd is pushing amd sponsored games to hog vram for no reason. It cannot be a coincidence that amd sponsored games require huge amounts of vram, are horribly optimized and don't look particularly great for the requirements either. It's like it's been done on purpose
That is annoying, it's a bit tit for tat with sponsored titles performing better and worse, the bigger problem for me is being more hard pressed to tell the difference between *top texture setting* and *high texture setting*, except the VRAM counter increase, yet at least with RT games the difference is far more transformative imo. Even metro exodus which runs great on consoles has a lighting model more impressive to me that ultra crisp textures up close.

IMO the biggest benefit of the bigger VRAM buffer is longevity, but even that it can be a non issue depending on use case. For instance, I had a GTX1080 and 3440x1440 monitor, and 8GB sufficed, then when I got a 3080, the GTX1080 was relegated to 1080p, matching it's then available muscle to a resolution and the VRAM requirements naturally drop in response too. Since getting the 3080, I now game on a 42" 4k120 OLED, and admittedly the 3080 is now borderline VRAM for max textures, but 3 years into it's life it's 3/4 the way to be replaced with something far more suited to GPU muscle through 4k120, and naturally will have a larger VRAM pool. Then, the 3080 will be relegated to, you guessed it, 1080p, where I anticipate to get several more years of great life due to being effectively OP for 1080p.

I can't sit and argue that more VRAM isn't better, but it's just another spec point to manage. It's slightly baffling to me the 3080 didn't come with at least 11GB at the time, and then there were going to be 20GB versions, which at the time I don't think I'd have bought even if I had the chance, unless it was readily available and say 10% more expensive. It was already shaping up to be extremely in demand (got mine right at launch), I'm pretty happy with what I got. Would it be better with 20GB? of course. is the 10GB still a kickass card? you bet.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Problem is amd is pushing amd sponsored games to hog vram for no reason. It cannot be a coincidence that amd sponsored games require huge amounts of vram, are horribly optimized and don't look particularly great for the requirements either. It's like it's been done on purpose
It cannot be a coincidence because its not. Consoles simply carry more than 8GB and devs build console-first, and PC next.
And yes, AMD did build the console APUs. And then MS and Sony ran with it.

Everything else is tinfoil hat conjecture. None of this is new since the dawn of consoles. They define where gaming plateaus. Nvidia is just trying to escape that reality by offering less, and getting their own margins up in the process. Please just take a step back and look at the bigger picture here. None of this invalidates your idea that AMD might have pushed VRAM a bit in a title here or there, I think Forspoken Godfall is a decent example where you might consider that, but there is still no conspiracy - in much the same way Nvidia pushes devs to incorporate RT to place AMD on the bench. But there are ALSO titles that just want 8GB no matter what, unless you play at the absolute bottom of quality. One such example is Total War Warhammer 3. It just has a *lot* of assets. It just wants a *lot* of VRAM to display them all at will. So even if you discount the supposed AMD propaganda library, you still won't get away with 8GB everywhere, even regardless of quality.

NONE of these per-game examples have any relation to the fact consoles still do offer upwards of 12GB of available VRAM. Devs will obviously use it because better graphics result in higher sales. Its just that simple and its the rule of thumb for anything in gaming history. More hardware is more possibilities, or more wiggle room, whichever way it works out, its more.

It gets even better though: if you DO happen to be convinced AMD is inflating VRAM requirements for the sole reason of spiting you and other Nvidia customers, you can simply not play the game. After all, if the titles are all so lackluster as you describe, why would you? Problem solved, you can roll with 8GB for ages. Reality hits sooner or later anyway. Hogwarts was clearly AMD sponsored too, right, seeing as Ampere fell apart in there? I frankly don't care if people can't reflect properly on gaming history and its hardware - its their loss. I have a 20GB card in a perfectly timely fashion. It could've been 16GB. Now its 20. Whatever. Its enough to keep me going for another half decade. Whereas with 12GB, I'd be damn sure I couldn't.

None of this is new, honestly, I don't know how long you've been around, but this whole debate is as old as VRAM itself. And it was never a debate, its just the very human response of people buying hardware and then getting the reality check their nice purchase is old news the next day because its tech. Today though you have a bigger issue alongside it: companies actively trying to manipulate crowds and push them into ecosystems that are preferably as locked as possible. Recognize, or you're gonna find yourself screwed.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
in much the same way Nvidia pushes devs to incorporate RT to place AMD on the bench. have a bigger issue alongside it: companies actively trying to manipulate crowds and push them into ecosystems that are preferably as locked as possible. Recognize, or you're gonna find yourself screwed.
But see that is the problem. It is not at all analogous. There is a difference between pushing RT = pushing image quality, and just pushing VRAM for no benefit. Take TLOU as a prime example. At low textures not only did it look hideous (i kid you not, look for videos onlines, it looks like a ps1 game) it used more VRAM than PT Requiem needed for 4k ultra. With RT, you can just turn it off if you don't like / your cardt can run it, I can't turn off the textures though can I? Nvidia sponsored games btw run flawlessly on AMD cards, at least the raster part.

It cannot be a coincidence because its not. Consoles simply carry more than 8GB and devs build console-first, and PC next.
And yes, AMD did build the console APUs. And then MS and Sony ran with it.
Only amd sponsored games build console first? Cause that's where the majority of issues (like, 98%) occur. The only non amd sponsored games that have issues with 8gb of vram are games that only exceed 8gb when you enable RT (Hogwarts and Requiem)

It gets even better though: if you DO happen to be convinced AMD is inflating VRAM requirements for the sole reason of spiting you and other Nvidia customers, you can simply not play the game.
I haven't played any of the offenders besides TLOU. And that was a gift

Hogwarts was clearly AMD sponsored too, right, seeing as Ampere fell apart in there?
"fell apart", you mean with RT open, which makes every card fall apart. You realize my 4090 was stuck to an average of 40 fps at 4k with RT on right? Sure it didnt run out of vram on the 4090, but the point is the game is insanely heavy with RT on whether you have the ram or not. Problem is amd spons games need 16 gb without RT. Can you imagine implementing RT on TLOU? That thing would require 20gigs minimum...
 
Joined
Jan 14, 2019
Messages
13,241 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
How dare Nvidia push for RT, and how dare AMD push for more VRAM? F* the whole lot, f* progress, I'm off to play Half-Life 1 at 640x480! :rockout:

Seriously though... the GPU market is a competitive one (for now, at least). Everybody pushes with what they've got. Nvidia has superior RT, AMD has more VRAM, Intel has... um... something? There's no need to be mad at either for pushing their agenda. If you were the CEO of either company, you'd do the same.

Also, the benefits of all progress are questionable and highly depend on the person you ask. Some people love RT. Some think it's too far-reaching to be useful today. Some think it's pointless. Same with VRAM. Some like to have more for slightly more future-proofing (or just the thought of it). Some think it's pointless and are happy with 8-10 GB. You just buy what seems of more use to you. Pointing fingers achieves nothing.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But see that is the problem. It is not at all analogous. There is a difference between pushing RT = pushing image quality, and just pushing VRAM for no benefit. Take TLOU as a prime example. At low textures not only did it look hideous (i kid you not, look for videos onlines, it looks like a ps1 game) it used more VRAM than PT Requiem needed for 4k ultra. With RT, you can just turn it off if you don't like / your cardt can run it, I can't turn off the textures though can I? Nvidia sponsored games btw run flawlessly on AMD cards, at least the raster part.
RT = pushing image quality? Lol. More often than not it looks hideous, tacked on for the marketing RTX ON box in videos, and the marketing push for devs. Its just as pointless. Both are ways to overinflate requirements so people get nudged to new cards. Stop fooling yourself. If you want to believe, you will believe, I can't help you anymore. We're in third gen and there is still no need whatsoever to use RT features in games. 'But it'll get there...' Sure. Sure it will. That's why the vast majority of Nvidia users is actually more stoked about DLSS updates than RT, hm? Ada's entire and only right of existence is the fact it has DLSS 3 that Ampere doesn't have, plus a decent node. There's nothing else.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,320 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
RT = pushing image quality? Lol. More often than not it looks hideous, tacked on for the marketing RTX ON box in videos
I can happily agree that I've seen some examples like that, but I can't deny the others that are genuinely pushing image quality, in a big way. Like control, dying light 2, metro exodus, Alan wake 2, cyberpunk fully path traced... right off the top of my head.

Just like there are texture pack examples like.. Really? This is all we get for all the hype? There have for sure been RT Implementations like that, some of those definitely in amd sponsored titles too where the RT checkbox was technically ticked for no real benefit to users (hello RT shadows only), but let's no pretend like there isn't RT titles pushing image quality. Lighting is such an integral part of image quality, especially for realism or even immersion in a non photo relasitic title. Super high quality textures are too.
 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
RT = pushing image quality? Lol. More often than not it looks hideous, tacked on for the marketing RTX ON box in videos, and the marketing push for devs. Its just as pointless. Both are ways to overinflate requirements so people get nudged to new cards. Stop fooling yourself. If you want to believe, you will believe, I can't help you anymore. We're in third gen and there is still no need whatsoever to use RT features in games. 'But it'll get there...' Sure. Sure it will. That's why the vast majority of Nvidia users is actually more stoked about DLSS updates than RT, hm? Ada's entire and only right of existence is the fact it has DLSS 3 that Ampere doesn't have, plus a decent node. There's nothing else.
Well again, it doesn't matter, cause you can turn RT off. But what am I going to do with textures that need 10+ gb while looking like crap like in tlou for example?

How dare Nvidia push for RT, and how dare AMD push for more VRAM? F* the whole lot, f* progress, I'm off to play Half-Life 1 at 640x480! :rockout:
You are missing the point completely. As ive said already, a game can use 500 gb of vram, I'm fine with that. But it has to look better than a game using 5. And that is not the case, which is the problem. Amd sponsored games just push vram for no benefit whatsoever. Do I need to repeat tlou again? At low textures it looked hideous and still required over 6gb, more than PT needed at ultra 4k.
 
Joined
Jan 14, 2019
Messages
13,241 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
You are missing the point completely. As ive said already, a game can use 500 gb of vram, I'm fine with that. But it has to look better than a game using 5. And that is not the case, which is the problem. Amd sponsored games just push vram for no benefit whatsoever. Do I need to repeat tlou again? At low textures it looked hideous and still required over 6gb, more than PT needed at ultra 4k.
That is your opinion, and I accept that. What I don't accept is the generalisation of "AMD sponsored = eats VRAM and looks like crap" based on one person's opinion based on one single game. Honestly, this is the first time I've seen someone calling graphics in TLoU crap (I haven't played it, so I can't comment). What I also don't accept is the "AMD pays devs to make games more VRAM hungry" conspiracy theory. Whether it's true or not doesn't change the fact that Nvidia doesn't offer enough VRAM for the targeted product class on most of the Ada lineup - the same way Nvidia pushing for RT doesn't change the fact that Nvidia is simply better at RT, so of course they're pushing for it. It's not a conspiracy against AMD users, it's only natural.

Edit: If I overtake someone on the highway/motorway, it's not because I hate that person. It's because they're driving too slow. Looking for conspiracies where there is none is pointless.

Well again, it doesn't matter, cause you can turn RT off. But what am I going to do with textures that need 10+ gb while looking like crap like in tlou for example?
Buy a card with adequate amount of VRAM, maybe?
 
Last edited:
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
That is your opinion, and I accept that. What I don't accept is the generalisation of "AMD sponsored = eats VRAM and looks like crap" based on one person's opinion based on one single game. Honestly, this is the first time I've seen someone calling graphics in TLoU crap (haven't played it, so I can't comment).
TLOU graphics are crap if you actually read what I said and not take it out of context. When you choose the settings to make the game use the same amount of vram as PT Requiem, TLOU looked like absolute garbage. Go check some videos before the last patches with textures on low, game used 6+ gb of vram and looked hideous. We are talking ps1 levels of badness.

It's not a single game, the majority of amd sponsored are like that. Godfall forspoken resident evil 4,i can keep going for ever, there are literally 10s of these.
 
Joined
Jan 14, 2019
Messages
13,241 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
TLOU graphics are crap if you actually read what I said and not take it out of context. When you choose the settings to make the game use the same amount of vram as PT Requiem, TLOU looked like absolute garbage. Go check some videos before the last patches with textures on low, game used 6+ gb of vram and looked hideous. We are talking ps1 levels of badness.
Yep - that's your opinion, which is fine, just don't try to sell it as fact. I don't judge games based on videos, so I'll say nothing more until I've played it.

It's not a single game, the majority of amd sponsored are like that. Godfall forspoken resident evil 4,i can keep going for ever, there are literally 10s of these.
Yeah?
1701428106986.png
1701428112671.png
1701428120744.png
 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yep - that's your opinion, which is fine, just don't try to sell it as fact. I don't judge games based on videos, so I'll say nothing more until I've played it.


Yeah?
View attachment 323623View attachment 323624View attachment 323625
It's definitely not my opinion, this is how tlou looked like before the patches, low textures look like absolute crap still require more vram than 4k Ultra PT


The graphs you posted are irrelevant, hwunboxed have tested in game and shows massive stuttering due to vram in these games
 
Joined
Jan 29, 2023
Messages
1,533 (2.15/day)
Location
France
System Name KLM
Processor 7800X3D
Motherboard B-650E-E Strix
Cooling Arctic Cooling III 280
Memory 16x2 Fury Renegade 6000-32
Video Card(s) 4070-ti PNY
Storage 500+512+8+8+2+1+1+2+256+8+512+2
Display(s) VA 32" 4K@60 - OLED 27" 2K@240
Case 4000D Airflow
Audio Device(s) Edifier 1280Ts
Power Supply Shift 1000
Mouse 502 Hero
Keyboard K68
Software EMDB
Benchmark Scores 0>1000
It's definitely not my opinion, this is how tlou looked like before the patches, low textures look like absolute crap still require more vram than 4k Ultra PT


The graphs you posted are irrelevant, hwunboxed have tested in game and shows massive stuttering due to vram in these games

The texture is buggy (not low res), they had to patch it, a buggy texture can be ugly and big, no ?
 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
The texture is buggy (not low res), they had to patch it, a buggy texture can be ugly and big, no ?
The whole game looks like that man
 
Joined
Jan 29, 2023
Messages
1,533 (2.15/day)
Location
France
System Name KLM
Processor 7800X3D
Motherboard B-650E-E Strix
Cooling Arctic Cooling III 280
Memory 16x2 Fury Renegade 6000-32
Video Card(s) 4070-ti PNY
Storage 500+512+8+8+2+1+1+2+256+8+512+2
Display(s) VA 32" 4K@60 - OLED 27" 2K@240
Case 4000D Airflow
Audio Device(s) Edifier 1280Ts
Power Supply Shift 1000
Mouse 502 Hero
Keyboard K68
Software EMDB
Benchmark Scores 0>1000
Joined
Jun 6, 2022
Messages
622 (0.66/day)
Seriously though... the GPU market is a competitive one (for now, at least). Everybody pushes with what they've got. Nvidia has superior RT, AMD has more VRAM, Intel has... um... something? There's no need to be mad at either for pushing their agenda. If you were the CEO of either company, you'd do the same.
nVidia can easily add vRAM. AMD can't (yet) add anything to RT.
I remain fixated on the power of the graphics processor. The desperate attempts of some to prove to me the usefulness of surplus vRAM at 30-50 FPS is at least comical. I find it comical to support this nonsense while praising 120+ Hz monitors.
It's not a tragedy to reduce the details to play very badly optimized titles, it's not a tragedy to disable RT either. Tragedy is when the game cannot be run due to the limitation of the video card, a chapter in which only one model has excelled in the last 4 years: the bad joke 6500XT.

Fortunately for us, the AMDvRAM versus nVidiaTools battle keeps prices under control and that's all that matters.
 
Joined
Feb 24, 2023
Messages
3,261 (4.75/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
AMD can't (yet) add anything to RT.
Which looks bizarre considering how Intel managed that. A770 is closer to RX 6800 than to its direct competition (RX 6700) in terms of RT performance despite Intel being de facto n00bs on this market. I see RT as a tech preview of a tech preview nonetheless. Works... questionably to say the least. I've played some RT games, path traced included, on a 4090 and didn't get impressed enough to pay for this feature now but I see a lot of potential in this field if things continue progressing as they are. In case of RT performance non-stop growing at today's pace, in a decade from now, RT will make all sense in the world. And AMD definitely will be screwed if they still are behind nVidia by such a margin.

VRAM hogging is a thing for a single figure number of games and I don't think it's meaningful to pay much attention to that. Almost everything is comfortable, virtually everything is playable at 1440p on 4 to 5 hunnit dollar GPUs. About a decade ago, it was mandatory to cut settings to low if you wanted to play a next-gen game at 1440p60 without incorrect texture render and massive stuttering on GPUs of such price range (considering inflation, we're speaking $300 USD GPUs of early 2010s).

1701502692081.png

1701502722532.png

1701503051975.gif
1701503069578.gif
1701503081892.gif
1701503087452.gif
1701503092975.gif
1701503113377.gif



Of course average FPS ain't everything but you get the idea. Getting playable experience at 1440p for $400 is pleasant. If that's only a couple titles where you need more than that we can consider things going under control. Luckily for us, game devs don't get worse in this department.
 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Which looks bizarre considering how Intel managed that. A770 is closer to RX 6800 than to its direct competition (RX 6700) in terms of RT performance despite Intel being de facto n00bs on this market. I see RT as a tech preview of a tech preview nonetheless. Works... questionably to say the least. I've played some RT games, path traced included, on a 4090 and didn't get impressed enough to pay for this feature now but I see a lot of potential in this field if things continue progressing as they are. In case of RT performance non-stop growing at today's pace, in a decade from now, RT will make all sense in the world. And AMD definitely will be screwed if they still are behind nVidia by such a margin.

VRAM hogging is a thing for a single figure number of games and I don't think it's meaningful to pay much attention to that. Almost everything is comfortable, virtually everything is playable at 1440p on 4 to 5 hunnit dollar GPUs. About a decade ago, it was mandatory to cut settings to low if you wanted to play a next-gen game at 1440p60 without incorrect texture render and massive stuttering on GPUs of such price range (considering inflation, we're speaking $300 USD GPUs of early 2010s).

View attachment 323733
View attachment 323734
View attachment 323736View attachment 323737View attachment 323738View attachment 323739View attachment 323740View attachment 323741


Of course average FPS ain't everything but you get the idea. Getting playable experience at 1440p for $400 is pleasant. If that's only a couple titles where you need more than that we can consider things going under control. Luckily for us, game devs don't get worse in this department.
Besides cyberpunk and dying light, where RT / PT is just transforming the experience, RT in games helps a lot for the mere fact that reflections don't appear and disappear all around you as you move the camera. Traditional raster breaks immersion with that kind of crap
 
Joined
Jan 14, 2019
Messages
13,241 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
before the patches
Seriously now? :kookoo: Are you playing an old, pirated version or something?

nVidia can easily add vRAM.
They could. But they won't. That's a scummy move in my eyes, versus AMD's pure inability to get ahead in the RT race, which is just what it is.

I remain fixated on the power of the graphics processor. The desperate attempts of some to prove to me the usefulness of surplus vRAM at 30-50 FPS is at least comical. I find it comical to support this nonsense while praising 120+ Hz monitors.
It's not a tragedy to reduce the details to play very badly optimized titles, it's not a tragedy to disable RT either. Tragedy is when the game cannot be run due to the limitation of the video card, a chapter in which only one model has excelled in the last 4 years: the bad joke 6500XT.

Fortunately for us, the AMDvRAM versus nVidiaTools battle keeps prices under control and that's all that matters.
To me, personally, 30-50 FPS doesn't look too bad, especially with VRR and LFC technology, as long as it's stable and there's no VRAM stutters. I don't play anything remotely competitive, so I don't care about maxing out my monitor's refresh rate. But you're right, I do me, you do you. That's the beauty of PC gaming. :)
 
Joined
Feb 24, 2023
Messages
3,261 (4.75/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
RT in games helps a lot for the mere fact that reflections don't appear and disappear all around you as you move the camera. Traditional raster breaks immersion with that kind of crap
Yeah but when the cost is that you need to scroll back to 1080p on a 20 hunnit GPU for that to be smooth and devoided of DLSS/FSR artifacts... That's why I consider it a double tech preview. A cool feature, not a single doubt. It's just the hardware ain't ready for it yet.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I can happily agree that I've seen some examples like that, but I can't deny the others that are genuinely pushing image quality, in a big way. Like control, dying light 2, metro exodus, Alan wake 2, cyberpunk fully path traced... right off the top of my head.

Just like there are texture pack examples like.. Really? This is all we get for all the hype? There have for sure been RT Implementations like that, some of those definitely in amd sponsored titles too where the RT checkbox was technically ticked for no real benefit to users (hello RT shadows only), but let's no pretend like there isn't RT titles pushing image quality. Lighting is such an integral part of image quality, especially for realism or even immersion in a non photo relasitic title. Super high quality textures are too.
They exist! But the vast majority isnt offering anything for its perf hit. Texture packs are similar yes; with a key difference: they barely cost core power so most of the time you can simply enable them at low or no perf hit. RT pushes on core and vram.

I have extensive experience with PT and RT 'doing well' and honestly? Sidegrades. Not upgrades. Some elements in the scene are better, most of them are just different, some elements (such as clarity/playability) suffer. It works in Control, slow pace SP game. It also looks good there because every scene is hand crafted. Condensed experience; so there certainly is a time and place for the tech.

The TL DR of this comparison to me is that ANY feature in gaming (or: commerce ;)) that tries to sell us its a lot more than what most others get is in fact 75% bullshit, 20% perceived status called epeen material and perhaps you will find 5% of actual value. Its overinflated and its a marketing tool first and foremost.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,241 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Yeah but when the cost is that you need to scroll back to 1080p on a 20 hunnit GPU for that to be smooth and devoided of DLSS/FSR artifacts... That's why I consider it a double tech preview. A cool feature, not a single doubt. It's just the hardware ain't ready for it yet.
I completely agree!

There are some great things that can be done with RT, no doubt (despite some bad implementations), but hardware power is just not ready, yet, and I'm not just saying this from an AMD perspective. The RT-to-raster hardware ratio has stayed the same on Nvidia since Turing (1st gen RT), which shows in the performance degradation when you enable RT. They say they've upgraded Ampere's RT cores, then again Ada's, but you don't see that manifest anywhere, the performance loss is the same. The RT-to-raster hardware ratio needs to be changed so that RT doesn't take such a large performance hit. Then, I'll say it's a cool technology.
 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Seriously now? :kookoo: Are you playing an old, pirated version or something?
No, I had it day one. Why?

Yeah but when the cost is that you need to scroll back to 1080p on a 20 hunnit GPU for that to be smooth and devoided of DLSS/FSR artifacts... That's why I consider it a double tech preview. A cool feature, not a single doubt. It's just the hardware ain't ready for it yet.
What is a 20 hunnit GPU?

And what about native artifacts? I'd never play a game natively whether it has RT or not. Native is outdated nowadays.
 
Top