• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

12Gb GPUs already obsolete, brand new game takes up to 18Gb Vram at 1440p

Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Not even a 2000$ gpu can get you a decent frame rate, no stutters, and they still don't look that much better then 2,3 year old games.

Except that's not because of VRAM requirements.
 
Joined
Jun 1, 2011
Messages
4,827 (0.97/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
How exactly do you determine if the requirements are 'ludicrous', whatever that even means. If a game uses more VRAM than any other previous game does that count as ludicrous ?
Ludicrous as in way more VRAM than the consoles and any other cards on the market other than flagship GPUs but a week later you are releasing a patch that knocks down VRAM by 25%.


Is your thought process simply :

low VRAM usage -> good optimization
high VRAM usage -> bad optimization
No, my thought process is uses X amount of performance at Ultra settings, uses around 75% of X performance at medium settings and uses around 50% of X performance at low settings equals good optimization.
Bad optimization would be uses X amount of performance at Ultra settings, 90% of X performance at medium settings and 80% of X performance at low settings.

Are you a game developer ? How do you know if something is poorly optimized ?
Because when said game releases a patch in less than three days after launch and a second patch a week later and the game suddenly offers better performance after each patch, I can feel confident the people who developed the game or port were well aware there were some optimization issues going into the launch of the game.
 
Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Except that's not because of VRAM requirements.

It's lazy devs/engineers and the shitty ports, the same people that manage everything in game, including vram usage

if a 4090 can't play a game decently how can i trust them to manage literally anything? again quote me a time in the past were a flagship couldn't run most AAA games decently at lauch. 30fps, with constant stutters, games run like shit
 
Joined
Apr 30, 2020
Messages
1,052 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
It's lazy devs/engineers and the shitty ports, the same people that manage everything in game, including vram usage

if a 4090 can't play a game decently how can i trust them to manage literally anything? again quote me a time in the past were a flagship couldn't run most AAA games decently at lauch. 30fps, with constant stutters, games run like shit
I can point one right now, Cyberpunk 2077 A game at 2020 release, RTX 3090 = 22 fps with max setting & raytracing enable to max.
 
Joined
Sep 10, 2018
Messages
7,633 (3.26/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I can point one right now, Cyberpunk 2077 A game at 2020 release, RTX 3090 = 22 fps with max setting & raytracing enable to max.

lets be real though a 3090 is no longer a 4k class product regardless of how much vram it has a really nice 1440p card sure but I wouldn't want to use it for 4k if someone else does good for them. It's over 2 years old at this point people need to have realistic expectations for how products age.

My 4090 will be struggling equally at 4k and will also become a 1440p card by the time the 50 series launches life goes on.
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It's lazy devs/engineers and the shitty ports
Same "lazy developers and shitty ports boilerplate" with no real explanation.

including vram usage
No, you're just wrong. Show me one game that runs badly on a 4090 because of VRAM requirements.

Ludicrous as in way more VRAM than the consoles and any other cards on the market other than flagship GPUs but a week later you are releasing a patch that knocks down VRAM by 25%.
Lol, PS3/360 had 512MB and even back when they were released most PCs had more memory than that and a couple of years in their lifecycle most high end PCs had like an order of magnitude more memory. Not only that this isn't anything new, it's not even as bad as it used to be.

No, my thought process is uses X amount of performance at Ultra settings, uses around 75% of X performance at medium settings and uses around 50% of X performance at low settings equals good optimization.
Bad optimization would be uses X amount of performance at Ultra settings, 90% of X performance at medium settings and 80% of X performance at low settings.
Completely meaningless arbitrary metrics that don't mean anything. Games nowadays are much more forgiving than they used to be when it comes down the minimum required to get a playable experience, even if you take a game like cyberpunk which is by far the worst running game ever maxed out you can lower the settings and get playable framerates on GPUs that are multiple times slower than something like a 4090.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,237 (2.02/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Same "lazy developers and shitty ports boilerplate" with no real explanation.


No, you're just wrong. Show me one game that runs badly on a 4090 because of VRAM requirements.


Lol, PS3/360 had 512MB and even back when they were released most PCs had more memory than that and a couple of years in their lifecycle most high end PCs had like an order of magnitude more memory. Not only that this isn't anything new, it's not even as bad as it used to be.


Completely meaningless arbitrary metrics that don't mean anything. Games nowadays are much more forgiving than they used to be when it comes down the minimum required to get a playable experience, even if you take a game like cyberpunk which is by far the worst running game ever maxed out you can lower the settings and get playable framerates on GPUs that are multiple times slower than something like a 4090.


Here, to help you out in your educational voyage of discovery. Many such cases.

If you've been looking at the Steam ratings of Star Wars Jedi Survivor, you've been surprised by a "mostly negative" score, which has now improved to "mixed." The negative reviews are not due to the gameplay, but due to various technical issues. In their infinite wisdom, EA told everyone just hours after launch that they are aware of the technical issues that affect "a minority of players." No doubt they knew about the issues and still decided to launch like that. We're now paying $70 to beta-test an unpolished turd that they call an AAA game—not the first time this year. I'm starting to wonder if these companies aren't slowly eroding their customer base by delivering broken products over and over again.
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C

Here, to help you out in your educational voyage of discovery. Many such cases.

If you've been looking at the Steam ratings of Star Wars Jedi Survivor, you've been surprised by a "mostly negative" score, which has now improved to "mixed." The negative reviews are not due to the gameplay, but due to various technical issues. In their infinite wisdom, EA told everyone just hours after launch that they are aware of the technical issues that affect "a minority of players." No doubt they knew about the issues and still decided to launch like that. We're now paying $70 to beta-test an unpolished turd that they call an AAA game—not the first time this year. I'm starting to wonder if these companies aren't slowly eroding their customer base by delivering broken products over and over again.

What's any of this even supposed to prove specifically, mister cringy arrogant replies.
 
Joined
Sep 10, 2018
Messages
7,633 (3.26/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Nobody is ever going to agree on this crap at least collectively. People just buy whatever they are going to buy if it performs like crap in 2 years life goes on buy a new GPU.

My 2x 680/7970s performed like crap after 2 years guess what I bought a new gpu..... My 2x 970/ 1x 980ti/ 1x 290X performed like crap when the 10 series came out guess what dual 1080s it is... I got tired of SLI guess what Titan Xp/1080ti did the trick wanted more performance grabbed a 2080ti and so on....

Sometimes it was vram other times it was generally weak performance for how I wanted to use a gpu if there is a game someone wants to play and their current setup isn't giving them what they want upgrade and if you can't afford it you got better things to worry about vs how optimized games are.

As a side not I still think 600+ usd gpu with 12GB of vram is a joke the same with 350+ usd gpus with 8GB at the same time if someone buys them I hope they get the performance they are looking for its their money after all.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.70/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
The biggest clowns by far are those who don't realize that despite this those games run just fine even without 24GB.

View attachment 295149

4070ti is still faster than a 3090 despite half the VRAM.

Look at what mainstream gamers are using. 18-24 FPS isn't just fine. It sucks donkey balls.
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Look at what mainstream gamers are using. 18-24 FPS isn't just fine. It sucks donkey balls.

If it sucks it’s not because of VRAM, I don’t know why people continue to believe something that’s simply not true. Mainstream gamers can just simply dial down the settings, like they’ve always done.
 
Joined
Sep 10, 2018
Messages
7,633 (3.26/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Look at what mainstream gamers are using. 18-24 FPS isn't just fine. It sucks donkey balls.

At every gpu launch for the last decade there where games that crushed flagship performance that is nothing new the only difference is other than SLI there was no real option to get better performance, These are just the resolutions I targeted by generation.

metro_2033_1920_1200.gifcrysis3_2560_1600.gifdeusex_3840_2160.png

The only difference is we really didn't have any options for more performance over a 500-700 tier product other than going with two cards.
 
Joined
Mar 22, 2020
Messages
27 (0.02/day)
Look at what mainstream gamers are using. 18-24 FPS isn't just fine. It sucks donkey balls.
I don't get it. I feel like I'm taking crazy pills. You all know there are settings lower than ultra, don't you ? Since when is 4K ultra how mainstream gamers play ?
This whole debate is surreal I swear.
 
Joined
Sep 10, 2018
Messages
7,633 (3.26/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I don't get it. I feel like I'm taking crazy pills. You all know there are settings lower than ultra, don't you ? Since when is 4K ultra how mainstream gamers play ?
This whole debate is surreal I swear.

While I try to remember that myself I do feel like a large portion of gamers rather there be no progress so that their hardware last longer I get that not everyone has a ton of disposable income.

Thanks to a decade of 2/4 core cpus and Nvidia stagnating the mid range with low amounts of vram I feel progress hasn't been what it should be. Tons of games still use 1-2 cores and a lot of games have terrible texture work that modders have to comically come in and fix assuming the game isn't so locked down it prevents it.
 
Joined
May 8, 2023
Messages
54 (0.08/day)
I don't get it. I feel like I'm taking crazy pills. You all know there are settings lower than ultra, don't you ? Since when is 4K ultra how mainstream gamers play ?
This whole debate is surreal I swear.

A year or two ago there were barely options for 4k monitors in my country. 1440p is just now becoming a little more popular here, but 1080p is still dominant. It feels like the change from 720p to 1080p, except now we have another resolution as a stepping stone (there was 900p, but I feel not a lot of people played in that res). And the "nothing but ultra/ultra+RT+PT" mentality is funny too. Spending more time looking at Afterburner than the game...

It's sorta ridiculous but that's technology for you.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,237 (2.02/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I don't get it. I feel like I'm taking crazy pills. You all know there are settings lower than ultra, don't you ? Since when is 4K ultra how mainstream gamers play ?
This whole debate is surreal I swear.
You have a point.

The issue is games like this one on release - which had GPU not being fully utilized because the game was for some reason CPU limited despite not even hitting 60 FPS. CPU limited scenarios are fine when you're in the hundreds of FPS or if it's an incredibly heavy scene e.g. moba games or a RT maxed night scene in Cyberpunk for instance, but the game just didn't offer the visuals or justification for the performance.

Games that push the envelope like Crysis/Cyberpunk are ok when they have reasons for the insane demands on release.

Games that look like any other UE4 game released in the last several years, but can't hit 60 FPS on a 4090 sometimes due to being rushed out the door are not. There's no heavy or advanced RT. The textures aren't 16 K or something like that. The detail just isn't there for the cost.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.70/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
You have a point.

The issue is games like this one on release - which had GPU not being fully utilized because the game was for some reason CPU limited despite not even hitting 60 FPS. CPU limited scenarios are fine when you're in the hundreds of FPS or if it's an incredibly heavy scene e.g. moba games or a RT maxed night scene in Cyberpunk for instance, but the game just didn't offer the visuals or justification for the performance.

Games that push the envelope like Crysis/Cyberpunk are ok when they have reasons for the insane demands on release.

Games that look like any other UE4 game released in the last several years, but can't hit 60 FPS on a 4090 sometimes due to being rushed out the door are not. There's no heavy or advanced RT. The textures aren't 16 K or something like that. The detail just isn't there for the cost.

That post sums it up very nicely. Well said sir.
 

HTC

Joined
Apr 1, 2008
Messages
4,668 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
If you believe buying an entry level video will "future proof" you for five years

For my use case, yes.

I've been using an RX 480 4GB until last week, when i bought this new card.

I don't need max details @ 1440P or 4K and i also don't need RT, so this card should work just fine.
 
Joined
Jan 20, 2019
Messages
1,631 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Not so crazy after all :respect:

NVIDIA GeForce RTX 4060 Ti Available as 8 GB and 16 GB, This Month. RTX 4060 in July


I honestly expected Nvidia to serve up at some point during the life-span of 40-series but very impressed if XX6X-16GB is announced earlier than anticipated.

This is exactly where the bottom-mid mainstream market should be in 2023 with SKU options swinging from 8 to 16GB. Less VRAM expending buyers can save their dosh and go 8 and higher VRAM expenders or higher resolution punters on a budget can opt for 16.

Only one more piece of the puzzle missing: "reasonable prices"

Thanks to a decade of 2/4 core cpus and Nvidia stagnating the mid range with low amounts of vram I feel progress hasn't been what it should be. Tons of games still use 1-2 cores and a lot of games have terrible texture work that modders have to comically come in and fix assuming the game isn't so locked down it prevents it.

Precisely my thoughts on the matter too!! I still don't get peoples fascination with encouraging limitations at the broader mainstream level. Its an unsolved mystery for the ages. The graphics evolution has often been compromised with lacklustre visual feasts (loosely fitted with "ultra") and lifting the VRAM cap was always envisaged as one of the primary factors in introducing visually stunning enhanced fidelity (or one stop closer to photorealism - although a long way to go). I'm hardly impressed with some of the current top dollar games at the best of quality bearings but its understandable... developers are hardly gonna push the feasty-eye envelope for the fewer higher-spec capable systems over the all-ordained ~8GB global handcuff.

Anyway, it seems the GPU manufacturers are on-board - about bloody time! (hope leaky leaks leak sticks)
 
Joined
Jun 1, 2011
Messages
4,827 (0.97/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
Lol, PS3/360 had 512MB and even back when they were released most PCs had more memory than that and a couple of years in their lifecycle most high end PCs had like an order of magnitude more memory. Not only that this isn't anything new, it's not even as bad as it used to be.
lol, this tool think its 2007 and the PS3 is still the major console, he hasn't realized the PS5 is out which helps explain why he gets pantsed in every single conversation he attempts. PATHETIC :roll:

Completely meaningless arbitrary metrics that don't mean anything.
numbers are hard for many people, you probably more so than others

For my use case, yes.

I've been using an RX 480 4GB until last week, when i bought this new card.

I don't need max details @ 1440P or 4K and i also don't need RT, so this card should work just fine.
performance wise as an entry level gaming card, the card is fine
priced as an entry level card, it's over priced
 

HTC

Joined
Apr 1, 2008
Messages
4,668 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
priced as an entry level card, it's over priced

It's A LOT MORE overpriced than you think: cost me 290€.

Out of curiosity, and checked the price again where i bought it ... and it's now @ 223€ ... :banghead:
 
Last edited:
Joined
Jun 14, 2020
Messages
4,296 (2.53/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
You have a point.

The issue is games like this one on release - which had GPU not being fully utilized because the game was for some reason CPU limited despite not even hitting 60 FPS. CPU limited scenarios are fine when you're in the hundreds of FPS or if it's an incredibly heavy scene e.g. moba games or a RT maxed night scene in Cyberpunk for instance, but the game just didn't offer the visuals or justification for the performance.

Games that push the envelope like Crysis/Cyberpunk are ok when they have reasons for the insane demands on release.

Games that look like any other UE4 game released in the last several years, but can't hit 60 FPS on a 4090 sometimes due to being rushed out the door are not. There's no heavy or advanced RT. The textures aren't 16 K or something like that. The detail just isn't there for the cost.
The most egregious thing for me is that - TLOU at 720p with low textures seems to need as much vram as Plague Tale at 4k ultra (actually, TLOU uses about 1gb more but nevermind). That right there is a clear example of a game using more vram for much worse image quality.
 
Joined
Aug 29, 2005
Messages
7,367 (1.04/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Sapphire Radeon RX 590 Nitro+ SE (RX 7900 XT/XTX or RX 9070 XT?) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
This is here starting to be an annoying thread, some UE developers have stated already they compensated for 8GB and lower for years now and changed their focus and it suddenly has become a problem.

If Valve also this did this it would be a mess but seriously you can only cramp so much data into 6 or 8GB and we have Nvidia on the other end with their graphics card that won't give people 16GB because they are afraid it will mess up their Quadro business.

I am here thinking buyers should think more of their choices because keep going with Nvidia that doesn't care much instead of trying AMD or Intel that will provide you for the same prices or less with at least 16GB and people start to complain.

If we want to change Nvidia's behaviour and get them to understand that the normal consumer also need 16GB on their RTX gaming cards not just Quadro cards stop buying their cards and get the demand more down because some companies only listen when their wallet starts to hurt.

Clearly for me Nvidia has shown that they only care about what they think is relevant and it's not more memory for their RTX gaming cards, this is why I only look for AMD now adays and why I choose to keep my RX 6800 XT even the RX 7900 XT is tempting.

For a side note I had RTX 3090 and 3070 cards but I went back to my current RX 6800 XT which is my third card and I am happy with my 1440p experience and I never really bother with ray tracing.
 
Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
This is here starting to be an annoying thread, some UE developers have stated already they compensated for 8GB and lower for years now and changed their focus and it suddenly has become a problem.

If Valve also this did this it would be a mess but seriously you can only cramp so much data into 6 or 8GB and we have Nvidia on the other end with their graphics card that won't give people 16GB because they are afraid it will mess up their Quadro business.

I am here thinking buyers should think more of their choices because keep going with Nvidia that doesn't care much instead of trying AMD or Intel that will provide you for the same prices or less with at least 16GB and people start to complain.

If we want to change Nvidia's behaviour and get them to understand that the normal consumer also need 16GB on their RTX gaming cards not just Quadro cards stop buying their cards and get the demand more down because some companies only listen when their wallet starts to hurt.

Clearly for me Nvidia has shown that they only care about what they think is relevant and it's not more memory for their RTX gaming cards, this is why I only look for AMD now adays and why I choose to keep my RX 6800 XT even the RX 7900 XT is tempting.

For a side note I had RTX 3090 and 3070 cards but I went back to my current RX 6800 XT which is my third card and I am happy with my 1440p experience and I never really bother with ray tracing.

AMD is also releasing 8GB cards in the near future
 
Joined
Aug 29, 2005
Messages
7,367 (1.04/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Sapphire Radeon RX 590 Nitro+ SE (RX 7900 XT/XTX or RX 9070 XT?) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
AMD is also releasing 8GB cards in the near future

I am aware of this, and there is still a need for at least 8GB cards but buyers also need to be aware of the requirements but when they complain that Nvidia specially don't give their cards enough vram for the performance they offer it's time to look else where.

I personally got annoyed with all the things Nvidia needs to stuff down their users throat when installing their drivers so I use NVcleaninstall from @W1zzard on my laptop it got a MX150 2GB gpu not really using it for gaming so the normal Nvidia Control Panel is more than enough for me.
 
Top