Monday, October 28th 2024

Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers

Our latest front-page poll sheds light on what people want from the next generation of gaming GPUs. We asked our readers what mattered most to them, with answers including raster performance, ray tracing performance, energy efficiency, upscaling or frame-gen technologies, the size of video memory, and lastly, pricing. Our poll ran from September 19, and gathered close to 24,000 votes as of this writing. Pricing remains the king of our polls, with the option gathering 36.1% of the vote, or 8,620 votes. Our readers expect pricing of next-generation GPUs to remain flat, variant-for-variant, and not continue on the absurdly upward trend it has had for the past few generations, with the high-end being pushed beyond the $1,000-mark, and $500 barely bringing in a 1440p-class GPU, while 4K-capable game consoles exist.

Both AMD and NVIDIA know that Moore's Law is cooked, and that generational leaps in performance and transistor counts are only possible with increase in pricing for the latest foundry nodes. AMD even tried experimenting with disaggregated (chiplet-based) GPUs with its latest RDNA 3 generation, before calling it quits on the enthusiast-segment, so it could focus on the sub-$1000 performance segment. The second most popular response was Raster 3D performance (classic 3D rendering performance), which scored 27% or 6,453 votes.
Generational gains in raster 3D graphics rendering performance at native resolutions remain eminently desirable for anyone following the PC hardware industry for decades now. With Moore's Law in place, we've been used to near-50% generational increases in performance, which enabled new gaming APIs and upped the eye-candy in games with each generation. Interestingly, ray tracing performance takes a backseat, polling not even 3rd, but 4th place, at 10.4% or 2,475 votes. The 3rd place goes to energy efficiency.

The introduction of 600 W-capable power connectors presented ominous signs of where power was headed with future generations of GPUs as the semiconductor fabrication industry struggles to make cutting edge sub 2 nm nodes available, which meant that for the past 3 or 4 generations, GPUs aren't getting built on the very latest foundry node. For example, by the time 8 nm and 7 nm GPUs came out, 5 nm EUV was already the cutting-edge, and Apple was making its iPhone SoCs on them. Both AMD and NVIDIA would go on to make their next-generations on 5 nm, while the cutting-edge had moved on to 4 nm and 3 nm. The upcoming RDNA 4 and GeForce Blackwell generations are expected to be built on nodes no more advanced than 3 nm, but these come out in 2025, by which time the cutting edge would have moved on to 20 A. All this impacts power, which a performance target wildly misaligns with foundry node available to GPU designers.

Our readers gave upscaling and frame-gen technologies like DLSS, FSR, and XeSS, the least votes, with the option scoring just 2.8% or 661 votes. They do not believe that upscaling technology is a valid excuse for missing generational performance improvement targets at native resolution, and take any claims such as "this looks better than native resolution" with a pinch of salt.

All said and done, the GPU buyer of today has the same expectations from the next-gen as they did a decade ago. This is important, as it forces NVIDIA and AMD to innovate, build their GPUs on the most advanced foundry nodes, and try not to be too greedy with pricing. NVIDIA's competitor isn't AMD or Intel, but rather PC gaming as a platform has competition from the consoles, which are offering 4K gaming experiences for half a grand, with technology that "just works." The onus then is on PC hardware manufacturers to keep up.
Add your own comment

73 Comments on Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers

#51
dont whant to set it"'
phintsUh oh VRAM is low priority on this TPU poll... only 12GB VRAM on 5070 confirmed.
What, where?
If only 12GB, that is some really big boy sweater the 5050 wearing.
Posted on Reply
#52
Steevo
I don't have a single RT game in my library, granted life is always in the way of playing the games I have, but as stated, the juice is not worth the squeeze.

If a game has to rely on RT to sell its shit. It's like AI on everything, no one actually wants it, yet here it is....
Posted on Reply
#53
tommo1982
phintsA lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
Perhaps rasterization is a dead end, but who will drop backward compatibility with current tech for a promise of new? Look what happens when Intel said it's going to drop support of some x86 legacy stuff. GFX generation needs a change, but I don't see it happening in PC ecosystem, even consoles fell into backward compatibility trap. Nope, it will be something new, not burdened by the old.
Posted on Reply
#54
Hecate91
Rasterization is only a dead end once RT can work on midrange cards without a significant performance hit, RT has it own hacks being upscaling and fake frames which makes games look worse as well.
Posted on Reply
#55
Wasteland
phintsA lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
> Looks at a gaming landscape in which rasterization accounts for 95% of every scene
> "Rasterization is a dead end"
Posted on Reply
#56
ThomasK
People should stop acting like RT is something the majority cares about, because it's not.
Posted on Reply
#57
EsliteMoby
loracle706Upscaling and frame generation lol no one wants that bullshit, finally people awakes, and those thiefs companies are pushing hard in this way !!
Yes. But game companies are forcing TAA on their titles so they can promote useless gimmicks like DLSS.
Posted on Reply
#58
T_Zel
Broken ProcessorI'm honestly surprised Energy Efficiency was 16% I must me a luggite because I DGAF about it.
I don't worry too much about energy consumption, but what I absolutely do care about is how much heat there is in my room. I currently have a 230W card and I am completely unwilling to buy one more power hungry than that purely for ambient heat reasons. And as an added bonus, typically the lower wattage cards are able to run quieter (assuming you buy a model with a halfway competent cooling solution).

That said, the raster price to performance benefit has to be there before I even consider looking at cards to upgrade to, so I would not have voted efficiency as my priority. But it and VRAM are still important factors before a purchase. I DGAF about RT or upscaling.
Posted on Reply
#59
1d10t
It's funny someone said that current artificial feature is the future, when in fact it is only a small part of post processing.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
Posted on Reply
#60
Prima.Vera
Agreed. Those new games are so extremely poor optimized, now is mandatory to use upscaling tech and fake frames just to run them OK-ish even on uber expensive GPUs.... Ridiculous.
Posted on Reply
#61
londiste
phintsA lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
No, rasterization is not a dead end. The problem is that with manufacturing process evolution slowing down scaling up compute unit counts is no longer a viable way ahead for long.
loracle706Up to them, next gpus generation rtx 6xxx/ rx 8xxx will surely handle 4k 60fps+ easily there will be no need for that bullshit specially if graphics does not make a huge evolution !!
No they won't. Look at state of the art games making 4090 struggle even without RT. Unreal Engine 5 with Nanite and Lumen turned up for example.
1d10tIt's funny someone said that current artificial feature is the future, when in fact it is only a small part of post processing.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
RT is straight-up not postprocessing.
DLSS/FSR/XeSS are so-so, tehnically they are at postprocessing stage but they have hooks in the pipeline - if only motion vectors and such.
Posted on Reply
#62
LittleBro
Hecate91Rasterization is only a dead end once RT can work on midrange cards without a significant performance hit, RT has it own hacks being upscaling and fake frames which makes games look worse as well.
Wasteland> Looks at a gaming landscape in which rasterization accounts for 95% of every scene
> "Rasterization is a dead end"
1d10tIt's funny someone said that current artificial feature is the future, when in fact it is only a small part of post processing.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
Good points. Rasterization is crucial for rendering a scene. There is still plenty of room left to get more detailed scenes with improved rasterization.
Prima.VeraAgreed. Those new games are so extremely poor optimized, now is mandatory to use upscaling tech just and fake frames just to run them OK-ish even on uber expensive GPUs.... Ridiculous.
EsliteMobyYes. But game companies are forcing TAA on their titles so they can promote useless gimmicks like DLSS.
DLSS-like functionalities helps devs to avoid proper optimization of games. In other words, it enables them to get paid quickier and for less work done. Next there will be games generated by "AI" sold with price tags of games made by humans.
Posted on Reply
#63
Tek-Check
It makes sense. For me, price and raw performance ratio was the most important and that's why I didn't buy 4090 but I bought 7900XTX from Sapphire Nitro range. Nvidia card was only ~25% faster but ~60% more expensive. Pretty simple calculation for me, while both cards offer 24GB of VRAM.

The Nitro card received several tech awards for its design and quality. It can even hit 3GHz in several games, at stock settings. Happy to provide screenshots after my holiday is finished. The card is fantastic in 4K/120Hz gaming on OLED display.
Posted on Reply
#64
TumbleGeorge
londisteCan someone explain to me what is it with this strange obsession with software when it comes to ray-tracing these days? :)
Another occasion for dUck-measuring if you mean auditorium. My card make more rt than your card. For game studios is easy way to transfer part of their work on the he consumer's back. For Nvidia is way to bigger profit margins.
Posted on Reply
#65
photonboy
Here's some REALITY:
1) Path-tracing:
This IS the end-goal for most 3D gaming. Rasterized graphics will eventually go away. Game DEVELOPMENT will actually be far EASIER once it's only being done for path-tracing for many reasons including issues related to light baking and conflicts between raster lighting methods. It's not even debatable if you're in the industry.
(It will take a LONG time though because game devs still need to make most games for older hardware due to the install base vs profitability.)

2) AI UPSCALING:
It's not a "BS" feature like some people seem to think. It's actually the most BENEFICIAL feature that has ever come to gaming. Has it been ABUSED? Absolutely.

Also, future games will continue to optimize around the assumption that AI UPSCALING will be used. For example, deciding on the art style, or line thickness, or how many light bounces to do etc.

Summary:
The future of HARDWARE design is primarily based around moving towards more PATH TRACING and utilizing AI UPSCALING. The Sony PS5 PRO is a good example of how this is happening. The main GPU architecture is done for a bit more general compute to get higher resolution, a better focus on path-tracing and some dedicated DIE SPACE for PSSR to utilize AI upscaling.

The PS6, next XBOX and PC gaming in general will be optimized around these TWO things.

By all means, be ANNOYED about upscaling and/or frame generation being abused. Or path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
Posted on Reply
#66
londiste
photonboyHere's some REALITY:
1) Path-tracing:
This IS the end-goal for most 3D gaming. Rasterized graphics will eventually go away. Game DEVELOPMENT will actually be far EASIER once it's only being done for path-tracing for many reasons including issues related to light baking and conflicts between raster lighting methods. It's not even debatable if you're in the industry.
(It will take a LONG time though because game devs still need to make most games for older hardware due to the install base vs profitability.)

2) AI UPSCALING:
It's not a "BS" feature like some people seem to think. It's actually the most BENEFICIAL feature that has ever come to gaming. Has it been ABUSED? Absolutely.

Also, future games will continue to optimize around the assumption that AI UPSCALING will be used. For example, deciding on the art style, or line thickness, or how many light bounces to do etc.

Summary:
The future of HARDWARE design is primarily based around moving towards more PATH TRACING and utilizing AI UPSCALING. The Sony PS5 PRO is a good example of how this is happening. The main GPU architecture is done for a bit more general compute to get higher resolution, a better focus on path-tracing and some dedicated DIE SPACE for PSSR to utilize AI upscaling.

The PS6, next XBOX and PC gaming in general will be optimized around these TWO things.

By all means, be ANNOYED about upscaling and/or frame generation being abused. Or path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
This future is still quite a bit away.

Path tracing is very-very expensive. Expanding what the RT units in GPUs are doing today is actually easy enough but while these do a lot of heavy lifting there is a lot else in RT/PT that needs to be figured out and standardized. Simple example - the bloody structure of the scene. Whether it will eventually be BVH, some variation of it, some other mesh structure or something completely new, for full path tracing to take off there are parts of it that probably need to end up hardware assisted if not straight up accelerated. And that requires some level of standardization about what and how everyone does with it. More likely though this will still be run on either shaders or CPUs which shifts the responsibility and performance cost but does not really solve anything. And afaik there are still some things that are difficult to do with path tracing.

Plus there is the elephant in the room - whoever wants to go for the full path tracing hardware needs to figure out what to do with all the existing rasterized games. Essentially something like a DX/OGL/Vulkan wrapper to whatever API will run path tracing. Since we do not know what the hardware approach will end up, this may end up to be a simple problem or an extremely difficult one :)
Posted on Reply
#67
Tomorrow
First game using real time RT came out in 2018 (Battlefield V).

Looking at Tim's video about RT the situation is not that great six years later. Games where RT is properly implemented, does not cause issues and is meaningfully distinct can be counted on two hands at best. If this has taken six years then we can only guess how long it takes for PT to start appearing in games in large numbers. Im guessing that's still at least a decade away and raster will not disappear even 30 years from now.
Posted on Reply
#68
csendesmark
Raytracing is a gimmick only, for now.
It is OK at best for the most part, and you need to burn a lot of money to make it fluid in QHD or larger.
Within in 3~5 maybe becomes standard for games on High end configs, but nothing else.
Posted on Reply
#69
Wasteland
photonboyOr path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
This is a thread about next-gen GPUs. I don't think anyone here disputes that full path-tracing is a worthwhile goal, or even that it will eventually take over, but the question is when? "Sometime between now and the heat death of the universe" doesn't move the needle on purchasing decisions made now or indeed within the next decade.

At this rate, I honestly wouldn't bet good money that full path-tracing will become standard within my lifetime, which we'll call another 30 years, give or take. We've seen plenty of hype campaigns for supposedly paradigm-destroying tech innovations that stall out on the 5-yard line. Wake me when path-tracing becomes more than a tech demo or a marketing bullet point. In the meanwhile, as @londiste suggests, there are tens of thousands of excellent games out there, many of them much more fun than the focus-group-driven corporate products at the leading edge of graphics tech. They're all rasterized.
Posted on Reply
#71
DemonicRyzen666
TheDeeGeeFact is, Ray Tracing is easier to work with for developers compared to baked lighting.
No it isn't.
Posted on Reply
#72
EsliteMoby
LittleBroDLSS-like functionalities helps devs to avoid proper optimization of games. In other words, it enables them to get paid quickier and for less work done. Next there will be games generated by "AI" sold with price tags of games made by humans.
DLSS is barely AI. It's temporal upscaling. If it's AI performance will be bad because every frame needs to be trained and reconstructed in real-time.
Posted on Reply
#73
tpa-pr
Like a few people here, I don't have any interest in ray-tracing or any of Nvidia's "special sauce". I want as many frames as possible at 1440p native, a fair bit of VRAM for future-proofing and drivers that don't cause me issues on Linux at a reasonable price. AMD has ticked all of those boxes for me with the 7900 XTX :)
Posted on Reply
Add your own comment
Nov 6th, 2024 08:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts