Monday, October 28th 2024
Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers
Our latest front-page poll sheds light on what people want from the next generation of gaming GPUs. We asked our readers what mattered most to them, with answers including raster performance, ray tracing performance, energy efficiency, upscaling or frame-gen technologies, the size of video memory, and lastly, pricing. Our poll ran from September 19, and gathered close to 24,000 votes as of this writing. Pricing remains the king of our polls, with the option gathering 36.1% of the vote, or 8,620 votes. Our readers expect pricing of next-generation GPUs to remain flat, variant-for-variant, and not continue on the absurdly upward trend it has had for the past few generations, with the high-end being pushed beyond the $1,000-mark, and $500 barely bringing in a 1440p-class GPU, while 4K-capable game consoles exist.
Both AMD and NVIDIA know that Moore's Law is cooked, and that generational leaps in performance and transistor counts are only possible with increase in pricing for the latest foundry nodes. AMD even tried experimenting with disaggregated (chiplet-based) GPUs with its latest RDNA 3 generation, before calling it quits on the enthusiast-segment, so it could focus on the sub-$1000 performance segment. The second most popular response was Raster 3D performance (classic 3D rendering performance), which scored 27% or 6,453 votes.Generational gains in raster 3D graphics rendering performance at native resolutions remain eminently desirable for anyone following the PC hardware industry for decades now. With Moore's Law in place, we've been used to near-50% generational increases in performance, which enabled new gaming APIs and upped the eye-candy in games with each generation. Interestingly, ray tracing performance takes a backseat, polling not even 3rd, but 4th place, at 10.4% or 2,475 votes. The 3rd place goes to energy efficiency.
The introduction of 600 W-capable power connectors presented ominous signs of where power was headed with future generations of GPUs as the semiconductor fabrication industry struggles to make cutting edge sub 2 nm nodes available, which meant that for the past 3 or 4 generations, GPUs aren't getting built on the very latest foundry node. For example, by the time 8 nm and 7 nm GPUs came out, 5 nm EUV was already the cutting-edge, and Apple was making its iPhone SoCs on them. Both AMD and NVIDIA would go on to make their next-generations on 5 nm, while the cutting-edge had moved on to 4 nm and 3 nm. The upcoming RDNA 4 and GeForce Blackwell generations are expected to be built on nodes no more advanced than 3 nm, but these come out in 2025, by which time the cutting edge would have moved on to 20 A. All this impacts power, which a performance target wildly misaligns with foundry node available to GPU designers.
Our readers gave upscaling and frame-gen technologies like DLSS, FSR, and XeSS, the least votes, with the option scoring just 2.8% or 661 votes. They do not believe that upscaling technology is a valid excuse for missing generational performance improvement targets at native resolution, and take any claims such as "this looks better than native resolution" with a pinch of salt.
All said and done, the GPU buyer of today has the same expectations from the next-gen as they did a decade ago. This is important, as it forces NVIDIA and AMD to innovate, build their GPUs on the most advanced foundry nodes, and try not to be too greedy with pricing. NVIDIA's competitor isn't AMD or Intel, but rather PC gaming as a platform has competition from the consoles, which are offering 4K gaming experiences for half a grand, with technology that "just works." The onus then is on PC hardware manufacturers to keep up.
Both AMD and NVIDIA know that Moore's Law is cooked, and that generational leaps in performance and transistor counts are only possible with increase in pricing for the latest foundry nodes. AMD even tried experimenting with disaggregated (chiplet-based) GPUs with its latest RDNA 3 generation, before calling it quits on the enthusiast-segment, so it could focus on the sub-$1000 performance segment. The second most popular response was Raster 3D performance (classic 3D rendering performance), which scored 27% or 6,453 votes.Generational gains in raster 3D graphics rendering performance at native resolutions remain eminently desirable for anyone following the PC hardware industry for decades now. With Moore's Law in place, we've been used to near-50% generational increases in performance, which enabled new gaming APIs and upped the eye-candy in games with each generation. Interestingly, ray tracing performance takes a backseat, polling not even 3rd, but 4th place, at 10.4% or 2,475 votes. The 3rd place goes to energy efficiency.
The introduction of 600 W-capable power connectors presented ominous signs of where power was headed with future generations of GPUs as the semiconductor fabrication industry struggles to make cutting edge sub 2 nm nodes available, which meant that for the past 3 or 4 generations, GPUs aren't getting built on the very latest foundry node. For example, by the time 8 nm and 7 nm GPUs came out, 5 nm EUV was already the cutting-edge, and Apple was making its iPhone SoCs on them. Both AMD and NVIDIA would go on to make their next-generations on 5 nm, while the cutting-edge had moved on to 4 nm and 3 nm. The upcoming RDNA 4 and GeForce Blackwell generations are expected to be built on nodes no more advanced than 3 nm, but these come out in 2025, by which time the cutting edge would have moved on to 20 A. All this impacts power, which a performance target wildly misaligns with foundry node available to GPU designers.
Our readers gave upscaling and frame-gen technologies like DLSS, FSR, and XeSS, the least votes, with the option scoring just 2.8% or 661 votes. They do not believe that upscaling technology is a valid excuse for missing generational performance improvement targets at native resolution, and take any claims such as "this looks better than native resolution" with a pinch of salt.
All said and done, the GPU buyer of today has the same expectations from the next-gen as they did a decade ago. This is important, as it forces NVIDIA and AMD to innovate, build their GPUs on the most advanced foundry nodes, and try not to be too greedy with pricing. NVIDIA's competitor isn't AMD or Intel, but rather PC gaming as a platform has competition from the consoles, which are offering 4K gaming experiences for half a grand, with technology that "just works." The onus then is on PC hardware manufacturers to keep up.
73 Comments on Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers
If only 12GB, that is some really big boy sweater the 5050 wearing.
If a game has to rely on RT to sell its shit. It's like AI on everything, no one actually wants it, yet here it is....
> "Rasterization is a dead end"
That said, the raster price to performance benefit has to be there before I even consider looking at cards to upgrade to, so I would not have voted efficiency as my priority. But it and VRAM are still important factors before a purchase. I DGAF about RT or upscaling.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
DLSS/FSR/XeSS are so-so, tehnically they are at postprocessing stage but they have hooks in the pipeline - if only motion vectors and such.
The Nitro card received several tech awards for its design and quality. It can even hit 3GHz in several games, at stock settings. Happy to provide screenshots after my holiday is finished. The card is fantastic in 4K/120Hz gaming on OLED display.
1) Path-tracing:
This IS the end-goal for most 3D gaming. Rasterized graphics will eventually go away. Game DEVELOPMENT will actually be far EASIER once it's only being done for path-tracing for many reasons including issues related to light baking and conflicts between raster lighting methods. It's not even debatable if you're in the industry.
(It will take a LONG time though because game devs still need to make most games for older hardware due to the install base vs profitability.)
2) AI UPSCALING:
It's not a "BS" feature like some people seem to think. It's actually the most BENEFICIAL feature that has ever come to gaming. Has it been ABUSED? Absolutely.
Also, future games will continue to optimize around the assumption that AI UPSCALING will be used. For example, deciding on the art style, or line thickness, or how many light bounces to do etc.
Summary:
The future of HARDWARE design is primarily based around moving towards more PATH TRACING and utilizing AI UPSCALING. The Sony PS5 PRO is a good example of how this is happening. The main GPU architecture is done for a bit more general compute to get higher resolution, a better focus on path-tracing and some dedicated DIE SPACE for PSSR to utilize AI upscaling.
The PS6, next XBOX and PC gaming in general will be optimized around these TWO things.
By all means, be ANNOYED about upscaling and/or frame generation being abused. Or path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
Path tracing is very-very expensive. Expanding what the RT units in GPUs are doing today is actually easy enough but while these do a lot of heavy lifting there is a lot else in RT/PT that needs to be figured out and standardized. Simple example - the bloody structure of the scene. Whether it will eventually be BVH, some variation of it, some other mesh structure or something completely new, for full path tracing to take off there are parts of it that probably need to end up hardware assisted if not straight up accelerated. And that requires some level of standardization about what and how everyone does with it. More likely though this will still be run on either shaders or CPUs which shifts the responsibility and performance cost but does not really solve anything. And afaik there are still some things that are difficult to do with path tracing.
Plus there is the elephant in the room - whoever wants to go for the full path tracing hardware needs to figure out what to do with all the existing rasterized games. Essentially something like a DX/OGL/Vulkan wrapper to whatever API will run path tracing. Since we do not know what the hardware approach will end up, this may end up to be a simple problem or an extremely difficult one :)
Looking at Tim's video about RT the situation is not that great six years later. Games where RT is properly implemented, does not cause issues and is meaningfully distinct can be counted on two hands at best. If this has taken six years then we can only guess how long it takes for PT to start appearing in games in large numbers. Im guessing that's still at least a decade away and raster will not disappear even 30 years from now.
It is OK at best for the most part, and you need to burn a lot of money to make it fluid in QHD or larger.
Within in 3~5 maybe becomes standard for games on High end configs, but nothing else.
At this rate, I honestly wouldn't bet good money that full path-tracing will become standard within my lifetime, which we'll call another 30 years, give or take. We've seen plenty of hype campaigns for supposedly paradigm-destroying tech innovations that stall out on the 5-yard line. Wake me when path-tracing becomes more than a tech demo or a marketing bullet point. In the meanwhile, as @londiste suggests, there are tens of thousands of excellent games out there, many of them much more fun than the focus-group-driven corporate products at the leading edge of graphics tech. They're all rasterized.