Rumors for RNDA4 show that AMD is not going to compete in the high end graphics space and instead focus on reducing costs. AMD is instead putting a heavier RnD effort into RNDA5. RNDA4 to have 7900xtx performance, less power consumption, and cost $400. Please keep in mind this is a rumor that leads to my hypothetical questions.
If AMD releases a 8800xt RNDA4 GPU that has
- More or less the same performance as the 7900xtx
- ~33% less power usage and
- The same feature set as RNDA3
- Costs $400 retail
What price premium do you think Nvidia would be able to charge for the RTX 4080/4080s?
Currently the 4080s is ~$100 or 11% more than the 7900xtx.
Would $40 more or $440 for a RTX4080/4080s be worth the cost for Nvidia's feature set?
The 4080s has ~20% more RT performance than the 7900xtx according to TechPowerUP reviews.
Would $80 more or $480 for a RTX4080/4080s be worth the cost for Nvidia's feature set?
EDIT: Yes I know $400 is a very low price for an 8800xt card. All of this is made up. I chose a low price to make it more attractive. Rumors are rumors. Hypothetical are hypothetical. $400 makes this question more interesting.
I'd like to point out that TPU tests with Ray Tracing - which means high settings, but not Path Tracing or "Full Ray Tracing" as NVIDIA calls it.
Graphs are pulled from:
The Phantom Liberty expansion brings big improvements to Cyberpunk 2077 and adds an exciting new story line. The game also gets support for DLSS 3.5 and Path Tracing. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection of modern...
www.techpowerup.com
and the latest GPU review
The Galax GeForce RTX 4070 Super EX comes with a small price increase over the NVIDIA MSRP, yet gives you a premium cooler that's much stronger than what's offered on the baseline models. Both temperatures and noise levels are improved over the Founders Edition.
www.techpowerup.com
In this metric the difference between AMD and NVIDIA is even greater than the 31% faster at native 4K ray tracing the RTX 4080 is over the RX 7900XTX (average across tested games in ray tracing - ultra (but not path tracing)).
This is the averaged chart of native Ray Tracing game tests, without frame generation or upscaling.
For direct comparison of ray tracing vs path tracing/full ray tracing, here's the normal "ray tracing" native 4K results in a specific game, known for a high quality and intensive ray tracing implementation, that also supports both ray and path tracing.
Here, the RTX 4080 is 57.6% faster than the RX 7900XTX in minimum FPS, and 59.9% faster in average FPS. The 4090 is in another league, with 101.3% faster performance than the RX 7900XTX in minimum FPS, and 117.3% faster in average FPS, more than twice as fast.
Now here's the 4K "path tracing" or "full ray tracing" results, again without upscaling, frame generation or ray reconstruction, all of which add performance.
Here the RTX 4080 is 226.3% faster in min FPS than the RX 7900XTX, and 220.9% faster in average FPS. The RTX 4090 is again, in another class, being 365.7% faster in minimum FPS, and 353.4% faster in average FPS.
The point i'm trying to make here, is that as you increase the load on the ray tracing hardware of these cards, the NVIDIA cards become faster, relatively, to their AMD equivalents. This is important to understand, because some popular games include very lightweight or basic implementations of "ray tracing", such as global illumination only, for example. This skews the data slightly, because the "ray tracing" performance penalty is much smaller, than if the entire game's lighting was ray or path traced, instead of a hybrid design of rasterised lighting and ray/path traced lighting.
Obviously 4K path traced gaming isn't currently viable at native, without using some form of upscaling/tech to improve FPS, or any combination of performance/quality improving tech such as DLSS/DLAA, Frame Generation and Ray Reconstruction. That's not the point here, I'm just trying to make it clear that most current games don't really push the envelope when it comes to "ray tracing" implementations, they're basically just features tacked on for the small percentage of PC's that can actually run RT/PT games with good FPS. It's simply not a developer priority when most gamers have hardware on the level of an RTX 2060, or use consoles. This will change though, as people naturally get faster hardware over time. So I find it interesting to compare what the difference is in actual performance potential between the two vendors,
when these cards are actually stressed with intensive ray/path tracing implementations.
Here's what those path tracing/"full ray tracing" numbers look like when combined with some of the performance improving technologies currently available. Ray reconstruction is only available when used with both DLSS and path tracing/"full ray tracing", it doesn't work with standard "ray tracing". This is because it's a new way of denoising that does both the upscaling and the denoising at the same time, rather than as separate steps, improving both quality and performance. Note that while the graph is titled DLSS 3.5 Performance, this is indicating the performance comparison, the actual testing is done with the DLSS "Quality" preset.
NVIDIA DLSS 3.5 launches today, supporting GeForce 20 and newer. The new algorithm improves the looks ray traced lighting, reflections, shadows and more. In our DLSS 3.5 Ray Reconstruction review we test image quality, but also discovered that VRAM usage actually goes down and performance goes up.
www.techpowerup.com
I wanted to write this because I don't think people (especially people who don't own an RTX card, or even those who haven't tried a higher end Ada generation card) really understand the difference in performance between the two vendors, and just how far ahead NVIDIA is.
I think that this comparison, while not perfect, indicates the kind of performance differences people can expect as RT/PT implementations become more involved, rather than the weak smattering of RT effects sprinkled on top of games that we're used to seeing today, so that consoles (which currently use AMD GPUs) can run the game.
With the release of the PS5 Pro and the eventual Xbox Series refresh, the PS5 Pro is rumoured to have significantly faster ray tracing performance, meaning developers will probably start using heavier RT/PT implementations. But I doubt we'll see widespread path tracing until the next generation of consoles are released, e.g. PS6.
As developers start to actually make full use of the new lighting techniques of the latest game engines moving forward, I expect this trend will really start to show the differences in performance more completely, and game performance testing will show numbers skewing closer and closer to what's been shown here. Path traced lighting, e.g. no rasterized lighting at all, is the obvious end game.
Something interesting to note - look at the RTX 4070 Ti compared to the RX 7900XTX, in the "average FPS" chart it's slower for Ray Tracing. In the heavy RT or PT implementation of Cyberpunk: Phantom Liberty, it's much faster.
This is what I'm getting at, most games today don't come close to fully using the ray tracing hardware on current generation cards, so even with "ray tracing" turned on, the FPS is still dictated by classic rasterization performance. This
will change, as games use heavier and heavier RT, or full RT/PT implementations.