Tuesday, February 6th 2024
AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER
Prices of the AMD Radeon RX 7900 XT graphics card hit new lows, with a Sapphire custom-design card selling for $699 with a coupon discount on Newegg. This puts its price a whole $100 cheaper (12.5% cheaper) than the recently announced NVIDIA GeForce RTX 4070 Ti SUPER. The most interesting part of the story is that the RX 7900 XT is technically from a segment above. Originally launched at $900, the RX 7900 XT is recommended by AMD for 4K Ultra HD gaming with ray tracing; while the RTX 4070 Ti SUPER is officially recommended by NVIDIA for maxed out gaming with ray tracing at 1440p, although throughout our testing, we found the card to be capable of 4K Ultra HD gaming.
The Radeon RX 7900 XT offers about the same performance as the RTX 4070 Ti SUPER, averaging 1% higher than it in our testing, at the 4K Ultra HD resolution. At 1440p, the official stomping ground of the RTX 4070 Ti SUPER, the RX 7900 XT comes out 2% faster. These are, of course pure raster 3D workloads. In our testing with ray tracing enabled, the RTX 4070 Ti SUPER storms past the RX 7900 XT, posting 23% higher performance at 4K Ultra HD, and 21% higher performance at 1440p.
Source:
VideoCardz
The Radeon RX 7900 XT offers about the same performance as the RTX 4070 Ti SUPER, averaging 1% higher than it in our testing, at the 4K Ultra HD resolution. At 1440p, the official stomping ground of the RTX 4070 Ti SUPER, the RX 7900 XT comes out 2% faster. These are, of course pure raster 3D workloads. In our testing with ray tracing enabled, the RTX 4070 Ti SUPER storms past the RX 7900 XT, posting 23% higher performance at 4K Ultra HD, and 21% higher performance at 1440p.
132 Comments on AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER
You did your choice, You had some reasoning behind the purchase of particular card you have. It' doesn't matter anymore, if someone's card is faster, sometimes within margin of error.
Some people don't even use RTRT, even while having hi-end RTX card. That's their choice. If you don't like the card and company of that card, that someone else uses... well it's not your busines. Go enjoy your <whatever GPU brand and color TM>.
If people want change, they have to make the first step instead of making every excuse in the book to purchase from the brand they are supporting. If you want the best video card on the market, its the 4090, there is no doubt about it. The 7900 XTX is slightly better than the 4080 overall but both are very close (The 7900XTX can be had cheaper however). The 7900 XT is better than the 4070 ti super but is much closer than the 4070 ti (And now its quite a bit cheaper). The fact we have so much brand loyalty out there is what is killing the GPU market.
You are trying to make an argument that would typically apply to eSports titles but in the case of CoD there are far far greater issues that plauge the game. I'd have to agree, there likely isn't much if any benefit to 700 FPS over 500 for example in regard to latency. I could maybe understand a very slight benefit to the ability to predict movement thanks to the additional frames but even still, you are talking 500 FPS vs 700, both of which are very high. Heck going from a 144 Hz to 240 Hz screen was a small upgrade to my eyes and most eSports players that have been asked about the difference seemed to agree. Most said that benefits really cap out at 360 Hz. That said there is a separate factor of image sharpness. A higher refresh rate monitor can improve sharpness of motion but there are other technologies that address that as well (ULMB2 for example). I think these additional factors make the conversation more complicated but at the very least we can conclude that benefits from higher FPS / refresh rates has reached extremely diminishing returns.
Also, these high-end graphics, or the chips inside them are also used in supercomputers which definitely have a more important function - to compute all kinds of problems, you said global warming, medicines solutions, even if you wish the capitalist system which can't last forever because it has its own disadvantages, someone mentioned bursting bubbles, etc. CPUs are cheap, while graphics cards are expensive. I don't see here anything but random prices setting by someone who dictates that behind the scenes. PhysX is like DLSS, proprietary feature, while ray-tracing is part of the Microsoft DXR specification which all agreed on, including the AMD that looks like not interested at all.
I mean, I am still fine with someone saying thats more important specifically to them because they want to run Cyberpunk at 4K Ultra with RT on Max, but even with that argument the only card that runs ray tracing decently on the titles it is decently noticeable is the 4090 which is very expensive. Even then it still is a huge performance hit on that card.
Muscle memory is trained on stable latencies. I played pretty competitive Guild Wars stuff and raided on shitty laptops, taking down raid bosses in WoW at 10 FPS, even leading raids with another CPU hog in the background (Ventrilo or TS). Throttling? I didn't have a clue what that was, but yea, either the server ponied up high latencies/low FPS, or the hardware wouldn't handle all the assets proper. Whatever. If you do it 10 times, you know when to push that button. Overall, if you played online a good ten thousand hours, you know what latency is and how to adapt your input to still land everything at the right time.
Variable latency? You can safely forget your game performance to increase. But on any stable latency, you can train.
Also buying a GPU nowadays solely based on RT is a pretty bad idea, because no GPU, not even the $2000+ overpriced Nvidia cards can successfully utilize RT without tanking performance and implementing more gimmicks to gain that performance back somewhat. Lol
TLDR: Doesn't the internet have the highest latency of all "components"?
This is all nonsense, globalists have been spreading this lie for ages now. Don't buy into those lies.
This is the kind of situation most gamers wrongly assume to be in when they complain about GPU prices, but go ahead and buy one from the top shelf anyway. The only difference is, we don't need that GPU. We just want it when we could make do with any other model. We are at fault for prices. If we all refused to pay thousands for a mere toy, then it wouldn't cost thousands, it's that simple. But we don't refuse because we're sheep and we believe in stupid nonsensical slogans like "it's just the way it is" or "these are harsh times". No, it's not the way it is. We make it ourselves.
And I'm talking about gaming GPU prices. Supercomputer and datacentre GPUs are totally different, and so are their prices. If I'm wrong, then I'd gladly hear about your experiences with your 750 W Radeon Instinct super GPU in games. :)
If we put this perspective on the % difference between GPUs... even a GPU that can output twice the FPS might net you what, a 10% advantage in total latency over the other. If the supposed 7900XTX scores 200 FPS vs another that has 400 you've already closed the gap for the most part. Diminishing returns.
And they will want to argue endlessly about why their overpriced product is so much better because what else are they going to do, something that is X% more expensive is never X% better, you're always getting ripped off, the more you pay the worse the value and the more you'll feel the need to justify your choice to others.
If the next X3D CPU is like 1000$, you'd absolutely have tons of people arguing it's actually OK because it is the fastest gaming CPU after all and if you think it's horrid value and a stupid choice you're just a poor pleb or something.
There are several features the NVIDIA GPUs have that AMD doesn't have that favours NVIDIA for me. I'm going to list some of the more important features for me under here.
- CUDA-cores (as I do video editing & encoding). It's much much more available in video editing programs or other programs that takes advantage of the CUDA-cores. AMD have something similar I think, but it's nowhere close to be as much effective and available as CUDA is.
- NVIDIA RTX Video (Super Resolution & HDR): nvidia.custhelp.com/app/answers/detail/a_id/5448/~/rtx-video-faq
- NVIDIA RTX TrueHDR in games (SDR to HDR conversion feature via AI): wccftech.com/nvidia-rtx-video-hdr-mod-reportedly-provides-better-auto-hdr-for-games-than-the-default-windows-one/
- Power usage. It's important for me as I'm using a Mini-ITX case.
- Better 'Ray Tracing' performance. Not important to be best at this. It's just a small added bonus to have a good performance in this sometimes. And the 'RTX 4070 Ti SUPER' is powerful enough to actually use 'Ray Tracing' to an acceptable level.
So, an 'RTX 4070 Ti SUPER' is the GPU I'm going to buy if I'm buying a new GPU before the next generation of GPUs comes out from NVIDIA and AMD.