Tuesday, February 6th 2024
AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER
Prices of the AMD Radeon RX 7900 XT graphics card hit new lows, with a Sapphire custom-design card selling for $699 with a coupon discount on Newegg. This puts its price a whole $100 cheaper (12.5% cheaper) than the recently announced NVIDIA GeForce RTX 4070 Ti SUPER. The most interesting part of the story is that the RX 7900 XT is technically from a segment above. Originally launched at $900, the RX 7900 XT is recommended by AMD for 4K Ultra HD gaming with ray tracing; while the RTX 4070 Ti SUPER is officially recommended by NVIDIA for maxed out gaming with ray tracing at 1440p, although throughout our testing, we found the card to be capable of 4K Ultra HD gaming.
The Radeon RX 7900 XT offers about the same performance as the RTX 4070 Ti SUPER, averaging 1% higher than it in our testing, at the 4K Ultra HD resolution. At 1440p, the official stomping ground of the RTX 4070 Ti SUPER, the RX 7900 XT comes out 2% faster. These are, of course pure raster 3D workloads. In our testing with ray tracing enabled, the RTX 4070 Ti SUPER storms past the RX 7900 XT, posting 23% higher performance at 4K Ultra HD, and 21% higher performance at 1440p.
Source:
VideoCardz
The Radeon RX 7900 XT offers about the same performance as the RTX 4070 Ti SUPER, averaging 1% higher than it in our testing, at the 4K Ultra HD resolution. At 1440p, the official stomping ground of the RTX 4070 Ti SUPER, the RX 7900 XT comes out 2% faster. These are, of course pure raster 3D workloads. In our testing with ray tracing enabled, the RTX 4070 Ti SUPER storms past the RX 7900 XT, posting 23% higher performance at 4K Ultra HD, and 21% higher performance at 1440p.
132 Comments on AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER
Warzone is an esport with many pro teams participating
escharts.com/games/warzone
OW is the least skill based esport out there, my noob friend could solo queue to Master by just spamming meta hero (Ana), no need for aiming LOL.
No idea where you pull the 0.13ms input lantecy improvement from, latency decrease linearly with FPS increase, by increasing the FPS to 2x you can cut the input latency in half.
GamerNexus started testing input latency improvement with GPU and here the 4090 the get a sizeable input latency reduction in Siege
Maybe I'm not as good an esport player as I used to be (I reached Divine bracket in Dota2 years ago), but I still play PUBG everyday, I sure as heck would like to have the competitive advantage against kids LOL
No wonder G-Sync monitors are rarer now that adaptive sync is an industry standard. Why pay a premium on a module if you can get a better panel instead? :kookoo:
I'm not saying that either choice is wrong, just that every argument has two sides (at least). The only online games I've ever been interested in are World of Tanks and World of Warships, although it's been a long while since I last played either. The quality of the general community hasn't been great in recent years (far too many angry teenagers for my liking).
AMD's RT is based on DirectX DXR, while AMD's other features like FSR aren't based on DirectX, they are still open source, which is much better than a proprietary closed ecosystem.
I remember when reviewers insisted Gysnc monitors were the only choice, reviewers are doing the same thing with Nvidia software features only Nvidia users can use,and on frame gen only 40 series users can use as Nvidia would rather people buy a new GPU than allow 30 series users to get more performance, cards like the 3060 8GB and 3070 8GB need frame gen as 8GB of VRAM just isn't enough for newer games. Hopefully AMD can improve on RT performance and it can become a feature everyone can use like Freesync, as Nvidia having a near monopoly on ray tracing isn't a good thing.
NvidianGreedia and aggressively defending each and every move of their beloved FRIEND helps absolutely nobody either. It only feeds their delusion.Nvidia's RT is also based on DXR. There's no difference, except they've got a generation's worth of a head start, as Turing was DXR capable and RDNA was not. Not only it wasn't, AMD also chose to take the lazy route and not implement the low-performance, but at the time, important software driver for it either, something that Nvidia went out of their way to add to Pascal. It may not have been fast enough for gamers, but it sure could help solidify GeForce as the premier RT vendor simply because you could develop DXR software on Pascal but not on AMD's products. The result is that both Nvidia's driver is far more mature, their software is more robust, but graphics programmers are actually more familiar with how the Nvidia hardware works because they've had such a massive head start.
Since I play on Windows, I'm not exactly drawn by open source, in fact, I couldn't care less as long as I get the best experience. The vast majority of people share this thought beyond a knee-jerk reaction of "oh yeah FOSS is great i love me some FOSS", and seconds later, pull out their iPhone from their pocket. They like the free part, the open source part... only devs care.
"An ecosystem (or ecological system) is a system that environments and their organisms form through their interaction.[2]: 458 The biotic and abiotic components are linked together through nutrient cycles and energy flows." - Wikipedia
How is DLSS an ecosystem? Or RT? These are features you can use in your game if you want to. They aren't living, interacting organisms in any way. I don't even see how they're premium when you have FSR and RT on AMD as well.
I'm currently playing Alan Wake 2 at 3440x1440, RT off, and no upscaling at 70-90 FPS on my 7800 XT. If I enable RT+PT, my FPS drops to 20. Should I have paid the Nvidia premium for a "much better" RT with the 4070 at a whopping 25 FPS? Is that really such a premium experience? I don't think so.
Edit: If you're enjoying your graphics card, good on you. Just don't try to sell it as some kind of "premium experience" because it's not. It's just a graphics card like any other.
If anything, AMD offers an enhanced experience by utilizing "Open Source" that's the thing with AMD they've always pushed for Open Source, where as Nvidia sticks it to developers with it's closed source solutions.
DLSS is not an ecosystem. It's simply a feature that is part of an ecosystem. AMD's got their FidelityFX series of features all of which form the feature set of Radeon GPUs. Which one offers more features and the least trouble? Not a hard thing to figure out.
I speak against AMD when I believe they have to be spoken against. See a thread I intentionally stayed out of? This one:
www.techpowerup.com/forums/threads/how-are-amd-drivers-these-days.318664/
and it's exemplary on why I don't trust them anymore. The market share speaks for itself.
And you know better, "follow the masses" isn't what I meant. It's that I'm not alone in thinking that their GPUs aren't so hot, particularly once you're on the receiving end of the endless tradeoffs, drawbacks, buts and ifs that are invariably involved. To claim the "premium experience", you can't have your customers doing that job for you.
RDNA3 is falling on its face, with such low market penetration it would probably gonna go the way of Vega64 and Radeon VII in term of driver support in the future.
The current release, 551.23, supports as far back as GM107, a now decade-old low end chip released January 2014.
Funny isn't it, how people say they're different and then they turn out being ... people. This isn't gonna go anywhere.
You guys are happy? Brilliant! That's a great sign. I hope they continue their work so that I can be happy with them again. There's nothing wrong with that. I just want the make believe to end, pretending that issues aren't real, that arguments are overblown, and that the small indie is always being ruthlessly bullied by the greedy meanie... it all got so old by now. Every time I read the word "nGreedia" I just feel bitter disappointment. I fully realize that I often get on AMD fans' case. It's generally vested in good faith.ok fine :)
Really, it's a lot less personal than it sounds. I've no real problem with you. It'd be good to see you broaden your horizons a little, though. AMD doesn't have your best interests at heart. :toast:
I know you meant that you're not alone thinking the way you think. It's just that it's not really helping the conversation as it's not a valid argument for or against anything. In my opinion, RDNA 3's main problem is RDNA 2's (relative) success. Most of those who wanted an AMD card in recent years have already got an RDNA 2, and the only tier that offers anything on top is the two 7900 cards. Fair enough, but... What issues?
Edit: The "RT is so much better on Nvidia" argument is equally old. "Slightly less shit" is not the same thing as "so much better".
I don't know what issues you are talking about. Please explain.
As far as Ngreedia, that moniker fits if you are EVGA or any of the AIB partners make less than 10%. When Nvidia says that it's profit margin has increased to 74%. The only thing to call that is greed. They even are making GPUs to subvert Sanctions and pumped 4090s to their Asian partners to avoid the sanctions. All of that led to what people like you like to tout about 81% market share. I will ask you do you know 10 people in your local circle that have a 4090?
Getting back to the thread. I have this specific card. I bought it at launch and it was $1299. It has been more than a year and now that card is $999. That is $300 and that is after more than a year on the market. If you think that people are not buying these GPUs you would be mistaken. At least once a week on TPU, someone joins the 7000 series club. At least in there we don't get hyperbolic comments like the ones you make.
The differences have been there for a long time. Today, Im of the opinion AMD has positioned itself much better, as in, the chiplet is likely a much more crucial piece of tech to advance GPUs than more refinement on monolithical dies where it is already evident we barely gain more FPS per dollar on the hardware alone. Thats worrying and requires a solution - one even Nvidia is gonna have to adopt.