Tuesday, February 6th 2024

AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER

Prices of the AMD Radeon RX 7900 XT graphics card hit new lows, with a Sapphire custom-design card selling for $699 with a coupon discount on Newegg. This puts its price a whole $100 cheaper (12.5% cheaper) than the recently announced NVIDIA GeForce RTX 4070 Ti SUPER. The most interesting part of the story is that the RX 7900 XT is technically from a segment above. Originally launched at $900, the RX 7900 XT is recommended by AMD for 4K Ultra HD gaming with ray tracing; while the RTX 4070 Ti SUPER is officially recommended by NVIDIA for maxed out gaming with ray tracing at 1440p, although throughout our testing, we found the card to be capable of 4K Ultra HD gaming.

The Radeon RX 7900 XT offers about the same performance as the RTX 4070 Ti SUPER, averaging 1% higher than it in our testing, at the 4K Ultra HD resolution. At 1440p, the official stomping ground of the RTX 4070 Ti SUPER, the RX 7900 XT comes out 2% faster. These are, of course pure raster 3D workloads. In our testing with ray tracing enabled, the RTX 4070 Ti SUPER storms past the RX 7900 XT, posting 23% higher performance at 4K Ultra HD, and 21% higher performance at 1440p.
Source: VideoCardz
Add your own comment

132 Comments on AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER

#26
Legacy-ZA
Garrusyeah the rx 7900 XT usually bests the 4070 in ray tracing games, people exaggerate AMD's RT loss in performance (my rtx 3080 is supposedly good at RT but loses to it for example)

RTX 4000 suck, no RT improvements, hopefully they move forward with RT for RTX 5000 series and actually improve it this time
If the price, with the performance, doesn't improve with that "technological leap" it means nothing to the masses, only to the few that can afford those exorbitant price tags.
Posted on Reply
#27
Panther_Seraphin
I can pick up a 7900xt for £120 less than a 4070ti Super.

nvidia is riding the AI hype train currently and will do for the near future as looking at some of the numbers being rumoured.

Nvidia currently selling a 609mm die for ~2k dollars aka the 4090, they are also selling an 814mm die for nearly 20x that aka the H100... Guess where there focus is going to be.
Posted on Reply
#28
ratirt
If you ask me, $100 cheaper for the XT is nice but i would be still keen on getting the XTX instead considering I've got a 6900xt so the 7900xt performance boost would not be justified enough.
Someone jumping from a 6700Xt to a 7900xt would make sense.
Posted on Reply
#29
Vayra86
Beginner Micro DeviceThat's true but... Here's the thing. AMD GPUs don't have access to DLSS and FSR is implemented very badly in some games. In case I use DLSS Quality (or sometimes even Balanced), I don't notice image quality degradation in most games. With FSR, image quality degradation is a given. Especially below Quality mode. So in older games it's NVIDIA because AMD GPUs aren't really optimised for them (some DX10 and older titles outright refuse to be stable or even don't launch on AMD cards but run perfectly fine on NV counterparts), and in newer games it's also NVIDIA because we basically compare DLSS Quality on NV to native on AMD (with how poorly native TAA works, DLSSQ oftentimes doesn't look worse than that).

That said, RX 7900 XT will only be tasty for below 650 USD (but for the actual reality check, it has to drop down to 550ish dollars so NV stop overpricing their hardware).

NB: I rock an AMD GPU and occasionally use an RTX 3060 Ti. I know what I'm talking about.
Lmao. 700 for this caliber of card is a steal. You keep moving the goalposts, theyre beyond sense at this point. Defaulting to DLSS on to say FSR is worse is another pretty weird take. DLSS is certainly not free of image degradation. Its highly game dependant. I think you're in a little bit of a bubble there. Old games not running is another such extreme edge case notion.
Posted on Reply
#30
AusWolf
Garrusyeah the rx 7900 XT usually bests the 4070 in ray tracing games, people exaggerate AMD's RT loss in performance (my rtx 3080 is supposedly good at RT but loses to it for example)

RTX 4000 suck, no RT improvements, hopefully they move forward with RT for RTX 5000 series and actually improve it this time
I couldn't agree more. Whether you lose 50% performance on RT or 60%, it doesn't matter. It's still crap either way.
Posted on Reply
#31
Outback Bronze
Well, it was a toss-up between the 4080, 7900XT & XTX for me coming from a 6800XT. At the time the 4090s were over 50% more than a 4080 down under. Picked up my 4080 for $1749 AUD BNIB. 4090s started from $2749.

If these recent price drops had of been around when I was looking to purchase, I'm not so sure I would have swung the way of the 4080.

Now in saying that, the 4090 is now a hard sell @ $1600 when the 4080 & now 4080S are now $999 once these price drops come into fruition.

It's really going to suck if these prices managed to stick or get worse with their appropriate tier cards in future releases.
Posted on Reply
#32
evernessince
nguyenActually Nvidia is miles faster in CoD when using competitive settings which most people would use when playing competitive games anyways
That entirely depends on which CoD you are talking about and what settings. You literally cherry-picked the worst scenario for the 7900 XTX whereas I can show the opposite scenario:



I'd argue the above scenario if FAR more realistic. Most people aren't gaming on a 7900 XTX or 4090 with a top end CPU to begin with, let alone turning down the graphics to basic on their $1,000 to $1,600 graphics card. Even if we did assume CoD is an eSports title (it isn't), people who play eSports need to buy a 4090 or 7900 XTX to begin with. It's a waste of money. Then again COD isn't an eSports game and people are not dropping $1,600 to play a game as bad looking as CoD on low. That is just some epic level of cherry-picking.
nguyenNot to mention Nvidia users also have Reflex advantage.
Which anyone can get the same effect by capping their FPS just below max GPU utilization. AMD also has anti-lag and anti-lag+ but it's just as irrelevant as reflux if you know how to cap your FPS.

It would be nice if TPU could have a single AMD GPU related article without the Nvidia defense force thowing shade completely unprovked on repeat. Yes we know DLSS is better and that RTX performance is better. We don't need that repeated ad nausium.
Posted on Reply
#33
Melvis
Always has been here in Aus and not by 100 but by $200 so this isnt new news at all.
Posted on Reply
#34
gffermari
Still don’t like the XT. The XT is the 6800XT successor and should have been priced at 649-699 a year ago. Not now.
Today I would get a XTX for 700ish.
Above that, the TiS has no competition.

At this level you can’t say, I don’t want RT, PT, super sampling, etc. Everything matters. And the problem with the Radeons is the package. Not the performance necessarily.
Posted on Reply
#35
theouto
evernessinceThat entirely depends on which CoD you are talking about and what settings. You literally cherry-picked the worst scenario for the 7900 XTX whereas I can show the opposite scenario:



I'd argue the above scenario if FAR more realistic. Most people aren't gaming on a 7900 XTX or 4090 with a top end CPU to begin with, let alone turning down the graphics to basic on their $1,000 to $1,600 graphics card. Even if we did assume CoD is an eSports title (it isn't), people who play eSports need to buy a 4090 or 7900 XTX to begin with. It's a waste of money. Then again COD isn't an eSports game and people are not dropping $1,600 to play a game as bad looking as CoD on low. That is just some epic level of cherry-picking.
Maybe I am remembering wrong, but I recall HUB saying that something wrong was going on with their AMD tests. And even if they didn't say so, it's clear something is wrong when the card performs the same at both 1080p and 1440p, and nigh identical at 4K, but the competition doesn't.
Posted on Reply
#36
Dave65
NordicI would choose the 7900xt at this price if I was buying. It's not like I am playing many ray tracing games or doing any compute work.
I jumped on this and got one for the kids PC.. Nvidia can suck it!
Posted on Reply
#37
Denver
Beginner Micro DeviceSo... $1600 is double the $1200..? I'm a little confused here.

30% at the very least. 4090 is CPU bottlenecked in almost all games below 4K and in many games at 4K.

One of a few games which can push your GPU to 100% clearly shows it's not 20% but rather 33%:


That said, $1600 is less greedy for 4090 than $1200 is for 4080 considering you're getting the very best all-around GPU. $1200 for 4080 was a joke, is a joke and will be a freaking joke. 7900 XTX doesn't deserve being over 700 USD either.

It's much more realistic than SSR, let alone baked lighting. If you need to pay attention to notice the difference this means you didn't enable RT or you have some vision issues, IDK. Once seen, can't be unseen. Much easier to implement, too. The problem is on the AMD side: they don't develop RT in their hardware so gaming consoles don't get RT so everyone only gets tech previews like Alan Wake 2 and Cyberpunk 2077, and games where RT isn't actually RT.

Just for your information: even Intel Arc GPUs handle RT better than AMD GPUs. This is how bad things for AMD are.
There are so many lies that I don't even want to respond. It's been a while since the 4090 has been close to $2000 in most parts of the world, and this is the average frame rate:



Just the fact that you have the courage to take as an example a beta game from half a decade ago running at 30fps only strengthens my argument. Is there a greater alienation than this?
Neither is realistic, look out the window.
Posted on Reply
#38
Macro Device
DenverIt's been a while since the 4090 has been close to $2000
Just a month or two maybe... Other than that, 4090 stays very close to its MSRP. We're talking almost 1.5 years lifespan at this point. You cherry picked 10% to "prove" what's either way false. It's been less than a week when 4090 was doubling the 4080's price. Today, it's $1200 VS $1800.
DenverJust the fact that you have the courage to take as an example a beta game from half a decade ago running at 30fps only strengthens my argument.
I just can count. Your average FPS count is all over the place because more than FIFTY percent of the tested games are not GPU intensive enough for 4090 to show its full potential over 4080 even at 4K. If you take a peek on the said "half decade old beta game" you'll also notice it runs 125% as fast as on a 4080 on a 4090 without RT (compared to 35% advantage with RT enabled). The GPU isn't 100% loaded. It's like telling one Bughatti is virtually the same as another Bughatti because you didn't compare them in intended usage but rather calculated average speed on tarmac, broken tarmac, off-road, ice, snow, rubbish, Mars, Venus, etc. Of course these zeroes from invalid tests will skew the difference between 2.0L and 3.0L models.

+ there are people willing to buy a GPU for professional usage and whoops, 4080 only has 66% RAM worth of 4090. That makes its horsepower effectively pointless for such buyers. And these, mind me, are not a below 0.1% minority you can easily ignore.
Vayra86You keep moving the goalposts, theyre beyond sense at this point
I keep trashing greedy companies, namely NV and AMD, for selling dog vomit for ludicrous amounts of money. I don't want the 4C8T VS Bulldozer era happen on the GPU market. That's why the second player, AMD, has to make something happen. If it wasn't for Ryzen we'd probably still be buying 4C8T i7s for 300ish quid. RDNA is just GCN on 'roids. Could be good enough a dozen years ago but now..?

Like, yeah, gaming is much more accessible than whatever ago, especially if we speak 2021 and 2022 but these extra tiny PCBs with so much simplification don't deserve being sold for that money. 4060 series is a joke. 4070 series is a rip off. 4080 is just WHY. And I can't say I'm happy with what the competition is doing because I know from my own experience how it's like to do a whole lot of nothing.
Posted on Reply
#39
Garrus
Legacy-ZAIf the price, with the performance, doesn't improve with that "technological leap" it means nothing to the masses, only to the few that can afford those exorbitant price tags.
The price did improve. The 7900 XT is $700 right now, same price as the RTX 3080, but is 35 percent faster and has double the VRAM. Not bad.

Frankly if it hits $650 you get that 40+ percent improvement in perf/dollar plus the 10GB of VRAM? That is quite a large leap over the RTX 3080. A must buy. So close now.
Posted on Reply
#40
3valatzy
GuckyThe 7900XT is what the 7800XT at the very least should have been as a 6800XT successor, minus the 4GB extra VRAM. And what price did the 6800XT have? 649$ at release.
gffermariStill don’t like the XT. The XT is the 6800XT successor and should have been priced at 649-699 a year ago. Not now.
Today I would get a XTX for 700ish.
Above that, the TiS has no competition.

At this level you can’t say, I don’t want RT, PT, super sampling, etc. Everything matters. And the problem with the Radeons is the package. Not the performance necessarily.
True.

RX 7900 XTX should have been RX 7900 XT for 750$.
RX 7900 XT should have been RX 7800 XT for 600$.
RX 7900 GRE should have been RX 7800 for 400$.
RX 7800 XT should also have been RX 7700 XT for 350$.
RX 7700 XT should have been RX 7700 for 279$.
RX 7600 XT should have been RX 7500 XT for 199$.
RX 7600 should have been RX 7400 XT for 129$.
100 bucks cheaper
and yet, under some settings, it is dangerously slower..

Posted on Reply
#41
Macro Device
3valatzyRX 7600 XT should have been RX 7500 XT for 199$.
RX 7600 should have been RX 7400 XT for 129$.
+55% price for +8 GB of almost useless VRAM? Kinda goofy. Just a reminder: these are essentially the same GPU but XT is ~10% overclocked.
3valatzyRX 7900 GRE should have been RX 7800 for 400$.
RX 7800 XT should also have been RX 7700 XT for 350$.
One of them shouldn't have existed in the first place. They are margin of error close in performance, oftentimes showing identical framerates.
Posted on Reply
#42
3valatzy
Beginner Micro Device+55% price for +8 GB of almost useless VRAM? Kinda goofy. Just a reminder: these are essentially the same GPU but XT is ~10% overclocked.
True. They could have differentiated them by active shaders, frequencies, and the VRAM.

RX 7500 XT 16GB 2048 shaders 2755 MHz boost clock
RX 7400 XT 8GB 1792 shaders 2475 MHz boost clock
Posted on Reply
#43
nguyen
evernessinceThat entirely depends on which CoD you are talking about and what settings. You literally cherry-picked the worst scenario for the 7900 XTX whereas I can show the opposite scenario:



I'd argue the above scenario if FAR more realistic. Most people aren't gaming on a 7900 XTX or 4090 with a top end CPU to begin with, let alone turning down the graphics to basic on their $1,000 to $1,600 graphics card. Even if we did assume CoD is an eSports title (it isn't), people who play eSports need to buy a 4090 or 7900 XTX to begin with. It's a waste of money. Then again COD isn't an eSports game and people are not dropping $1,600 to play a game as bad looking as CoD on low. That is just some epic level of cherry-picking.



Which anyone can get the same effect by capping their FPS just below max GPU utilization. AMD also has anti-lag and anti-lag+ but it's just as irrelevant as reflux if you know how to cap your FPS.
Weird when Warzone is the cherry picked best case scenario for 7900XTX to begin with LMAO, all other esport games play better on Nvidia anyways. Funny how the latest AMD drivers broke the performance in Warzone, the only esport game that AMD is good at :roll:

BTW I'm a very good esport player myself and I play with low settings all the time, always having the highest end gaming GPU and monitor certainly give me the competitive advantage :rolleyes:. There is something wrong in your thinking that people with high-end GPUs must play only single players game :kookoo:

I guess you have never played esport, capping FPS only work half the time. If you cap the FPS too high and get GPU-bound mid gunfight you will get very noticeable very bad input latency when you need it the most, capping FPS too low and you are losing on latency reduction with higher FPS (e.g. capping at 120FPS give worse input latency vs 200FPS with reflex). Uncapped FPS + Reflex is simply the best solution for esports.
Posted on Reply
#44
Vayra86
Beginner Micro DeviceI keep trashing greedy companies, namely NV and AMD, for selling dog vomit for ludicrous amounts of money. I don't want the 4C8T VS Bulldozer era happen on the GPU market. That's why the second player, AMD, has to make something happen. If it wasn't for Ryzen we'd probably still be buying 4C8T i7s for 300ish quid. RDNA is just GCN on 'roids. Could be good enough a dozen years ago but now..?

Like, yeah, gaming is much more accessible than whatever ago, especially if we speak 2021 and 2022 but these extra tiny PCBs with so much simplification don't deserve being sold for that money. 4060 series is a joke. 4070 series is a rip off. 4080 is just WHY. And I can't say I'm happy with what the competition is doing because I know from my own experience how it's like to do a whole lot of nothing.
Greed is also with consumers, expecting to eternally 'get more for less'. Something's gonna give eventually. We could reflect a bit more on that, too. And none of that excuses the greed of corporate, by the way, I fully agree with you there.

But then that's clear, thanks :) I get where you're coming from now. My stance on the $699,- price tag for a 7900XT is more based on the perspective of the current, actual market without the principles of what I considered a good price for high end GPUs in 2017. Because, really, that's what this is under the skin, isn't it.

Price go up, mind says meh. I already cognitive dissonanced- my 900 eur purchase of a 7900XT to the dark pit of ignorance, so I'm past that :D In hindsight... would I have waited. Perhaps, yeah, 7900XT's go for 760 eur now over here. Shit happens.

Really the best thing to fight corporate greed is to limit yourself in your upgrade itch. There's no need to buy a GPU every other year. There's no need to play shit on 4K ultra RT On. Games aren't better with it either. Buy something that truly lasts and isn't sensitive to the next feature creep built to expand corporate margins even further. This is how I ended up at AMD - equal or more GPU for my $$$ and a much better outlook on the future of gaming - not RT driven, but driven by solid hardware with RT as a bonus 'if it truly matters'. Fact is, there still isn't a single game where RT truly matters, three generations in. Its still e-peen territory more so than front and center for gaming. And the supposed hype for it is rapidly replaced with 'AI' as the next buzz word to sell chips on. Even Nvidia still tries to sell everyone the lie they are selling an AI upscale.

Another perspective on whether companies are lazy or what they progress for us: look at the chiplet. AMD drives that innovation, and it is the ONLY CONCEIVABLE WAY forward to get competitive chip pricing back, simply because we can go back to making not super huge monolithic chips on a single expensive node.
Lazy? I beg to differ. AMD is pushing the advance of silicon on the most fundamental level, rather than pushing overmarketed software updates with a buy now button.
Posted on Reply
#45
3valatzy
Vayra86Greed is also with consumers, expecting to eternally 'get more for less'.
Much less so than the greedy corporations. Getting more for less is exactly the reason why we see a progress and optical shrinks.
I guess you don't want to still stay on a 90nm GPU with a 720p screen?!
Posted on Reply
#46
Macro Device
Vayra86Greed is also with consumers, expecting to eternally 'get more for less'.
Not much relevant.

If NV are selling their lowest tier GPUs for $200 it means they are likely to get away with selling their higher tier GPUs for about 500 USD and still have profit. Yes, I'm 99% sure they spend less than 500 bucks on manufacturing, distributing and advertising a single 4090. The companies won't die because of our "greed." They will be fine.

An average consumer, though, is not a giga billion "we can buy anybody" corporation. Paying 1000+ USD for a gaming GPU is wild and it will stay wild.
Posted on Reply
#47
Vayra86
3valatzyMuch less so than the greedy corporations. Getting more for less is exactly the reason why we see a progress and optical shrinks.
I guess you don't want to still stay on a 90nm GPU with a 720p screen?!
I did say also. The entire marketplace is a game of supply and demand. We're all in it.

Its an illusion that we can always get more for less. There are limits to everything.
Beginner Micro DeviceIf NV are selling their lowest tier GPUs for $200 it means they are likely to get away with selling their higher tier GPUs for about 500 USD and still have profit. Yes, I'm 99% sure they spend less than 500 bucks on manufacturing, distributing and advertising a single 4090. The companies won't die because of our "greed." They will be fine.
Companies also won't even exist if it weren't for our greed. It is getting nearly philosophical here, but really... we're our own architects of trouble. The overwhelming majority of gaming and GPU development for gaming is pure luxury that we keep spending more money on.
Posted on Reply
#48
Denver
Beginner Micro DeviceJust a month or two maybe... Other than that, 4090 stays very close to its MSRP. We're talking almost 1.5 years lifespan at this point. You cherry picked 10% to "prove" what's either way false. It's been less than a week when 4090 was doubling the 4080's price. Today, it's $1200 VS $1800.

I just can count. Your average FPS count is all over the place because more than FIFTY percent of the tested games are not GPU intensive enough for 4090 to show its full potential over 4080 even at 4K. If you take a peek on the said "half decade old beta game" you'll also notice it runs 125% as fast as on a 4080 on a 4090 without RT (compared to 35% advantage with RT enabled). The GPU isn't 100% loaded. It's like telling one Bughatti is virtually the same as another Bughatti because you didn't compare them in intended usage but rather calculated average speed on tarmac, broken tarmac, off-road, ice, snow, rubbish, Mars, Venus, etc. Of course these zeroes from invalid tests will skew the difference between 2.0L and 3.0L models.

+ there are people willing to buy a GPU for professional usage and whoops, 4080 only has 66% RAM worth of 4090. That makes its horsepower effectively pointless for such buyers. And these, mind me, are not a below 0.1% minority you can easily ignore.

I keep trashing greedy companies, namely NV and AMD, for selling dog vomit for ludicrous amounts of money. I don't want the 4C8T VS Bulldozer era happen on the GPU market. That's why the second player, AMD, has to make something happen. If it wasn't for Ryzen we'd probably still be buying 4C8T i7s for 300ish quid. RDNA is just GCN on 'roids. Could be good enough a dozen years ago but now..?

Like, yeah, gaming is much more accessible than whatever ago, especially if we speak 2021 and 2022 but these extra tiny PCBs with so much simplification don't deserve being sold for that money. 4060 series is a joke. 4070 series is a rip off. 4080 is just WHY. And I can't say I'm happy with what the competition is doing because I know from my own experience how it's like to do a whole lot of nothing.
You show me a game running horribly as an argument and I'm Cherry picking?

Calling it cherry picking is a joke. This is an average of well-known AAA games, if you take out the buggy or lightweight games and include the most played AAA games today, AMD tends to be closer to Nvidia, not further away.

The analogy comparing normal cars to luxury cars is apt. While you can freely use a regular car on the streets, a luxury car often ends up sitting in a garage, unused and collecting dust. Plus, you can't fully utilize its potential since it's not as readily accessible.
Essentially, you end up taking it for an occasional slow stroll around the streets just for the sake of showing off. The main difference is that the luxury car has enormous potential to run faster than 300 km/h, but the 4090 does not, it will be brought to its knees by the full RT.

People complain about high prices but keep throwing money at companies with 60% margins and saying it's the other company's fault that you continually ignored out of pure alienation with the holy RT. It is not sustainable to expect any company to sell products at a loss, with production costs escalating since TSMC's rise to the throne and the idolatry of RT.
Posted on Reply
#49
nguyen
Outback BronzeWell, it was a toss-up between the 4080, 7900XT & XTX for me coming from a 6800XT. At the time the 4090s were over 50% more than a 4080 down under. Picked up my 4080 for $1749 AUD BNIB. 4090s started from $2749.

If these recent price drops had of been around when I was looking to purchase, I'm not so sure I would have swung the way of the 4080.

Now in saying that, the 4090 is now a hard sell @ $1600 when the 4080 & now 4080S are now $999 once these price drops come into fruition.

It's really going to suck if these prices managed to stick or get worse with their appropriate tier cards in future releases.
With TSMC as the only manufacturer making GPUs for Nvidia/AMD/Intel, I doubt prices will get better with next-gen releases.

Maybe AMD or Intel should take Samsung foundry and make cheap&affordable GPUs
Posted on Reply
#50
Daven
The 4090 has been hovering around $2000 for awhile. The 4080 Super is around $1000. So yes the 4090 is double the price for 20% more performance.

And did that 3valatzy guy really call enabling RT ‘settings’ which makes AMD dangerously slower?
Posted on Reply
Add your own comment
Jan 18th, 2025 23:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts