Tuesday, March 7th 2023

AMD Radeon RX 7900 XT Now Starts at $800 in Direct Clash to RTX 4070 Ti

Prices of the AMD Radeon RX 7900 XT RDNA3 graphics card are on a downward slope, with the card now starting at $800 on US computer hardware retailer Newegg. The ASRock RX 7900 XT Phantom Gaming, a custom-design graphics card, has been holding at $799.99 for roughly a week now, while the next cheapest card, an XFX co-branded AMD reference graphics card, is going for $839.99 on the site. These prices put the RX 7900 XT in a direct clash with the NVIDIA GeForce RTX 4070 Ti. The RX 7900 XT was launched with an AMD MSRP of $899.99, with a performance level that compelled NVIDIA to re-position the RTX 4070 Ti (originally announced as the $900 RTX 4080 12 GB), to $800. In our testing, the RX 7900 XT is about 5% faster than the RTX 4070 Ti in conventional raster 3D graphics, but with a ray tracing performance that's comparable to the previous-generation RTX 3080 Ti.
Source: VideoCardz
Add your own comment

87 Comments on AMD Radeon RX 7900 XT Now Starts at $800 in Direct Clash to RTX 4070 Ti

#51
kapone32
Avro ArrowI don't believe that the "free cheques" had anything to do with it. The demand for uber-expensive video cards arose because of the silicon shortage that was exacerbated by the Ethereum boom. People had no choice because of what was going on. Make no mistake, people have been overpaying for video cards for well over a decade. How do you think nVidia got so rich?
This narrative has gotten so old it is almost as bad as the argument that Crypto Mining was not as popular as people claim vs Nvidia even though they sold most of their cards to miners from 2019- mid 2022.
Posted on Reply
#52
wheresmycar
kanecvrMy budget is 500 - that's all I'm willing to spend on a gpu.
Right on brother!! I'm in a similar boat, not willing to spill over a specified budget, a budget which i already consider way too much for a graphics card. Not gonna settle with anything below 30% in terms of upgrade performance... 50% might have me sold but i desire more for the money.
Posted on Reply
#53
medi01
fevgatosall the extra features
That's a generous way to refer to "a couple of weeks of frame interpolation"... :D
Posted on Reply
#54
Avro Arrow
dj-electricI really wish AMD would just do it the regular way with an announcement. Price adjustments in the greedy GPU market have become such a pain lately.
The second most disappointing thing in GPU pricing, right after their initial MSRP is competition and demand response. Weak.
The way that most people buy video cards, I'm actually surprised that AMD hasn't thrown in the towel and that there is any competition at all.
ratirtYou can say they did or they did not. In both cases you can justify your opinion.
Sold enough that is why the price dropped.
Did not sell enough that is why the price dropped.
Both answers make sense.
I'd focus rather on the part that the price dropped which means better value.
I agree with you but remember that "better" doesn't necessarily mean "good". This is a perfect case-in-point.
TumbleGeorgeI think after RTX 4070 price of RX 7900 XT will decrease more.
We can hope.
mamaGood move. This should move a few more units.
Yeah, but it's not enough to make a real difference. The card is still about $200 too expensive.
Outback BronzeHopefully the XTX, 4080 & 90 drop soon too : )
I honestly don't care about the 4080 and 4090 prices but I completely agree with you about the XTX. Pricing has no effect on nVidia sales, people who buy nVidia do so because value isn't high on their list of priorities (although some are just ignorant and aren't familiar with Radeon).
Broken ProcessorPC gamers are giving up gumtree is full of gaming monitors and it's always the same response prices are to high they are selling up. Sad to see.
Not really... I mean, it's sad that we're in this situation to begin with but I actually find the fact that nobody is buying them to be quite heartening. The fact that these greedy corporations aren't succeeding in their attempt to completely fleece consumers is a definite silver lining because it means that this situation won't continue. Ultimately, the power in any marketplace belongs to the consumers as long as they're willing to dangle that money-coloured carrot far enough away from the corporations to make them actually work for it.

The key to power in the marketplace is refusing to buy something unless it's a good value. Otherwise, you're just pissing your money away (which is what the corporations want).
nguyenHow the turntables when Nvidia pricing forced AMD to drop price, instead of the other way around :rolleyes:
Yeah but that's because nVidia customers don't care much about value. If they did, they wouldn't be buying nVidia to begin with. :laugh:
AssimilatorDLSS is objectively superior quality and constantly being improved, FSR is lower quality and generally only receives an update every time AMD releases a new architecture.
Objectively, sure, I agree with you. The thing is that any gaming experience is completely subjective. Comparing DLSS with FSR requires painstaking attention to details that nobody looks at when gaming (it's kinda like ray-tracing that way). It's a case of splitting hairs at this point.

If they both look "good" then nobody's going to lament having either of them. My gaming experience has nothing to do with the tiniest graphics details in a given scene because I don't look at them and I'd be willing to bet that I'm in the majority when it comes to that.

If I'm playing a good game and my rig gives me a smooth and responsive experience, I couldn't care less about what each individual leaf on a tree looks like. This means that to actually care about whether you're using DLSS or FSR would mean that you have a pretty bad case of OCD and probably wouldn't be able to enjoy gaming for one "flaw" or another anyway.

While I would agree that DLSS is objectively "better", I wouldn't agree that it's better in any meaningful way. Comparing the two is like the old Mustang vs. Camaro argument and every bit as futile.
AssimilatorThe 4070 Ti has very little problem with 4K currently, there is zero reason to expect it will in future. But this argument is irrelevant anyway, because both these GPUs will be obsolete long before 4K becomes the de facto resolution.
At the moment, you're correct. When it comes to your assumption about the future, that's nothing more than hopeful conjecture. You could be right but you could also be wrong. To me, my R9 Fury became a cautionary tale because even though its GPU is more potent than that of an RX 580, an RX 580 with 8GB can easily outperform it in games where 4GB is a hindrance, even at 1080p.

One day, my RX 6800 XT will be no longer usable for gaming but... it won't be because it doesn't have enough VRAM. The RTX 3080 will be a completely different story however.
DavenI bought the Merc edition 7900xt at $900 about two months ago. It would have been nice to save $100 but GPU prices are so fickle you can never tell which way they are gonna go.
Well, at least you got a Merc for MSRP at the time. That's a bit of a bonus, eh? I guess XFX was feeling generous. :laugh:
DavenAn even crazier comparison is the 7900xt versus the 4080. For 50% higher price on the 4080, you get a little over 10% gen ras and about the same power consumption.
Yeah but you can't look at it through rational eyes. Rationally, none of these cards are worth it but people still buy them. I personally wouldn't touch any of these cards at their current prices. If I were in the market for a card, the RX 7900 XT would have to be no more than $600USD for me to even consider it. As for nVidia, they're never a good value so I don't even bother looking at GeForce cards.
AusWolfAwesome news, but I wonder where the actually affordable RDNA 3 cards are. AMD seems to be pulling an Nvidia by running the 7900 series and RDNA 2 in tandem.
I've been asking the same thing for over a month. AMD starts talking about RDNA4 and I'm like "Hey, you realise that it's almost Q2 of 2023 and you STILL only have 2 RDNA3 cards out!".
Posted on Reply
#55
Space Lynx
Astronaut
Avro ArrowI don't believe that the "free cheques" had anything to do with it. The demand for uber-expensive video cards arose because of the silicon shortage that was exacerbated by the Ethereum boom. People had no choice because of what was going on. Make no mistake, people have been overpaying for video cards for well over a decade. How do you think nVidia got so rich?


Yeah but Amazon engages in slave labour and that's an order of magnitude worse. I refuse to support that company because sooner or later, we'll all be forced to work like robots without so much as a piss-break. No thank-you, Jeff Bezos can suck it and I'll take my chances with Newegg.

Besides, after the Gamers Nexus exposé, I'm pretty sure that Newegg cleaned up their act.


I honestly don't believe that they care anymore and I don't really blame them. If you had a history of offering much better prices on video cards and the ignorant masses just bought nVidia regardless of price.

How long would it be before you decided that it wasn't worth offering ungrateful consumers better prices if they weren't going to buy your card anyway? To their credit, AMD lasted a lot longer than I would have. Their "bad press" changes nothing.
eh be careful what you read on the news.

i know a guy who has worked at amazon warehouse for 15 years, we been friends since high school days and he loves working there and he makes damn good money these days still doing same warehouse floor labor job. really good health insurance too. heh, but yeah believe what you will. all companies have a bit of good and bad here and there.
Posted on Reply
#56
Guwapo77
kapone32Your 6900XT is just fine for every Game right now.
Already having to turn down crap in 1440p tho...
Posted on Reply
#57
AusWolf
Guwapo77Already having to turn down crap in 1440p tho...
Just out of curiosity, how many hundred FPS is your target?
Posted on Reply
#58
Zareek
80-watt HamsterThe optics are irrelevant. Your hypothetical average person will buy the highest-numbered Nvidia card they can afford and go on with their lives not even caring if AMD even exists.
Maybe where you live, the average person knows who Nvidia or AMD are. Where I live, the average person has no clue. They ask their one geeky friend or relative and go from there. If they don't have said friend or relative, they simply buy the cheapest or whatever one has the best box, the best marketing mumbo jumbo or whatever the retailer is pushing that month. My son is a great example. He is in his early twenties and games way more than is healthy. He has zero clue about what hardware he is playing his games on, and has no interest in learning. Most of his friends are the same. They aren't all dreaming about 500FPS at 4K. Whatever the games start at with default settings, they play it like that. If the game they want to play doesn't run, maybe they'll upgrade, or maybe they'll just play it on Xbox or PlayStation instead. They just want to play games!

The same goes for the post you replied to, the model numbers mean squat crap to most people. They usually assume more = better, If they can get a 7900XT for less than a 4070ti they think they are getting a deal because the 7900 is a much bigger number than 4070. Hell, if there is an Intel video card in the same group they might go with that since they've heard of Intel before, it used to be on TV all the time.
Posted on Reply
#59
Avro Arrow
Space Lynxeh be careful what you read on the news.

i know a guy who has worked at amazon warehouse for 15 years, we been friends since high school days and he loves working there and he makes damn good money these days still doing same warehouse floor labor job. really good health insurance too. heh, but yeah believe what you will. all companies have a bit of good and bad here and there.
When I see Bernie Sanders getting involved, I trust it. Sure, maybe the warehouse that your friend works at treats workers fairly but it all depends on what laws that the warehouse has to follow. I've seen more than enough evidence to believe what people are saying.
ZareekMaybe where you live, the average person knows who Nvidia or AMD are. Where I live, the average person has no clue. They ask their one geeky friend or relative and go from there. If they don't have said friend or relative, they simply buy the cheapest or whatever one has the best box, the best marketing mumbo jumbo or whatever the retailer is pushing that month. My son is a great example. He is in his early twenties and games way more than is healthy. He has zero clue about what hardware he is playing his games on, and has no interest in learning. Most of his friends are the same. They aren't all dreaming about 500FPS at 4K. Whatever the games start at with default settings, they play it like that. If the game they want to play doesn't run, maybe they'll upgrade, or maybe they'll just play it on Xbox or PlayStation instead. They just want to play games!
I think that most people are like that.
ZareekThe same goes for the post you replied to, the model numbers mean squat crap to most people. They usually assume more = better, If they can get a 7900XT for less than a 4070ti they think they are getting a deal because the 7900 is a much bigger number than 4070. Hell, if there is an Intel video card in the same group they might go with that since they've heard of Intel before, it used to be on TV all the time.
Oh man, do I have a funny story for you! I was in Canada Computers and these people were trying to decide which video card to buy. As is usual, the guy behind the counter was some teenager with a completely useless amount of actual PC knowledge. These people said that they had a budget of $1000 and this kid was trying to push an RTX 4080 on them. I asked him if he could quickly grab me a 2TB Lexar NVMe while they were thinking (because it was taking forever). I offered my assistance to them and when they told me what they wanted and had $1000, I told them that the best value in that price category would be a Radeon, not a GeForce (which is objectively true).

The girl said "But my CPU is Intel."
I said "So?"
She said "Doesn't that mean I need nVidia?"
I cracked up (but not in a demeaning way) and said "No, that has nothing to do with it. You can use any PC video card with any CPU. Intel and nVidia aren't exactly allies."
She said "Ohhh, no, I didn't think that Intel and nVidia were allies, I just thought that you had to have an AMD CPU to use a Radeon card because it says AMD on the box."
(This actually made sense because I bet a lot of people think that.)
I said "Oh, no, that's not there because it can only be used with AMD, it's because AMD owns ATi, the producer of Radeon cards. A Radeon is, in effect, a video card made by AMD. They make both CPUS and video cards. You'd be just fine with it, I promise."

It's things like this that make me think AMD was insane to retire the ATi brand. The confusion that they have probably caused a lot of consumers has probably resulted in a lot of lost Radeon sales. It's like, what positives did they expect to occur from retiring one of the most well-known video card brands in existence and replacing it with a brand that is known for CPUs? It just makes no sense.
Posted on Reply
#60
Zareek
Avro ArrowIt's things like this that make me think AMD was insane to retire the ATi brand. The confusion that they have probably caused a lot of consumers has probably resulted in a lot of lost Radeon sales. It's like, what positives did they expect to occur from retiring one of the most well-known video card brands in existence and replacing it with a brand that is known for CPUs? It just makes no sense.
I agree 1000% the ATI name was a massive asset, and they threw it down the toilet. I never understood that move, they gained nothing from it and I agree probably lost a ton of sales.
Posted on Reply
#61
Avro Arrow
ZareekI agree 1000% the ATI name was a massive asset, and they threw it down the toilet. I never understood that move, they gained nothing from it and I agree probably lost a ton of sales.
It's almost like corporations get an ego all their own. They want THEIR name on everything but that's not always the best tactic to have. The Royal Dutch Shell Company is a good example of not rocking the boat. They own the Shell name as well as Pennzoil and Quaker-State. It's all the same oil in different bottles but they understood that consumers have their favourites for whatever reason and continued with those brand names. Sure, the names have become objectively meaningless but they're not meaningless to people who are accustomed to buying them. Then you have Chevron who bought out Texaco and re-branded "Texaco Havoline" as "Chevron Havoline". Well, that failed miserably because for whatever reason, people wanted to see that T-Star logo, not the shield on their Havoline bottles.

ATi was the same thing and I have always thought that AMD should bring the ATi name back because I'm sure that "ATi Radeon" would still draw a lot more people than "AMD Radeon" ever did.
Posted on Reply
#62
Guwapo77
AusWolfJust out of curiosity, how many hundred FPS is your target?
You think the 6900XT is more powerful than it is apparently. I'm simply looking for max settings at or above 120FPS. I'm not asking for much bro.
Posted on Reply
#63
80-watt Hamster
Guwapo77You think the 6900XT is more powerful than it is apparently. I'm simply looking for max settings at or above 120FPS. I'm not asking for much bro.
Uh, you sure about that?
Posted on Reply
#64
TheoneandonlyMrK
Yeah yeah yeah Ati great AMD bad.

As an owner, absolute f08&£$ $Teel buy three...

The more you buy the less you save but if you buy one you'll have to have saved First.

There are no winner's, just shit choices.

Though for me keeping this card three years then passing it onto a nephew, so not everyone, the Vram makes this a no contest even paying the 880 I did.and these OC :)
Posted on Reply
#65
AusWolf
Guwapo77You think the 6900XT is more powerful than it is apparently. I'm simply looking for max settings at or above 120FPS. I'm not asking for much bro.
I guess we have different ideas about what is and isn't much. With my expectations of 40-60 FPS at 1080p, even my 6750 XT is overkill.
Posted on Reply
#66
Guwapo77
80-watt HamsterUh, you sure about that?
I'm pretty got damn positive about that...its 1440p. 120FPS should be doable across the board but it isn't. I can deal with 1% lows in the 90fps or above. 90FPS is the minimum in which games feel smooth. I use to play at 60fps and thought that was the gaming apex. If you have a monitor that is 60hz, don't upgrade it because it will "F" your life.
Posted on Reply
#67
ratirt
Guwapo77I'm pretty got damn positive about that...its 1440p. 120FPS should be doable across the board but it isn't. I can deal with 1% lows in the 90fps or above. 90FPS is the minimum in which games feel smooth. I use to play at 60fps and thought that was the gaming apex. If you have a monitor that is 60hz, don't upgrade it because it will "F" your life.
I think that depends on the game and if you want to use RT. With the RT, not even 4090 can manage 60FPS in some games.
1440p for 6900xt and 120 FPS is nothing special. I play 4k most of the time and I don't have an issue achieving 120FPS. Obviously, there are games out there that 6900xt may struggle to sustain 120FPS at all times at 4k but 1440p? That would also depend on the game obviously. I only hope you are not expecting too much from this card though.
Posted on Reply
#68
Guwapo77
ratirtI think that depends on the game and if you want to use RT. With the RT, not even 4090 can manage 60FPS in some games.
1440p for 6900xt and 120 FPS is nothing special. I play 4k most of the time and I don't have an issue achieving 120FPS. Obviously, there are games out there that 6900xt may struggle to sustain 120FPS at all times at 4k but 1440p? That would also depend on the game obviously. I only hope you are not expecting too much from this card though.
I get so tired of this back and forth nonsense...when obviously the games I play can't sustain 120fps at MAX settings. I stated that from the jump. We'll just say my computer configuration sucks, I got a bad 6900XT, my damn FPS counter is broken as it should reflect twice, and my eyes are bad because they can't tell smoothness. Everyone is right and the 6900XT is the perfect gaming card for all scenarios.
Posted on Reply
#69
ratirt
Guwapo77I get so tired of this back and forth nonsense...when obviously the games I play can't sustain 120fps at MAX settings. I stated that from the jump. We'll just say my computer configuration sucks, I got a bad 6900XT, my damn FPS counter is broken as it should reflect twice, and my eyes are bad because they can't tell smoothness. Everyone is right and the 6900XT is the perfect gaming card for all scenarios.
Apparently these games are demanding and I'm not sure what your problem is. You expect more than the card was ought to handle? Or was it advertised as it should sustain 120FPS in your game but it doesn't?
Maybe you have a CPU bottleneck or something? Did you try comparing your results? Which game are we talking about here?
Posted on Reply
#70
Guwapo77
ratirtApparently these games are demanding and I'm not sure what your problem is. You expect more than the card was ought to handle? Or was it advertised as it should sustain 120FPS in your game but it doesn't?
Maybe you have a CPU bottleneck or something? Did you try comparing your results? Which game are we talking about here?
I stated facts - I must turn down settings in 1440p. For some odd reason that is to much for you to swallow. Yes, the 6900XT must turn down settings @ 1440p - Fact. Not sure what MY problem is? People like YOU that jump on here and state that on MY system MY 6900XT plays gloriously at 4K. Obviously, our standards for what video game visuals should be are different. My system specs are here for public viewing as you can see. Regardless of what you or anyone else has to say, it won't change the fact that I have to turn down the visual settings. Simple fact.
Posted on Reply
#71
TheoneandonlyMrK
Guwapo77I stated facts - I must turn down settings in 1440p. For some odd reason that is to much for you to swallow. Yes, the 6900XT must turn down settings @ 1440p - Fact. Not sure what MY problem is? People like YOU that jump on here and state that on MY system MY 6900XT plays gloriously at 4K. Obviously, our standards for what video game visuals should be are different. My system specs are here for public viewing as you can see. Regardless of what you or anyone else has to say, it won't change the fact that I have to turn down the visual settings. Simple fact.
Well start a thread because this isn't about your 6900XT.

Or duck out, you were sick of debating off topic tat two posts ago.

On a 7900XT I also have to turn some settings down, in the odd game at high FPS or high Res, but few games mind, not many.

So does a 4090 owner if he wants 120FPS in some games or use resolution scaling.

Example portal RTX.

Point being that's SOME games not all.

Can we get back on topic yet, have we been on topic yet?!?!
Posted on Reply
#72
Wasteland
Guwapo77I stated facts - I must turn down settings in 1440p. For some odd reason that is to much for you to swallow. Yes, the 6900XT must turn down settings @ 1440p - Fact. Not sure what MY problem is? People like YOU that jump on here and state that on MY system MY 6900XT plays gloriously at 4K. Obviously, our standards for what video game visuals should be are different. My system specs are here for public viewing as you can see. Regardless of what you or anyone else has to say, it won't change the fact that I have to turn down the visual settings. Simple fact.
Everyone acknowledges (or should acknowledge) that the 6900xt can't play absolutely every game in existence, at 1440p native, with full Ultra settings, and at a locked 120 fps. It's self-evident. But that doesn't mean @kapone32 was wrong to argue that the 6900xt is an excellent 1440p card. That was the original point of contention.

In fact, nothing will play every single game at 120+ FPS under the listed conditions. Let's take Cyberpunk as an example. TPU's test results at 1440p native and ultra quality settings, without ray tracing enabled, follow:



And if you want 120 FPS with max RT, well then, lol:



These results paint a pretty bleak picture for your use case, until we remember that stock Ultra settings are almost always a bum deal. Not every game has ludicrous performance sinks like AC Odyssey's infamous volumetric clouds, but there are usually at least a couple of settings that you'd have to be mildly insane to max out. A number of outlets, notably Digital Foundry,offer visual bang-for-buck analyses of demanding titles. In the case of CP2077, we have a few options to improve performance while maintaining effectively ultra-quality visuals.

Personally, I played through CP2077 at these "tweaked Ultra" settings (without RT), at 1440p native, with my frame rate locked at 80, on a 6800 non-XT. And I do mean locked at 80: my 0.1% lows were at like 77 FPS. For me, this was a fantastic experience. I understand that you want higher than 80 FPS, but your 6900XT should be at least 20% faster than my card. FSR at its highest Quality setting should take you the rest of the way to >120. Sure, this set up won't give you the premier 2023 gaming experience, but you also can't do much better without spending wildly disproportionate amounts of money. "Diminished returns" is an understatement.

So the question here isn't whether you're lying or mistaken about having to dial down settings to maintain 120 FPS. The question is, 'what's your point?' It certainly doesn't make sense to criticize the 6900xt on the basis that it can't achieve an unreasonable performance standard. You characterized this unreasonable standard as "not asking for much, bro," which leads me to wonder whether you're letting frivolous complaints ruin your experience. I'm guilty of doing this on occasion, too. As someone on this forum recently said, "Games are meant to be played, not watched."
Posted on Reply
#73
sLowEnd
If the price of the 7900XT has dropped, I haven't seen the price drop put into effect here in Canada yet.
Posted on Reply
#74
wheresmycar
sLowEndIf the price of the 7900XT has dropped, I haven't seen the price drop put into effect here in Canada yet.
In the UK, the 7900XT at launch was going for around £900-£1050 (depending on the models). Currently we're seeing units for £770-£950. Still not appealing for me but its nice to know the price may drop further.
Posted on Reply
#75
kapone32
wheresmycarIn the UK, the 7900XT at launch was going for around £900-£1050 (depending on the models). Currently we're seeing units for £770-£950. Still not appealing for me but its nice to know the price may drop further.
That price sounds nice but like slowEnd said in Canada there has been no movement on pricing for any high end card. It actually makes the 7900XT more appealing as it can be had for a few dollars more than a 6800XT in Canada.

www.newegg.ca/gigabyte-radeon-rx-6800-xt-gv-r68xtgaming-oc-16gd/p/N82E16814932381?Description=6800XT&cm_re=6800XT-_-14-932-381-_-Product

www.newegg.ca/asrock-radeon-rx-7900-xt-rx7900xt-pg-20go/p/N82E16814930083?Description=7900XT&cm_re=7900XT-_-14-930-083-_-Product



There is 1 As Rock. 1 MSI and 1 Sapphire at that price on Newegg. The Powercolor card is about $40 cheaper but that card has been maligned for fan noise on consumer cards (based on user reviews). I have said this before but because we have a distributor network for PC parts in Canada both the retailer and customer pay the price. The best priced GPU I have seen in the last 6 months in Canada was an Asus Dual (Budget brand) 6600 for $269.99 on Canada Computers. But now that card has sold so well that on clearance it is $479.

www.canadacomputers.com/product_info.php?cPath=43_557_558&item_id=206490

But I did find a 7900XT for less than I paid as Canada Computers has my card for $1249.

www.canadacomputers.com/product_info.php?cPath=43_557_558&item_id=235518

On a completely different tangent AMD CPUs are currently Priced to sell. I am looking at a 7600X for $319.99

www.canadacomputers.com/product_info.php?cPath=4_64_5443&item_id=227006

With this board it could be sweet.

www.canadacomputers.com/product_info.php?cPath=26_1832_5654&item_id=229346
Posted on Reply
Add your own comment
Jun 3rd, 2024 08:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts