Tuesday, October 18th 2022

AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669

In the run up to the November 3 reveal of the next-generation RDNA3 architecture, and with the 43% faster RTX 4090 mauling away its appeal to the enthusiast crowd, the AMD Radeon RX 6900 XT got another round of price-cuts, and can be had for as low as $669. Prices are down on both sides of the big pond, with European retailers listing it for as low as 699€. Although not technically AMD's flagship graphics card, with the RX 6950 XT (starts at $869); the RX 6900 XT is a formidable 4K gaming graphics card with a high performance-per-Dollar at its new price (roughly 35% higher than the RTX 4090). AMD's latest round of official price-cuts happened around mid-September as the company was bracing for the RTX 4090 "Ada."
Source: VideoCardz
Add your own comment

131 Comments on AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669

#101
Figus
Chrispy_Additionally, it can't do DLAA, DLSS has better game support at the moment than FSR, and then there's NVENC which is vastly superior. If you do anything other than gaming you'll also appreciate CUDA support as so many applications support CUDA but not OpenCL.
Actually FSR (or better RSR) can be anbled in all dx11 and dx12 games directly from amd drivers panel, so no better support for dlss, maybe better quality of dlss2 vs rsr (that is fsr1).
Posted on Reply
#102
Chrispy_
N3M3515I definitely would like to see a review by tpu of that gpu
Me too but it's a niche product with limited availability in only a few regions. As far as I know only Powercolor and Sapphire have models, just a single model each.
N3M3515lol i had a XFX 6800 GT XXX Edition, that's a lot of X's
Not as many as the XFX X1950 XTX XXX Edition though. I genuinely bought those for CF rigs.

Not XFX models, mind you. Just X1950 XTX cards.
Posted on Reply
#103
Super Firm Tofu
Chrispy_Me too but it's a niche product with limited availability in only a few regions. As far as I know only Powercolor and Sapphire have models, just a single model each.
XFX do as well

Posted on Reply
#104
Chrispy_
Cool, there may be others, they're just pretty rare here in Europe.
I bought the Sapphire Pulse just to see what the fuss was about.
Posted on Reply
#105
Super Firm Tofu
Chrispy_Cool, there may be others, they're just pretty rare here in Europe.
I bought the Sapphire Pulse just to see what the fuss was about.
Similar here. Only the Sapphire listed in addition to the XFX at Newegg.

How is power consumption? In that aspect, I preferred the power/performance ratio of 6800 to the 6700/50XT. The 67x0XT seems pushed way past the efficiency range.
Posted on Reply
#106
DemonicRyzen666
Legacy-ZAWell, you see, just because a game has RT, doesn't mean people want to play them. :)
Why does it have to be games everyone else wants to play?
Posted on Reply
#107
Ruru
S.T.A.R.S.
The cheapest 6900 XTs in Finland are a little under 900EUR, though our prices include VAT.
Posted on Reply
#108
InVasMani
N3M3515A 6800 XT @$550 is a far better buy than any of those :D
Yeah, but that's a god damn rip off compared to ARF's $400 RX6900 XT suggestion. :laugh:

Dear, AMD keep trying we've set the bar standard to $400 for RX6000XT,

Sincerely TPU vermin.

Ok so $400 for a RX 6900XT so what a RTX 3080Ti suppose to cost tree hunnie n fiddy!!? Anti up Jensen!! UPSCALE this sale!!!
Posted on Reply
#109
dragontamer5788
Chrispy_Cool, there may be others, they're just pretty rare here in Europe.
I bought the Sapphire Pulse just to see what the fuss was about.
Has your group looked into the Intel Arc A770 at all?

The raytracing performance in Blender... at least the public benchmarks for Blender... have impressed me, especially for only $350.

I'd imagine that AMD GPUs are the worst for your group, because AMD GPUs have the worst raytracing performance out of all three manufacturers (even if we "magic" the software into place, the AMD 6900 XT will never raytrace as quickly as NVidia or Intel counterparts. AMD simply put less raytracing hardware in these cards...) I dunno how many graphics packages are programmed to use Intel Arc yet though.
Posted on Reply
#110
Legacy-ZA
DemonicRyzen666Why does it have to be games everyone else wants to play?
Yeah no, I am an old guy, I gave you the answer if you want to know how I got to the answer, well, logic and research will get you there.
Posted on Reply
#111
DemonicRyzen666
Legacy-ZAYeah no, I am an old guy, I gave you the answer if you want to know how I got to the answer, well, logic and research will get you there.
No you didn't.
There are much better games out there then what the mob of gamer hyper up & play.
Posted on Reply
#112
Chrispy_
dragontamer5788Has your group looked into the Intel Arc A770 at all?

The raytracing performance in Blender... at least the public benchmarks for Blender... have impressed me, especially for only $350.

I'd imagine that AMD GPUs are the worst for your group, because AMD GPUs have the worst raytracing performance out of all three manufacturers (even if we "magic" the software into place, the AMD 6900 XT will never raytrace as quickly as NVidia or Intel counterparts. AMD simply put less raytracing hardware in these cards...) I dunno how many graphics packages are programmed to use Intel Arc yet though.
Nope. From reading the reviews the performance in certain things may be there but the driver is still way too immature to give someone one for production. Until there are numerous, reliable sources saying that all of the basic functionality with vital things like display detection, multi-monitor, refresh rate, colour depth, vsync etc is now working, I'm not touching it with a ten-foot pole.

I want intel to succeed but I also haven't seen a SINGLE review by anyone anywhere that says it's actually usable as a daily driver. It's an interesting first attempt that currently requires you to spend as much time troubleshooting as you do using it. Until the general consensus says otherwise, it's not even worth my interest.

We went through a phase of almost exclusive AMD GPUs because GCN was fantastic and AMD were typically putting way more VRAM on their cards than Nvidia. Turing was the tipping point because so many plugins started leveraging RTX. Not DXR, but Nvidia proprietary RTX.
Super Firm TofuSimilar here. Only the Sapphire listed in addition to the XFX at Newegg.

How is power consumption? In that aspect, I preferred the power/performance ratio of 6800 to the 6700/50XT. The 67x0XT seems pushed way past the efficiency range.
Stock settings, around the same efficiency as a 6600XT - about 10% faster, 10% more power draw.

I don't have a 6600XT but this particular 6700 undervolts like a champ - can drop core voltage by 200mv and still basically manage the same clocks. It's about a 30% reduction in power draw without any real loss. I guess the BIOS and driver see Navi22 and go "oh hey, 1.15V to reach 2.6GHz" Except the 6700 maxes out at 2.4GHz boost and can do that just fine on 900mv, so I have no idea why it keeps adding voltage to the boost bins because it's not clocking any higher (probably driver limitation to stop it cannibalising 6750XT sales)
Posted on Reply
#113
AusWolf
dragontamer5788Has your group looked into the Intel Arc A770 at all?

The raytracing performance in Blender... at least the public benchmarks for Blender... have impressed me, especially for only $350.

I'd imagine that AMD GPUs are the worst for your group, because AMD GPUs have the worst raytracing performance out of all three manufacturers (even if we "magic" the software into place, the AMD 6900 XT will never raytrace as quickly as NVidia or Intel counterparts. AMD simply put less raytracing hardware in these cards...) I dunno how many graphics packages are programmed to use Intel Arc yet though.
I don't know what you mean by "your group", but I was pretty enthusiastic about getting one until I realised that it's only available at one single UK store for £400 on pre-order. I love playing with new and weird tech, but paying that much for a graphics card that isn't only weird beyond measure, but also not here (pre-order) is a bit too steep even for me.
Posted on Reply
#114
NinjaQuick
InVasManiYeah, but that's a god damn rip off compared to ARF's $400 RX6900 XT suggestion. :laugh:

Dear, AMD keep trying we've set the bar standard to $400 for RX6000XT,

Sincerely TPU vermin.

Ok so $400 for a RX 6900XT so what a RTX 3080Ti suppose to cost tree hunnie n fiddy!!? Anti up Jensen!! UPSCALE this sale!!!
I mean, I just ebay'd to get a 6900xt for 500...
Posted on Reply
#115
InVasMani
That just sucks the life out of breaking the box seal... and I didn't set that terms of agreement ARF did, but hell with it I agree $400 sounds better than moar dollahs.
Posted on Reply
#116
RandallFlagg
Super Firm TofuXFX do as well

I had an intense interest in that card at one point, but what little info is out there on it clearly implied that the card performed far below what its specs suggest. In some cases it was even with a 6600 XT, mostly about 10% faster, which would put it at even with a 6650 XT.

I had to conclude that it is in some way or another limited at a firmware level. Not surprising, given that internally it is using a laptop GPU.

So, it's priced about right for that level of performance given a 6650 XT can be had for a similar amount, but adds nothing else.

If you're looking for something near 6700 XT performance because of the name though (like I was), you're not going to find it in that card.
Posted on Reply
#117
HenrySomeone
InVasManiOk so $400 for a RX 6900XT so what a RTX 3080Ti suppose to cost tree hunnie n fiddy!!? Anti up Jensen!! UPSCALE this sale!!!
In an ideal world perhaps... I remember you could get a couple 980Ti-s (new) under $400 after the 1070 launch. Of course even then, it wasn't the greatest deal compared to the latter, which was similarly priced, performed about the same (although with both OCed to the max, the 980Ti usually won, at least back then) and used much less power; the main attraction was for people who already had one to use in one of the last hurrahs of Sli.
Posted on Reply
#118
Vayra86
DemonicRyzen666Why does it have to be games everyone else wants to play?
Same underlying reason VR isn't taking off proper.

No killer app, no market success. It is that simple. You can easily do without RT in the games where it is implemented well. And in most games the implementation is not 'well', but rather, quite horrible and pointless, plus a major performance hog. So you could say, 'I love seeing the technology' but when the game you see it in performs like crap and isn't quite fun, what's the point? People generally want more out of their gaming.
Posted on Reply
#119
Chrispy_
Vayra86You can easily do without RT in the games where it is implemented well. And in most games the implementation is not 'well', but rather, quite horrible and pointless, plus a major performance hog. So you could say, 'I love seeing the technology' but when the game you see it in performs like crap and isn't quite fun, what's the point? People generally want more out of their gaming.
Yup. I bought one of the first launched RTX cards 4 years ago and have had several more since then as I tend to cycle through hardware opportunistically.

To this date, I have never chosen to play with raytracing enabled, the one exception being Metro Exodus EE which is a fairly light implementation of raytraced GI and you can't disable it anyway. From 2060FE to 3090, I've never found the performance hit of raytracing worth it, and I dislike the soft-focus fuzziness of DLSS which RTX cards always use as a crutch to mask the massive performance drop with raytracing.

Sure, it's an option - but my personal preference has always been to run at higher native resolution and higher framerate. The image quality benefits far more from increased resolution than it does from rendering reflections and shadows in a different way. CP 2077, for example - is a choice between 4K60 native without raytracing or DLSS balanced with raytracing. Yes, the raytracing looks marginally better, but DLSS balanced is nowhere close to the actual detail of native. It's 1080p vs 4K.
Posted on Reply
#120
JAB Creations
LFaWolfHow did you get 31% from either of the graphs?
On the RTX 3090 Ti page the 4090 is listed as 41% faster.
On the RTX 4090 page the 3090 Ti is listed as 29% slower.

Since Nvidia is anti-consumer I don't follow them closely so I don't know how TPU screwed that up.
Posted on Reply
#121
neatfeatguy
JAB CreationsOn the RTX 3090 Ti page the 4090 is listed as 41% faster.
On the RTX 4090 page the 3090 Ti is listed as 29% slower.

Since Nvidia is anti-consumer I don't follow them closely so I don't know how TPU screwed that up.
We'll say the 3090Ti does 100fps.

41% faster is the 4090, then it does 141fps.

4090 does 141fps. If the 3090Ti is 29% slower; then 3090Ti gets 100fps.
(141 * .71 = 100.11fps)
Posted on Reply
#122
AsRock
TPU addict
Yeah i nearly got one was on Amazon for $700 but they said it was in stock but i had to wait 10 days for deliverly so i cancelled it. Last time i checked it had gone up to $900, now the XFX 6900XT is not for sale from Amazon now.
Posted on Reply
#123
LFaWolf
JAB CreationsOn the RTX 3090 Ti page the 4090 is listed as 41% faster.
On the RTX 4090 page the 3090 Ti is listed as 29% slower.

Since Nvidia is anti-consumer I don't follow them closely so I don't know how TPU screwed that up.
On the RTX 3090 Ti page the 4090 is listed as 41% faster.
---> This is correct.

On the RTX 4090 page the 3090 Ti is listed as 29% slower.
---> This is wrong. This is not how the math is calculated. The correct calculation is (100-71)/71 = 0.408 so round up to about 41%
Posted on Reply
#124
Chrispy_
LFaWolfOn the RTX 3090 Ti page the 4090 is listed as 41% faster.
---> This is correct.

On the RTX 4090 page the 3090 Ti is listed as 29% slower.
---> This is wrong. This is not how the math is calculated. The correct calculation is (100-71)/71 = 0.408 so round up to about 41%
Yep. People can't get their head around maths sometimes.

50% more of 100 is 150
100 is not 50% less than 150.
Posted on Reply
#125
mb194dc
Some sanity restored to the market. If you OC and push the 6900xt to it's limits.

The market won't stand for the top cards costing more than $600-650 in my view. Similar to when GTX 980 and 1080 launched. That's where we go back to, eventually...
Posted on Reply
Add your own comment
Dec 19th, 2024 04:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts