Tuesday, October 18th 2022
AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669
In the run up to the November 3 reveal of the next-generation RDNA3 architecture, and with the 43% faster RTX 4090 mauling away its appeal to the enthusiast crowd, the AMD Radeon RX 6900 XT got another round of price-cuts, and can be had for as low as $669. Prices are down on both sides of the big pond, with European retailers listing it for as low as 699€. Although not technically AMD's flagship graphics card, with the RX 6950 XT (starts at $869); the RX 6900 XT is a formidable 4K gaming graphics card with a high performance-per-Dollar at its new price (roughly 35% higher than the RTX 4090). AMD's latest round of official price-cuts happened around mid-September as the company was bracing for the RTX 4090 "Ada."
Source:
VideoCardz
131 Comments on AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669
Not XFX models, mind you. Just X1950 XTX cards.
I bought the Sapphire Pulse just to see what the fuss was about.
How is power consumption? In that aspect, I preferred the power/performance ratio of 6800 to the 6700/50XT. The 67x0XT seems pushed way past the efficiency range.
Dear, AMD keep trying we've set the bar standard to $400 for RX6000XT,
Sincerely TPU vermin.
Ok so $400 for a RX 6900XT so what a RTX 3080Ti suppose to cost tree hunnie n fiddy!!? Anti up Jensen!! UPSCALE this sale!!!
The raytracing performance in Blender... at least the public benchmarks for Blender... have impressed me, especially for only $350.
I'd imagine that AMD GPUs are the worst for your group, because AMD GPUs have the worst raytracing performance out of all three manufacturers (even if we "magic" the software into place, the AMD 6900 XT will never raytrace as quickly as NVidia or Intel counterparts. AMD simply put less raytracing hardware in these cards...) I dunno how many graphics packages are programmed to use Intel Arc yet though.
There are much better games out there then what the mob of gamer hyper up & play.
I want intel to succeed but I also haven't seen a SINGLE review by anyone anywhere that says it's actually usable as a daily driver. It's an interesting first attempt that currently requires you to spend as much time troubleshooting as you do using it. Until the general consensus says otherwise, it's not even worth my interest.
We went through a phase of almost exclusive AMD GPUs because GCN was fantastic and AMD were typically putting way more VRAM on their cards than Nvidia. Turing was the tipping point because so many plugins started leveraging RTX. Not DXR, but Nvidia proprietary RTX. Stock settings, around the same efficiency as a 6600XT - about 10% faster, 10% more power draw.
I don't have a 6600XT but this particular 6700 undervolts like a champ - can drop core voltage by 200mv and still basically manage the same clocks. It's about a 30% reduction in power draw without any real loss. I guess the BIOS and driver see Navi22 and go "oh hey, 1.15V to reach 2.6GHz" Except the 6700 maxes out at 2.4GHz boost and can do that just fine on 900mv, so I have no idea why it keeps adding voltage to the boost bins because it's not clocking any higher (probably driver limitation to stop it cannibalising 6750XT sales)
I had to conclude that it is in some way or another limited at a firmware level. Not surprising, given that internally it is using a laptop GPU.
So, it's priced about right for that level of performance given a 6650 XT can be had for a similar amount, but adds nothing else.
If you're looking for something near 6700 XT performance because of the name though (like I was), you're not going to find it in that card.
No killer app, no market success. It is that simple. You can easily do without RT in the games where it is implemented well. And in most games the implementation is not 'well', but rather, quite horrible and pointless, plus a major performance hog. So you could say, 'I love seeing the technology' but when the game you see it in performs like crap and isn't quite fun, what's the point? People generally want more out of their gaming.
To this date, I have never chosen to play with raytracing enabled, the one exception being Metro Exodus EE which is a fairly light implementation of raytraced GI and you can't disable it anyway. From 2060FE to 3090, I've never found the performance hit of raytracing worth it, and I dislike the soft-focus fuzziness of DLSS which RTX cards always use as a crutch to mask the massive performance drop with raytracing.
Sure, it's an option - but my personal preference has always been to run at higher native resolution and higher framerate. The image quality benefits far more from increased resolution than it does from rendering reflections and shadows in a different way. CP 2077, for example - is a choice between 4K60 native without raytracing or DLSS balanced with raytracing. Yes, the raytracing looks marginally better, but DLSS balanced is nowhere close to the actual detail of native. It's 1080p vs 4K.
On the RTX 4090 page the 3090 Ti is listed as 29% slower.
Since Nvidia is anti-consumer I don't follow them closely so I don't know how TPU screwed that up.
41% faster is the 4090, then it does 141fps.
4090 does 141fps. If the 3090Ti is 29% slower; then 3090Ti gets 100fps.
(141 * .71 = 100.11fps)
---> This is correct.
On the RTX 4090 page the 3090 Ti is listed as 29% slower.
---> This is wrong. This is not how the math is calculated. The correct calculation is (100-71)/71 = 0.408 so round up to about 41%
50% more of 100 is 150
100 is not 50% less than 150.
The market won't stand for the top cards costing more than $600-650 in my view. Similar to when GTX 980 and 1080 launched. That's where we go back to, eventually...