Sunday, April 23rd 2023

AMD Radeon RX 6950 XT Now as Low as $600, Poses a Juicy Alternative to RTX 4070

In the height of the crypto-mining GPU shortage of 2021-22, the Radeon RX 6900 XT and its refresh the RX 6950 XT, were scalped and resold for upward of $2,000. You can get one for as low as $600 on Newegg. The ASRock Radeon RX 6950 XT Phantom Gaming OC is now listed on U.S. retailer Newegg.com at $630, with a coupon code that shaves $30 off, bringing the effective price down to $600, which is the MSRP of the recently announced NVIDIA GeForce RTX 4070 "Ada."

Our testing shows that the RTX 4070 offers performance on-par with previous-generation RTX 3080 and the Radeon RX 6800 XT. The RX 6900 XT is about 6% faster than the RTX 4070 at 1440p (averaged over our test suite), and the RX 6950 XT from our older reviews is about 6-7% faster than the RX 6900 XT at 1440p. This is, however, performance with raster 3D graphics (which makes up the majority of gaming graphics), the ray tracing performance of the RX 6950 XT is closer to that of the RTX 3070 Ti, or about 23% slower than the RTX 4070. The RTX 4070 is a more efficient GPU, and also offers next-gen features such as DLSS 3 Frame Generation.
Source: VideoCardz
Add your own comment

46 Comments on AMD Radeon RX 6950 XT Now as Low as $600, Poses a Juicy Alternative to RTX 4070

#1
dragontamer5788
I'm seeing $650 at my local Microcenter. $600 from Newegg.

Seems like a good time to think about a new rig! Zen4, cheaper GPUs, DDR5 upgrade. Its just the motherboards that feel overpriced (but PCIe5 / DDR5 are better, so maybe its worth the extra price on the mobo).
Posted on Reply
#2
A Computer Guy
This price seems amazing but I'm nervous about buying these high end cards because of both the power needed and the problems reported.
Posted on Reply
#3
Chaitanya
Unfortunately when lower end 7000 parts come out it wont be priced as well.
Posted on Reply
#4
Verpal
Since energy price is going insane everywhere, I wonder how much performance can we get out of 6950XT if we undervolt it with 200W limit?

Would be pretty annoying to make sure it is actually 200W though, since AMD report wattage on core only and we need something to measure rest of the board.
Posted on Reply
#5
mama
...and the extra vram. 12GB is a bad joke.
Posted on Reply
#6
R0H1T
A Computer GuyThis price seems amazing but I'm nervous about buying these high end cards because of both the power needed and the problems reported.
The power can be controlled to an extent by limiting the TDP/temps etc. But the biggest issue IMO with the previous gen is the ridiculous sizes on some of them! No way I'm getting a 3-4 slot GPU ever.
Posted on Reply
#7
MrDweezil
Pretty appealing from a price/performance perspective, but its too big and I don't think my PSU could handle it. If they get this aggressive with pricing on the 6800 XT though, I'm in.
Posted on Reply
#8
Vayra86
A Computer GuyThis price seems amazing but I'm nervous about buying these high end cards because of both the power needed and the problems reported.
Power is an issue. Other problems... are you thinking of anything specific? Might be able to shine some light, as far as I know RDNA2/3 are in a good place now. I'm living the RDNA3 dream right now, no problems whatsoever ;)
Posted on Reply
#9
Bomby569
the enormous difference in power draw makes all the different, it's a pay now and keep it forever, no more payments, VS a pay now and then you have to pay monthly instalments for life

it would be a disastrous choice where i live
Posted on Reply
#10
ZoneDymo
I dont agree, the 4070 is way more energy efficient and has stuff like DLSS 3.0
If the RX6950 will end up being able to run FSR 3.0 and it ends up being worth a damn, then maybe.

Right now AMD just needs to DROP The F out of their cards, everything down 150 - 200 dollars/euro AND release the replacements already ><
Posted on Reply
#11
Bomby569
ZoneDymoI dont agree, the 4070 is way more energy efficient and has stuff like DLSS 3.0
If the RX6950 will end up being able to run FSR 3.0 and it ends up being worth a damn, then maybe.

Right now AMD just needs to DROP The F out of their cards, everything down 150 - 200 dollars/euro AND release the replacements already ><
People trash Nvidia, but AMD is selling a end of life card with similar performance and almost double the power draw for similar price, it's insanity. :kookoo:
I guess the only reason anyone would buy this was if they went by that brainless HU video about vram, and you almost think it was exactly tailor made for that purpose. We don't need to lower the price because vram o_O
Posted on Reply
#12
Vayra86
Bomby569the enormous difference in power draw makes all the different, it's a pay now and keep it forever, no more payments, VS a pay now and then you have to pay monthly instalments for life

it would be a disastrous choice where i live
Not so much power draw, but power spikes. And resulting instability on below spec PSUs.
You can undervolt just fine and use as much power as any other high end GPU - 250-300W give or take, and you can go lower too. At the cost of single digit performance %.

This is no different than with any other high end GPU from the last gen or even the current one



^ Gaming typical draw.


So please... let's keep talking about reality, and take note of the Ampere cards, and even the mighty efficient Ada based 4090 at 508W.
Similarly, the efficiency gap between both Ampere and RDNA2, OR Ada and RDNA3 is negligible.
Posted on Reply
#13
Bomby569
Vayra86Not so much power draw, but power spikes. And resulting instability on below spec PSUs.
You can undervolt just fine and use as much power as any other high end GPU - 250-300W give or take, and you can go lower too. At the cost of single digit performance %.

This is no different than with any other high end GPU from the last gen or even the current one



^ Gaming typical draw.
So please... let's keep talking about reality.
i haven't followed the 4070, because it's an idiotic product, don't know it's undervolt capabilities, but in general all cards can be undervolted for a single digit performance, that's true, but if you are going to do it (and 99% don't) won't you prefer to start with the one that draws almost half at similar performance?
Posted on Reply
#14
Vayra86
Bomby569i haven't followed the 4070, because it's an idiotic product, don't know it's undervolt capabilities, but in general all cards can be undervolted for a single digit performance, that's true, but if you are going to do it (and 99% don't) won't you prefer to start with the one that draws almost half at similar performance?
It doesn't draw almost half, the gap is 100W in a worst case scenario only, and <50W if you undervolt RDNA2, while the 4070 doesn't gain nearly as much efficiency from UV.
Puts a rather different perspective on your supposed 'insanity' of buying a 16GB GPU at 600 bucks doesn't it.

You really need to reflect what you're saying against the actual numbers. The 4070 FE does 4.5W/frame; its not even the most efficient Ada.

Posted on Reply
#15
Bomby569
Vayra86It doesn't draw almost half, the gap is 100W in a worst case scenario only, and <50W if you undervolt RDNA2, while the 4070 doesn't gain nearly as much efficiency from UV.
Puts a rather different perspective on your supposed 'insanity' of buying a 16GB GPU at 600 bucks doesn't it.

You really need to reflect what you're saying against the actual numbers. The 4070 FE does 4.5W/frame; its not even the most efficient Ada.

sure there are ifs and buts, but the numbers those are almost double or almost half depending on your perspective

Posted on Reply
#16
Vayra86
Bomby569sure there are ifs and buts, but the numbers those are almost double or almost half depending on your perspective

Furmark, dude.

Please. Just admit your earlier comment was pulling things out of proportion, don't make a further fool of yourself, seriously.
Posted on Reply
#17
Bomby569
Vayra86Furmark, dude.

Please. Just admit your earlier comment was pulling things out of proportion, don't make a further fool of yourself, seriously.
that's the cards power draw, period, that's a fact. If you undervolt or not and 99% don't, if you play game x or y with feature z or x enabled, competitive gaming or with a fps cap, or if you fold mine or do whatever, get better or worst luck on the lottery and your card undervolts better or worst, etc... it's another story

not to mention needing a bigger psu
Posted on Reply
#18
Vayra86
Bomby569that's the cards power draw, period, that's a fact. If you undervolt or not and 99% don't, if you play game x or y with feature z or x enabled, competitive gaming or with a fps cap, or if you fold mine or do whatever it's another story
Ah, so 4090's 'power draw' is 666,6W?

You need to get your head examined, fast. Stop grasping at straws, count to 10, and admit you're spouting BS and got corrected. All is well. I have a persistent allergy for nonsense and this is nonsense. Furmark hasn't been good for anything since many years, its called a power virus for a reason.
Posted on Reply
#19
Bomby569
Vayra86Ah, so 4090's 'power draw' is 666,6W?

You need to get your head examined, fast. Stop grasping at straws, count to 10, and admit you're spouting BS and got corrected. All is well. I have a persistent allergy for nonsense and this is nonsense. Furmark hasn't been good for anything since many years, its called a power virus for a reason.
similar situation, one draws 200, the other draws 377. If your point is in other scenarios one won't draw 377, the other won't draw 200
Posted on Reply
#20
Vayra86
Bomby569similar situation, one draws 200, the other draws 377. If your point is in other scenarios one won't draw 377, the other won't draw 200
Not sure if you're joking or actually serious now :D
I hope you're joking. Again: Furmark is no basis to indicate ANYTHING. All you get to see is maximum board power; not what GPUs actually do in any typical load. Furmark doesn't even get boost clocks going.

Just stick to the gaming power draw, its much safer for you, interpreting numbers clearly isn't your strong suit. TPU has nice bar charts that represent the different types of power draw clearly. There is also a Furmark one. Do a little compare, and you might figure it out.
Posted on Reply
#21
Bomby569
Vayra86Not sure if you're joking or actually serious now :D
I hope you're joking. Again: Furmark is no basis to indicate ANYTHING. All you get to see is maximum board power; not what GPUs actually do in any typical load. Furmark doesn't even get boost clocks going.

Just stick to the gaming power draw, its much safer for you, interpreting numbers clearly isn't your strong suit. TPU has nice bar charts that represent the different types of power draw clearly. There is also a Furmark one. Do a little compare, and you might figure it out.
what's "gaming power draw"? what games? what settings? do you cap fps? is the optimized or not? do you max fps or cap fps? does the game prefer team red or blue? what etc etc .... ???!!!
Posted on Reply
#22
Vayra86
Bomby569what's "gaming power draw"? what games? what settings? do you cap fps? is the optimized or not? do you max fps or cap fps? does the game prefer team red or blue? what etc etc .... ???!!!
Those metrics are a good 6 posts above this one. You take averages across a large benchmark suite, and you get those nice bar charts.
Or you can cherry pick games and pick out the biggest offenders in peak power draw.

NEITHER will show you anything close to Furmark numbers, and all GPUs much closer together than their peak power draw suggests.

But you know this already, you're just trying to make a case where there isn't one. Just stahp
Posted on Reply
#24
Ferrum Master
Great actually.

Regarding the discussion about RX6x and RTX3xx power spikes. W1z stated it as one of the points that current gen of cards softens things in this regard and are less critical.

The thing about picking up a correct PSU and power consumption.

Do PSU reviews answer the question, will my PC work with this certain PSU? I have 6900XT + 5800X3D for example? As the PSU reviews are full of self proclaiming certification and other BS - no, they are weak in that regard, giving you the practical info. You usually will not get the answer unless you try itself. OCP triggering is a sketchy topic really.
Posted on Reply
#25
ToxicTears
PowerDraw? I have 6900XT watter cooled feat Ryzen 5700X .. the total consumption of the computer assembly via wattmeter is 300W because undervolt 6900XT RedDevil ...only 2300MHz and 0,975V = 180-200W for my GPU. 50°C temps. And undervolt have only 5% worst performance at all
Posted on Reply
Add your own comment
Nov 12th, 2024 18:50 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts