Sunday, April 23rd 2023
AMD Radeon RX 6950 XT Now as Low as $600, Poses a Juicy Alternative to RTX 4070
In the height of the crypto-mining GPU shortage of 2021-22, the Radeon RX 6900 XT and its refresh the RX 6950 XT, were scalped and resold for upward of $2,000. You can get one for as low as $600 on Newegg. The ASRock Radeon RX 6950 XT Phantom Gaming OC is now listed on U.S. retailer Newegg.com at $630, with a coupon code that shaves $30 off, bringing the effective price down to $600, which is the MSRP of the recently announced NVIDIA GeForce RTX 4070 "Ada."
Our testing shows that the RTX 4070 offers performance on-par with previous-generation RTX 3080 and the Radeon RX 6800 XT. The RX 6900 XT is about 6% faster than the RTX 4070 at 1440p (averaged over our test suite), and the RX 6950 XT from our older reviews is about 6-7% faster than the RX 6900 XT at 1440p. This is, however, performance with raster 3D graphics (which makes up the majority of gaming graphics), the ray tracing performance of the RX 6950 XT is closer to that of the RTX 3070 Ti, or about 23% slower than the RTX 4070. The RTX 4070 is a more efficient GPU, and also offers next-gen features such as DLSS 3 Frame Generation.
Source:
VideoCardz
Our testing shows that the RTX 4070 offers performance on-par with previous-generation RTX 3080 and the Radeon RX 6800 XT. The RX 6900 XT is about 6% faster than the RTX 4070 at 1440p (averaged over our test suite), and the RX 6950 XT from our older reviews is about 6-7% faster than the RX 6900 XT at 1440p. This is, however, performance with raster 3D graphics (which makes up the majority of gaming graphics), the ray tracing performance of the RX 6950 XT is closer to that of the RTX 3070 Ti, or about 23% slower than the RTX 4070. The RTX 4070 is a more efficient GPU, and also offers next-gen features such as DLSS 3 Frame Generation.
46 Comments on AMD Radeon RX 6950 XT Now as Low as $600, Poses a Juicy Alternative to RTX 4070
Seems like a good time to think about a new rig! Zen4, cheaper GPUs, DDR5 upgrade. Its just the motherboards that feel overpriced (but PCIe5 / DDR5 are better, so maybe its worth the extra price on the mobo).
Would be pretty annoying to make sure it is actually 200W though, since AMD report wattage on core only and we need something to measure rest of the board.
it would be a disastrous choice where i live
If the RX6950 will end up being able to run FSR 3.0 and it ends up being worth a damn, then maybe.
Right now AMD just needs to DROP The F out of their cards, everything down 150 - 200 dollars/euro AND release the replacements already ><
I guess the only reason anyone would buy this was if they went by that brainless HU video about vram, and you almost think it was exactly tailor made for that purpose. We don't need to lower the price because vram o_O
You can undervolt just fine and use as much power as any other high end GPU - 250-300W give or take, and you can go lower too. At the cost of single digit performance %.
This is no different than with any other high end GPU from the last gen or even the current one
^ Gaming typical draw.
So please... let's keep talking about reality, and take note of the Ampere cards, and even the mighty efficient Ada based 4090 at 508W.
Similarly, the efficiency gap between both Ampere and RDNA2, OR Ada and RDNA3 is negligible.
Puts a rather different perspective on your supposed 'insanity' of buying a 16GB GPU at 600 bucks doesn't it.
You really need to reflect what you're saying against the actual numbers. The 4070 FE does 4.5W/frame; its not even the most efficient Ada.
Please. Just admit your earlier comment was pulling things out of proportion, don't make a further fool of yourself, seriously.
not to mention needing a bigger psu
You need to get your head examined, fast. Stop grasping at straws, count to 10, and admit you're spouting BS and got corrected. All is well. I have a persistent allergy for nonsense and this is nonsense. Furmark hasn't been good for anything since many years, its called a power virus for a reason.
I hope you're joking. Again: Furmark is no basis to indicate ANYTHING. All you get to see is maximum board power; not what GPUs actually do in any typical load. Furmark doesn't even get boost clocks going.
Just stick to the gaming power draw, its much safer for you, interpreting numbers clearly isn't your strong suit. TPU has nice bar charts that represent the different types of power draw clearly. There is also a Furmark one. Do a little compare, and you might figure it out.
Or you can cherry pick games and pick out the biggest offenders in peak power draw.
NEITHER will show you anything close to Furmark numbers, and all GPUs much closer together than their peak power draw suggests.
But you know this already, you're just trying to make a case where there isn't one. Just stahp
www.newegg.com/msi-rx-6950-xt-gaming-x-trio-16g/p/N82E16814137733?Item=N82E16814137733
Regarding the discussion about RX6x and RTX3xx power spikes. W1z stated it as one of the points that current gen of cards softens things in this regard and are less critical.
The thing about picking up a correct PSU and power consumption.
Do PSU reviews answer the question, will my PC work with this certain PSU? I have 6900XT + 5800X3D for example? As the PSU reviews are full of self proclaiming certification and other BS - no, they are weak in that regard, giving you the practical info. You usually will not get the answer unless you try itself. OCP triggering is a sketchy topic really.