Thursday, April 13th 2023

AMD Radeon RX 6800 Discounted to $469.99 as RTX 4070 Hits the Market

AMD is not only doing marketing slides and pulling the VRAM card against NVIDIA ahead of the GeForce RTX 4070 launch, but it is also apparently doing some discounts on its Radeon RX 6000 series graphics cards, with the Radeon RX 6800 now selling for as low $469.99. As spotted by Tom's Hardware, both the Radeon RX 6800 XT and the Radeon RX 6800 has seen some discounts from various AIB partners earlier, ranging from $30 to $50, and are now selling at $110 below MSRP, making them a decent buy at $539 and $469. To make things even more interesting, these are pretty good custom versions, including the ASRock Radeon RX 6800 XT Phantom Gaming and the Gigabyte Radeon RX 6800 Gaming OC WindForce 3X graphics cards.

Yesterday, the AMD Radeon RX 6950 XT, which was the RDNA 2 flagship, was spotted discounted down to $609.99 by Kyle Bennet, and it was the Phantom Gaming Radeon RX 6950XT from ASRock. The deal is still available over at Newegg.com, and it is the lowest price for the Radeon RX 6950 XT, which is still a great card. The newly launched NVIDIA GeForce RTX 4070 is still widely available and there are plenty of SKUs to choose from at the MSRP of $599.99.
Sources: Tomshardware, Kyle Bennet (Twitter), Gigabyte RX 6800 (Newegg), ASRock RX 6800 XT (Newegg), ASRock RX 6950 XT (Newegg)
Add your own comment

72 Comments on AMD Radeon RX 6800 Discounted to $469.99 as RTX 4070 Hits the Market

#51
btk2k2
oxrufiioxoAlthough I agree with you going by techspot numbers that would make the 7900XT only 10% faster at 1440p making it DOA for 800+ usd.


Maybe 6900XT like performance for 500 usd with 16GB of vram that would give the 7900XT just enough breathing room.
7900XT is more of a 4K card with 20GB of ram though. Sure a 7800XT with 6950XT performance will also do well at 4K I can see the legs on the 7900XT being far better long term.
Posted on Reply
#52
Bomby569
it's still a old gen card and a with a enormous power draw in comparison
Posted on Reply
#53
nguyen
btk2k27900XT is more of a 4K card with 20GB of ram though. Sure a 7800XT with 6950XT performance will also do well at 4K I can see the legs on the 7900XT being far better long term.
7900XT can't reach 4K60 avg in some recent titles already (The last of us, Forspoken, Dead Space Remake), and that is without RT.

There in no leg on the 7900XT, it's already obsolete now...
Posted on Reply
#54
john_
mäggäriIf game settings page is done by professionals, it wouldn't allow too high settings in the first place, but alerts if VRAM limit has been exceeded. :)
Have you seen the video from Hardware Unboxed with Hogwarts Legacy? You have almost smooth framerate, but with visual problems, with textures a mess loading, unloading. This is how they will "fix" the VRAM capacity problems. And guess what. Professionals decided to limit the VRAM capacity on $400-$800 cards. Are you shocked?
Will have Intel again, 13700k at 80W when UV&OC +10% performance and way ahead of AMD. Sad thing is even 6core 13600K is better in productivity aswell, when compared to majority of AMD products. Buying Intel gets you gaming CPU and content creator CPU, two products in one.
"Neanderthal 45'-angle forehead ppl" can't handle the truth, even after Der8auer pointed out the effiency of Intel CPUs in his undervolting review.

If 7800X3D would use 50W and 13700K even 100W after adjusting while gaming, using 1500h full load in games, with 20cnt/kWh, means 15eur yearly. A bit over 1eur per month, under 3eur/month still if 40snt/kWh. I would suggest a phrase "It's time to prioritize", if such amounts cause financial ship to sink. ;D

Or another scenario with 7800X3D 50W and 150W Intel, is same as above - 150kWh difference.

Now 1500h yearly is 125h gaming monthly, and no skipping or upcoming months will be busy haha. Or the energy price difference narrows down a lot. :lovetpu:

At same time ppl are ready to use 350W 6800-6900XT instead of 180W 4070, which was on par in Gamers Nexus. :)

I remember days when everybody was soooooo productive and shouting out how important this is to have, when 1700X etc were released. Now when Intel took the lead, silence has landed. :) :roll:
Damn, I had to recheck the URL on my browser to see it is was saying WCCFTECH.
Love your double standards and excuses.
Posted on Reply
#55
TheoneandonlyMrK
nguyen7900XT can't reach 4K60 avg in some recent titles already (The last of us, Forspoken, Dead Space Remake), and that is without RT.

There in no leg on the 7900XT, it's already obsolete now...
Spoken like a guy who hasn't played portal 2Rt without Dlss on a 4090.

Truly idiotic,ALL CARDs are obsolete by your nonsense.

Meanwhile on topic ,this should clear the channel.


For AMD.
Posted on Reply
#56
mäggäri
john_Have you seen the video from Hardware Unboxed with Hogwarts Legacy? You have almost smooth framerate, but with visual problems, with textures a mess loading, unloading. This is how they will "fix" the VRAM capacity problems. And guess what. Professionals decided to limit the VRAM capacity on $400-$800 cards. Are you shocked?


Damn, I had to recheck the URL on my browser to see it is was saying WCCFTECH.
Love your double standards and excuses.
I love the 1440p results - the resolution this card is aimed for, constant handshaking with 3080 and often with 3090 even, while having 2700MHz GPU frequency. Would also say the 1440p is kind of "new FHD", modern standard for gaming. Steve is my lighthouse in the dark seas of GPUs. If we look at the 4070's success using 180W, those AMD 6xxx should bring more FPS also, for example at 270W would mean 50% increase, but can't see it happening. I don't accept this kind of behaviour. :)
AMD is great new kitchen appliance, keeping my coffee warm. :toast:


On 12700K I was able to boost allcore performance around 12% while reducing in CB R23 ~245W ->175W. :) Since those 13-gen CPUs are able to scale even better, the results are better. Only by undervolting of 0,1V otherwise stock values, will end up with 5500MHz allcore; Gains 10% in tests. On my previous 12700K I went from ~1,36V -> 1.19V while having OC of 300MHz + other adjustments on RING. So much to crank up with Intel CPUs. On 13700K I am expecting even greater drop in wattage.


Quite interesting results here (reliable source), 180-190W versus 330-350W 6950xt. Set the timing nicely, happy watching. :) Generally saying 4070 is faster than 3080, in 1440p. But not significantly, exactly what I was expecting 1-2months ago when doing some maths according to GPU, that OCing gives the extra over 3080 due to 4070 scaling over 3000MHz on GPU. Still having much lower power consumption.

Posted on Reply
#57
apoklyps3
mäggäriIf game settings page is done by professionals, it wouldn't allow too high settings in the first place, but alerts if VRAM limit has been exceeded. :)



Will have Intel again, 13700k at 80W when UV&OC +10% performance and way ahead of AMD. Sad thing is even 6core 13600K is better in productivity aswell, when compared to majority of AMD products. Buying Intel gets you gaming CPU and content creator CPU, two products in one.
"Neanderthal 45'-angle forehead ppl" can't handle the truth, even after Der8auer pointed out the effiency of Intel CPUs in his undervolting review.

If 7800X3D would use 50W and 13700K even 100W after adjusting while gaming, using 1500h full load in games, with 20cnt/kWh, means 15eur yearly. A bit over 1eur per month, under 3eur/month still if 40snt/kWh. I would suggest a phrase "It's time to prioritize", if such amounts cause financial ship to sink. ;D

Or another scenario with 7800X3D 50W and 150W Intel, is same as above - 150kWh difference.

Now 1500h yearly is 125h gaming monthly, and no skipping or upcoming months will be busy haha. Or the energy price difference narrows down a lot. :lovetpu:

At same time ppl are ready to use 350W 6800-6900XT instead of 180W 4070, which was on par in Gamers Nexus. :)

I remember days when everybody was soooooo productive and shouting out how important this is to have, when 1700X etc were released. Now when Intel took the lead, silence has landed. :) :roll:
this guy cracks me up =)))
Posted on Reply
#58
john_
mäggäriI love the 1440p results - the resolution this card is aimed for, constant handshaking with 3080 and often with 3090 even, while having 2700MHz GPU frequency. Would also say the 1440p is kind of "new FHD", modern standard for gaming. Steve is my lighthouse in the dark seas of GPUs. If we look at the 4070's success using 180W, those AMD 6xxx should bring more FPS also, for example at 270W would mean 50% increase, but can't see it happening. I don't accept this kind of behaviour. :)
AMD is great new kitchen appliance, keeping my coffee warm. :toast:


On 12700K I was able to boost allcore performance around 12% while reducing in CB R23 ~245W ->175W. :) Since those 13-gen CPUs are able to scale even better, the results are better. Only by undervolting of 0,1V otherwise stock values, will end up with 5500MHz allcore; Gains 10% in tests. On my previous 12700K I went from ~1,36V -> 1.19V while having OC of 300MHz + other adjustments on RING. So much to crank up with Intel CPUs. On 13700K I am expecting even greater drop in wattage.


Quite interesting results here (reliable source), 180-190W versus 330-350W 6950xt. Set the timing nicely, happy watching. :) Generally saying 4070 is faster than 3080, in 1440p. But not significantly, exactly what I was expecting 1-2months ago when doing some maths according to GPU, that OCing gives the extra over 3080 due to 4070 scaling over 3000MHz on GPU. Still having much lower power consumption.

Me me me me me. Well, while I don't really care about your utopian experiences with your system, where obviously the first thing you rush to do was undervolting, I bet even with a Celeron and a GTX 1630 you would be happy, considering they have the correct stickers on them.

Just wanted to say that your lighthouse, Steve, at about 27 minute he insists people to go and look for the RX 6950XT because the prices are great and even have a look at second hand, obviously much cheaper, second hand RTX 3080s before going out and buying an RTX 4070.
Why? Because even with that efficiency and Frame Generation, that 12GB VRAM limit for a $600 card is just too obvious to ignore.
mäggäriMy dream is to buy without brands, only to pay for the performance, not for ideology.
I thing you are a long way from achieving your dream. But I could be wrong.
mäggäriI understand the nvidia&intel hate, it's been coded in cave ppl genetics.
Intel and Nvidia killed the dinosaurs. That's why we hate them.
Posted on Reply
#59
TheoneandonlyMrK
apoklyps3this guy cracks me up =)))
Yeah comedy gold
mäggäriMy dream is to buy without brands, only to pay for the performance, not for ideology.

I understand the nvidia&intel hate, it's been coded in cave ppl genetics. Some ppl like to be fanboys of some semi-conductor brands lmao. I prefer the facts and can't even see brand names. Like when buying fresh Audi, I see only blank pages in catalogues if no quattro mentioned, then I get in contact with car retailers and ask "Why there's empty pages in this leaflet?". :)
well with your amazing input, avatar, and vague air of well todo epean you really are starting out well, not sure what your plan is(you know why im saying it(heart)) but your one biased post away from ignore.

in fact i read this page bye now.
Posted on Reply
#60
apoklyps3
modern Intels?
what's that? some kind of desease?
Posted on Reply
#61
TheoneandonlyMrK
OP, I think you could have started this thread better personally, :).

Ps if this needs deleting Can the mod start a convo with me please.
Posted on Reply
#62
oxrufiioxo
TheoneandonlyMrKWhen a thread about AMD dropping prices on one card starts with,


"AMD is not only doing marketing slides and pulling the VRAM card against NVIDIA ahead of the GeForce RTX 4070 launch, but it is also apparently doing some discounts on its Radeon RX 6000 series graphics cards, with the Radeon RX 6800 now selling for as low $469.99.*

How the actual fffff did you expect this to go, has the OP a horse in the game, I would hope not but they should consider not starting threads in such flame baity ways, it's now a Nvidia fanboi shit on AMD thread , fine there are many many.

Again troll, Me via pms, the convo would be at least not one sided the mods love my public posts as is, not.
It's honestly a smart move they know fanboys are like Flys to $h!+ to these kinds of articles. A large part of their revenue is based on traffic you would think so it makes sense.

As much as I think the whole fanboy thing should die that will never happen people root for these companies like they are their local Football teams after all. It's human nature I guess and beyond that people have a deep rooted hatred towards one or multiple of these companies for some usually ridiculous reason.

I think anyone with half a brain knows that it always comes down to who's on top behavior wise.

As far as the topic of this thread AMD really just needs to release a current gen competitor in the 550-650 price range they are seemingly always late lately at least when compared to Nvidia I get that they need to sell their previous generation hardware but really the only people buying it are those that can't afford or who choose not to afford any of the current gen offerings.

I'm a little worried about the 7800XT in general though if it performs like a 6900XT and cost around 600 usd that is a pretty big fail due to the 6800XT would not be that much slower and 2 years older it would be almost worse than the 4070 but kinda just the same and they really can't make it that much better because at that point it makes the 7900XT look even worse although it's at least creeping down to 800 usd.

I don't think the 6800 is all that appealing at 470 usd it's still a pretty great card though and probably the most efficient RDNA2 card without tweaking it while still having more than adequate vram I just think anyone who wanted one already bought it.


Like I said before balls in AMDs court hopefully they don't disappoint us. I know a lot of people who buy in the 500 ish price range who aren't impressed with Nvidia's 600 usd offering hopefully they thrash that card.
TheoneandonlyMrKOP, I think you could have started this thread better personally, :).
.
Ps if this needs deleting Can the mod start a convo with me please
We can all do better that's just part of life even the best of us aren't perfect which isn't really possible to begin with.

If the mods delete my comments due to not fitting with the original post I trust that they are making a judgment based on the best interest of the website I don't take it personally.

I know the mods are too busy but I do wish there was some feedback when it happens sometimes it's obvious why but other times it's not they are just human as well trying to do their best I'm sure.
Posted on Reply
#63
TheoneandonlyMrK
oxrufiioxoIt's honestly a smart move they know fanboys are like Flys to $h!+ to these kinds of articles. A large part of their revenue is based on traffic you would think so it makes sense.

As much as I think the whole fanboy thing should die that will never happen people root for these companies like they are their local Football teams after all. It's human nature I guess and beyond that people have a deep rooted hatred towards one or multiple of these companies for some usually ridiculous reason.

I think anyone with half a brain knows that it always comes down to who's on top behavior wise.

As far as the topic of this thread AMD really just needs to release a current gen competitor in the 550-650 price range they are seemingly always late lately at least when compared to Nvidia I get that they need to sell their previous generation hardware but really the only people buying it are those that can't afford or who choose not to afford any of the current gen offerings.

I'm a little worried about the 7800XT in general though if it performs like a 6900XT and cost around 600 usd that is a pretty big fail due to the 6800XT would not be that much slower and 2 years older it would be almost worse than the 4070 but kinda just the same and they really can't make it that much better because at that point it makes the 7900XT look even worse although it's at least creeping down to 800 usd.

I don't think the 6800 is all that appealing at 470 usd it's still a pretty great card though and probably the most efficient RDNA2 card without tweaking it while still having more than adequate vram I just think anyone who wanted one already bought it.


Like I said before balls in AMDs court hopefully they don't disappoint us. I know a lot of people who buy in the 500 ish price range who aren't impressed with Nvidia's 600 usd offering hopefully they thrash that card.




We can all do better that's just part of life even the best of us aren't perfect which isn't really possible to begin with.

If the mods delete my comments due to not fitting with the original post I trust that they are making a judgment based on the best interest of the website I don't take it personally.

I know the mods are too busy but I do wish there was some feedback when it happens sometimes it's obvious why but other times it's not they are just human as well trying to do their best I'm sure.
Reasonable, love it.

And fair points I agree with most.

No I don't Need to discuss that post with the mod ,. Sometimes you can't recall what you said though, at least right away, your like , ,, What post, which one.
Posted on Reply
#64
Unregistered
Why_MeThe 4070 12GB has a MSRP of $600 ... the 3080 10GB has an MSRP of $700.
nVidia has different MSRP for different regions.
#65
VTK85
I wish some of these prices would come to UK, gpu market here is screwed, even low end used cards on ebay go for ridiculous value.
Posted on Reply
#66
apoklyps3
why make GPUs that perform well anymore?
let's all make frame generator cards.
RT is still very niche.

Personally I prefer real performance, not generated fake frames.
Games are poorly ported/optimised/made for PC as it is. Don't need any extra glitches generated by GPU tech (DLSS or FSR)
If you don't have the top of the line from nvidia, maybe you should care about RT anyway.
The performance cost is too big for pretty shadows/lights anyway. I'd rather have better FPS at this point.
Posted on Reply
#67
InVasMani
john_Have you seen the video from Hardware Unboxed with Hogwarts Legacy? You have almost smooth framerate, but with visual problems, with textures a mess loading, unloading. This is how they will "fix" the VRAM capacity problems. And guess what. Professionals decided to limit the VRAM capacity on $400-$800 cards. Are you shocked?


Damn, I had to recheck the URL on my browser to see it is was saying WCCFTECH.
Love your double standards and excuses.
I wasn't feeling that video at all texture quality goes in out and out looking like you're turning on and off LOD bias from 0 to +3 or like looking at 1999 game's textures compared to a 2023 textures. That's makes the worst example I think I've ever seen of DLSS or FSR look exceptionally good by comparison.
Posted on Reply
#68
Patriot
john_I thing you are a long way from achieving your dream. But I could be wrong.
That is putting it mildly. Dreams of buying without brand loyalty, gets Nvidia-Wintel Tattoo.
Posted on Reply
#69
ValenOne
nguyen7900XT can't reach 4K60 avg in some recent titles already (The last of us, Forspoken, Dead Space Remake), and that is without RT.

There in no leg on the 7900XT, it's already obsolete now...
7900 XT 16 GB's price range is against RTX 4070 T 12 GB.
mäggäriI love the 1440p results - the resolution this card is aimed for, constant handshaking with 3080 and often with 3090 even, while having 2700MHz GPU frequency. Would also say the 1440p is kind of "new FHD", modern standard for gaming. Steve is my lighthouse in the dark seas of GPUs. If we look at the 4070's success using 180W, those AMD 6xxx should bring more FPS also, for example at 270W would mean 50% increase, but can't see it happening. I don't accept this kind of behaviour. :)
AMD is great new kitchen appliance, keeping my coffee warm. :toast:


On 12700K I was able to boost allcore performance around 12% while reducing in CB R23 ~245W ->175W. :) Since those 13-gen CPUs are able to scale even better, the results are better. Only by undervolting of 0,1V otherwise stock values, will end up with 5500MHz allcore; Gains 10% in tests. On my previous 12700K I went from ~1,36V -> 1.19V while having OC of 300MHz + other adjustments on RING. So much to crank up with Intel CPUs. On 13700K I am expecting even greater drop in wattage.


Quite interesting results here (reliable source), 180-190W versus 330-350W 6950xt. Set the timing nicely, happy watching. :) Generally saying 4070 is faster than 3080, in 1440p. But not significantly, exactly what I was expecting 1-2months ago when doing some maths according to GPU, that OCing gives the extra over 3080 due to 4070 scaling over 3000MHz on GPU. Still having much lower power consumption.

Cinebench R23 results are useless when amateurs like me have Blender instead. Cinebench R23 doesn't use AVX-512.



Posted on Reply
#70
apoklyps3
and nothing else uses AVX-512 :roll:
Posted on Reply
#71
ValenOne
apoklyps3and nothing else uses AVX-512 :roll:
Does Blender use AVX-512?

Blender uses AVX, AVX2 and AVX-512 in the Cycles render engine.
Posted on Reply
Add your own comment
Aug 4th, 2024 01:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts