Thursday, April 13th 2023

AMD Radeon RX 6800 Discounted to $469.99 as RTX 4070 Hits the Market

AMD is not only doing marketing slides and pulling the VRAM card against NVIDIA ahead of the GeForce RTX 4070 launch, but it is also apparently doing some discounts on its Radeon RX 6000 series graphics cards, with the Radeon RX 6800 now selling for as low $469.99. As spotted by Tom's Hardware, both the Radeon RX 6800 XT and the Radeon RX 6800 has seen some discounts from various AIB partners earlier, ranging from $30 to $50, and are now selling at $110 below MSRP, making them a decent buy at $539 and $469. To make things even more interesting, these are pretty good custom versions, including the ASRock Radeon RX 6800 XT Phantom Gaming and the Gigabyte Radeon RX 6800 Gaming OC WindForce 3X graphics cards.

Yesterday, the AMD Radeon RX 6950 XT, which was the RDNA 2 flagship, was spotted discounted down to $609.99 by Kyle Bennet, and it was the Phantom Gaming Radeon RX 6950XT from ASRock. The deal is still available over at Newegg.com, and it is the lowest price for the Radeon RX 6950 XT, which is still a great card. The newly launched NVIDIA GeForce RTX 4070 is still widely available and there are plenty of SKUs to choose from at the MSRP of $599.99.
Sources: Tomshardware, Kyle Bennet (Twitter), Gigabyte RX 6800 (Newegg), ASRock RX 6800 XT (Newegg), ASRock RX 6950 XT (Newegg)
Add your own comment

72 Comments on AMD Radeon RX 6800 Discounted to $469.99 as RTX 4070 Hits the Market

#26
Chry
Minus InfinityThe sad thing is AMD won't have mid-tier cards out until July at earliest.
June.
Posted on Reply
#27
oxrufiioxo
Minus InfinityThe sad thing is AMD won't have mid-tier cards out until July at earliest. Even worse 7700XT which is the 6600XT replacement is only targeting 4060 Ti not even 4070. 7800XT is not even as fast as 4070 Ti according to internal leaks.

Anyway 6800XT is best bang for the buck and what I got to replace a 1080 Ti. I'm skipping this gen entirely and waiting to see RDNA4, Blackwell and Battlemage. If Intel csn deliver on the promise of hitting 4080 performance at say 40% less it would be worth a look. If rumors re to believed 5090 is getting massive price rise as Nvidia will be paying 50% more per wafer for TSMC 3nm vs current 4nm they are using. Lovelace is just getting us ready for more sticker shock.
For Nvidia to hit the 2x performance increase at least in RT that die is going to have to be massive 800mm2 ish so it isn't going to be cheap unless it's a 30-40% increase instead and similar or slightly smaller die.

I still think whatever the 5070 ends up being it should be ballpark 4080 performance now will they ask 699 for it I wouldn't bet against it or it even be more expensive lol.

Battlemage I wouldn't hold my breath....

I do have hope that RDNA4 will be really good especially if AMD figures out multiple GCD per gpu you would think that AMD is tired of losing.
I feel like RDNA2 was a big step up from their vega/RDNA 1 days and so far I feel like they have regressed with RDNA3 mostly due to the 6800XT/6900XT better competing with the 3080/3090 than their current offerings. I still have hope though.
ChryThanks, good to hear amd prices tend to lower a bit. Would like to get a 7700xt/7800xt this summer.
You are probably looking at October-November to get decent deals on those assuming a June/July launch that's just me speculating going by how the prices on their other cards behaved though for all we know another major chip shortage for god knows what reason can happen and gpu prices could double again.... Knocking on wood that doesn't happen..
Posted on Reply
#28
ValenOne
The major difference between RX 7800 16 GB and RX 6800 16 GB would be improved raytracing, AV1 encoding support, and less power consumption. There's less need for AMD to release another x800 SKU when RX 6800 already has 16 GB VRAM.
Posted on Reply
#29
oxrufiioxo
rvalenciaThe major difference between RX 7800 16 GB and RX 6800 16 GB would be improved raytracing, AV1 encoding support, and less power consumption. There's less need for AMD to release another x800 SKU when RX 6800 already has 16 GB VRAM.
I think AMD kinda put themselves in a bad spot with the 7900XT not making it a 700 ish gpu and the clear better option over the 4070ti at this point it is just debatable what to go with when it comes to those two I really couldn't fault someone for going either way but due to Nvidia mindshare most are likely leaning Nvidia.

Whatever the 7800XT becomes it can't be so good that it makes the 800 usd 7900XT irrelevant but it also can't be so bad it doesn't properly compete with the new 4070 event at 500 usd with 4070 like performance I'm not sure most will care about it even if it has 16GB of vram but the problem they have making it any better than that is the 7900XT.

I guess we will see.

Hopefully the midrange is way more interesting than the high end. So far not so much....
Posted on Reply
#30
mäggäri
MrDweezilIs that going to bring anything to the table besides better energy efficiency? The 40xx cards haven't done anything to move the existing price/performance curve and the 6xxx cards already occupy appropriate positions along it.
DLSS 3.0 keeps 4000-series usable for long time, brings longetivity for the card.

Like 3000 series laptops also can enjoy DLSS 2.0, helps casual gaming over a longer period.

4070 non-ti has a cute tiny PCB and 180W usage in games, interesting and compact option for small builds too. Didn't check yet the if board power limits can be reached with OC. The other way in optimizing is to aim for around 150W but with 100% performance. :)
Posted on Reply
#31
InVasMani
Imagine selling a RX580 in 2021 for $500's and in 2023 buying a RX6800 for $470's and having $30's leftover for a crystal ball.
Posted on Reply
#32
JAB Creations
While people bitch and complain on here my RX 6800 has given me over 60 FPS at 1440p with maxed settings in everything so I have zero complaints. I got it when I finally had the money and it was ~$600 at the time. I don't regret it in the least. Since things are getting better for me this year I plan on getting a 7900XTX for the 24GB and when I upgrade to a proper 4K 144hz screen.
Posted on Reply
#33
ZoneDymo
Boring, it should be at most 300 bucks by now
Posted on Reply
#34
chowow
ha ha 4070 the new 3070 in a year guaranteed
Posted on Reply
#35
evernessince
mäggäriDLSS 3.0 keeps 4000-series usable for long time, brings longetivity for the card.

Like 3000 series laptops also can enjoy DLSS 2.0, helps casual gaming over a longer period.

4070 non-ti has a cute tiny PCB and 180W usage in games, interesting and compact option for small builds too. Didn't check yet the if board power limits can be reached with OC. The other way in optimizing is to aim for around 150W but with 100% performance. :)
If your card struggles to even play a game, frame generation is the last thing you want to enable as that's where the latency hit is the largest and where the visual artifacts are most visible. AS HWUB pointed out, ideal range for frame generation is 70 - 120 base FPS. Outside that it doesn't make much sense and by extension it doesn't make sense that it'd extend the life of a card outside of niche scenarios. In addition, it will do nothing if your card runs out of VRAM. That's a scenario that's likely given the 4070 and 4070 Ti are already skirting the line in that regard.

If you are planning on keeping your card for a longer period of time, it's a far better idea to get one with 16GB+ instead of relying on fake frames with serious drawbacks.
Posted on Reply
#36
oxrufiioxo
evernessinceIf your card struggles to even play a game, frame generation is the last thing you want to enable as that's where the latency hit is the largest and where the visual artifacts are most visible. AS HWUB pointed out, ideal range for frame generation is 70 - 120 base FPS. Outside that it doesn't make much sense and by extension it doesn't make sense that it'd extend the life of a card outside of niche scenarios. In addition, it will do nothing if your card runs out of VRAM. That's a scenario that's likely given the 4070 and 4070 Ti are already skirting the line in that regard.

If you are planning on keeping your card for a longer period of time, it's a far better idea to get one with 16GB+ instead of relying on fake frames with serious drawbacks.
.
My biggest issue with DLSS3 isn't artifacts it's the latency impact when running less than 70fps but as long as with frame gen you are in the 100-120fps it's fine but any lower and the latency becomes and issue before the artifacts do.

Except in Spiderman which is the poorest implementation I've tried.

People really need to use it themselves before drawing conclusion though when Nvidia announced it I thought it was pretty stupid but after hands on time I like it quite a bit.

At the very least I'm much less bothered by the artifacts DLSS3 produces compared to the shit SSR that is in most games.
Posted on Reply
#37
MarsM4N
oxrufiioxoThey should actually just release a real current generation competitor.... Not like the bar is super high.
Apparently they plan to release the rest of the 7000 Series cards for the Chinese 618 Shopping Festival. ;) Which guess what, is on the 18th June.


As they're starting to sell of the 6000 Series it's most likely true. PCGH also has some specs posted. Don't know how accurate they are, but looks plausible.
Posted on Reply
#38
Unregistered
The 4070 is rubbish it launched at the price of the 3080, ridiculous for such a tiny GPU. Maybe if it was a proper 70 card beating the 3090ti it would be less insulting.
#39
oxrufiioxo
Xex360The 4070 is rubbish it launched at the price of the 3080, ridiculous for such a tiny GPU. Maybe if it was a proper 70 card beating the 3090ti it would be less insulting.
While I don't necessarily disagree at least out here in the states it's 100 usd cheaper...

The one nice thing is at least it won't be sold out and then scalped into oblivion keep in mind most spent well over $1k for their 3080s same with the 6800XT.
Posted on Reply
#40
Broken Processor
Great card but pricey as expected and 12gb vram I'm sorry in hogwarts legacy I used nearly 14gb of vram at one point so I couldn't recommend this card to anyone.

I know peep's will argue but but ray tracing and dlss 3 but imho when paying this amount for a card I don't expect to have to use artificial frame generation to make a game playable and considering the cost of ram this move comes across as artificial product segmentation and early obsolescence.
Posted on Reply
#41
BoboOOZ
oxrufiioxoDepends, amd gpu prices seem to drop faster than Nvidia. We usually don't see an Nvidia price drop unless the current model is getting replaced by a higher tier sku or a new generation is coming soon.
It's because there are higher margins on the AMD cards, so sellers can drop the prices a bit and still sell at a profit. For Nvidia, sellers do not have the same margin so they can't drop prices, unless they want to sell at a loss. Given that Nvidia cards are most of the volume, nobody can really afford that. Hence the not budging prices.
Posted on Reply
#42
Unregistered
oxrufiioxoWhile I don't necessarily disagree at least out here in the states it's 100 usd cheaper...

The one nice thing is at least it won't be sold out and then scalped into oblivion keep in mind most spent well over $1k for their 3080s same with the 6800XT.
I would agree, but the issue with the 4070 is it's already at scalped prices, it costs 30% more than what I paid for the 3070 in 2021 (keep in mind I paid a bit more than the regular price).
#43
john_
Here it is. An RX 6800 16GB card drops at $469 and ll you read in comments is negativity.

I hope AMD brings nothing new the next few years, never release other 7000 series cards for desktop and everyone here nagging, to go and pay $469 to Nvidia, instead of AMD, for an 8GB ultra butchered RTX 4060.
I am pretty sure they will be super excited with Frame Generation and RT performance with mid settings at 1080p.
Posted on Reply
#44
mäggäri
evernessinceIf your card struggles to even play a game, frame generation is the last thing you want to enable as that's where the latency hit is the largest and where the visual artifacts are most visible. AS HWUB pointed out, ideal range for frame generation is 70 - 120 base FPS. Outside that it doesn't make much sense and by extension it doesn't make sense that it'd extend the life of a card outside of niche scenarios. In addition, it will do nothing if your card runs out of VRAM. That's a scenario that's likely given the 4070 and 4070 Ti are already skirting the line in that regard.

If you are planning on keeping your card for a longer period of time, it's a far better idea to get one with 16GB+ instead of relying on fake frames with serious drawbacks.
Didn't say a word about FG, which is new feature for 4000 series.

DLSS is basically dynamic resolution, that is done differently on PS5, but the result is the same.


If VRAM would be such issue, all games became unplayable overnight. Lots of sheep have lost common sense and do not seem to understand what they read, nor what is presented in reviews. :)
Posted on Reply
#45
BoboOOZ
Xex360I would agree, but the issue with the 4070 is it's already at scalped prices, it costs 30% more than what I paid for the 3070 in 2021 (keep in mind I paid a bit more than the regular price).
And it's also more cut down compared to the higher tiers, it's actually a 3060ti die.
Posted on Reply
#46
Why_Me
Xex360The 4070 is rubbish it launched at the price of the 3080, ridiculous for such a tiny GPU. Maybe if it was a proper 70 card beating the 3090ti it would be less insulting.
The 4070 12GB has a MSRP of $600 ... the 3080 10GB has an MSRP of $700.
Posted on Reply
#47
mäggäri
john_Here it is. An RX 6800 16GB card drops at $469 and ll you read in comments is negativity.

I hope AMD brings nothing new the next few years, never release other 7000 series cards for desktop and everyone here nagging, to go and pay $469 to Nvidia, instead of AMD, for an 8GB ultra butchered RTX 4060.
I am pretty sure they will be super excited with Frame Generation and RT performance with mid settings at 1080p.
No need to enable Frame Generation, enabling DLSS 3.0 will do fine. :)

Pro-AMD bois screaming about DLSS and twisting facts, when AMD was only able to produce FSR, which gives around 5fps and makes ppl stare at porridge covered graphics.

I have been testing 3070M 140W with DLSS in Warzone only, so could be that I am very wrong with my thoughts. :)

According to Gamers Nexus 3070 review, nice card gets even nicer when tweaking or optimizing the perf/W. In their review, was shaking hands often with 3090, and more often with 3080. Now imagine 150W 6800XT or 3080, laughter begins.

I just can't un-see the perf/W possibilities on this release. 4070FE cooler was even without vapor chamber and still enough cooling easily. Good direction.
Posted on Reply
#48
john_
mäggäriwhen AMD was only able to produce FSR, which gives around 5fps and makes ppl stare at porridge covered graphics.
Thank God we have objective posters like you.

Everyone says that the best thing about 4070 is efficiency. Thank you captain Obvious.
I bet you have an Intel CPU,... for efficiency.
Posted on Reply
#49
Hecate91
mäggäriIf VRAM would be such issue, all games became unplayable overnight. Lots of sheep have lost common sense and do not seem to understand what they read, nor what is presented in reviews. :)
If you paid attention to what is being said in reviews, you would understand why VRAM is a problem. Running out of VRAM causes frame rate drops and worse 1% FPS low rates.
Posted on Reply
#50
pavle
RX 6800XT is the sensible choice with its VRAM and raster ability if one wishes not to climb aboard nvidia's train to oblivion (if we skip the Fake_Frames™ feature entirely).
Posted on Reply
Add your own comment
Aug 4th, 2024 01:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts