• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800

Bad value compared to the RTX 3070? Sorry, what?

The RX 5800 is 16% more expensive than the 3070 while being 6-9% faster. I also suggest you to check the newest game benchmarks: out of the 4 games including Godfall, WD: Legion, AC: Valhalla and Dirt 5, in 3, AMD cards are crushing NV counterparts, so overall, the difference is even bigger. You can expect a 10-12% difference in average. It offers TWICE MORE VRAM, which is really important in 4K, as there are already games like Doom Eternal which run out of 8GB, and the 3070 takes a hit in performance there if you don't lower settings. 2080 Ti was considered a 4K GPU. 3070 is on the same performance level, so it is also a 4K GPU. However, the flagship Turing card had 11 GB VRAM while 3070 only has 8. That will be a problem in the long run in 4K. With the 6800, you won't have that problem. Also, you have significantly less power consumption, therefore the 6800 crushes the 3070 in efficiency. The only difference is the RT performance but TBH, how many games support it? How many games have proper RT support that is really visible? Also the 6800 can be OCd better than the 3070... :D I can't even remember when an AMD card could OC better than its NV rival.


Same to you: 16% more expensive, 10-12% faster (taking newest benchmarks into account), double VRAM, significantly less power consumption, lower temperatures, better OC. Does the things mentioned aside performance doesn't justify a 4-6% price difference?
If and only if raster gaming performance is what you’re looking for.

Set aside productivity performance and RT+DLSS which Nvidia dominate anyway, Nvidia's extra features like RTX Voice, RTX Broadcast, and whatever other shit they come up with their tensore cores in the future, hell even in-house game streaming solution (idk how to articulate this feature. Just think it as PC game streaming on mobile/laptop. Separate from Geforce Now) AMD is pretty cocky asking 80 dollars more for less features. I'm not saying to price $50 less than 3070 like 6800 XT is doing to 3080. But still looking at the performance I'd happily pay $20 more (at $520) or if I'm fanboy enough $50 more (at $550). But $80 just sounds too close to $100 and I'd rather buy a 3070 and Cyberpunk 2077.
 
Well if you say that a 10% difference is performance is the same, ok. However, I can't understand how can't you say a word about the half amount of VRAM on the 3070 which will definitely not be enough for 4K in the long run.
It's actually 8% best case scenario (4k, RTRT disabled). That's how much that 8GB VRAM hinders the 3070.
Unfortunately , my crystal ball is in for repairs, I can't predict the future like you do.
 
Last edited:
6800 seems more worth it than 3070 for the price. But the problem is there're nowhere to be found for MSRP. However is arguing that $80 is a lot for jump from 3070 to 6800, because it's a lot for some - that's a no point. If you're struggling with money - you're not byoing a fking $500 gpu in a first place. You sit happy with a budget build or a gaming console as your only system at best. Period.
 
Well if you say that a 10% difference is performance is the same, ok. However, I can't understand how can't you say a word about the half amount of VRAM on the 3070 which will definitely not be enough for 4K in the long run.

I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.
 
The 6800 is their highend card that is going up against the 3080 not the 3070. If you don't want to spend the 80 bucks then wait for the 6700. I hate to say this in a sentence but $580 isn't bad for a highend card with that amount of vram.
Oh yeah forgot about RX 6700 series. According to rumors, 40 CU RDNA 2 cards gonna clock even higher than the 6800 series ones. Guess AMD's gonna cradle 3070 from low-end and high-end but not directly compete against it (pricing wise).
 
I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.

why not? most 4k gamers only want to hit 60 fps, and once oc'd i think it will hit that in every game across the board, its only 5 fps short in most games I see at 4k might have to lower settings just a smidge in some games, but won't be noticeable to maintain 4k 60 fps. not bad imo. AMD is kicking major butt lately imo, it's awesome.
 
I've been a long time follower and I really appreciate the quality of the GPU reviews here! :)
This time I am a bit confused about the power consumption values.
Average gaming power consumption for both AMD cards seems to be quite low compared to other reviews.
I suspect that this time the cards may be underutilized in Metro: Last Light at 1920x1080.
 
Last edited:
why not? most 4k gamers only want to hit 60 fps, and once oc'd i think it will hit that in every game across the board, its only 5 fps short in most games I see at 4k might have to lower settings just a smidge in some games, but won't be noticeable to maintain 4k 60 fps. not bad imo. AMD is kicking major butt lately imo, it's awesome.

If you're aiming at 4K at 60 FPS (without heavily dropping in-game graphics settings), then both should be fine, although the RTX 3070 would likely suffer quicker once all the VRAM is used up. I can see this set up good for HTPC setups.

But for 2160p75 or higher? Best to get a 6800 XT or RTX 3080.
 
6800 seems more worth it than 3070 for the price. But the problem is there're nowhere to be found for MSRP. However is arguing that $80 is a lot for jump from 3070 to 6800, because it's a lot for some - that's a no point. If you're struggling with money - you're not byoing a fking $500 gpu in a first place. You sit happy with a budget build or a gaming console as your only system at best. Period.
Of course it's a point. You don't have to be struggling with money to see that you can put $80 to better use. For example, if you were building a whole system, that $80 could buy you a high-end CPU cooler or almost one TB of SSD storage.
 
Average gaming power consumption for both AMD cards seems to be quite low compared to other reviews.
I suspect that this time the cards may be underutilized in Metro: Last Light at 1920x1080.
Because Metro Last Light isn't a permanent 100% load. It is very dynamic with scenes that don't always use the GPU at 100%. This gives clever GPU algorithms a chance to make a difference. I found this a much more realistic test than just Furmark or stand still in Battlefield V at highest settings

Edit: added this on the power page
Our Metro Last Light power measurements seem to be extremely well-suited for the Radeon RX 6800 Series. It looks like AMD has mastered dynamically reacting to situations where GPU load isn't at 100% all the time, with sub-millisecond accuracy. This will benefit your normal gameplay sessions, too. That's the reason I picked Metro LL in the first place, it runs a more realistic test scenario than just power measurement with Furmark or stand still in Battlefield.
 
Because Metro Last Light isn't a permanent 100% load. It is very dynamic with scenes that don't always use the GPU at 100%. This gives clever GPU algorithms a chance to make a difference. I found this a much more realistic test than just Furmark or stand still in Battlefield V at highest settings
I do not contest this, but there could still be a CPU bottleneck at play here. If you attain the same values at higher resolutions, then your point is valid.
Power consumption tests at 1440p or 4k would be nice anyway, as an added nuance! ;)
 
I do not contest this, but there could still be a CPU bottleneck at play here. If you attain the same values at higher resolutions, then your point is valid.
Power consumption tests at 1440p or 4k would be nice anyway, as an added nuance! ;)
Already running :) Just for you
 
You are too kind! :D
bah .. time to retest at 1440p... will first check what's happening with nvidia in the same test

fkkext5ewr.jpg


i always use "warm" results btw
 
Last edited:
3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
Of course there's the matter of stock, but that will sort itself out eventually.

Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.

The frametime variance seem much lower on the RX 6800 series vs the nVidia lineup, which should result in much smoother gameplay. If this holds for most of the tested games, then the 6800 will give you much much better experience than the 3070 as the percentage difference suggests.
 
bah .. time to retest at 1440p... will first check what's happening with nvidia in the same test

fkkext5ewr.jpg
I really have to tip my hat to you for this reaction! :respect:
 
Every retailer on the planet sells the RTX3070 at a much higher price than MSRP yet most of you keep quoting it. The prices will not go down even when the market will be saturated, as Gamers Nexus presented in their RTX3000 MSRP video. Nvidias profit margin is over 9000!!! :)) .... I just managed to buy a RX6800 at launch for £530.00 and I`m happy :p
 
I really have to tip my hat to you for this reaction! :respect:
thats 3080
wczrcagnak.jpg


not sure what to do now? if i had to start from scratch i would probably retest on 1440p nowadays, but testing 6800 xt at 1440p and 3080 at 1080p would be unfair

edit: now that I'm thinking about this. doesn't it look like amd simply deals better with the situations where load isnt 100% at 1080p?
 
Last edited:
thats 3080
wczrcagnak.jpg


not sure what to do now? if i had to start from scratch i would probably retest on 1440p nowadays, but testing 6800 xt at 1440p and 3080 at 1080p would be unfair
That's a tough one, but I suppose, for the sake of completeness, an added graph with average gaming power consumption values at 1440p (or even 4k) would clarify things a bit.
It seems that this new generation of graphics cards cannot be fully saturated by the current CPUs at 1080p.
It is your call, of course, if you are willing to add these retroactively to the nvidia reviews. It seems like a lot of work...
 
That's a tough one, but I suppose, for the sake of completeness, an added graph with average gaming power consumption values at 1440p (or even 4k) would clarify things a bit.
It seems that this new generation of graphics cards cannot be fully saturated by the current CPUs at 1080p.
It is your call, of course, if you are willing to add these retroactively to the nvidia reviews. It seems like a lot of work...
i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?

pdcvvx8dau.jpg
 
Last edited:
3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
Of course there's the matter of stock, but that will sort itself out eventually.

Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.
If an extra $80 is a lot of money for someone willing to spend $500 on a video card, then they really can't afford a $500 video card either.

AMD/Radeon AIB's should be able to sell every 6800 series 16GB card available for the next several weeks, at least. There is no pressure to price it lower, or offer a cheaper 8GB version at this time.

According to this review it appears the 6800 has better RT performance than a 2080 Super at resolutions with playable framerates. So it appears that AMD's slowest 6800 offering delivers better RT performance than ~99% of NVIDIA cards with dedicated RT hardware sold to date. Could the 6800 series RT performance been better? Sure. It the 6800 series a terrible value because it only outperforms ~99% of all NVIDIA RT cards sold to date? Probably not.
 
Last edited:
i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?
I think nobody would be bothered by 2 extra graphs at 1440p and 4k for gaming power consumption, as 1080p, 1440p and 4k are all valid use case scenarios. I would find this approach (3 graphs) quite useful, as I nowadays mostly game at 1440p high refresh rate, with the occasional 4k/60. No more 1080p for me. :D
It would present a more nuanced picture of gaming power consumption.
In summary: average power for warm cards at 1080p, 1440p, 4k + max gaming power at 4k.
 
In summary: average power for warm cards at 1080p, 1440p, 4k + max gaming power at 4k.
i think that would be too much data? because we also have idle, multi-monitor, media playback and furmark
 
i think that would be too much data? because we also have idle, multi-monitor, media playback and furmark
If I'd have to choose one, I would choose the 4k data then, as it is the most GPU bound result and frankly still valid gaming power consumption, even if pushing the power load.

i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?

pdcvvx8dau.jpg
6800XT is still more power efficient at higher resolutions, even if not by that much. ;)

i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?

pdcvvx8dau.jpg


edit: now that I'm thinking about this. doesn't it look like amd simply deals better with the situations where load isnt 100% at 1080p?
Looking at the overlapped graphs, it does seem as if that is the case (AMD dealing better with lower loads), but averages would be helpful here.
 
Damn, the 6800 looks like a sweet upgrade for me, even though I think that the card is priced $50 higher than where it should be.

Anyone want to buy my 5700XT with Bytski water block???
 
Back
Top