• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6800

Looks like both cards are out of stock. Hopefully it will be better next week when board partners release.
 
I'm curious how AMD's take on DLSS will end up. Will it be developer dependent on a game by game basis or game agnostic perhaps and just works on all games retroactively perhaps more like similar to how mCable works, but obviously much more sophisticated and advanced. I'd rather the later be the cases even if the performance was half as good as DLSS with similar image quality. Having it just work and on all titles is a hell of a lot more useful in the grand scheme. I'd say both are useful, but in the big picture DLSS in the way it works is less optimal in ways than the way mCable handles upscale from a approach standpoint since it just works ahem like Jensen touted RTX ironically. Let's hope AMD's approach is closer to mCable as it would be more ideal. Where I think DLSS has it's strengths and merit is closer to AMD with Mantle to squeeze and eek out that extra bit of closer to the metal hardware performance out though requires developer attention at the same time which is a big drawback.

I wonder if AMD's upscale can be combined with Radeon Boost. It could be used either before or after. It seems like 2 options are possible upscale the native resolution to a target which by extension gets applied to Radeon Boost as well or downsample with Radeon Boost then strictly upscale the in motion resolution. The latter option would have cleaner more pure stationary image result, but less of a combined performance boost performance impact. Additionally either options could have custom arbitrary frame rate target use case scenario's giving more optimal results when and where needed for image quality in relation to motion and input lag. Combine that as well with having variable rate shading able to do similar frame rate triggered use scopes and that could have a really big impact on general performance to optimize the hardware.
 
Really excited as this offering is great
 
i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?

pdcvvx8dau.jpg

I came on with very similar questions to Bobmeix - I'd been thinking the power consumption figures looked suspiciously good. Definitely re-iterate his comments about how great your reviews and work is - best GPU reviews on the web in my view.

I think it's worth noting that performance per watt is also impacted by this, and that is calculated separately for 1080p, 1440p and 4k. Taking power consumption figures at 1080p and using them for 1440p and 4k Performance per Watt charts seems a bit arbitrary now we know there are significant differences in power consumption at different resolutions. Depending on the amount of work you are willing to do, the options look to me to be:
  1. Most work, most accurate: Run power consumption at all three resolutions, publish all three and use resolution-specific figures in performance-per-watt
  2. Midground: Add either 1440p or 4k to the power consumption test and take an average of the two resolutions. 4k seems the obvious as you get both ends of the spectrum that way
  3. Least work, least accurate: Move from the 1080p to 1440p for the core benchmark on the assumption 1440p is likely to be in the middle of any performance gradients like we're seeing here
I think any of these would be an improvement - how many people are realistically going to buy a 240-360hz monitor to justify a 3080 or 6800xt at 1080p? 1440p and 4k are the more obvious use cases...
 
Damn look at that efficiency! That's an insane performance/watt showing from AMD. I thought I was reading it wrong at first. Also the overclocking gains is not bad at all at 9%+. Seems like a better buy than the 6800XT imo especially since when RT on they perform sorta similarly...
 
Yeah more VRAM is nice but it shouldn't be $70 more. As for the Smart Access Memory (or Resizeable BAR), Nvidia already assures people that they will support the feature soon.

Ridiculous. Before AMD did it people were paying $70 more to go from 4GB to 8GB, not you get 16GB total and you are complaining. I can't even...
 
The frametime variance seem much lower on the RX 6800 series vs the nVidia lineup, which should result in much smoother gameplay. If this holds for most of the tested games, then the 6800 will give you much much better experience than the 3070 as the percentage difference suggests.
Only in some titles, according to this: https://www.techpowerup.com/review/amd-radeon-rx-6800/39.html
And in some titles Nvidia still gets ahead.
Not worth an extra $80 to me.
 
6800 is meh.
In most games it’s inferior to 3070.
If no 6800 nano release, then a modded ZOTAC 3070 will be currently the best one for ITX builds.

But there’s a good news about 6800 - it has low spikes - only 423W within 130μs.
That means we can use a Corsair SF450 on it with no risk!
 
Last edited:
I had no problem getting one
Look at this guy^ He's so happy it's unbearable. I lost count of how many threads he visited and replied he's managed to snag a card. :kookoo::laugh:
 
Performance per watt calculation is really off when you base the power consumption in a much older game at 1080p and divided by the avg FPS of all games tested.
Does the 6800 use 210W in new games like Metro Exodus, AC Valhalla, etc... ? if it doesn't then the power consumption figure is wrong, so does the performance per watt.

Maybe we can use the Watt per FPS figure ? by locking a game to 120fps, and measure the avg power consumption. This is still a valid comparison since many people play game with locked FPS.
 
i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?

pdcvvx8dau.jpg

Purely IMHO: power consumption at a resolution that makes the most sense for that particular card? E.g. for 4k card at 4k, while for a cheap 1080p card at 1080p. Or would that make the comparison unfair?
 
I like the 6800, but it should have cost 499usd, not 579. The value is not good vs 6800XT and 3070 unless the exceptional power consumption is important for you.

6800 should be a great card for notebooks, make a slightly downclocked version and make a 110-120W variant. It should beat any existing notebook GPU by far.
 
  • Like
Reactions: bug
Purely IMHO: power consumption at a resolution that makes the most sense for that particular card? E.g. for 4k card at 4k, while for a cheap 1080p card at 1080p. Or would that make the comparison unfair?
That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?
 
  • Like
Reactions: bug
Like I feared. Bad value comapared to RTX 3070 across the bard. But I think AMD gonna price drop it when RTX 3070 Ti launches. Because unlike Nvidia, AMD can't squeeze in another SKU.

They are going to have to move on price, but a big factor this time around is also availability, and it also impacts price.

Still, 8~10 GB is not a great thing for this performance level, we're already seeing those numbers allocated today.

That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?

The answer is people need to be educated on what numbers they're really looking at.

CPU performance is not going up as fast as GPU performance. Resolutions have made a quadruple jump from 1080p > 4K over the course of a few generations. Yes, this is going to shift the balance around in strange ways... its like reading the news. Some people just read headlines and misinterpret the better half of them, others read the article to figure out what's really happening.

If you don't draw the line there, any news outlet is on a race to the bottom. You can't cater to stupid - we have social media for that.
 
Over 2.5 Ghz overclocked almost 25% higher clocks compared to previous generation, that's crazy.

I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.

The 6800 is totally capable of 4K.

Performance per watt calculation is really off when you base the power consumption in a much older game at 1080p and divided by the avg FPS of all games tested.
Does the 6800 use 210W in new games like Metro Exodus, AC Valhalla, etc... ? if it doesn't then the power consumption figure is wrong, so does the performance per watt.

Maybe we can use the Watt per FPS figure ? by locking a game to 120fps, and measure the avg power consumption. This is still a valid comparison since many people play game with locked FPS.

What a bizarre and contrived way to try and invalidate AMD's far superior performance/watt, you gave me a good laugh.
 
The 6800 is totally capable of 4K.
I kinda agree with him. Looking at the graphs 6000 series is bandwidth starved at 4k it seems.
While 3000 series is no better with their cheapskate VRAM size.
 
The 6800 is their highend card that is going up against the 3080 not the 3070.
The benchmarks don't show that. The 6800 seems to be firmly a 3070 competitor as it's wedged inbetween the 3070 and 3080, performance wise. AMD would be wise to adjust their prices quickly. And let's face reality, AMD has to be planning the 6900 & 6900XT. Those will be the models that compete(or even beat out) the 3080, 3080ti and 3090.
I hate to say this in a sentence but $580 isn't bad for a highend card with that amount of vram.
Why would you hate to say that? It's an excellent point! The 6800 & 6800XT are going to be a god-send for anyone wanting to do rendering and other tasks that require high amounts of VRAM. In that context, the price is excellent and the 6800 is a very well balanced card for the money.

I said this over in the 6800XT review, but it bares repeating, Welcome back to the GPU performance party AMD! Well nice this!
 
I kinda agree with him. Looking at the graphs 6000 series is bandwidth starved at 4k it seems.

I see no real evidence for that, even in Red Dead Redemption 2 one of the worst games to run in 4K the performance scales in the same way it does for something like a 3090 which has a lot more DRAM bandwidth. Had that being the case performance at 4K should be much worse.
 
That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?
I still believe that 3 graphs for average gaming power would clarify that. It's better to have more than less info.
This would also make the performance/watt graphs more relevant!
 
hi1v5lxnxe.jpg

This is all my games at all resolutions on 6800 XT. I think for next round of retesting I'll revise my power testing to be a bit more demanding
 
I see no real evidence for that, even in Red Dead Redemption 2 one of the worst games to run in 4K the performance scales in the same way it does for something like a 3090 which has a lot more DRAM bandwidth. Had that being the case performance at 4K should be much worse.
I'm speaking from overall benchmarks I've seen so far. Let's treat 3080 and 6800 XT are in the same ballpark performance and 1440p as our control environment.
1) In games where 6800 XT leads 3080, in 4k 3080 leads or difference is miniscule.
2) In games where 6800 XT is hair-width slower than 3080, in 4k 3080 leads farther.

6800 otoh is all around faster than 3070 but in 4k the gap becomes narrower.
 
Back
Top