• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

please elaborate
6800 bar is using different color and just taking color into account it's not possible to say if it is with or without RT.
But it's just one line that needs reading, so it's quite easy to perceive still.

Compare it to say this color scheme, where you see, hey, more saturated vs less saturated bars:

1605772639573.png
 
6800 bar is using different color and just taking color into account it's not possible to say if it is with or without RT.
But it's just one line that needs reading, so it's quite easy to perceive still.
Oh I see. Our standard graph colors are green for primary tested, light blue for secondary tested, so I wanted to keep that highlighting scheme to guide the eye

I think this is worse, especially if you take it in the context of the other charts in the review
co5gblv9ja.jpg
 
TLDR:
  • Truly stellar rasterization performance per watt thanks to the 7nm node.
  • Finally solved multi-monitor idle power consumption!!
  • Quite poor RTRT performance (I expected something close to RTX 2080 Ti, nope, far from it).
  • No tensor cores (they are not just for DLSS but various handy AI features like background noise removal, background image removal and others).
  • Horrible x264 hardware encoder.
  • Almost no attempt at creating a decent competition. I remember AMD used to fiercly compete with NVIDIA and Intel. AMD 2020: profits and brand value (fueled by Ryzen 5000) first.
Overall: kudos to RTG marketing machine which never fails to overhype and underdeliver. In terms of being future-proof people should probably wait for RDNA 3.0 or buy ... NVIDIA. Even 3-5 years from now you'll be able to play the most demanding titles thanks to DLSS just by lowering resolution.

The 4 titles they want you to play, yes. You enjoy that Ray Traced, DLSSed Nvidia-chooses-for you what games you play-GPU, with a power consumption that is worse than anything AMD ever managed to produce.

I'll take one that's just going to give the best perf/dollar. They can stick that RT nonsense where the sun doesn't shine for now. There's still barely any content and what's there is completed in a few hours anyway. And even so - the card still runs it. wrt RT, little has changed since Turing, and the console ports will do RT just fine on either GPU regardless, because they do on the console as well.

I think AMD picked a fine balance in that regard. Low on RT perf, stellar on where the performance really matters. No tensor cores :oops:... they only serve for vastly underused DLSS. Stop fooling yourself with things that really are irrelevant to the segment the product's for. The feature set difference is just not there anymore unless you really want to bitch about it. Its a bit sad, even. The hardware decoder, okay. But the rest? Please...

Nah... after a long time in the green zone, its switching time for me... I was never a fan of hot/power hungry parts... this hasn't changed.

Well looks like FidelityFX Super Resolution just like DLSS is game-by-game basis

At least I hope that they invest all those Ryzen profits for game development.

Its the usual 'AMD have to have it too' movement and this alone signifies it'll die off eventually. Perhaps we should view it as an additional tool to make high profile games look better or, counteract their shitty optimization / RT perf hit.

Neither company is going to happily crunch down everything forever.

You are right about the stock, but I still see no reason to buy 6800XT over 3080.

For just $50 more, you will get :

1) Better RT
2) DLSS
3) More stable drivers.

And +30% power consumption. So its quite a bit more than 50 bucks, even if your power is cheap. You can safely double that over the lifetime of the GPU.

But let's face it, if you're a value king you just go for the 6800 or further down the stack. The 3070 is also much better value.
 
Last edited:
Oh I see. Our standard graph colors are green for primary tested, light blue for secondary tested, so I wanted to keep that highlighting scheme to guide the eye

I think this is worse, especially if you take it in the context of the other charts in the review
co5gblv9ja.jpg
Agree that that is worse,
 
@W1zzard
graphics in frametime are a little bit confusing at best (sorry, i have to say that)
at least try to match color with brand, amd = orange/red, nvidia = green (or gray),
then it can be useful to put all graphics of a comparison near each other varying only game titles, acually you have:
BF5 6800xt vs 3080
Borderland 3 6800xt vs 6800
Borderland 3 6800xt vs 3070
Borderland 3 6800xt vs 3080
Civ 6 6800xt vs 3080
DMC 5 6800xt vs 3080
Doom E 6800xt vs 3080
FC 5 6800xt vs 3080
G5 6800xt vs 6800
G5 6800xt vs 3080
G5 6800xt vs 2080S
G5 6800xt vs 2080Ti
ME 6800xt vs 3080
SB 6800xt vs 3080
TR 6800xt vs 3080
W3 6800xt vs 6800
W3 6800xt vs 3070
W3 6800xt vs 3080
W3 6800xt vs 2080Ti

Logic of this order? i think will be best 6800xt vs 3080 (all games) then 6800xt vs 2080Ti (all games?) and then 6800xt vs 6800 (all games?)
3070/2080S is better suited against 6800
 
1) Better RT in a handful of nVidia sponsored games, but likely not in newer games like Dirt 5 that use DXR 1.1
2) DLSS a TAA derivative upscaling that blurs stuff, but will be countered by AMD anyhow
3) More stable drivers. a FUD, pro-green post would not feel complete without it

FTFY
 
"Not great not terrible"

so amd is back in business at high-end which is good for end consumer

nv is a little bit in corner ;they knew ahead launching their new cards that competition exist; this is why we saw those "low" prices which killed their previous pricing; i don't think they will cut the prices for the 3070&3080 as they are positioned well perf/price wise (not to mention these are quite expensive hardware-wise) ; only the 6900 will be an issue for them as seems it will be really close to 3090 considering current result...

overall good news and when cards will be plenty older ones will be cheaper... i'll wait for a cheap 5700xt or 2080 when they go down to 300-350 € ... i'm patient..
 
Dirt 5 is interesting to me. While I don't know if its DXR 1.1 implemention or relatively light ray tracing effects leads to better performance for 6000-series card, I would like reviewers to do in-depth test on it.

Now I don't usually get between pointless banter among emotionally charged fans but I gotta stop you there bud. DLSS (or least the 2.0 revision) is awesome bud. I don't give a rat's arse about ray tracing but DLSS is mind-blowing. And what converted me?
This video:
 
I don't usually get between pointless banter among emotionally charged fans
You are so classy.

mind-blowing
It is in the eye of the beholder, nothing to argue about.
It adds blur, it wipes fine details, it is mostly TAA, not NN (DLSS 1.0 was pure NN), but as all TAA derivatives, it has its uses.

Referencing it by upscaled resolution is spreading FUD.

the ars technica review is also a good one, shows it beating 3090 in 5 titles, meh im gonna have fun gaming, enough of this nonsense debating

In computerbase review they had a separate section for newer games and it looks pretty sunny for AMD:

1605780865616.png
 
Well, slower than the 3080 while similarly unobtainable, and with less features. I don't care about ray tracing, but I have a 3840x2160 120Hz screen to feed with frames and the 6800XT just doesn't cut it. Also, DLSS will probably future proof the 3080 much better as far as achieving playable framerates. I have to say this launch is heavily overhyped. It's a good GPU, which couldn't be said about AMD products for years, but not better than the competition. So, I won't cancel my 3080 order.
 
You are so classy.


It is in the eye of the beholder, nothing to argue about.
It adds blur, it wipes fine details, it is mostly TAA, not NN (DLSS 1.0 was pure NN), but as all TAA derivatives, it has its uses.

Referencing it by upscaled resolution is spreading FUD.



In computerbase review they had a separate section for newer games and it looks pretty sunny for AMD:

View attachment 176215
Well, I won't argue if DLSS is TAA or NN since I'm not knowledgeable about it but like 2kilksphilip said, I think it's awesome as a pure anti-aliasing option. After killing MSAA, modern AA has never been the same. DLSS as AA replacement changes that for me.
 
I won't argue if DLSS is TAA or NN since I'm not knowledgeable about it
Let's not pretend it is a secret, shall we:

So for their second stab at AI upscaling, NVIDIA is taking a different tack. Instead of relying on individual, per-game neural networks, NVIDIA has built a single generic neural network that they are optimizing the hell out of. And to make up for the lack of information that comes from per-game networks, the company is making up for it by integrating real-time motion vector information from the game itself, a fundamental aspect of temporal anti-aliasing (TAA) and similar techniques. The net result is that DLSS 2.0 behaves a lot more like a temporal upscaling solution, which makes it dumber in some ways, but also smarter in others.



I think it's awesome as a pure anti-aliasing option.
And there is nothing wrong with that, especially if you judge it first hand, and not "some youtuber said so"
 
Amazing performance per watt, and overall performance itself. It is great to see AMD near the top.

But raytracing performance is pretty bad, and right now there is no "DLSS" to help with that. A good first step, though.

I hope to see some price wars with NVIDIA and Intel next year.
 
@W1zzard I have a question about ROPs.
In the article it is stated that there is 128 of them, yet GPU-Z only state half of it.
And I can't see anywhere AMD stating it having 128.
Thanks for your time.
 
@W1zzard I have a question about ROPs.
In the article it is stated that there is 128 of them, yet GPU-Z only state half of it.
And I can't see anywhere AMD stating it having 128.
Thanks for your time.
GPU-Z is wrong, this will be fixed in next version.

Reason is that AMD's driver reports the wrong number to GPU-Z. AMD has doubled the pixel output per ROP unit with RDNA2, but the AMD driver doesn't take this doubling into account yet. They'll certainly fix it at some point, for now I added code to GPU-Z: "If Navi21 & (RopCount == 48 or RopCount == 64) , Then RopCount = RopCount * 2".

hf7ooepnqm.jpg
 
Can't actually buy a 6800XT right now so comparing regional prices is moot point.
I don't see any wrong buying either 6800XT or 3080 if they are available where you live, these 2 are so close and represent the best GPUs from Nvidia and AMD.

AMD has done the impossible here, I was thinking that the 128MB Infinity Cache might have some downfall in the frametimes department it is rock solid there.

Now the question is can AMD produce enough cards to satiate the starving gaming crowds :D. Seems like both AMD and Nvidia are both laughing all the way to the bank this season.
It isn't if a $700 card is listed with this MSRP in the States (as out of stock at online retailers) and in EU becomes a $1000 MSRP card (also listed as out of stock). It's still a valid point and we'll be able to verify this once 6800's AIB designs will be listed at retailers next week.

The argument about $50 price difference is only valid in the US right now. FE vs Ref it's 1:1 in EU, with odds swayed toward AMD because they're pushing reference 6800's through regular retail channels instead of a broken website which only shows "Coming soon". At least with AMD you can place an order and get in the queue.
 
Last edited:
all pc building companies do this as far as im aware, nvidia did as well.
Its usually lifted a day or so before release.
How so, if they released a product that was available for preorder and promised something they couldn't deliver, and held reviewers to keep people from backing out maybe. But it's a great product, the only issue is available cards.
So people can make an informed buying decision but that's not how scams work. My microcenter has been flooded with the horde. RTX 3000, Ryzen 5000, PS5/XBSX and now RDNA2 shortages have cause the stores to hand out vouchers still every morning.
 
GPU-Z is wrong, this will be fixed in next version.

Reason is that AMD's driver reports the wrong number to GPU-Z. AMD has doubled the pixel output per ROP unit with RDNA2, but the AMD driver doesn't take this doubling into account yet. They'll certainly fix it at some point, for now I added code to GPU-Z: "If Navi21 & (RopCount == 48 or RopCount == 64) , Then RopCount = RopCount * 2".

hf7ooepnqm.jpg
Ah ok. I see now. Thanks again boss.
 
Irrespective of what the benchmarks are saying (about RT), given that both Nvidia and AMD will see benefits from future driver updates (I assume) or game advancements, I'd like to find some theoretical numbers for the respective hardware implementations. Forget TFlops etc.

This might turn out to be hard to determine and I don't have time to figure it out right now, but there are numbers out there that suggest Big Navi can do 1 ray/triangle intersection per CU per clock - so 80 CUs at 2200mhz would be 176,000. Is this good? How would this compare to Ampere - is it even comparable given their relative implementations?

It's got me curious...
 
Irrespective of what the benchmarks are saying (about RT), given that both Nvidia and AMD will see benefits from future driver updates (I assume) or game advancements, I'd like to find some theoretical numbers for the respective hardware implementations. Forget TFlops etc.

This might turn out to be hard to determine and I don't have time to figure it out right now, but there are numbers out there that suggest Big Navi can do 1 ray/triangle intersection per CU per clock - so 80 CUs at 2200mhz would be 176,000. Is this good? How would this compare to Ampere - is it even comparable given their relative implementations?

It's got me curious...


be careful ... curiosity killed the cat :laugh:

back to business is an interesting question but i'm not sure if it can be compared 1:1 due the different architecture...
 
This thread confirmed has been bombarded by trolls.
Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...) and somehow ignoring their pride about performance/power consumption/heat.
hello...

oh yes. winter is coming.

yup the amount of nv shilling going on in this thread is ridiculous. It making the thread almost unreadable!

Why are they so angry competition is good for the market.
 
Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.

Nah, you misunderstand, Nvidia did it for all that production capacity that Samsung offers! And there is STILL a demand problem, go figure!

But yeah, this is leaps and bounds ahead. Apparently Nvidia's Ampere happened at AMD, or something. It pales in comparison - on a technical level that is, because the stack is competitive now in both camps. AMD has some pricing to fix if you ask me for the 6800. The 6800XT is in an okay place but still brings up the dilemma wrt ray tracing for some. A bit of a lower price point would have eliminated that outright - or at least more so.
 
Back
Top