• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

I am happy to see AMD coming back with their GPU division. I am seriously considering the 6800XT, my first AMD card since my Fury X.

I enjoy raytracing when I am playing single player games, but when I play online, I just want high, stable FPS. I don't have time to look at all the shiny reflections, shadows, or other flare that comes along. Honestly, playing Call of Duty Modern Warfare and now Cold War, I really can't tell the difference with raytracing on vs off. I will say that DLSS has been a nice addition in the handful of games I own that support it. Most don't though.

I used to be one for jumping all in on Nvidia's new tech, 3D Vision, PhysX (I owned the BFG Tech standalone card before Nvidia converted PhysX into CUDA with their acquisition), G-Sync, SLI, Pixel Shader 3.0 vs 2.0 (Original FarCry), DirectX 10 on the 8 series, etc.... It is all just fluff and you used to pay through the nose for it. There are always the few games that are used to demonstrate the new shiny features, but really you do not see it catch on until years later. In the case of 3D Vision it sorta died on its own, the support was awful, SLI got the axe, G-Sync you don't need the proprietary module anymore to utilize it. I have owned several G-Sync panels and now have ASUS TUF panels that support both G-Sync and Freesync, works wonderful and doesn't have the $300-$400 mark up. Even those PhysX exclusive features... looking at you Alice, Batman Arkham Series, Ghost Recon Advanced Warfighter, Mafia, and Metro... are mostly fluff and didn't all that much to the immersion of the games. At the time it was nice, but not for the premium you had to pay to get it. All of these proprietary features go to the way side... and are dropped. Raytracing is nice when I take the time to notice it in single-player games, but I learned a long time ago and with quite a bit of money that the new shiny just isn't worth it initially. Kudos that both companies are picking it up and running with it, but it just isn't there yet.

I have never had issues really with either company. I have had Radeon's whose fans quit working and XFX have sent me just the cooler (My 4890 comes to mind and the R9 290X). I can't speak to the HDMI issues on the newer cards, but I never had a problem using them with DisplayPort. I have had Nvidia GPUs and Memory die on me.... MSI 9600GT, MSI and Zotac 570.... those wonder purple and green speckles scattered across the screen.

I know I am preaching to the choir.... but it really is a shame that people are so polarized by brand names..... companies who don't care about you and just want to turn a profit. Some of who don't even by enthusiast level hardware, just keyboard cowboys who have to thump their chest and wave the banner of their fanboyism..... it's really sad.

Just look at what features are important and matter to you, do your research, and buy that. Everyone's needs change over time. I know mine have. I used to be all AMD/ATi, then AMD/Nvidia, Intel/AMD, Intel/Nvidia, and now back to AMD/Nvidia currently. For 4 years I used my laptop as a desktop with the graphics amplifier, and people told me I was nuts. I liked the flexibility in gaming at home and then unhooking the graphics amplifier when I went to a LAN. I still use my Alienware for LANs.

I know it is hard to keep up with, but I really wish moderators would just clean up some of these threads, and not just here on TPU, but 3DGuru, YouTube, etc.... this is really asinine to see.

I don't mind objective based discussion and helping people understand things before making a purchase (I used to work at Microcenter), but the immediate flaming back and forth sucks.... I'd say almost half the comments on a given thread about GPUs or CPUs could be deleted.

I feel old, but I have been into this hobby for over 20 years now.... maybe I am just being senile in my old age.
 
I am happy to see AMD coming back with their GPU division. I am seriously considering the 6800XT, my first AMD card since my Fury X.

I enjoy raytracing when I am playing single player games, but when I play online, I just want high, stable FPS. I don't have time to look at all the shiny reflections, shadows, or other flare that comes along. Honestly, playing Call of Duty Modern Warfare and now Cold War, I really can't tell the difference with raytracing on vs off. I will say that DLSS has been a nice addition in the handful of games I own that support it. Most don't though.

I used to be one for jumping all in on Nvidia's new tech, 3D Vision, PhysX (I owned the BFG Tech standalone card before Nvidia converted PhysX into CUDA with their acquisition), G-Sync, SLI, Pixel Shader 3.0 vs 2.0 (Original FarCry), DirectX 10 on the 8 series, etc.... It is all just fluff and you used to pay through the nose for it. There are always the few games that are used to demonstrate the new shiny features, but really you do not see it catch on until years later. In the case of 3D Vision it sorta died on its own, the support was awful, SLI got the axe, G-Sync you don't need the proprietary module anymore to utilize it. I have owned several G-Sync panels and now have ASUS TUF panels that support both G-Sync and Freesync, works wonderful and doesn't have the $300-$400 mark up. Even those PhysX exclusive features... looking at you Alice, Batman Arkham Series, Ghost Recon Advanced Warfighter, Mafia, and Metro... are mostly fluff and didn't all that much to the immersion of the games. At the time it was nice, but not for the premium you had to pay to get it. All of these proprietary features go to the way side... and are dropped. Raytracing is nice when I take the time to notice it in single-player games, but I learned a long time ago and with quite a bit of money that the new shiny just isn't worth it initially. Kudos that both companies are picking it up and running with it, but it just isn't there yet.

I have never had issues really with either company. I have had Radeon's whose fans quit working and XFX have sent me just the cooler (My 4890 comes to mind and the R9 290X). I can't speak to the HDMI issues on the newer cards, but I never had a problem using them with DisplayPort. I have had Nvidia GPUs and Memory die on me.... MSI 9600GT, MSI and Zotac 570.... those wonder purple and green speckles scattered across the screen.

I know I am preaching to the choir.... but it really is a shame that people are so polarized by brand names..... companies who don't care about you and just want to turn a profit. Some of who don't even by enthusiast level hardware, just keyboard cowboys who have to thump their chest and wave the banner of their fanboyism..... it's really sad.

Just look at what features are important and matter to you, do your research, and buy that. Everyone's needs change over time. I know mine have. I used to be all AMD/ATi, then AMD/Nvidia, Intel/AMD, Intel/Nvidia, and now back to AMD/Nvidia currently. For 4 years I used my laptop as a desktop with the graphics amplifier, and people told me I was nuts. I liked the flexibility in gaming at home and then unhooking the graphics amplifier when I went to a LAN. I still use my Alienware for LANs.

I know it is hard to keep up with, but I really wish moderators would just clean up some of these threads, and not just here on TPU, but 3DGuru, YouTube, etc.... this is really asinine to see.

I don't mind objective based discussion and helping people understand things before making a purchase (I used to work at Microcenter), but the immediate flaming back and forth sucks.... I'd say almost half the comments on a given thread about GPUs or CPUs could be deleted.

I feel old, but I have been into this hobby for over 20 years now.... maybe I am just being senile in my old age.

Oh boy another former FuryX owner. You would know better to stay off that red hyper train.
 
Thank you AMD for not following nVidia off the deep end into 350W TDPs. I totally expected this Radeon to be 350W gaming.
 
Exactly, if the government says you should pay a tax on your tea, you should, and be grateful, for its the people in power we should all trust.

But back on topic, AMD has a product for less that does as well or better and this is the best you can muster? Political BS? Tribalism? No defensible position, just attempting to smear and troll? Give me a break.

Can you point me to a site to buy a 3080Ti? Or a 3080? (They are $1400 each on Amazon for 3080s and the Ti isn't available for months)
 
Low quality post by r9
Somehow, I get the feeling that AMD fanboys are Trump supporters.
Attention everybody we have Antifa member in the house.
 
This thread confirmed has been bombarded by trolls.
Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...) and somehow ignoring their pride about performance/power consumption/heat.
hello...

oh yes. winter is coming.
 
Last edited:
Low quality post by r9
Exactly, if the government says you should pay a tax on your tea, you should, and be grateful, for its the people in power we should all trust, after all Bernie I mean Biden is going to cure cancer.....


And why shouldn't he, after 48 years in power, its the least he can do.

But back on topic, AMD has a product for less that does as well or better and this is the best you can muster? Political BS? Tribalism? No defensible position, just attempting to smear and troll? Give me a break.

Can you point me to a site to buy a 3080Ti? Or a 3080? (They are $1400 each on Amazon for 3080s and the Ti isn't available for months)

Cure cancer right after he defunds police, take our rights to defend ourselves and reduce the age of consent.
 
Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.
Why do you think cache is power hungry? One of the slides shows a cache hit is like 1/6 the power use of going out to memory and it has lower latency and higher bandwidth.

Samsung's 8nm node is a decent bit worse than TSMC's 7 but it's not that bad. Still better than TSMC's 12nm. Ampere seems to just be worse for current rasterized games. Those FP units aren't utilized fully. And Nvidia is pushing the clocks to their max, maybe they caught wind of AMD's performance.
 
This thread confirmed has been bombarded by trolls.
Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...) and somehow ignoring their pride about performance/power consumption/heat.
hello...

oh yes. winter is coming.

Performance (17 REVIEW SITES AVERAGE)

Power consumption/Heat? Go buy a GTX1650 for reduced power consumption and heat.
 
Doesn't look like Nvidia is in trouble

"NVIDIA (NASDAQ: NVDA) today reported record revenue for the third quarter ended October 25, 2020, of $4.73 billion, up 57 percent from $3.01 billion a year earlier, and up 22 percent from $3.87 billion in the previous quarter. "
 
You are right about the stock, but I still see no reason to buy 6800XT over 3080.

For just $50 more, you will get :

1) Better RT
2) DLSS
3) More stable drivers.
for 10 games and if you lucky you play 3 or 4 of them......
What an argument...
If Nvidia fans ony have RTX and DLSS and drivers stability, that means they have....nothing.

The big question : Why NV fans take their precious time to come to talk on a .... AMD thread????

PRICELESS:clap::clap::clap::clap::clap:
 
I purchased an asus tuf 3080 but I’m super happy to see how competitive amd is. Now I see why nvidia had to drop their prices so much this generation. Competition is great and was sorely needed.
 
Performance (17 REVIEW SITES AVERAGE)

sorry, I'm lost. I don't understand why you gave me that link.
care to elaborate?

:oops: :(
Power consumption/Heat? Go buy a GTX1650 for reduced power consumption and heat.

????
:nutkick:
 
The big question : Why NV fans take their precious time to come to talk on a .... AMD thread????

PRICELESS:clap::clap::clap::clap::clap:
Idle mind is the devil's workshop.
I purchased an asus tuf 3080 but I’m super happy to see how competitive amd is. Now I see why nvidia had to drop their prices so much this generation. Competition is great and was sorely needed.
WHAT?!! Nvidia increased price during 2000 series and kept the same price for 3000 series. And before they increased the pricing, they obfuscated the market with 1000 series. How?
Nvidia said:
Well the msrp of our 1000 series card is X dollars. But that's just reference. We'll sell you our premium FE card that's actually X+50 dollars. If you want msrp card, check out our AIB partners, who will almost not release any card at msrp. Or actually sell at our FE pricing which is not msrp

Seriously Nvidia are masters at consumer perception manipulation. And objectively speaking, that's fucking impressive. Look at what people parrot now, "$500 3070 is bringing $1200 2080 Ti performance". Forgetting the fact that if the 2000-series prike hike didn't happen, 2080 Ti would've costed $800-900.
 
This thread has gone to shit thanks to couple posters. Emotionally charged fanboying is one thing but, i'm seening outright shilling. You guys are way too aggressive and obvious.
People will be buying these gpus whether you like it or not, so chill off.
 
Last edited:
Forgetting the fact that if the 2000-series prike hike didn't happen, 2080 Ti would've costed $800-900.

$750 MAX

Seriously Nvidia are masters at consumer perception manipulation. And objectively speaking, that's fucking impressive. Look at what people parrot now, "$500 3070 is bringing $1200 2080 Ti performance".

That made me laugh so hard......people talking about $500 3070 like it's the deal of the century, when the gtx 1070 started at $380 and was equal to the $700 GTX 980Ti.
 
I don't know how politics somehow got inserted into the discussion and don't see how it has anything to do with it. If you can't keep your political bias out, please do not post otherwise you will be thread banned and assessed points.
 
power consumption
Check the discussion in the forum comments of the non-xt review, you’ll understand

something must have gone wrong in some instances even the 6800 non xt beats the 6800 xt..
I thought i replied to you already? What you are seeing is cpu Limited in very specific games due to driver overhead, which bunches up results against an invisible wall, and randomly one will be better than the other, they really are equal in that test, look at the numbers not at the placement of the bar
 
Last edited:
Cyberpunk 2077 looks awesome in trailers. I can't wait to see it in full ray tracing glory.
Same here brother.
Cyberpunk 2077 is the only thing that has pushed me to upgrade this year. Sure, my trusty 1080 still does fine for most games I play at my resolution but I've been waiting almost 8-whole years for this game and you betcha that I wanna be able to play it at its highest visual fidelity when it comes out... if I can grab a 3070/80 by then which looks unlikely :peace:

On the topic of the new RX cards, I gotta say It's much better than I was expecting from AMD due to well... past disappointments with their high-end GPUs relative to NVIDIA's offerings. This year what they've brought forth looks a lot better even though the Ray Tracing performance shown here and in other reviews leaves something to be desired but I feel that just like they have done with Zen over the years, the 2nd generation of their RT accelerators is when they'll really start bringing the heat to NVIDIA's overall performance lead in both rasterized and ray-traced gaming alike, just like they did to Intel with their CPUs. I'm just glad that we're seeing so much competition nowadays :clap::clap:
 
  • Like
Reactions: SLK
Check the discussion in the forum comments of the non-xt review, you’ll understand


I thought i replied to you already? What you are seeing is cpu Limited in very specific games due to driver overhead, which bunches up results against an invisible wall, and randomly one will be better than the other, they really are equal in that test, look at the numbers not at the placement of the bar
That makes no sense if it is CPU limited then again the 6800 xt should be faster or as fast as the 6800 non xt... same driver same "cpu limitation" or bottleneck same game same resolution just more cu units and higher frequency...

On top of that other reputable channels that benchmarked the gpu have different results e.g in Assassin Creed Odyssey you depict in 1080p the 6800 being faster than the 6800 xt and both of them slower than even the RTX 3070 if memory serves me right yet e.g in hardware unboxed in that particular game the rx 6800 xt beats even the RTX 3090!!!

and in general as I said before I think that its clear that RTX 3070 trades blows with the 2080 ti and its not a biased opinion like most big channels showcase that (and I can tell that from personal experience)

and you have the rtx 3070 being faster than even the 6800 xt.... yea sure there is something wrong I am not sure what that something is (methodology, specific bugs with specific games, your setup or maybe you simply did some typos when writing down the results dunno but there is something off here)
'
If I had to guess I would say that due to whatever issue your metrics in some games are off which leads to the "relative performance" score being negatively affected for the 6800 xt


case and point all these reputable channels showcasing that the rx 6800 xt trades blows even with the RTX 3090 while you showcase it as slower compared to even the RTX 3070 while having 2 FPS difference with the RX 6800 non xt

Nothing personal mr admin but something IS off... maybe you got a defective sample?
 
Last edited by a moderator:
In Jan 3080 20 GB with more cores will come out and 6800XT will be really underpowered in 2 months. Also I am concerned that 6800XT can only match 3070 when RT is enabled.. what’s your take away from it ?
My take away from it is this; is it really a problem? Whether you buy anything from a 3060Ti to a 3090, or a 6800 to 6900XT, they're ALL pretty damn fast.

I won't be losing any sleep over $50 here, or 20% FPS difference there... Next year, then the year after there will be something faster again and I'll buy that.

If you think you're future-proofing at the moment by buying a 3080 - then you're probably mistaken as there are probably going to be some big leaps over the next few years and you can't be worrying about that. If you need an upgrade now, buy one. If you don't then do you just want to splash some cash - and if so, just buy whatever you want.
 
That makes no sense if it is CPU limited then again the 6800 xt should be faster or as fast as the 6800 non xt... same driver same "cpu limitation" or bottleneck same game same resolution just more cu units and higher frequency...
I don’t think you quite understand what CPU Limited means. It means the CPU is holding the cards back so they’ll get to a certain FPS and can’t go any higher. 1080p resolution will be CPU bottlenecked with most modern GPUs so you need to look at 1440p and higher to see how the cards perform.
I thought of a good analog, it’s like lining up a 700 horsepower Ferrari against a 150 horsepower Honda but you limit both cars to 50 horsepower. Both are going to perform about the same with some variation with the driver (pun intended).
Hardware Unboxed doesn’t even test 1080p because all these modern GPUs are CPU bottlenecked at 1080p. Look at 1440p+ if you want to see a true GPUs performance to compare.
And don’t even bother comparing different sites benchmarks there are so many variables that influence performance it’s pointless.
 
Last edited:
Lol, $50 does not matter much at this price point. You might as well pay $50 more for a superior RT performance.
Yeah sorry but that $50 changes to $200 anywhere outside of US.
 
Yeah sorry but that $50 changes to $200 anywhere outside of US.

Can't actually buy a 6800XT right now so comparing regional prices is moot point.
I don't see any wrong buying either 6800XT or 3080 if they are available where you live, these 2 are so close and represent the best GPUs from Nvidia and AMD.

AMD has done the impossible here, I was thinking that the 128MB Infinity Cache might have some downfall in the frametimes department it is rock solid there.

Now the question is can AMD produce enough cards to satiate the starving gaming crowds :D. Seems like both AMD and Nvidia are both laughing all the way to the bank this season.
 
That makes no sense if it is CPU limited then again the 6800 xt should be faster or as fast as the 6800 non xt... same driver same "cpu limitation" or bottleneck same game same resolution just more cu units and higher frequency...
Yes, what you say is true. There is a little bit of randomness in all measurements. It's impossible to reproduce any measurement 100%. My AC test does have a relatively high variability, if I do 10 runs back to back, on the same card same system, there will be a few FPS difference every time. This is normal and just how things work. Of course I can roll the dice as often as I need to get the desired placement. Is that what you want?

I recommend you play a game and look at the FPS counter. Now move away, and move back to the same place. Do you see the exact same number?
 
Back
Top