• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6800 XT

Please educate yourself about game development a tiny bit. It's perfectly possible to make both games require a lot less VRAM - it's just their developers went overboard when they had access to the 2080 Ti.

Of course you can make something less VRAM. You just use inferior (smaller) textures.

You can also make the geometry bottleneck smaller if you use lower quality models and lesser tessellation settings, you can lessen the impact on the L2 caches (for Ampere) if the game's very design easily compresses using earlier DCC parameters from Pascal.

But why lower the settings? Isnt Ultra supposed to be decadent? If anything, we need games to have more demanding Ultra settings like we had in 2004 or 2007. A return to games being completely unplayable on Ultra settings like in the past would be so welcome to real tech enthusiasts. For the rest there is the "High" preset.
 
4K performance in Microsoft Flight Simulator is extremely disappointing :( I was hoping RDNA2's higher frequency and 16gb or ram to make a difference, but game obviously prefers more shaders and faster memory. Well, 3080 for me after all I guess.
View attachment 176144
WAIT FOR TI if you can!
 
WAIT FOR TI if you can!
The 3080 Ti will for sure be slightly slower than the 3090. Less bandwidth. So the dude, if he really cares for MSFS so much... well that is his choice.
 
Of course you can make something less VRAM. You just use inferior (smaller) textures.

And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:

af6787b9beff0e1a0b3223bf6ce60a8a.jpg


I'm not sure they are comparable yet. Something is definitely missing. :cool:
 
Gamers need this GPU, here the reviews.


Better than the rtx 3080 with lower power consuption and is cheaper.
On 1080P its better than the rtx 3090
View attachment 176154

View attachment 176153

Good job AMD for beating NVIDIA. :clap:

(Some NVIDIA fan`s are not happy, lol.)

Imagine buying any of these cards for 1080p....
 
And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

id Tech 7 already does that. The only optimization it is missing (with regards to textures) is sampler feedback, or to be precises - a Vulkan equivalent to it.
 
Imagine buying any of these cards for 1080p....


So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.
 
You think RDNA2 based cards will not improve their performance over time? I know the future potential uplift dank memes about AMD, GCN on consoles but you really think the massive gains for games we saw on zen3 chips was just a coincidence?

When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.
 
So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.

Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience than a RTX 3080/6800XT at Ultra setting on an inferior display.
 
Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.

You mean the 5nm TSMC node that apple is using for the M1 processor. Guess who is going to buy up all the fab capacity for that. Both AMD and NV will be fighting for craps there. And why do you assume AMD is going to sit on 7nm on the GPU side while NV goes to 5nm?

You do know there is an RDNA 3 already on the road map? and guess what nm its going to be using?
 
Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
- RNDA2 has better performance per watt
- Offers more Vram
- 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
- worse in RT (roughly on pair with 2080TI)
- lacks AI supersampling ("DLSS")

It all comes down to what features do you want. Price performance ratio is about the same. 6800 is faster with more vram than 3070, but costs 16% more. 6800XT is on pair with 3080 has more vram BUT worse RT and no AI SS and costs 7% less. All AMD brings to the table is more options to chose from but it isn't necessary better value. Now 6800 costing $499 and 6800XT $599, that would be true Ampere killers. AMD has clearly chosen profit margins over market share gain. That's their decision to make. I personally still hate Ngreedia because of Turing, but I've fallen out of love for Team red too, since they've decided to rise profit margins. I'll just buy what suits my needs best for as cheap as I can get it.

Its sad but they just spent 30 billion acquiring a company, they quite literally can't afford to give too much value, especially considering supply. Lets hope they'll throw something special at the mainstream segment when they reveal navi 22...
 
Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience than a RTX 3080/6800XT at Ultra setting on an inferior display.

I will never be that rich, so more power to you lol
 
So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.

I find pushing AAA games with higher frames like 144hz futile, because you often get into frame pacing[stutter] issues on these mostly unoptimized, console port type of games. They are first and foremost optimized for 30 fps, then for 60 fps, and anything higher is just luxury. But you are right, some people enjoy higher frames on lower resolutions.
 
Speaking of the RT performance in Watch dogs: Legion for AMD cards:

af6787b9beff0e1a0b3223bf6ce60a8a.jpg


I'm not sure they are comparable yet. Something is definitely missing. :cool:
This is exactly why I asked for a review of the actual raytracing output on both Nvidia and AMD cards.
 
The 6800xt avg gaming power consumption is around 218watts, vs 300watts for the 3080 and above. GG AMD!
 
Well with all due respect, it is you who seem to have the "special" eyes because the numbers you mention are "modified" towards AMDs favor.

It's mostly 5% difference and the power consumption difference is closer to 50 watts.

Also the "other tweaks" part is a lot better for the Green team nowadays... bios flashing on 3000 series is easy as pie and brings great performance jumps with it especially combined with watercooling.

Not to mention the software side of things with dlss, raytracing etc

If you think 5% is insignificant, go tell that to high end nvme drives lol

210 VS 303 is 93W difference.



3080 is 6% faster, but the 6800XT can overclock 10% faster out of the box, VS 4% for the 3080. So a net equal.

Raytracing is 10 games currently, if those 10 games are a make or break deal go for the green, but to 90% of gamers we know raytracing will be like tesselation, it will take a few generations to implement and match up performance and quality and by that time these cards will be obsolete and the difference will be 15FPS at 4K in those new games VS 21FPS.

DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?

And about overclocking and flashing BIOS, if 10% out of the box and a whopping 1v is an indication for AMD with the node they are on watercooling and 1.2-1.3V should give a 6800XT 25% more clock speed, meaning it will be 25% faster than the 3080 while still being cheaper. So still the better buy for 90% of gamers, games and those who want to play the silicon lottery and tweak. Samsungs node is crap and poor choice on on Nvidia for using them to save a few $$$
 
When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.

This myth has been debunked by many reputable websites, including TPU. NVIDIA does not drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however stops tweaking drivers for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imagine optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.
 
Last edited:
DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?
That is no longer the case (it was in DLSS 1.0). Since DLSS 2.0 there are no per-game profiles anymore. No game-specific training is required. The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.
 
This myth has been debunked by many reputable websites, including TPU. NVIDIA does not drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however stops tweaking drivers for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imaging optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.

Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.
 
Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.

This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.

As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).
 
I'm not sure where I stand right now frankly. AMD's cards really has been a good thing happening for sure. Nvidia now has serious competition.

But here comes my concern, while I have taken the choice to move to Zen 3. I'm not so sure on big Navi yet. First of all I have only had nvidia since like ever. I have never owned a amd card in my entire life.

Big Navi offers more vram and a little bit cheaper. That is great. But will driver be the Radeon 5000 series all over again with bugs and problems, that still whas a problem long time after launch and the last thing I want is driver with issues and problems. So far amd will have to convince me away from nvidia, they will have to show that not only the hardware is good, but also there software side. Also it can clearly be seen ray tracing on big nav is in it's infantcy and not on pair with nvidia just yet.

So I guess my next gpu choise will be between 6900 XT and the rumored RTX 3080 TI. Also depending on performance and driver experience and optimization. Nvidia also have some other features like streaming and AI software. In short, I am not totally convinced to go big Navi yet. Drivers first of all have to be a good experience, bugs and errors will only piss me off. Then Ray tracing will have to mature as well. I might end up going nvidia again, so amd convince me to go radeon.
 
Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.

The dilemma of the Nvidia fanboy.

It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.
 
Back
Top