• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Joined
Feb 18, 2017
Messages
688 (0.24/day)
So what we have learned from the 5000 series CPU and the 6000 series GPU launch is that

-FHD results in CPU benchmarks are not relevant any more (in fact they really weren't earlier too, but for some reason it was important for the blue team)
-NV cards can be UVd to get better power consumption
-NV fans hold on to the very last point they can (RT with only a few games available), when a rival generation supporting RT for the first time is able to match the last gen (first gen RT) flagship green card

Love it, really. :D
 
Joined
Oct 26, 2019
Messages
168 (0.09/day)
Could be software scheduler affecting Nvidia performance in DX12 and Vulkan in 1080p? Is if software or hardware being tested?
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Certainly an impressive effort from AMD, shame stock levels with these seem to be non-existent too.
 
Joined
Feb 3, 2012
Messages
202 (0.04/day)
Location
Tottenham ON
System Name Current
Processor i7 12700k
Motherboard Asus Prime Z690-A
Cooling Noctua NHD15s
Memory 32GB G.Skill
Video Card(s) GTX 1070Ti
Storage WD SN-850 2TB
Display(s) LG Ultragear 27GL850-B
Case Fractal Meshify 2 Compact
Audio Device(s) Onboard
Power Supply Seasonic 1000W Titanium
Actually impressed with what AMD has managed to pull off with these cards. Performance is where I expected them to land. I fully expect nvidia to respond with their Ti cards on a better node.
 
Joined
Jun 11, 2020
Messages
573 (0.35/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
Do you ?

AMD is doing away with a narrower bus and slower memory, they can always bump those up and gain a considerable chunk of performance just by doing that and nothing more. Also caches scale really well with newer nodes in terms of power, there is a very good chance AMD will trash Nvidia in performance/watt even harder when both get to the next node.

Also, you do realize AMD is probably already working on a future 5nm design by now. Meanwhile I am led to believe Nvidia is figuring out how to bring Ampere to 7nm ? Do I have to explain how things aren't exactly going too well for them if that's the case ?

Seems like Nvidia has just done an Intel style upsy-daisy by screwing up their node situation.

Intel style is a bit harsh, but ya not paying up for TSMC 7nm might have been a mistake... We'll see if Big Navi stock can improve, which has been predicted once AIB cards start rolling in..
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
So what we have learned from the 5000 series CPU and the 6000 series GPU launch is that

-FHD results in CPU benchmarks are not relevant any more (in fact they really weren't earlier too, but for some reason it was important for the blue team)
-NV cards can be UVd to get better power consumption
-NV fans hold on to the very last point they can (RT with only a few games available), when a rival generation supporting RT for the first time is able to match the last gen (first gen RT) flagship green card

Love it, really. :D
Or you know, regarding the last part, some people just like RT for what it is. A lot of people in this thread have praised AMD for the achievement with this card even if they prefer an nvidia card because of RT. I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.

Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.
 
Joined
Jun 11, 2020
Messages
573 (0.35/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
Or you know, regarding the last part, some people just like RT for what it is. A lot of people in this thread have praised AMD for the achievement with this card even if they prefer an nvidia card because of RT. I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.

The difference is, 8gb will always be 8gb while RT performance will likely improve and AMD does have their DLSS like solution in the works which will also help.
 
Joined
Jun 29, 2015
Messages
25 (0.01/day)
System Name Computer
Processor Intel Core i7 2600
Motherboard Intel
Cooling Deepcool Gammaxx 200T
Memory 8GB DDR3
Video Card(s) Zotac GTX 760
Storage SSDs\HDDs
Display(s) Acer G7 G227HQL @75Hz
Case Deepcool Frame
Power Supply EVGA 430w
Software Windows 10 x64
Even if it's called "RTX" in games in reality it's D3D12 DXR - there's no such thing as "nVidia standards right now on PC though". It's like saying that there are two different DirectX'es for AMD and NVIDIA.
Good lord.
Even if it's called "RTX" in games in reality it's D3D12 DXR - there's no such thing as "nVidia standards right now on PC though". It's like saying that there are two different DirectX'es for AMD and NVIDIA.
Slow down.

You're too focused on a single word. It was called RTX in games because nVidia paid the developers money and helped to implement it so they could market their hardware for this purpose. Like they do with all of their tech, right?

What my initial post was saying is that nVidia themselves rushed RT onto the 20 series so they could pretty much become synonymous with Ray Tracing(which they have) even though they themselves are the only ones implementing it, usually with incentives. Still with me?

As of right now, RTX is what people associate as a standard on PC. Not THE standard, but how nVidia thinks it should work.

A lower setting for cheaper hardware, a medium setting for higher end stuff and finally the ultra setting for their enthusiast market.

In the future, now that RDNA2 is here and new consoles have arrived, we will likely just have a simple setting to enable or disable BUT there will be an RTX setting in specific games(just like PhysX) where buying an nVidia card gets you "better" effects.

Do you understand? The scope of RT we will see on consoles is what PC is going to get since no one makes huge, graphics heavy PC games any more.

RTX will still be a thing. Just a premium thing. I can totally see nVidia releasing GTX cards that do the simpler RT that consoles will target and selling RTX cards that handle a heavier load in nVidia sponsored games(again, like PhysX).
 
Joined
Jul 3, 2019
Messages
322 (0.16/day)
Location
Bulgaria
Processor 6700K
Motherboard M8G
Cooling D15S
Memory 16GB 3k15
Video Card(s) 2070S
Storage 850 Pro
Display(s) U2410
Case Core X2
Audio Device(s) ALC1150
Power Supply Seasonic
Mouse Razer
Keyboard Logitech
Software 22H2
I fully expect nvidia to respond with their Ti cards on a better node.
That's not how that works. Maybe new stepping with abit better yields and perf/w but it wont be anything major.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.

I tell you why, because some insist a 50% performance hit so much better than a 60% hit.

No, they're both crap. RT is still not ready for prime time.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
The difference is, 8gb will always be 8gb while RT performance will likely improve and AMD does have their DLSS like solution in the works which will also help.
Indeed. I do not intend to change my graphic card right now, so if RT performance do improve and the AMD equivalent of DLSS do prove to be worth it when i do, i would most certainly be interested by an AMD graphic card.
 
Joined
Jun 28, 2018
Messages
299 (0.13/day)
Performance is as expected, in rasterization slightly below 3080 (on average), in RT it loses a lot, but in compensation it´s a little cheaper, has more VRAM and consumes significantly less power.

Overall, it´s a very solid product from AMD, as has not been seen in years. This was long overdue after several disappointments.

In another note, Nvidia definitely messed up going with Samsung. It is not surprising that Nvidia is already placing orders for TSMC 5nm.
 
Last edited:
Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Feb 18, 2017
Messages
688 (0.24/day)
Or you know, regarding the last part, some people just like RT for what it is. A lot of people in this thread have praised AMD for the achievement with this card even if they prefer an nvidia card because of RT. I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.

Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.
There is no problem when someone says he is disappointed with the RT performance (in my opinion, matching a flagship first gen rival with the first gen non-flagship card is not disappointing or not in a way it is mentioned) but in other aspects it's a great card, and of course these people are not the ones to whom I addressed my comment.
 
Joined
May 15, 2014
Messages
235 (0.06/day)
If you need the best RT performance RIGHT NOW, then Nvidia is clearly the way to go. That being said, I won't be surprised if after a year or two, RT performance of the 6800 and 6800xt is noticeably better than 2080ti.
A lot will depend on which arch is better suited to inline RT with future titles. I'd hazard most titles are optimised for Nv ATM (obviously) where run-time optimisation will be trickier for AMD. I guess if AMD have a stable software stack they can build on performance from there.

G6X is NVIDIA exclusive from what I understand, also AMD's board partners cannot change memory technologies
Exclusive insofar as implementing the tech in their memory controllers. I think AMD is unlikely to use GDDR6X even for their next gen.

Good performance, however it seems the infinity cache approach to accelerating memory performance can't keep up with 4k performance.
It's a bit early to tell & may still require driver tuning. It may be a case of frametime being more alu limited @ 4k giving Ampere the advantge.

I hope a board partner attempts a gddr6x version just to see if y theory is correct.
Not possible.
 
Joined
Jun 24, 2020
Messages
93 (0.06/day)
While being 9% cheaper, I consider that a win!
Certainly, they might just end up being the same (and ridiculously high) price.

wait, if AMD keeps fixing the drivers

will 6800xt finally be as good as 3080, like 0% margin
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Do you ?

AMD is doing away with a narrower bus and slower memory, they can always bump those up and gain a considerable chunk of performance just by doing that and nothing more. Also caches scale really well with newer nodes in terms of power, there is a very good chance AMD will trash Nvidia in performance/watt even harder when both get to the next node.

Also, you do realize AMD is probably already working on a future 5nm design by now. Meanwhile I am led to believe Nvidia is figuring out how to bring Ampere to 7nm ? Do I have to explain how things aren't exactly going too well for them if that's the case ?

Seems like Nvidia has just done an Intel style upsy-daisy by screwing up their node situation.
Actuall you don't. AMD in fact HAD to go for a feature like Infinity Cache. It was not an option, it was a requirement. It's because of their raytracing solution. If you look at the RDNA2 presentation slides, there is a clear line that says:
"4 texture OR ray ops/clock per CU"


Now what do you think that means? I'll enlighten you. Remember there is such a thing as the bounding volume hierarchy tree (BVH). That is a big chunk of data, as it holds the bounding boxed for all objects in the scene to help optimize ray intersections. Unfortunately for AMD, as you see in the slide, their cores cannot perform raytracing operations at the same time as texturing operations, unlike in Nvidia's design. Even worse, they are using the same memory (as AMD repeatedly stated) as they use for texture data. If AMD GPUs did not have the Infinity Cache, they would be in huge trouble, as their per-core cache would keep being invalidated all the time, having to dump the BVH data and replacing them with texture data (and vice versa). And you can see the Big Navi paying the price for that in 4k and raytracing.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
More VRAM is always good.

More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

FTFY.

And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.
 
Joined
Feb 18, 2017
Messages
688 (0.24/day)
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.
And why is it our problem? NV chose this way, AMD chose another. Anyway, the 3080 probably won't have a problem with its 10 GB of VRAM, but the 3070 will.
 
Joined
Oct 26, 2019
Messages
168 (0.09/day)
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.
10 GB does affect DOOM Eternal performance a bit, but in 8K. In 4K 8 GB is more than sufficient for upcoming years.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
-NV cards can be UVd to get better power consumption

To add clarity to this, they can be undervolted (I would assume .850mV as I'm at .825mV for a 1% loss) to just slightly below the peak gaming power usage of the 6800 XT (284W) and still retain stock performance. Samsung 8nm is definitely inferior to TSMC 7nm for sure, but not by much.

I'd like to see someone try undervolting (not overclocking) the 6800 XT. I would be seriously impressed if it can get below 240W.
 
Joined
Feb 23, 2019
Messages
6,069 (2.88/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
1605715584202.png

6800 is surprisingly in stock. 6800 XT - not so much, but price matches 3080 FE.
 
Top