• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,843 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.
Yeah no doubt, you can always make games run bad by changing settings or replacing textures. These cases are very very edge case, maybe 1000 gamers out of 100 million? (making up random numbers).

More = good, but more = $, so more != good ;)
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

This is probably the reason why they stuck with a 320-bit bus and limited the RTX 3080 to just 10 GB. Having more exclusive/proprietary/expensive GDDR6X chips would've pushed up the price.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

FTFY.

And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.
The thing is, benchmark are usually made with nothing else running on the test bench. I did run into the 8GB vram limit when using the high res pack in WD:L while W1zz did not, because i usually have a bunch of things running on my other monitor, and some (youtube video playing for example) do add to vram usage. So, while i could do without the high res pack, it does not seems very future friendly if i already potentially hit the limit.
 
Joined
Oct 25, 2020
Messages
6 (0.00/day)
Fanboyism at its finest. There's almost zero difference between 10900K and 5950X at resolutions above 1080p and 10900K is as fast or faster than 5950X at 1080p.
That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);

1605715810945.png
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.
That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.
 
Joined
Feb 23, 2019
Messages
6,069 (2.88/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Joined
Jan 21, 2020
Messages
109 (0.06/day)
No, they're both crap. RT is still not ready for prime time.
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
yeah, as much as I love competition, and as fast as the card might be, I'm not comfy with the price hike over $500 over the last couple of years
$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.

Also, remember that the 8800 ultra was over $800 in 2007.


These high prices are nothing new.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.

Wolfenstein the new order was one of the two games to expose the limitations ofthe 2GB framebuffer on the 680/770 cards way back when, the other being forza 4. But most other games ran perfectly fine, by the time the 2GB limit actually became a significant limit the 680 performance class was the range of 4GB 560s and 770 usage was nearly non existent anymore.

Not saying RAM limits cant happen, but the doom-and-gloom over nvidia's 10GB bus is way overhyped. A handful of games with manual settings that obliterate VRAM usage =! 10GB not being enough for 99% of PC gamers, even on the high end.
 
Joined
Jul 8, 2019
Messages
169 (0.09/day)
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.

It will just matter if demanding games ie. most AAA titles get DLSS. For the rest[indie games or AA games] you don't need it as it will run at 4k native.
 
Joined
Apr 6, 2015
Messages
250 (0.07/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.

Also, remember that the 8800 ultra was over $800 in 2007.


These high prices are nothing new.
LOL
Was back then a proud 8800 GTS 320 MB owner, until a month later they dropped 8800 GT with hardware HD decoding.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D

While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) while giving a noticeable uplift in graphics fidelity, then it would be acceptable. But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);

View attachment 176136
You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Thanks for the review.


RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.


I imagine AMD will make the same move as soon as the node is available. Do you have insider information that TSMC plans on barring AMD from using the node to make Nvidia look better :eek:?

So far TSMC is barring everyone from using 5nm except for Apple...
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,261 (4.67/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($179)
Cooling Frost Commander 140 ($42)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3100 core $(705)
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p ($399)
Case NZXT H710 (Red/Black) ($60)
honestly I am glad I got the 6800 non-XT, I imagine with my super tuned ram at 3600 cas 14-14-14 which i already have stable on my 5600x, enable rage mode, medium OC, smart access memory enabled... I will be nipping at the heals of a 3080 myself even with non-xt.

but mainly since i game at 1080p 165hz or 1440p 144hz, the 6800 with all that stuff mentioned above, maxes out the frame rate anyway... so yeah... I'm set and I saved $80 on top of that. would have liked a 6800 xt for sure, but I am just thankful I got what I got.

also love the title... "nvidia is in trouble" haha indeed, glorious times.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.
You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
rDNA2 is shaping up to be the next Evergreen series. Wouldnt surprise me to see AMD ripping a significant chunk of marketshare back from Nvidia.

The AIB 6800xts are going to be awesome with larger power buses and limits. And given how slow the fans spin at stock there is plenty of temp room as well.

Now I really want to see what the 6900XT is capable of, with the 6800XT OC tickling the 3090 in nvidia's golden examples.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) while giving a noticeable uplift in graphics fidelity, then it would be acceptable. But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).
I can play Control in 4K in full raytracing right now and get massive quality increase out of that. Now add Minecraft and Cyberpunk 2077 also with massive quality improvements gained from raytracing.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.
There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,261 (4.67/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($179)
Cooling Frost Commander 140 ($42)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3100 core $(705)
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p ($399)
Case NZXT H710 (Red/Black) ($60)
Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?

nvidia card's don't OC for one thing, the new ones don't. and AMD oc's very well. surpassing 3080 really across the board even with both oc'd.

also competition is just great for the PC gaming industry... so just be happy and move on with life?
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.
AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?
Nvidia has practically 0 OC headroom. AMD has decent headroom and is restricted by power limits, AIB products with higher limits and more memory OCing will expand that further. With OC factored in 6800xt totally closes the gap with Nvidia.

Raytracing continues to only be a selling point to a tiny minority of users who love the game control, outside of that game raytracing is hilarious vaporware. A $50 difference means AMD getting more attention, that's enought o convince people with the cards being so close, and Nvidia doesnt have a lot of headroom to cut prices on a GA die they cant seem to make in any large number.
 
Top