• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800

Joined
Jul 26, 2019
Messages
419 (0.21/day)
Processor R5 5600X
Motherboard Asus TUF Gaming X570-Plus
Memory 32 GB 3600 MT/s CL16
Video Card(s) Sapphire Vega 64
Storage 2x 500 GB SSD, 2x 3 TB HDD
Case Phanteks P300A
Software Manjaro Linux, W10 if I have to
Looks like both cards are out of stock. Hopefully it will be better next week when board partners release.
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
I'm curious how AMD's take on DLSS will end up. Will it be developer dependent on a game by game basis or game agnostic perhaps and just works on all games retroactively perhaps more like similar to how mCable works, but obviously much more sophisticated and advanced. I'd rather the later be the cases even if the performance was half as good as DLSS with similar image quality. Having it just work and on all titles is a hell of a lot more useful in the grand scheme. I'd say both are useful, but in the big picture DLSS in the way it works is less optimal in ways than the way mCable handles upscale from a approach standpoint since it just works ahem like Jensen touted RTX ironically. Let's hope AMD's approach is closer to mCable as it would be more ideal. Where I think DLSS has it's strengths and merit is closer to AMD with Mantle to squeeze and eek out that extra bit of closer to the metal hardware performance out though requires developer attention at the same time which is a big drawback.

I wonder if AMD's upscale can be combined with Radeon Boost. It could be used either before or after. It seems like 2 options are possible upscale the native resolution to a target which by extension gets applied to Radeon Boost as well or downsample with Radeon Boost then strictly upscale the in motion resolution. The latter option would have cleaner more pure stationary image result, but less of a combined performance boost performance impact. Additionally either options could have custom arbitrary frame rate target use case scenario's giving more optimal results when and where needed for image quality in relation to motion and input lag. Combine that as well with having variable rate shading able to do similar frame rate triggered use scopes and that could have a really big impact on general performance to optimize the hardware.
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,796 (1.50/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Really excited as this offering is great
 
Joined
May 12, 2016
Messages
259 (0.08/day)
Processor Intel Core i7 11700
Motherboard Asus b560-i ROG
Cooling Thermalright Assassin King Mini
Memory G.Skill Trident Z 3600
Video Card(s) RTX 3080 FE
Display(s) Dell S2721DGF
Case Ncase M1
Power Supply Corsair SF750
Mouse HyperX
Keyboard HyperX

JonCo

New Member
Joined
Nov 18, 2020
Messages
1 (0.00/day)
i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?


I came on with very similar questions to Bobmeix - I'd been thinking the power consumption figures looked suspiciously good. Definitely re-iterate his comments about how great your reviews and work is - best GPU reviews on the web in my view.

I think it's worth noting that performance per watt is also impacted by this, and that is calculated separately for 1080p, 1440p and 4k. Taking power consumption figures at 1080p and using them for 1440p and 4k Performance per Watt charts seems a bit arbitrary now we know there are significant differences in power consumption at different resolutions. Depending on the amount of work you are willing to do, the options look to me to be:
  1. Most work, most accurate: Run power consumption at all three resolutions, publish all three and use resolution-specific figures in performance-per-watt
  2. Midground: Add either 1440p or 4k to the power consumption test and take an average of the two resolutions. 4k seems the obvious as you get both ends of the spectrum that way
  3. Least work, least accurate: Move from the 1080p to 1440p for the core benchmark on the assumption 1440p is likely to be in the middle of any performance gradients like we're seeing here
I think any of these would be an improvement - how many people are realistically going to buy a 240-360hz monitor to justify a 3080 or 6800xt at 1080p? 1440p and 4k are the more obvious use cases...
 
Joined
Jul 18, 2016
Messages
354 (0.12/day)
Location
Indonesia
System Name Nero Mini
Processor AMD Ryzen 7 5800X 4.7GHz-4.9GHz
Motherboard Gigabyte X570i Aorus Pro Wifi
Cooling Noctua NH-D15S+3x Noctua IPPC 3K
Memory Team Dark 3800MHz CL16 2x16GB 55ns
Video Card(s) Palit RTX 2060 Super JS Shunt Mod 2130MHz/1925MHz + 2x Noctua 120mm IPPC 3K
Storage Adata XPG Gammix S50 1TB
Display(s) LG 27UD68W
Case Lian-Li TU-150
Power Supply Corsair SF750 Platinum
Software Windows 10 Pro
Damn look at that efficiency! That's an insane performance/watt showing from AMD. I thought I was reading it wrong at first. Also the overclocking gains is not bad at all at 9%+. Seems like a better buy than the 6800XT imo especially since when RT on they perform sorta similarly...
 
Joined
Nov 4, 2019
Messages
234 (0.13/day)
Yeah more VRAM is nice but it shouldn't be $70 more. As for the Smart Access Memory (or Resizeable BAR), Nvidia already assures people that they will support the feature soon.

Ridiculous. Before AMD did it people were paying $70 more to go from 4GB to 8GB, not you get 16GB total and you are complaining. I can't even...
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,796 (1.50/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64

bug

Joined
May 22, 2015
Messages
13,836 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The frametime variance seem much lower on the RX 6800 series vs the nVidia lineup, which should result in much smoother gameplay. If this holds for most of the tested games, then the 6800 will give you much much better experience than the 3070 as the percentage difference suggests.
Only in some titles, according to this: https://www.techpowerup.com/review/amd-radeon-rx-6800/39.html
And in some titles Nvidia still gets ahead.
Not worth an extra $80 to me.
 
Joined
Jan 15, 2018
Messages
55 (0.02/day)
6800 is meh.
In most games it’s inferior to 3070.
If no 6800 nano release, then a modded ZOTAC 3070 will be currently the best one for ITX builds.

But there’s a good news about 6800 - it has low spikes - only 423W within 130μs.
That means we can use a Corsair SF450 on it with no risk!
 
Last edited:
Joined
Dec 10, 2014
Messages
1,335 (0.36/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | Koss KSC75
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Tiny11 Windows 11 Education 24H2 x64
I had no problem getting one
Look at this guy^ He's so happy it's unbearable. I lost count of how many threads he visited and replied he's managed to snag a card. :kookoo::laugh:
 
Joined
Nov 11, 2016
Messages
3,454 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Performance per watt calculation is really off when you base the power consumption in a much older game at 1080p and divided by the avg FPS of all games tested.
Does the 6800 use 210W in new games like Metro Exodus, AC Valhalla, etc... ? if it doesn't then the power consumption figure is wrong, so does the performance per watt.

Maybe we can use the Watt per FPS figure ? by locking a game to 120fps, and measure the avg power consumption. This is still a valid comparison since many people play game with locked FPS.
 
Joined
Jul 20, 2016
Messages
58 (0.02/day)
i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?


Purely IMHO: power consumption at a resolution that makes the most sense for that particular card? E.g. for 4k card at 4k, while for a cheap 1080p card at 1080p. Or would that make the comparison unfair?
 
Joined
Aug 9, 2019
Messages
1,717 (0.88/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
I like the 6800, but it should have cost 499usd, not 579. The value is not good vs 6800XT and 3070 unless the exceptional power consumption is important for you.

6800 should be a great card for notebooks, make a slightly downclocked version and make a 110-120W variant. It should beat any existing notebook GPU by far.
 
  • Like
Reactions: bug

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,932 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Purely IMHO: power consumption at a resolution that makes the most sense for that particular card? E.g. for 4k card at 4k, while for a cheap 1080p card at 1080p. Or would that make the comparison unfair?
That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?
 
  • Like
Reactions: bug
Joined
Sep 17, 2014
Messages
22,642 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Like I feared. Bad value comapared to RTX 3070 across the bard. But I think AMD gonna price drop it when RTX 3070 Ti launches. Because unlike Nvidia, AMD can't squeeze in another SKU.

They are going to have to move on price, but a big factor this time around is also availability, and it also impacts price.

Still, 8~10 GB is not a great thing for this performance level, we're already seeing those numbers allocated today.

That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?

The answer is people need to be educated on what numbers they're really looking at.

CPU performance is not going up as fast as GPU performance. Resolutions have made a quadruple jump from 1080p > 4K over the course of a few generations. Yes, this is going to shift the balance around in strange ways... its like reading the news. Some people just read headlines and misinterpret the better half of them, others read the article to figure out what's really happening.

If you don't draw the line there, any news outlet is on a race to the bottom. You can't cater to stupid - we have social media for that.
 
Joined
Jan 8, 2017
Messages
9,499 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Over 2.5 Ghz overclocked almost 25% higher clocks compared to previous generation, that's crazy.

I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.

The 6800 is totally capable of 4K.

Performance per watt calculation is really off when you base the power consumption in a much older game at 1080p and divided by the avg FPS of all games tested.
Does the 6800 use 210W in new games like Metro Exodus, AC Valhalla, etc... ? if it doesn't then the power consumption figure is wrong, so does the performance per watt.

Maybe we can use the Watt per FPS figure ? by locking a game to 120fps, and measure the avg power consumption. This is still a valid comparison since many people play game with locked FPS.

What a bizarre and contrived way to try and invalidate AMD's far superior performance/watt, you gave me a good laugh.
 
Joined
Dec 10, 2014
Messages
1,335 (0.36/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | Koss KSC75
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Tiny11 Windows 11 Education 24H2 x64
The 6800 is totally capable of 4K.
I kinda agree with him. Looking at the graphs 6000 series is bandwidth starved at 4k it seems.
While 3000 series is no better with their cheapskate VRAM size.
 
Joined
Jul 5, 2013
Messages
28,208 (6.74/day)
The 6800 is their highend card that is going up against the 3080 not the 3070.
The benchmarks don't show that. The 6800 seems to be firmly a 3070 competitor as it's wedged inbetween the 3070 and 3080, performance wise. AMD would be wise to adjust their prices quickly. And let's face reality, AMD has to be planning the 6900 & 6900XT. Those will be the models that compete(or even beat out) the 3080, 3080ti and 3090.
I hate to say this in a sentence but $580 isn't bad for a highend card with that amount of vram.
Why would you hate to say that? It's an excellent point! The 6800 & 6800XT are going to be a god-send for anyone wanting to do rendering and other tasks that require high amounts of VRAM. In that context, the price is excellent and the 6800 is a very well balanced card for the money.

I said this over in the 6800XT review, but it bares repeating, Welcome back to the GPU performance party AMD! Well nice this!
 
Joined
Jan 8, 2017
Messages
9,499 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I kinda agree with him. Looking at the graphs 6000 series is bandwidth starved at 4k it seems.

I see no real evidence for that, even in Red Dead Redemption 2 one of the worst games to run in 4K the performance scales in the same way it does for something like a 3090 which has a lot more DRAM bandwidth. Had that being the case performance at 4K should be much worse.
 
Joined
Nov 18, 2020
Messages
39 (0.03/day)
Location
Arad, Romania
Processor i9-10850K @ 125W Power Limit
Motherboard ASUS TUF Gaming Z590-PLUS
Cooling Noctua NH-D15S
Memory Kingston KF432C16RBK2/64
Video Card(s) ASUS RTX 3070 TUF GAMING O8G @ 950mV / 2010MHz
Storage Samsung 970 EVO Plus 2TB + Kingston KC3000 2TB + Samsung 860 EVO 2TB + Samsung 870 EVO 4TB
Display(s) ASUS PB287Q + DELL S2719DGF
Case FRACTAL Define 7 Dark TG
Audio Device(s) integrated + Microlab FC330 / Audio-Technica ATH-M50s/LE
Power Supply Seasonic PRIME TX-650, 80+ Titanium, 650W
Mouse SteelSeries Rival 600
Keyboard Corsair K70 RGB TKL – CHERRY MX SPEED
That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?
I still believe that 3 graphs for average gaming power would clarify that. It's better to have more than less info.
This would also make the performance/watt graphs more relevant!
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,932 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit

This is all my games at all resolutions on 6800 XT. I think for next round of retesting I'll revise my power testing to be a bit more demanding
 
Joined
Dec 10, 2014
Messages
1,335 (0.36/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | Koss KSC75
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Tiny11 Windows 11 Education 24H2 x64
I see no real evidence for that, even in Red Dead Redemption 2 one of the worst games to run in 4K the performance scales in the same way it does for something like a 3090 which has a lot more DRAM bandwidth. Had that being the case performance at 4K should be much worse.
I'm speaking from overall benchmarks I've seen so far. Let's treat 3080 and 6800 XT are in the same ballpark performance and 1440p as our control environment.
1) In games where 6800 XT leads 3080, in 4k 3080 leads or difference is miniscule.
2) In games where 6800 XT is hair-width slower than 3080, in 4k 3080 leads farther.

6800 otoh is all around faster than 3070 but in 4k the gap becomes narrower.
 
Top