• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Joined
Aug 27, 2013
Messages
55 (0.01/day)
System Name Redemption(Fractal DD XL R2 mod)
Processor Intel Core i7 4770K
Motherboard Gigabyte Z97X-G1 Gaming BLK WIFI
Cooling Water Cooled
Memory G. Skill Ripjaws Z 16 GB 1866 MHz
Video Card(s) 2 x Gigabyte R9 290X with EK blocks
Storage 256 GB Samsung 830 SSD
Display(s) Dell U2713HM
Case Fractal Design Define XL R2 (Modified)
Audio Device(s) Creative SB Z
Power Supply Silverstone Stider Evolution 1000 Watt Gold with individual MDPC sleeves
Software Windows 7 Ultimate
Great work W1z

AMD delivered what they promised. The performance compared to RTX 3080 is impressive despite the noisy cooler. The price is also better in Canadian pesos. It makes my decision a but harder since I don't need RT but do like the idea and what it does for the games that support it.

What I don't get is all the people complaining about RT performance. If I remember correctly, the same people were the first to dismiss RT as a gimmicky feature when it came out. Now these same people are dismissing AMD for not having a strong performance with RT enabled. Appreciate the competition and improvement that have been made because it's more likely that RDNA3 will have better RT performance and that will help the competition in terms of pricing.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Why didn't you guys use the worlds fastest CPU for gaming?
That would also allow you to test SAM+Ragemod?
You saw the SAM review with 5900X ? There's a big link in the conclusion
Rage Mode has nothing to do with the CPU or platform, it's explained in the review, too

Seriously
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.56/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
You saw the SAM review with 5900X ? There's a big link in the conclusion
Rage Mode has nothing to do with the CPU or platform, it's explained in the review, too

Seriously

Still your fault ... darn link is not big enough ... make it flash.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,991 (2.53/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
Even while the ray tracing performance seems pretty bad. These cards seem to lose more performance with ray tracing then Nvidia Turing did by a bit, I may still end up getting one early next year. This 1070 is piss slow and I plan to put my next card under water, and feel the 6800/6800XT will be a lot more fun to have on water than a 3080 with how much overclocking headroom these things have.

That is unless Nvidia release the 3080Ti on TSMC 7nm, but that remains to be seen and if a different node really helps Nvidia's arch all that much.

Nah, you misunderstand, Nvidia did it for all that production capacity that Samsung offers! And there is STILL a demand problem, go figure!

But yeah, this is leaps and bounds ahead. Apparently Nvidia's Ampere happened at AMD, or something. It pales in comparison - on a technical level that is, because the stack is competitive now in both camps. AMD has some pricing to fix if you ask me for the 6800. The 6800XT is in an okay place but still brings up the dilemma wrt ray tracing for some. A bit of a lower price point would have eliminated that outright - or at least more so.

Yeah my dilemma depending on how things shake out starting early next year and if Nvidia releases a 3080Ti that isnt using Samsung anymore (rumor mill is it could be on TSMC 7nm), but I wanted my next card to be fairly good at both ray tracing and rasterization. Ampere is right in that ball park imo, but then its power consumption and heat is something else. Then we have the 6800XT that fits the rastorization, efficiency, and overclocking potential to be really fun to have on water cooling, but then be pretty booty for ray tracing and no DLSS equivalent as of yet...

Ampere = Adequate ray tracing performance for all the games id actually want to run ray tracing on (non competitive multiplayer titles), but power and heat, and very little tweaking headroom, and bad scaling for overclocking

Big Navi = Really good in everything but ray tracing as it stands right now....and no DLSS

@W1zzard What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?
Some cosmetic issues in the control panel, nothing worth mentioning. No stability issues, no crashes
 
Joined
Sep 1, 2009
Messages
1,224 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling ARCTIC Liquid Freezer II 240
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
What I would like to see is a Performance on DXR 1.0 Titles vs Performance on DXR 1.1 titles. I wonder if the new style of DXR which the RX 6800XT was made for performs significantly better. Overall whatever card I can actually get my hands on will replace my 1080TI.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
What I would like to see is a Performance on DXR 1.0 Titles vs Performance on DXR 1.1 titles. I wonder if the new style of DXR which the RX 6800XT was made for performs significantly better. Overall whatever card I can actually get my hands on will replace my 1080TI.
I don't think microbenchmarks are useful to the vast majority of readers, if you want to code some test and send it to me i'd be happy to run it.
Need the source for security, but happy to sign an NDA. This offer is open to anyone reading this.post.
 
Last edited:
Joined
Apr 30, 2011
Messages
2,692 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
And another aspect of Big Navi's value is that the drivers seem to be great. Not any reviewer mentioned any serious problem. Let's hope it is a start of constant stability there.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.

The perf/W charts have been updated, too, and relevant texts as well.
 
Joined
Sep 17, 2014
Messages
22,009 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Great work W1z

AMD delivered what they promised. The performance compared to RTX 3080 is impressive despite the noisy cooler. The price is also better in Canadian pesos. It makes my decision a but harder since I don't need RT but do like the idea and what it does for the games that support it.

What I don't get is all the people complaining about RT performance. If I remember correctly, the same people were the first to dismiss RT as a gimmicky feature when it came out. Now these same people are dismissing AMD for not having a strong performance with RT enabled. Appreciate the competition and improvement that have been made because it's more likely that RDNA3 will have better RT performance and that will help the competition in terms of pricing.

Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

With 8-10GB cards to carry them forward.

:laugh:

Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

Knowing the technical details and actual performance, heat and TDP now, its crystal clear AMD has a much stronger offering. Even if they don't top the 3080 in all situations - they have a much better piece of silicon on offer, and more VRAM to keep it going forever. These cards will be holding value, whereas a 3080 is just the 2080ti all over again - eclipsed within a single gen.

A few weeks back one could think 'hm, 3080 at 700, thats a great deal!'. Today, not so much. Its a new norm and the 3080 is really on the worst end of it. Especially if you consider that Nvidia is deploying a largely enabled GA102 for it. It means that without a shrink, Ampere is a dead end, and even on 7nm its worth no more than a refresh. This was the major redesign? Back to the drawing board then.

Its going to be interesting to see how important people think RT performance really is, because it truly is the one differentiator Nvidia can hold on to.
 
Last edited:
Joined
Jul 7, 2019
Messages
890 (0.47/day)
Still your fault ... darn link is not big enough ... make it flash.

Like the internet of the 90s with giant flashing "CLICK HERE!" links?

Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

It'll be interesting to see what happens moving forward; now that AMD has their foot in most gaming companies' doors simply due to optimizing for consoles, it's more than likely that we'll see AMD optimized features first and Nvidia optimizations following, just due to the nature of going for multi-platform releases and similar core hardware between consoles and AMD CPUs/GPUs. It's just a matter of AMD finalizing their version of DLSS and getting them to utilize it, and optimizations to RT.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

With 8-10GB cards to carry them forward.

:laugh:

Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

Knowing the technical details and actual performance, heat and TDP now, its crystal clear AMD has a much stronger offering. Even if they don't top the 3080 in all situations - they have a much better piece of silicon on offer, and more VRAM to keep it going forever. These cards will be holding value, whereas a 3080 is just the 2080ti all over again - eclipsed within a single gen.

A few weeks back one could think 'hm, 3080 at 700, thats a great deal!'. Today, not so much. Its a new norm and the 3080 is really on the worst end of it. Especially if you consider that Nvidia is deploying a largely enabled GA102 for it. It means that without a shrink, Ampere is a dead end, and even on 7nm its worth no more than a refresh. This was the major redesign? Back to the drawing board then.

Its going to be interesting to see how important people think RT performance really is, because it truly is the one differentiator Nvidia can hold on to.
I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.
 
Joined
Feb 26, 2016
Messages
551 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Monsgeek M5W w/ Cherry MX Silent Black RGBs
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
Check the discussion in the forum comments of the non-xt review, you’ll understand
So what you are essentially telling me is that, a 6800 takes more power than a 6800 XT. That doesn't seem realistic..
 
Joined
Feb 13, 2017
Messages
138 (0.05/day)
Well, slower than the 3080 while similarly unobtainable, and with less features. I don't care about ray tracing, but I have a 3840x2160 120Hz screen to feed with frames and the 6800XT just doesn't cut it. Also, DLSS will probably future proof the 3080 much better as far as achieving playable framerates. I have to say this launch is heavily overhyped. It's a good GPU, which couldn't be said about AMD products for years, but not better than the competition. So, I won't cancel my 3080 order.
yeah no, if you want 4k at ultra, 10GB won't be enough. DLSS, much like PhysX, will prob be only available on a handful of titles. And it isn't slower, it's about the same while overclocking much more and using much less energy. I also have a Sony X900H 4k@120hz to feed, and a 16GB highly overclockable 6800XT is a much better deal, now and even more in the future.

Even while the ray tracing performance seems pretty bad. These cards seem to lose more performance with ray tracing then Nvidia Turing did by a bit, I may still end up getting one early next year. This 1070 is piss slow and I plan to put my next card under water, and feel the 6800/6800XT will be a lot more fun to have on water than a 3080 with how much overclocking headroom these things have.

That is unless Nvidia release the 3080Ti on TSMC 7nm, but that remains to be seen and if a different node really helps Nvidia's arch all that much.



Yeah my dilemma depending on how things shake out starting early next year and if Nvidia releases a 3080Ti that isnt using Samsung anymore (rumor mill is it could be on TSMC 7nm), but I wanted my next card to be fairly good at both ray tracing and rasterization. Ampere is right in that ball park imo, but then its power consumption and heat is something else. Then we have the 6800XT that fits the rastorization, efficiency, and overclocking potential to be really fun to have on water cooling, but then be pretty booty for ray tracing and no DLSS equivalent as of yet...

Ampere = Adequate ray tracing performance for all the games id actually want to run ray tracing on (non competitive multiplayer titles), but power and heat, and very little tweaking headroom, and bad scaling for overclocking

Big Navi = Really good in everything but ray tracing as it stands right now....and no DLSS

@W1zzard What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?

most new games, based on consoles, will have very light RT effects, and will all be using the "AMD standard" as used on consoles. I wouldn't be worried with the 6800 RT performance, it's good enough for some effects here and there - same as the consoles. Both Ampere and new 6000 series are too week for full blown RT effects anyway, and that's coming only in the PS6 era imho.

I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.
8-10GB is a hard fact / limit people experience today on 1440p / 4K at ultra
RT / DLSS games are just a handful, with arguable quality uplift (many people saying they can't see a difference). People are more hyped about these features due to nVidia's marketing / reviewers rather than real usage scenarios (like PhysX was). Now if you tell me you value CUDA or nvenc, then yes these are valid scenarios where Radeons don't have a say.
 
Last edited:
Joined
Dec 10, 2014
Messages
1,327 (0.37/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | QKZ x HBB
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Windows 10 Education 22H2 x64
And another aspect of Big Navi's value is that the drivers seem to be great. Not any reviewer mentioned any serious problem. Let's hope it is a start of constant stability there.
I don't remember any reviewer having driver issues on RX 5000 series release either. But few weeks later general public started posting online about various issues. Utimately time will tell.
 
Joined
Jul 21, 2020
Messages
9 (0.01/day)
Location
Hong Kong
Processor Ryzen 3900X
Motherboard MSI X570 Tomahawk
Memory 32GB G Skill Trident Z NEO 3600MHz G16
Video Card(s) Gigabyte Vision 3080 10GB
Power Supply Corsair AX850
I think RX5000's driver was good, to begin with and there is some function in the software stack that just doesn't work
 

gxv

New Member
Joined
Oct 17, 2020
Messages
2 (0.00/day)
'With SAM enabled, we see the averages change "dramatically" (in the context of competition), with the RX 6800 XT now being 2% faster across all three resolutions. This helps the RX 6800 XT match the RTX 3080 at 1080p, while beating it by 1% at 1440p and being just 4% slower at 4K UHD—imagine these gains without even touching other features, such as Radeon Boost or Rage Mode! '

Now imagine techpowerup doing a serious review including rage + sam mode together
 
Joined
Sep 17, 2014
Messages
22,009 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.

You're absolutely right. I'm not dismissing them entirely though - its just that a choice for 'more RT performance and DLSS' is a choice for a variable, completely abstract advantage. You just don't know how it will develop going forward, and the baseline performance on both cards is decent enough for 'playable' - except perhaps at 4K, where Nvidia might be able to keep over 45-50 FPS more readily with RT on. That is the extent of the validity of those arguments, really. FWIW, apparently AMD Is also launching a DLSS-equivalent. It'll likely not be as useful, but more readily available to a broad range of games. But in much the same way, I wouldn't count on any of it.

In much the same vein we can't really predict how VRAM requirements will develop going forward, but there is a similar sort of risk of lacking performance there on Nvidia's end and it kind of extends to the 3070 too with its rather low 8GB. And then you're talking not just about RT perf, but everything that takes a hit. At the same time, there is one guarantee, none of these cards can really do more RT than a few fancy effects here or there. And still lose a lot of frames. And since the GPUs in the consoles won't be getting faster this gen, that's what you've got for the next three-five years going forward.

There is indeed something to choose now, and that's great.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
So what you are essentially telling me is that, a 6800 takes more power than a 6800 XT. That doesn't seem realistic..
Ah you misunderstood. Look at the charts from the thread, note how 6800 XT runs closer to 300 W at higher resolution. That's why I switched all cards 2080 Ti and up to power testing at 1440p and not 1080p. Both reviews have been updated accordingly, check if the numbers listed in the review make more sense now.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
8-10GB is a hard fact / limit people experience today on 1440p / 4K at ultra
RT / DLSS games are just a handful, with arguable quality uplift (many people saying they can't see a difference). People are more hyped about these features due to nVidia's marketing / reviewers rather than real usage scenarios (like PhysX was). Now if you tell me you value CUDA or nvenc, then yes these are valid scenarios where Radeons don't have a say.
I already posted about it, but here it is again.

I personally liked RT on Control and more recently on WD:L and i did find the difference noticeable when i had to deactivate it for some tests after some hour of playing. I also look forward to play minecraft RT with some friends, as the results are really amazing.

Now, i have a 2070 Super OC, all the game i play usually do above 60fps at 1440p with very few exception. The most important one being WD:L, that both hit the 8Go VRAM limit when RT is on with the HD texture pack and the way i'm playing. So i'm aware about the low vram issue you mentioned.

The reason why i have been following this review is that i was hoping for a card with 16Go Vram that can justify to replace my actual GPU. But even with the better efficiency and rasterization performance, why would i buy a 6800XT to have worse performances in the games that would make me want to upgrade my actual GPU in the first place?

So now you can tell me to drop RT on Legion and enjoy my 85fps with my 6800XT (according to guru3d bench). But then i could also just deactivate it on my 2070Super while keeping DLSS which both solve the memory issue and give me 80FPS with a small drop in image quality for 0€ and even less power consumption than the 6800XT.
On the other hand, if i choose to buy the 3080, i will benefit from the FPS boost in every game like the 6800XT, it will solve my issue with Legion and allow me to play minecraft RT in comfortable condition. I may run into issue with the 10GB Vram at some point and it may cost me a bit more of electricity but it would still seems like a better deal for me right now.

@W1zzard : i'm curious about something regarding media playback, you mentioned that it put the memory frequency to the max, does it affect gaming performances in a meaningful way if you have a video playing while a running a demanding game ? Asking because i usually listen to some youtube video while playing.
 
Last edited:
  • Like
Reactions: SLK

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
i'm curious about something regarding media playback, you mentioned that it put the memory frequency to the max, does it affect gaming performances in a meaningful way if you have a video playing while a running a demanding game ? Asking because i usually listen to some youtube video while playing.
I've never tested that. Doubt it, decode happens on separate hardware in the GPU, and clocks are at max due to gaming already
 
Joined
Apr 21, 2010
Messages
574 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
I think the reason why multi-monitor consumes less power is because of Infinity Cache not memory.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I think the reason why multi-monitor consumes less power is because of Infinity Cache not memory.
No, the reason is that memory clock is now running very low compared to previous AMD cards, check my reviews, the data is there

It is possible that they leverage the L3 cache to reduce the number of memory accesses, so that the low memory clock is ok, but why is this not a problem on NVIDIA who don't have L3 cache?
 
Joined
Apr 21, 2010
Messages
574 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
I noticed card doesn't expose memory temp , right?
 
Joined
Apr 12, 2013
Messages
7,424 (1.77/day)
Is it just me or does the infinity cache(?) seem to help with a more consistent frame rate & a lot less spikes wrt 3080 ~
 
Top