• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6900 XT

Joined
May 3, 2018
Messages
2,881 (1.20/day)
So how does techspot show the 6900XT (sans SAM) being the fastest card of all at 1080p and 1440p and a bit slower than the 3090 at 4K but this shows it trailing at all resolutions.

This is a huge win for the 6900XT at the price. I guess we now get 3080 Ti 20GB on TSMC 7nm very shortly as a panic response.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
So how does techspot show the 6900XT (sans SAM) being the fastest card of all at 1080p and 1440p and a bit slower than the 3090 at 4K but this shows it trailing at all resolutions.

This is a huge win for the 6900XT at the price. I guess we now get 3080 Ti 20GB on TSMC 7nm very shortly as a panic response.

Panic response for something that isnt really widely available yet? And its probaby because techspot has less DX11 games in their review suite and different setups.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Theres no stock, but here in Au the 6900XT is listed at $1400 and the 3090's are going for $2800-$3000
That sheer dollar value down under, makes this a clear winner if you dont want ray tracing (or a 3080, if you do)
Cheapest I can see is $1599, where have you seen 1400? And I can see 3090's for a hair over $2600, not that it really matters.

To my eyes the clear choices are the 6800XT or 3080. The price hike for a 6900XT over a 6800XT makes exceptionally little sense, it basically exists to sell people the 6800XT as it appears, and is, much better value.
Theres no clear winner this gen, what we have is even better... COMPETITION
I'll cheers to that! :toast:
 
Last edited:
Joined
Jul 5, 2013
Messages
27,806 (6.68/day)
Theres no clear winner this gen, what we have is even better... COMPETITION
Yes, this exactly! And this is most excellent!

Once again late to the party in this thread, but I have to say it's refreshing to see AMD standing on more or less equal ground to NVidia! The 6900XT trades blows with the 3090 and it's interesting to see the results. NVidia still has the advantage in RTRT performance and VRAM(which will only really matters in 8k gaming and professional applications) but Radeon is standing side by side with the best Geforce and not flinching.

AMD, Welcome back to the premium GPU space! Well done indeed!
 

afw

Joined
Mar 4, 2009
Messages
646 (0.11/day)
System Name StealthX
Processor Intel i7 2700k
Motherboard Asus SABERTOOTH z77
Cooling Prolimatech Genesis
Memory CORSAIR Vengeance 8GB DDR3
Video Card(s) ASUS GTX 960 2GB black edition
Storage CORSAIR FORCE GT 120GB SSD + Samsung Spinpoint F3 (1TB + 2 x 500GB)
Display(s) Acer G235H 23" (1920 x 1080)
Case Silverstone Raven RV02
Audio Device(s) OB
Power Supply Seasonic x650 GOLD
Software Windows 7 Ultimate 64-bit
The pricing is the problem here I guess ... but whats the point ... no stocks from both nVidia and AMD and whatever is out there is sold at almost double the price ... thank you COVID :mad::banghead:
 
Joined
Jun 18, 2015
Messages
341 (0.10/day)
Location
Perth , West Australia
System Name schweinestalle
Processor AMD Ryzen 7 3700 X
Motherboard Asus Prime - Pro X 570 + Asus PCI -E AC68 Dual Band Wi-Fi Adapter
Cooling Standard Air
Memory Kingston HyperX 2 x 16 gb DDR 4 3200mhz
Video Card(s) AMD Radeon 5700 XT 8 GB Strix
Storage Intel SSD 240 gb Speed Demon & WD 240 SSD Blue & WD 250 SSD & WD Green 500gb SSD & Seagate 1 TB Sata
Display(s) Asus XG 32 V ROG
Case Corsair AIR ATX
Audio Device(s) Realtech standard
Power Supply Corsair 850 Modular
Mouse CM Havoc
Keyboard Corsair Cherry Mechanical
Software Win 10
Benchmark Scores Unigine_Superposition 4K ultra 7582
way overpriced , i will be looking at the 6800 myself 1440p 144 hz is my monitor max
 
Joined
Apr 30, 2020
Messages
985 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
@ W1zzard

How come their isn't a comparison in the Raytracing charts when overclocked ?
I'd like to know if it has any effect on the RT performance at all ?
There is only stock, with Raytracing enabled compared to disabled.

These cards are hard to compare because you can't really truly compare it to Nvidia's second Generation of RTX card. The Rasterization is much higher then NVidia's First Generation RTX cards it's a tough thing to compare, with anything really..
 
Joined
Aug 20, 2007
Messages
21,469 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
I don't know about you guys, but as someone who plays a lot of Indie games that are and will be predominantly DX11 for some time... That DX11 overhead is a dealbreaker for me.

It is good to see AMD have a competivie product though, for most situations.
 
Joined
Apr 12, 2013
Messages
7,529 (1.77/day)
These cards are hard to compare because you can't really truly compare it to Nvidia's second Generation of RTX card. The Rasterization is much higher then NVidia's First Generation RTX cards it's a tough thing to compare, with anything really..
If you prefer RT Nvidia is a better choice, at least till RDNA3 cards debut. For traditional rasterization based games AMD is the way to go IMO.
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
From the article conclusion:

Zen 3 is sold out everywhere

That's really not the case. Here in Romania, there is no shortage of 5800X. I bought one yesterday at MSRP + 2%, and in 30 minutes it will be delivered, which is why I can't sleep right now :). I could have also bought a 5900X at MSRP + 10%, but I wanted a CPU with only one CCD. I had much more trouble finding a good motherboard for it, since I'm switching from Intel. I didn't find the motherboard I wanted from my usual retailer, I had to order one from a more obscure shop, so it will only be delivered towards the end of the week, or even next week :(.

But 5600X is indeed out of stock, and so is the 5950X. Anyway, if that statement would be changed to "Zen 3 is sold out almost everywhere", I wouldn't necessarily disagree. Maybe Romania is special. Although we seem to be affected by GPU shortages just as much as the rest of the world, so I have a suspicion that even worldwide the Zen 3 shortages are not as bad as the GPU shortages.

Now back to the topic, as others, I'm a bit underwhelmed by the 6900 XT performance. I expected more. I'm only interested in 4K performance, and this is how it looks in the TPU 4K benchmarks, even with SAM and DDR4-3800:

Code:
in  6 games - 6900 XT is slower than the 3080 at 4K
in 13 games - 6900 XT is somewhere between the 3080 and 3090 at 4K
in  4 games - 6900 XT is faster than the 3090 at 4K

So, even with SAM and faster memory, it's closer to a 3080 than a 3090 at 4K. Taking raytracing into account, it's even worse. And I'm interested in doing some machine learning on my GPU, and the AMDs are not ideal for that, to say the least, for various reasons.

The only saving grace is the increased efficiency of the 6900 XT. I received a Gigabyte 3080 Vision OC as a birthday present from my colleagues this weekend, and the damn thing made me open my window to cool my room. In the winter. While idle at the desktop. I shudder thinking what it will be like in the summer if I can't find a solution. I'm still troubleshooting why it's so power hungry even when idle.

So, maybe there are some use cases where the 6900 XT makes sense over the NVidia cards after all. But personally, I think the 6900 XT is at least $200 more expensive than it should be.
 
Joined
Apr 12, 2013
Messages
7,529 (1.77/day)
AMD will probably sell at least 10x as many zen3 chiplets (or 5x as many chips?) by the end of the year wrt Ampere & RDNA2 cards combined. The margins are much higher, yields much better & capacity isn't as constrained.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I received a Gigabyte 3080 Vision OC as a birthday present from my colleagues this weekend, and the damn thing made me open my window to cool my room. In the winter. While idle at the desktop.
What on earth are you on about, at idle you'll be lucky if it draws 50w. I have a 3080, summer has just started in Australia and mine doesn't appreciably warm my room when gaming.

Most of the rest of what you're saying makes some sense but I just can't get behind that quote, if the 3080 did that to you there's a high likelihood that in an apples to apples scenario so did the last card/rest of the system.
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Now I wonder what happens going forward for RNDA3?

I'd be keen to see if the this 4 to 1 relationship gets revised to something like a 6 to 2 relationship? That might allow for more brute force as well as alternate frame half resolution temporal ray traced acceleration effects.
1607487101551.png


I'm not sure what they'll do with infinity cache. I could see a minor bump to the size especially after a node shrink or maybe 7nm EUV potentially as well. The other aspect is it's split between two 64MB slabs similar to CCX's so I wonder if a monolithic 128MB slab that's shared access is bound to happen eventually.

As for this bit on the CU scalars and schedulers where is this going?
1607487978593.png


I think maybe they'll double down or increase it's overall design layout granularity and scheduling relationship another 50% potentially. That in mind if they bump up the infinity cache another 64MB and make it all shared a 50% in this area makes a lot more sense.

I want to know more about Radeon Boost how configurable is it can you pick a custom downscale resolution target point to adhere to? It seems like it would work well in practice I'm just curious how adjustable it is. I think there are defiantly people that might prefer downscale the resolution to 1440p from 4K as opposed to 1080p or even more custom targets in between both like LOD mipmap scaling just more granular option targets to scale how much image fidelity to performance is adjusted while in motion. I really the idea of it a lot I've just only seen that one slide on it which isn't real detailed unfortunately.

I really think Wizard should consider a chart for 4K with Radeon Boost enabled with SAM on and off. The way that the smart access memory works that is a interesting combination to look at because they play into each other well with Radeon Boost making SAM enabled more ideal for people playing at high resolutions. You get a 7% SAM advantage at 1080p and 2% at 4K so with Radeon Boost with SAM you should have somewhere in the 2% to 7% ballpark!? I don't know how well average 5% roughly, but could lean more towards 7% or 2% depends how much scene activity is going on of course when it matters though it should be closer to the 7% mark. If for any other reason it would be interesting to see how Radeon Boost and SAM interact with the mixed rendering.
 
Last edited:
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
What on earth are you on about, at idle you'll be lucky if it draws 50w. I have a 3080, summer has just started in Australia and mine doesn't appreciably warm my room when gaming.

Most of the rest of what you're saying makes some sense but I just can't get behind that quote, if the 3080 did that to you there's a high likelihood that in an apples to apples scenario so did the last card/rest of the system.

That's why I said I'm troubleshooting the issue; I don't think this is normal, and I'm trying to determine if it's a problem with my system or the board. Apparently, the RAM remains at full speed even when idle, and it uses 21% of its power target when idle as a result. It's a 320W card, so 21% would be something like 70 W. Which it blows towards me constantly, if I keep the side of my case open. The fan is almost never idle.

My previous card was a 2080, and it never did this, on the same system.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Has BAR size been benchmarks at different aperture size settings for power consumption yet!!? I wonder what kind of impact it is has on that surely TDP goes up a bit, but maybe not too badly and likely mostly in line with the GPU uplift in any case. Still it's something to look at and consider and wonder if that's played any role in why until now it kind of got set at 256MB and forgotten or set aside until now.
 
Joined
Nov 18, 2010
Messages
7,534 (1.47/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
AMD doesn't use shunts, the power draw is estimated internally in the GPU afaik

Indeed. They have spoiled any fun of doing hard OC.

Hoping the AIB versions will have a decently limited bios.
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
AMD should probably consider a special form of Radeon Boost to apply just for the RTRT elements that can be adjusted between 480p/720p/1080p for the time being and revised and scaled upward later on newer upward for RDNA2. It might not be a gigantic reduction to RTRT image quality relative to the performance gains most of the scene is still ultimately rasterized. If they could add that as a software option to RDNA 2 that would change the RTRT battlefield quite a bit at least until Nvidia follows suit though is there even a way to check to the end user what the RTRT resolution adheres to? I know you can adjust the quality, but does it specify the resolution or simply the quality which could be determined by several factors like the amount of light rays and bounces.
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
What on earth are you on about, at idle you'll be lucky if it draws 50w.

I think I got to the bottom of it. Using multiple monitors triggers the problem with my Gigabyte RTX 3080 Vision OC. I have 2 or 3 displays connected at all times: a 4K @ 60 Hz monitor over DisplayPort, a 3440x1440 monitor at 100Hz, also over DisplayPort, and a 4K TV at 60Hz HDR, over HDMI, which I usually keep turned off.

After closing all applications, it still refused to reduce GPU memory speed. But I noticed when Windows turns off my displays the GPU memory frequency and power usage finally goes down. So, I disconnected my 4K monitor. The power usage went down to 7%, and the memory frequency dropped to 51MHz from 1188MHz. I turned on the 4K TV instead, the power usage and memory frequency remained low. I turned off the 4K TV again and reconnected the 4K monitor. The power usage and memory frequency went up again. I disconnected the 3440x1440 display, the frequency and power usage dropped. I turned on the 4K TV, the power usage and memory frequency remained low.

So, in short, if I connect both my monitors, over DisplayPort, the memory frequency never goes down. As a final experiment, I connected the 3440x1440 display over HDMI, at 50Hz. There were some oscillations, depending on which apps were open, but the GPU power usage and memory frequency remained low, for the most part.

So, I'm guessing it really doesn't like having multiple monitors at high refresh rates and resolutions connected, especially over DisplayPort. This is how the power and frequency usage looked while I was disconnecting/connecting various monitors:

AORUS_2020-12-09_07-36-03.png


The thing is, I looked at all the 3080 TPU reviews, and none of them mentioned the GPU memory frequency being higher when idle and using multiple monitors, unless I missed something.

@W1zzard have seen anything like on any of the 3080s in your tests, GPU memory frequency never going down while using multiple monitors? You have a table with clock profiles on each GPU review, and for all your 3080 reviews you listed the multi-monitor GPU memory frequency as 51MHz. How exactly did you test that? How many monitors, at which resolutions/refresh rates, and how were they connected? DisplayPort, or HDMI? If there were just a couple of monitors at low resolutions, then that might explain the difference to my experience with the Gigabyte RTX 3080 Vision OC.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In negatives it lists:

"Overclocking requires power limit increase"

Is there a piece of computer hardware that when overclock Ed DOESN'T require a power limit increase?
All the custom design RX 6800 XT cards overclock just fine without power limit increase. "power limit increase" = you must increase the power limit slider in radeon settings or OC will not do anything.

Obviously overclocking always increases power consumption, that's not what I meant

How come their isn't a comparison in the Raytracing charts when overclocked ?
RT is simply not important enough at this time. I test SO many things, reviews need to be finished in a reasonable timeframe, so I have to make compromises.

That's really not the case. Here in Romania, there is no shortage of 5800X
Congrats on your new processor. The supply situation is definitely not normal, i.e. anyone can get any CPU at reasonable prices

Has BAR size been benchmarks at different aperture size settings for power consumption yet
You can't adjust the BAR size, the size == VRAM size, that's the whole point of mapping all GPU memory into CPU address space. Obviously it does not "use" the whole VRAM, I also suspect some secret sauce here, i.e. per-game optimizations in how data is transfered, AMD hinted at that in the press briefings

@W1zzard have seen anything like on any of the 3080s in your tests, GPU memory frequency never going down while using multiple monitors? You have a table with clock profiles on each GPU review, and for all your 3080 reviews you listed the multi-monitor GPU memory frequency as 51MHz. How exactly did you test that? How many monitors, at which resolutions/refresh rates, and how were they connected? DisplayPort, or HDMI? If there were just a couple of monitors at low resolutions, then that might explain the difference to my experience with the Gigabyte RTX 3080 Vision OC.
It's detailed on the power page in the expandable spoiler. Two monitors: 1920x1080 and 1280x1024, intentionally mismatched, one DVI, one HDMI, intentionally mismatched.

I think you are seeing increased clocks due to the refresh rate? Try going to 75 Hz or even 60 Hz.

Would love to hear more about this, could be good input so I can adjust my testing, in a separate thread please
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Not a bad result for the 6900xt considering the price difference with 3090. But yet again, considering the price of the 6900xt that is definitely not my card.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
All the custom design RX 6800 XT cards overclock just fine without power limit increase. "power limit increase" = you must increase the power limit slider in radeon settings or OC will not do anything.

Obviously overclocking always increases power consumption, that's not what I meant


RT is simply not important enough at this time. I test SO many things, reviews need to be finished in a reasonable timeframe, so I have to make compromises.


Congrats on your new processor. The supply situation is definitely not normal, i.e. anyone can get any CPU at reasonable prices


You can't adjust the BAR size, the size == VRAM size, that's the whole point of mapping all GPU memory into CPU address space. Obviously it does not "use" the whole VRAM, I also suspect some secret sauce here, i.e. per-game optimizations in how data is transfered, AMD hinted at that in the press briefings


It's detailed on the power page in the expandable spoiler. Two monitors: 1920x1080 and 1280x1024, intentionally mismatched, one DVI, one HDMI, intentionally mismatched.

I think you are seeing increased clocks due to the refresh rate? Try going to 75 Hz or even 60 Hz.

Would love to hear more about this, could be good input so I can adjust my testing, in a separate thread please

power consumption tests may need higher bandwidth monitors to be relevant, i've done some quick testing here and there does seem to be a threshold at which the GPU's ramp up the multi monitor consumption... gah it'd be a shitty expense to add a high refresh display (or two) to a benchmarking system
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
wait, you can get those fancy little dongles for fake monitors - they'd be perfect for simulating extra screens without actually needing them
(random amazon image for example)

 
Top