• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6900 XT

Joined
Jan 13, 2020
Messages
28 (0.02/day)
It is barely faster than RTX 3080 at 4K but cost 400 dollar more

Not worth buying over RTX 3080 specially when it lacks DLSS and poor RT performance

Both RTX 3090 and RX6900 are bad value for gamers


Also, Nvidia will get something similar to SAM in the futures. It has been confirmed
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
You need to look at those numbers a little closer.

Are you sure?

mspaint_2020-12-09_15-52-47.png
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Fixed that for you.

You need to look at those numbers a little closer.


That is an opinion not everyone will agree with.
Indeed they can't be too bad value to be sold out for starters. That said quantities are low, but plenty of people are complaining about that at the same time so the demand is around.

Alright, but does that take into account "Radeon Boost" being actively enabled? SAM works better at reduced resolutions as well so "Radeon Boost" pairs very ideally with it. That is a more significant perk than DLSS as well I'd argue since it isn't some cherry picked AAA developer enabled feature on handful of games. If I'm not mistake "Radeon Boost" will work across all games which is a significant difference between the two. I'd say regardless it's priced fairly appropriately in line with performance from the looks of things given it's got higher average frame rates than the RTX 3080.

Sure you can argue the RTX 3080 is stronger at RTRT, but average frame rates aren't RTRT in the first place those are the 0.1% frame rates in the context of game development how titles like that exist right now. If you took the catalog of all steam games in it's entirety GPU the RX6900 XT should end up ahead should performance average more or less hold true across more game titles. The fact is RTRT won't skew results too much in the big picture right now because there are so few titles with it at this junction point in time and it will take years for that to even really begin to change substantially.
 
Last edited:
Joined
May 24, 2007
Messages
5,429 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
Thank you for the review W1zzard. Great graphics card at the price point versus 3090. Though, I still think 4k hardware is not *quite* there.

I don't agree with the recent review downgrades caused by "Not for the average gamer." I would rather you provide a review on the hardware, not a review on what percentage of the market will want the hardware. I wish we could do away with this in future reviews.
 
Joined
Mar 23, 2005
Messages
4,086 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Something is wrong with the RTX 3090. It's not meant for PC Gaming, seeing how bad it performs in PC Gaming relative to the price Nvidia is asking. Had this not been released, AMDs RX 6900XT would have been the fastest GPU on the planet, at least for a short period till Nvidia musters up the RTX 3080Ti. So Now we all know why Nvidia had a need to launch a over priced, power sucking, server GPU called RTX 3090, to keep AMDs RDNA2 from achieving a label of fastest GPU, at least for a short while that is. o_O:eek::cry::peace::clap::kookoo::toast::roll:
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
Are you blind or do you not understand your own example?

Your initial modified quote said it's "barely faster than RTX 3090 ", instead of the 3080 from the actual quote. Which was factually wrong, and my post demonstrated that.

Then apparently you realized your mistake, probably after this reply to me, and edited the quote again in your old post, to say it is "barely slower than RTX 3090". Which is debatable, but OK, at least it's not completely wrong.

The point is the 6900 XT is closer to a 3080 than a 3090 at 4K, so the claim of the original poster of that quote, that it is barely faster than the 3080 at 4K, is still more appropriate than your second edit to it, in my opinion.

Alright, but does that take into account "Radeon Boost" being actively enabled? SAM works better at reduced resolutions as well so "Radeon Boost" pairs very ideally with it. That is a more significant perk than DLSS as well I'd argue since it isn't some cherry picked AAA developer enabled feature on handful of games. If I'm not mistake "Radeon Boost" will work across all games which is a significant difference between the two. I'd say regardless it's priced fairly appropriately in line with performance from the looks of things given it's got higher average frame rates than the RTX 3080.

Honestly, at the moment I don't care about either DLSS or Radeon Boost. I don't intend to use DLSS, but I may change my mind in the future. And Radeon Boost just reduces the actual rendering resolution when you make fast movements, if I understand correctly. It's a nice feature, but in a title like Microsoft Flight Simulator, which is what I played the most recently, you rarely make fast movements. Depending on the game and your play style so it might not help much, if at all, even if in theory it could work for any game.
 
Joined
Jul 5, 2013
Messages
27,806 (6.68/day)
So Now we all know why Nvidia had a need to launch a over priced, power sucking, server GPU called RTX 3090, to keep AMDs RDNA2 from achieving a label of fastest GPU, at least for a short while that is.
Wrong. NVidia launched the 3090 as a premium product and it shines in that capacity. It is currently the only card shown so far to do 8k gaming. The 6900XT is likely to be able to do it as well, but no one has shown those results yet.

Then apparently you realized your mistake, probably after this reply to me, and edited the quote again in your old post, to say it is "barely slower than RTX 3090".
Yup, true. I did see my error and corrected it. Still doesn't matter. Their original statement was blatantly and deliberately incorrect and that is what I was pointing out. Perhaps you failed to understand that context.
Which is debatable, but OK, at least it's not completely wrong.
Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

Yes, NVidia technically has the performance crown with the 3090, but only by the slimest of margins and only through collating averages over a limited number of gaming titles. As I said earlier, the 3090s advantages are the RTRT performance and extra 8GB of VRAM. Otherwise AMD has matched NVidia this round of GPU's. 3090-like performance for 2/3 the price? 6900XT. 3080-like performance for $110 less? 6800XT.

Trying to minimize AMDs progress and offerings in the way many users have been doing in this thread is the same kind of narrow-minded nonsense that people were spewing about RTRT with the release of the RTX2000 series cards a few years ago. It's pathetic.
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
I think I got to the bottom of it. Using multiple monitors triggers the problem with my Gigabyte RTX 3080 Vision OC. I have 2 or 3 displays connected at all times: a 4K @ 60 Hz monitor over DisplayPort, a 3440x1440 monitor at 100Hz, also over DisplayPort, and a 4K TV at 60Hz HDR, over HDMI, which I usually keep turned off.

After closing all applications, it still refused to reduce GPU memory speed. But I noticed when Windows turns off my displays the GPU memory frequency and power usage finally goes down. So, I disconnected my 4K monitor. The power usage went down to 7%, and the memory frequency dropped to 51MHz from 1188MHz. I turned on the 4K TV instead, the power usage and memory frequency remained low. I turned off the 4K TV again and reconnected the 4K monitor. The power usage and memory frequency went up again. I disconnected the 3440x1440 display, the frequency and power usage dropped. I turned on the 4K TV, the power usage and memory frequency remained low.

So, in short, if I connect both my monitors, over DisplayPort, the memory frequency never goes down. As a final experiment, I connected the 3440x1440 display over HDMI, at 50Hz. There were some oscillations, depending on which apps were open, but the GPU power usage and memory frequency remained low, for the most part.

So, I'm guessing it really doesn't like having multiple monitors at high refresh rates and resolutions connected, especially over DisplayPort. This is how the power and frequency usage looked while I was disconnecting/connecting various monitors:

View attachment 178801

The thing is, I looked at all the 3080 TPU reviews, and none of them mentioned the GPU memory frequency being higher when idle and using multiple monitors, unless I missed something.

@W1zzard have seen anything like on any of the 3080s in your tests, GPU memory frequency never going down while using multiple monitors? You have a table with clock profiles on each GPU review, and for all your 3080 reviews you listed the multi-monitor GPU memory frequency as 51MHz. How exactly did you test that? How many monitors, at which resolutions/refresh rates, and how were they connected? DisplayPort, or HDMI? If there were just a couple of monitors at low resolutions, then that might explain the difference to my experience with the Gigabyte RTX 3080 Vision OC.

It's a bit off topic but anyway, i got this issue during a few month with my 2070 Super (1440p 144hz monitor on DP + 1080p 60Hz monitor on hdmi) but it solved itself at some point. It seems to be quite common issue with nvidia card. If a clean driver reinstall doesn't solve the issue, you can use the Multi Display Power Saver module from Nvidia Inspector to force the reduced frequency when idling.
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
It's a bit off topic but anyway, i got this issue during a few month with my 2070 Super (1440p 144hz monitor on DP + 1080p 60Hz monitor on hdmi) but it solved itself at some point. It seems to be quite common issue with nvidia card. If a clean driver reinstall doesn't solve the issue, you can use the Multi Display Power Saver module from Nvidia Inspector to force the reduced frequency when idling.

Thanks for trying to help, but I already found the problem, it was having HDR enabled on one of the 4K displays at 60Hz. If I disable it, the power comes down significantly. My old card didn't support 4K HDR at 60Hz with the HDMI to DisplayPort adapter I was using, which is probably why I didn't have this problem with the old card. So, in a way, it's not a bug, it's a feature :)

Still, according to the TPU reviews the new Radeons are much more efficient in multi-monitor setups, so it's something to consider if you don't use your PC just for gaming.

Anyway, as @W1zzard requested, I added more details about my troubleshooting in a dedicated thread:

 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Thanks for trying to help, but I already found the problem, it was having HDR enabled on one of the 4K displays at 60Hz. If I disable it, the power comes down significantly. My old card didn't support 4K HDR at 60Hz with the HDMI to DisplayPort adapter I was using, which is probably why I didn't have this problem with the old card. So, in a way, it's not a bug, it's a feature :)

Still, according to the TPU reviews the new Radeons are much more efficient in multi-monitor setups, so it's something to consider if you don't use your PC just for gaming.

Anyway, as @W1zzard requested, I added more details about my troubleshooting in a dedicated thread:



Ahah! fantastic catch, i'd just been fiddling with HDR on my new screen as well and could have made the same mistake (honestly, HDR looks so bad on monitors)
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Wrong. NVidia launched the 3090 as a premium product and it shines in that capacity. It is currently the only card shown so far to do 8k gaming. The 6900XT is likely to be able to do it as well, but no one has shown those results yet.


Yup, true. I did see my error and corrected it. Still doesn't matter. Their original statement was blatantly and deliberately incorrect and that is what I was pointing out. Perhaps you failed to understand that context.

Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

Yes, NVidia technically has the performance crown with the 3090, but only by the slimest of margins and only through collating averages over a limited number of gaming titles. As I said earlier, the 3090s advantages are the RTRT performance and extra 8GB of VRAM. Otherwise AMD has matched NVidia this round of GPU's. 3090-like performance for 2/3 the price? 6900XT. 3080-like performance for $110 less? 6800XT.

Trying to minimize AMDs progress and offerings in the way many users have been doing in this thread is the same kind of narrow-minded nonsense that people were spewing about RTRT with the release of the RTX2000 series cards a few years ago. It's pathetic.

A 6800XT is $110 less than a 3080? News to me...
 
Joined
Jul 5, 2013
Messages
27,806 (6.68/day)
A 6800XT is $110 less than a 3080? News to me...
It's as easy as looking up the the prices.
For example;
$699
$649
$589
Unless my math is off, that's a $50 difference in favor of the 6900XT. The $110 difference is for the 6800. Seems I looked at a 6800 when I looked up prices earlier. Even still AMD has the value add.
 
Last edited:
Joined
Jul 9, 2016
Messages
1,078 (0.35/day)
System Name Main System
Processor i9-10940x
Motherboard MSI X299 Xpower Gaming AC
Cooling Noctua NH-D15S + Second Fan
Memory G.Skill 64GB @3200MHz XMP
Video Card(s) ASUS Strix RTX 3090 24GB
Storage 2TB Samsung 970 EVO Plus; 2TB Corsair Force MP600; 2TB Samsung PM981a
Display(s) Dell U4320Q; LG 43MU79-B
Case Corsair A540
Audio Device(s) Creative Lab SoundBlaster ZX-R
Power Supply EVGA G2 1300
Mouse Logitech MK550
Keyboard Corsair K95 Platinum XT Brown Switches
Software Windows 10 Pro
Benchmark Scores Cinebench R20 - 6910; FireStrike Ultra - 13241; TimeSpy Extreme - 10067; Port Royal - 13855
It's as easy as looking up the the prices.
For example;
$699
$649
$589
Unless my math is off, that's a $50 difference in favor of the 6900XT. The $110 difference is for the 6800. Seems I looked at a 6800 when I looked up prices earlier.

That is an 6800xt, not 6900xt. Msrp of 6900xt is $999
 
Joined
Jul 23, 2019
Messages
71 (0.04/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling MSI MPG Coreliquid K360
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Asus GeForce RTX 4070 DUAL OC
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
The price differences may depend on the country, but here (in France), there is almost no price differences between a 6800XT (from ~770€) and a rtx 3080 (start at ~800€). The 6900XT are listed starting at 1250€, which seems quite high compared to the difference in performance with a 3080, especially if you don't have a ryzen 5000.
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

OK, it's not debatable. I'm just going to post numbers then, without debating. This is how the 6900XT looks at 4K compared to the 3090, even if you give it every possible advantage, including enabling SAM:

- 17% slower in Jedi: Fallen Order
- 15% slower in Control
- 15% slower in Anno 1800
- 15% slower in The Witcher 3
- 14% slower in Civilization VI
- 14% slower in Metro Exodus
- 12% slower in Devil May Cry 5
- 8% slower in Divinity Original Sin II
- 8% slower in Borderlands 3
- 7% slower in DOOM Eternal
- 6% slower in Red Dead Redemption 2
- 4% slower in F1 2020
- 4% slower in Gears 5
- 3% slower in Assassin's Creed Odyssey
- 3% slower in Death Stranding
- 3% slower in Sekiro: Shadows Die Twice
- 3% slower in Shadow of the Tomb Raider
- 2% slower in Project Cars 3
- 1% slower in Strange Brigade

- 3% faster in Far Cry 5
- 6% faster in Battlefield V
- 6% faster in Detroit Become Human
- 8% faster in Hitman 2

Look, don't get me wrong, the 6900XT is a nice card, and I have no problem buying AMD products when they are better than the competing ones and the price makes sense. For example I have a 5800X in a box on my desk right now, and I'm waiting for the motherboard to be delivered.

AMD has to be congratulated for closing the gap to NVidia, and I can't wait to see the next generation of AMD GPUs. All I'm saying is that at this price point the 6900XT is not something that makes me go "wow". Neither is the 3090, considering its huge price. I'm not saying not to buy them. If you need that additional performance and are willing to pay the price, go for it. I don't.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Now do the same comparison for 1440p - not everyone is focused on 4k
 
Joined
Sep 24, 2020
Messages
145 (0.10/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Great to see Zen3 + SAM combo tested.

even rx 6900 xt and with SAM on and picket mem cant beat nvidia 3090 Fonders edition model.

It depends on which games you test.
Pick up the newest hottest and uh oh doh.

1607594690317.png


The games in question:

1607594709433.png


OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.
There is a number of people with decent 1440p monitor looking for high framerates.

This is how the 6900XT looks at 4K
Most games (understandably) are old crap.
And this is likely why AMD has better results at 1440p and below: you get into CPU limited scenarios with that old crap like Civ4.
 
Joined
Jul 5, 2013
Messages
27,806 (6.68/day)
All I'm saying is that at this price point the 6900XT is not something that makes me go "wow". Neither is the 3090, considering its huge price. I'm not saying not to buy them.
You're forgetting the prices of Vega, Radeon7 and the RTX2000 series card. The 3090 is effectively the RTX Titan replacement, which offers approx 50% greater performance and at $1000 less. The RX5000 and RX6000 are likewise less expensive than previous gen gpu's and offer amazing performance jumps. Maybe I find everything exciting and amazing because I don't have a short memory, can keep perspective and context clearly in view. That wasn't a jab at you personally because it seems a lot of people are simply forgetting to remember the recent past.

The reality is thus, GPU offerings from both companies are exceptional this generation and are a serious value when compared to past generations of GPUs. Logic is lost on anyone who does not see and understand the context of that perspective.
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Now do the same comparison for 1440p - not everyone is focused on 4k
AMD just needs to make single card dual GPU solution with two RX 6900 XT's problem solved $500's more expensive and best case 17% more performance than a RTX 3090...no need to worry about TDP, noise, or heat output those figures aren't considerations to Nvidia RTX 3 series users at that end of the spectrum.

OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.
Alright, but 4K isn't a minority? DLSS/RTRT games aren't a minority in contrast to the amount of games w/o those features!? People that can just burn $500's for best case 17% more performance and overlook heat, noise, and power usage aren't minorities!? Who are you really Tom Cruise!?
 
Last edited:
Top