• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 3090 Ti Suprim X

Joined
May 31, 2017
Messages
432 (0.16/day)
Processor Ryzen 5700X
Motherboard Gigabyte B550 Arous Elite V2
Cooling Thermalright PA120
Memory Kingston FURY Renegade 3600Mhz @ 3733 tight timings
Video Card(s) Sapphire Pulse RX 6800
Storage 36TB
Display(s) Samsung QN90A
Case be quiet! Dark Base Pro 900
Audio Device(s) Khadas Tone Pro 2, HD660s, KSC75, JBL 305 MK1
Power Supply Coolermaster V850 Gold V2
Mouse Roccat Burst Pro
Keyboard Dogshit with Otemu Brown
Software W10 LTSC 2021
why not list msrp price
 
D

Deleted member 202104

Guest
why not list msrp price

It's on the Value and Conclusion page:

1648601556310.png
 
Joined
Nov 11, 2016
Messages
3,479 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Awesome review with lots of new games added, sadly 3090Ti is just a 3090 with higher default power limit, so nothing exciting here.
 
Joined
Jun 5, 2021
Messages
284 (0.22/day)
Shows the 3090 was bandwidth starved.. in some games the 3090ti has a massive lead at 1440p
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
So preparation for ADA started with putting an emphasis on the power consumption from NVIDIA with its 3090 ti which ADA will be compared to. That move will surely ease he blow and power consumption for the ADA release. People already say stupid stuff like 'you dont like the power consumption don't buy it'. Whatever anyone says here the power consumption rating for the current cards is atrocious and it will get worse overtime. It is hard to say but we are definitely going backwards. These companies should start working for their money not look for an excuse to make a furnace out of a graphics card to get some more performance. Fundamental changes are the only way to go.

@W1zzard out of curiosity. The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,974 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?
In 4K? Yeah, that's because they are running out of VRAM
 
Joined
Nov 11, 2016
Messages
3,479 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
So preparation for ADA started with putting an emphasis on the power consumption from NVIDIA with its 3090 ti which ADA will be compared to. That move will surely ease he blow and power consumption for the ADA release. People already say stupid stuff like 'you dont like the power consumption don't buy it'. Whatever anyone says here the power consumption rating for the current cards is atrocious and it will get worse overtime. It is hard to say but we are definitely going backwards. These companies should start working for their money not look for an excuse to make a furnace out of a graphics card to get some more performance. Fundamental changes are the only way to go.

@W1zzard out of curiosity. The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?

Just limit the max FPS to reduce power consumption, looks like 3090Ti is using less watts than 6900XT in this scenario.
power-vsync.png
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
In 4K? Yeah, that's because they are running out of VRAM
Bummer. These would have been capable to run those games at 4k with RT on.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,974 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Bummer. These would have been capable to run those games at 4k with RT on.
Exactly, yet there's often drama about "only x GB"
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Just limit the max FPS to reduce power consumption, looks like 3090Ti is using less watts than 6900XT in this scenario.
View attachment 241765
Sure. Or you can just go 1080p with Vsync. Im sure it will use even less. Very efficient card.
:laugh:

Exactly, yet there's often drama about "only x GB"
That is not a good thing. Considering NV is pushing so much for RT and yet constrain the cards with insufficient memory capacity to run them. Even if they are capable for 4k. All the DLSS and RT becomes irrelevant for those cards.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,974 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
That is not a good thing. Considering NV is pushing so much for RT and yet constrain the cards with insufficient memory capacity to run them. Even if they are capable for 4k. All the DLSS and RT becomes irrelevant for those cards.
DLSS lowers the resolution, which lowers the memory requirement, 4K+DLSS will run perfectly fine
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
DLSS lowers the resolution, which lowers the memory requirement, 4K+DLSS will run perfectly fine
So where you able to run Doom at 4k with RT on, with a 3070 Ti for instance with DLSS on and it gave you decent FPS? Or was that out of the picture as well?
If it did run, then DLSS has another functionality.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,974 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
So where you able to run Doom at 4k with RT on, with a 3070 Ti for instance with DLSS on and it gave you decent FPS? Or was that out of the picture as well?
If it did run, then DLSS has another functionality.
I haven't tested it, but I'm quite positive that it will run with good FPS in that scenario
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I haven't tested it, but I'm quite positive that it will run with good FPS in that scenario
How about that. DLSS is a software override for memory capacity insufficiency. :roll:
Do you have any memory requirements for the 4K doom and FarCry when RT is on? I'm guessing the minimum is 10GB since 3080 with 10GB is fine. Assuming, memory requirements numbers are growing, I wonder, how other games tested stack with the ram memory requirements for 4k RT gameplay. 8GB at the edge or there is some spare left.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,974 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Do you have any memory requirements for the 4K doom and FarCry when RT is on? I'm guessing the minimum is 10GB since 3080 with 10GB is fine.
Yup, maybe it's 9 GB, but same thing really. It also depends on the map, your location in it, and the settings of course (I'm using highest)
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yup, maybe it's 9 GB, but same thing really. It also depends on the map, your location in it, and the settings of course (I'm using highest)
Well apparently the 3070 could have had a decent FPS in Doom or FarCry even without DLSS on but memory capacity does not allow it. It is really starting to be an issue. A handicapped card.
 
Joined
May 2, 2017
Messages
7,762 (2.77/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?
I assume you're talking about RT performance? 'Cause in rasterization this does not apply at all.

As for "decent FPS" in FarCry6 with RT on? In lower resolutions these cards roughly match or slightly beat the 2080Ti, which delivers 48.8fps - hardly ground breaking performance. Playable? Absolutely. But hardly good. Calling the cards handicapped because of poor performance in an extreme edge case scenario (RT performance at the highest reasonably available resolution in two games out of nine), for cards arguably not designed for gaming at that resolution in the first place? Yeah, that's a stretch. Sure, the 3070 and Ti are perfectly capable 2160p60 cards in rasterization. But not in RT. And that is fine - it's an extreme requirement.

Just limit the max FPS to reduce power consumption, looks like 3090Ti is using less watts than 6900XT in this scenario.
View attachment 241765
I'm genuinely curious as to why this is. Looking at CP2077 performance results, sadly these cards seem CPU limited at 1080p, so there isn't much to gather there - though at 1440p the 3090Ti clearly pulls ahead of the 3080Ti, 6900XT and 3090 which are tied. Yet in this power consumption graph the Nvidia cards are closely grouped while the 6900XT is ~40% higher. That strikes me as odd, given that the 6900XT uses less power than a 3090, 3080ti, and even the 3080. I understand that the test scenarios for these measurements aren't the same, but the difference seems very strange to me. The Gaming power consumption numbers are also CP2077, though at 1440p, not 1080p - but in this scenario, the 6900XT delivers the essentially identical performance at ~40W less. So how come the situation is so dramatiaclly reversed at 1080p60? @W1zzard, got any thoughts on this?
 
Joined
May 31, 2016
Messages
4,446 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I assume you're talking about RT performance? 'Cause in rasterization this does not apply at all.
If you want to know about then you need to read the post. Cutting in a conversation may lead to misunderstanding.
Yes it is about RT.
As for "decent FPS" in FarCry6 with RT on? In lower resolutions these cards roughly match or slightly beat the 2080Ti, which delivers 48.8fps - hardly ground breaking performance. Playable? Absolutely. But hardly good. Calling the cards handicapped because of poor performance in an extreme edge case scenario (RT performance at the highest reasonably available resolution in two games out of nine), for cards arguably not designed for gaming at that resolution in the first place? Yeah, that's a stretch. Sure, the 3070 and Ti are perfectly capable 2160p60 cards in rasterization. But not in RT. And that is fine - it's an extreme requirement.
as you mentioned still playable and doom maxed out would be around 70 which is perfect. With FarCry6 you can always drop some detail and play 60 no problem if mid 50 is not what you'd expect. Memory constraints prevent that. That is why I said handicapped card. Same goes for the 3070 Ti as well. You could use both those cards to play 4k with RT on no problem with both games mentioned. Due to lack of memory they can't. New feature for DLSS. Enabling handicapped GPUs to play at 4K due to low memory. I hope that is the case. Something tells me, since RT is booming, more RAM will be required as time goes by so we will see more of those situations, card cant run 4k even though they have enough core performance.
I disagree with you. If it had the 10GB RAM it would have been capable of 4K no problem. So by design they are not capable off that due to memory. It is like you pay cash and you have to play what it is design for even though you could have played higher res. For me a handicap not a feature.

I'm genuinely curious as to why this is. Looking at CP2077 performance results, sadly these cards seem CPU limited at 1080p, so there isn't much to gather there - though at 1440p the 3090Ti clearly pulls ahead of the 3080Ti, 6900XT and 3090 which are tied. Yet in this power consumption graph the Nvidia cards are closely grouped while the 6900XT is ~40% higher. That strikes me as odd, given that the 6900XT uses less power than a 3090, 3080ti, and even the 3080. I understand that the test scenarios for these measurements aren't the same, but the difference seems very strange to me. The Gaming power consumption numbers are also CP2077, though at 1440p, not 1080p - but in this scenario, the 6900XT delivers the essentially identical performance at ~40W less. So how come the situation is so dramatiaclly reversed at 1080p60? @W1zzard, got any thoughts on this?
You know how one GPU utilize the given resources? The power consumption and performance are not linear? At some point you need to give more power to achieve certain performance level. That is what you see here. The GPU has more resources than 6900xt you have mentioned. It clocks lower as well so power drops significantly for 3090Ti while 6900xt has to use more resources and that comes with a power usage. Also, 6900 XT has a power limit as you know preventing the situation like 3090 Ti going above 450W.
I hope that is what you have been wondering about. At least that is how I see it.
You can see similar behavior with CPUs.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,736 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!

Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.

So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .

The price of RT and 4K.

Both questionable moves 'forward' that require substantial increases in supporting hardware (cache size, VRAM, and changes in cores/specialized cores). We had very efficient GPUs for rasterized content, and even just 4K wasn't a massive issue on its own. 28nm was in a pretty good place at the end of that node, as was TSMC 16nm.

Still I say, its going to be very interesting to see where RT will go in the future. Widespread adoption, sure, but in what magnitude and how worthwhile it remains to dedicate hardware and die space to it... AMD might be on to something with much smaller dies that do RT but aren't great at it. A bigger die will still do the whole gaming operation faster, but you can use all of it for all content.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I'm just depressed a card I would like to eventually upgrade to is already over 60% slower than this.
 
Joined
May 2, 2017
Messages
7,762 (2.77/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
If you want to know about then you need to read the post. Cutting in a conversation may lead to misunderstanding.
Yes it is about RT.
Ahem. Your first post bringing this up said the following:
@W1zzard out of curiosity. The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?
No mention of RT there. Hence my question. This isn't because I'm "cutting into a conversation", it's because I was curious about the premise of said conversation, as it was unclear. You literally didn't say.
as you mentioned still playable and doom maxed out would be around 70 which is perfect. With FarCry6 you can always drop some detail and play 60 no problem if mid 50 is not what you'd expect. Memory constraints prevent that. That is why I said handicapped card. Same goes for the 3070 Ti as well. You could use both those cards to play 4k with RT on no problem with both games mentioned. Due to lack of memory they can't. New feature for DLSS. Enabling handicapped GPUs to play at 4K due to low memory. I hope that is the case. Something tells me, since RT is booming, more RAM will be required as time goes by so we will see more of those situations, card cant run 4k even though they have enough core performance.
I disagree with you. If it had the 10GB RAM it would have been capable of 4K no problem. So by design they are not capable off that due to memory. It is like you pay cash and you have to play what it is design for even though you could have played higher res. For me a handicap not a feature.
As for this, we'll have to disagree on that. While this "problem" will no doubt become more noticeable in the future, at the same time its absolute compute performance (whether rasterization or RT) will simultaneously decrease relative to the demands put on it by games, meaning that by the point where this is a dominating issue (rather than an extreme niche case, like today), those GPUs likely wouldn't produce playable framerates even if they had infinite VRAM. Remember, Doom Eternal is just about the easiest-to-run AAA shooter out there in terms of its compute requirements (and it can likely run more than fine at 2160p RT on a 3070 if you lower the texture quality or some other memory-heavy setting to the second highest setting). And it's not like these two games are even remotely representative of RT loads today - heck, nothing is, given that performance for the 3090 Ti at 2160p varies from ~137fps to ~24fps. The span is too wide. So, using these two edge cases as a predictor for the future is nit-picking and statistically insignificant. So again, calling the cards "handicapped" here is ... well, you're picking out an extreme edge case and using it in a way that I think is overblown. You can't expect universal 2160p60 RT from any GPU today, so why would you do so with an upper mid-range/lower high end GPU? That just doesn't make sense. Every GPU has its limitations, and these ones clearly have their limitations most specifically in memory-intensive RT at 2160p - the most extreme use case possible. That is a really small limitation. Calling that a "handicap" is making a mountain out of a molehill.

You know how one GPU utilize the given resources? The power consumption and performance are not linear? At some point you need to give more power to achieve certain performance level. That is what you see here. The GPU has more resources than 6900xt you have mentioned. It clocks lower as well so power drops significantly for 3090Ti while 6900xt has to use more resources and that comes with a power usage. Also, 6900 XT has a power limit as you know preventing the situation like 3090 Ti going above 450W.
I hope that is what you have been wondering about. At least that is how I see it.
That is a way too simplistic solution to this conundrum. As a 6900XT owner using it on a 1440p60 display, I know just how low that GPU will clock and how efficiently it will run if it doesn't need the power (that 75W figure I gave for Elden Ring isn't too exceptional). I've also run an undervolted, underclocked profile at ~2100MHz which never exceeded 190W no matter what I threw at it. The point being: RDNA2 has no problem clocking down and reducing power if needed. And, to remind you, in the game used for power testing here, the 6900XT matches the performance of the 3080Ti and 3090 at 1440p while consuming less power. Despite its higher clocks, even at peak. And, of course, all of these GPUs will reduce their clocks roughly equally, given an equal reduction in the workload. Yet what we're seemingly seeing here is a dramatic difference in said reductions, to the tune of a massive reversal of power efficiency.

So, while you're right that power consumption and performance scaling are not linear, and that a wide-and-slow GPU will generally be more efficient than a fast-and-narrow one, your application of these principles here ignores a massive variable: architectural and node differences. We know that RDNA2 on TSMC 7nm is more efficient than Ampere on Samsung 8nm, even at ~500MHz higher clocks. This is true pretty much true across the AMD-Nvidia product stacks, though with some fluctuations. And it's not like the 3090Ti is meaningfully wider than a 3090 (the increase in compute resources is tiny), and by extension not a 6900XT either. You could argue that the 3080Ti and 3090 are wider than the 6900 XT, and they certainly clock lower - but that runs counter to your argument, as they then ought to be more efficient at peak performance, not less. This tells us that AMD simply has the architecture and node advantage to clock higher yet still win out in terms of efficiency. Thus, there doesn't seem to be any reason why these GPUs wouldn't also clock down and reduce their power to similar degrees, despite their differing starting points. Now, performance scaling per frequency for any single GPU or architecutre isn't entirely linear either, but it is close to linear within the reasonable operating frequency ranges of most GPUs. Meaning that if two GPUs produce ~X performance, one at 2GHz and one at 2.5GHz, the drop in clock speeds needed to reach X/2 performance should be similar, not in MHz but in relative % to their starting frequencies. Not the same, but sufficiently similar for the difference not to matter much. And as power and clock speeds follow each other, even if non-linear, the power drop across the two GPUs should also be similar. Yet here we're seeing one GPU drop drastically more than the other - if we're comparing 3090 to 6900 XT, we're talking a 66% drop vs. a 46% drop. That's a rather dramatic difference considering that they started out at the same level of absolute performance.

One possible explanation: That the Ampere cards are actually really CPU limited at 1080p in CP2077, and would dramatically outperform the 6900XT there if not held back. This would require the same to not be true at 1440p, as the Ampere GPUs run at peak power there, indicating no significant bottleneck elsewhere. This would then require power measurements of the Ampere cards at 1080p without Vsync to check. Another possible explanation is that Nvidia is drastically pushing these cards beyond their efficiency sweet spot in a way AMD isn't - but given the massive clock speeds of RDNA2, that is also unlikely - both architectures seem to be pushed roughly equally (outside of the 3090 Ti, which is ridiculous in this regard). It could also just be some weird architectural quirk, where Ampere is suddenly drastically more efficient below a certain, quite low clock threshold (significantly lower than any of its GPUs clock in regular use). This would require power testing at ever-decreasing clocks to test.

Either way, these measurements are sufficiently weird to have me curious.

I'm just depressed a card I would like to eventually upgrade to is already over 60% slower than this.
60% slower? You're looking at a 3050 as an upgrade to a 980 Ti?
 
Joined
May 31, 2017
Messages
432 (0.16/day)
Processor Ryzen 5700X
Motherboard Gigabyte B550 Arous Elite V2
Cooling Thermalright PA120
Memory Kingston FURY Renegade 3600Mhz @ 3733 tight timings
Video Card(s) Sapphire Pulse RX 6800
Storage 36TB
Display(s) Samsung QN90A
Case be quiet! Dark Base Pro 900
Audio Device(s) Khadas Tone Pro 2, HD660s, KSC75, JBL 305 MK1
Power Supply Coolermaster V850 Gold V2
Mouse Roccat Burst Pro
Keyboard Dogshit with Otemu Brown
Software W10 LTSC 2021
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
@Valantar
I don't see that power consumption so complicated. Each GPU simply have some predefined performance states at some frequencies and it depends on current GPU core percentage usage when the GPU decide to change that state to other one. Both amd and nvidia have different percentage/frequency/... ranges for different performance states count, ...so it is very hard to tell at default conditions if the card could or could not be more power efficient.

I would advise take some static camera angle in game at 1080p60fps, set manually some lower performance state, start manually locking GPU frequencies to lower values and stop this process when gpu reaches almost 100% core utilization. Then do undervolting :D and after that check what is the power consumption. Without this procedure we are all totaly just guessing.

Could you post what are the frequencies (core, vram) of each performance state on your 6900XT?
 
Top