• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI GeForce RTX 3090 Ti Suprim X

Finally there is graphics card(no matter what is the price) on which we can play few years old games(like RDR2) at 1080p@60Hz full details at "stupidly" low consumption :)
EDIT: Can I ask what game was used in that V-Sync 60Hz power consumption summary? ...I just read it in testing details :oops: @W1zzard
Considering how little power my 6900 XT consumes at 1440p60 in most games (75W-ish in Elden Ring, though that's hardly very demanding, just buggy AF), it could be pretty much anything - though I kind of expect it to be at 4k given the seeming advantage of Ampere over RDNA2 in that graph. Care to share some details, @W1zzard ?
 
These are so bad, that the Founders Editions in France have been posted for sale 3.5 hours ago and they are still in stock, lol.
 
Considering how little power my 6900 XT consumes at 1440p60 in most games (75W-ish in Elden Ring, though that's hardly very demanding, just buggy AF), it could be pretty much anything - though I kind of expect it to be at 4k given the seeming advantage of Ampere over RDNA2 in that graph. Care to share some details, @W1zzard ?

From the Power consumption page (click the Power Consumption Testing Details button near the top):

V-Sync: If you don't need the highest framerate and want to conserve power, running at 60 FPS is a good option. In this test, we run Cyberpunk 2077 at 1920x1080, capped to 60 FPS. This test is also useful in testing a graphic card's ability to react to situations with only low power requirements. For graphics card that can't reach 60 FPS at 1080p, we report the power draw at the highest achievable frame rate.
 
You just know they wanted to push these to 550w+ but the design just couldn't handle it.

Got to wait for 4090 till we see the full 600w monster!

In terms of performance, be interested to see this up against the xfx 6900 xt zero wb with power limit also pushed up to 450w odd.
 
Last edited:
3 reason for this release.

1. To ensure the 6950XT does not get performance crown.
2. Inflate value of next gen RTX 4070/4080. Also pumps up their performance per watt improvements.
3. To milk the more money than sense crowd who believe this card being the fastest card on the market is worth 2k while not realizing this privilege will only be for 5 months or so. This is the least important factor simply due to the low volume of this product. Marketing purposes of the first two points are far more valuable.

If this had been an AMD product with similar performance differences, we would be mostly praising AMD for finally getting the performance crown back. But with this launch and the 20% gap in performance, Nvidia has likely succeeded in staving off AMD from taking the performance crown for now.
 
Im a bit confused with the comments. Im an owner of an aftermarket 3090 that can reach 470w with stock bios. Currently running a 550w bios. What is new about this? Most 3090's with 3x8pin could reach roundabout the same consumption. Why are people going crazy all of a sudden? Did they expect the 3090ti to consumes less than the 3090? Im deeply confused...
 
Obviously fake review, because it doesn't pull the 600+W that the REALLY REALLY SMART people have been claiming for months.

/s, for those who aren't REALLY REALLY SMART.

Yeah, but that is two GPUs.

We're supposed to be moving forward, not backwards.
GA102 has 28.3 billion transistors in 628 mm², or ~45 million transistors per mm².
2x Vesuvius have 12.4 billion transistors in 876 mm², or ~14 million transistors per mm².

Apparently, fundamental physics escapes you.

These are so bad, that the Founders Editions in France have been posted for sale 3.5 hours ago and they are still in stock, lol.
I'm sure that has nothing at all to do with a price that very few can afford.

Im a bit confused with the comments. Im an owner of an aftermarket 3090 that can reach 470w with stock bios. Currently running a 550w bios. What is new about this? Most 3090's with 3x8pin could reach roundabout the same consumption. Why are people going crazy all of a sudden? Did they expect the 3090ti to consumes less than the 3090? Im deeply confused...
Your confusion will abate once you realise that most of the people pretending they're horrified, are just AMD fanboys.
 
Last edited:
I'm preparing to quit this hobby looking at the power consumption of these cards. I'm just done.
You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So "I need a 500w GPU or I can't have fun" is definitely not true.
 
Last edited:
People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.

power_maximum.gif
power_peak.gif
power_average.gif
You do make a point when comparing this two halo products but,

Power consumption has already got too damn high across the board compared to this historical chart. If I take into account the current leaks - > it will stay the same or get even higher.

This is bad for pc gaming overall and it will just push people like myself(200w gpu and 200w cpu max that I can take) to consoles: prices will probably be too high, high power consumption and high heat output.
 
And people in the 4090 topic over yonder saying 'muh muh 600W of course it won't'... but this one already hits 480. And yes, 'you don't have to buy it'... but sooner rather than later, we've set the norm for much higher TDP GPUs. Turing was up from half a decade of stable top end TDPs. Ampere was up and away. What's next? Mars? Those nodes aren't getting a whole lot smaller, so perhaps GPUs need some fundamental changes to make their generational jump worthwhile.

You do make a point when comparing this two halo products but,

Power consumption has already got too damn high across the board compared to this historical chart. If I take into account the current leaks - > it will stay the same or get even higher.

This is bad for pc gaming overall and it will just push people like myself(200w gpu and 200w cpu max that I can take) to consoles: prices will probably be too high, high power consumption and high heat output.
Important take away from that chart: top end SKUs are circling 200-225W, with 240 the upper end. Where are we now? :) 240W is x70-x80 territory. This 3090ti doubles it.
 
Last edited:
Double power use, yet SLI is "gone". Makes sense to me.
A big culprit is the limitation of die size. Those dies are big already. The clocks need to be high. Where are those chiplet GPUs...

You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So "I need a 500w GPU or I can't have fun" is definitely not true.
This is absolutely true as well... the price of entry into gaming isn't increasing a whole lot to be fair, the base line of 'quality' is in a good place even at sub mid range. That is, now that GPU prices are going down again... just a little more pls...
 
You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So "I need a 500w GPU or I can't have fun" is definitely not true.
I'm not putting any GPU above 250W in my machine. They're going to become a rarity at this point. My 3070 has been undervolted to 160W from 240W stock, and it has much better performance over stock & over a 2080 Ti. That's what I want.

Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.

Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.
 
Last edited:
Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.
Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.
Oh I agree. In one of my rigs I have a GTX 1660 (120w but I even undervolted that to 88w), that runs 99% of what I want to play these days. But I think they've simply hit the wall. 4k and Ray-Tracing drove up demand (as does ever decreasingly optimised games) just after all the easy per generation efficiency leaps we had with Maxwell, Pascal, etc, ended. So the only way of meeting "I need triple the horsepower for my 4k ray-tracing" now is to triple the wattage. Personally, I find the whole rat-race ridiculous and wouldn't touch a +250w GPU with a barge pole either (it's made easier for me by losing a lot of interest in many "must have" AAA + multi-player games), but I can see why a lot of people are considering switching to console if the PC industry doesn't get its act together over the next couple of years (and start making games more efficient if the hardware's architectural efficiency has genuinely hit a hard wall).

Edit: The "canary in the coal-mine" as to 'the party is over' for massive efficiency gains has been the low-end, ie, when you ignore GPU's of different wattage (and nVidia branding-drift) and just compare "same wattage across generations", the GTX 1060 (120w, 2016) was a huge jump over the GTX 960 (120w, 2015) after just 1 year, the GTX 1660S (120w, 2019) was much less even after 3 years, and the RTX 3050 (120w, 2022) is hardly any improvement at all after another 3 years. The only reason the RTX 2060 was faster than the GTX 1660 was to up the wattage to 160w. If you were to take the RTX 2060 and RTX 3060 and benchmark both capped to 120-130w, that would highlight just how "like for like" efficiency gains have slowed to a crawl since Turing...
 
Last edited:
"Significantly faster than RTX 3090 non-Ti"

It's less than 10% when you compare an aftermarket 3090 vs aftermarket 3090Ti, how on earth can less than 10% be deemed significant?
 
Do. Not. Buy. What a drama. Not.

I still don't understand the need to come and shit on products you don't need/can't afford/find inappropriate. Why?? People normally don't get riled up about luxury cars, houses which cost tens of millions dollars, etc. etc. etc. Why go crazy about this particular card which is basically a status item and not much more?
The issue here is, that they don't correlate. A luxurious Mansion is not driving up normal house prices, an expensive Bentley won't affect the prices of a VW Up in the market.
Nvidia on the other hand is using Halo products like the Titan and now the xx90 (Ti) branding to establish higher prices throughout the whole line-up.

Sure, you don't have to buy them. However, not pointing out that they are charging more and more for less just normalizes the process.
 
Last edited:
11% better performance on 4K vs the 3090 isn’t really that awful but isn’t really impressive specially because the 3090 ti is gonna be way more expensive in retail price,

the ray tracing surprised me a lot, it was good to see a good jump difference in the ray tracing performance vs the 3090,

hopefully the the retail price isn’t too far out of the 3090 average price atm
 
You just know they wanted to push these to 550w+ but the design just couldn't handle it.

Got to wait for 4090 till we see the full 600w monster!

In terms of performance, be interested to see this up against the xfx 6900 xt zero wb with power limit also pushed up to 450w odd.

There might be some aftermarket 4090's that push closer to 700W+, which would be staggeringly stupid to use anywhere outside Syberia or the North Pole.

I'm not putting any GPU above 250W in my machine. They're going to become a rarity at this point. My 3070 has been undervolted to 160W from 240W stock, and it has much better performance over stock & over a 2080 Ti. That's what I want.

Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.

Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.

I agree with you, 250W is the absolute ceiling for me because of the heat output and noise such a card would spit out. Ideally sub 200W for high end, which is where I'd thought we'd be now but the reality is going to be 3X that for high end instead.
 
With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!

Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.

So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .
 
With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!

Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.

So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .
Profit margins.
 
People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.

power_maximum.gif
power_peak.gif
power_average.gif
Actually, ONLY AMD will have an MCM GPU the next generation and from all the leaks, Nvidia 4000 series is slated to be even more power hungry.

3 reason for this release.

1. To ensure the 6950XT does not get performance crown.
2. Inflate value of next gen RTX 4070/4080. Also pumps up their performance per watt improvements.
3. To milk the more money than sense crowd who believe this card being the fastest card on the market is worth 2k while not realizing this privilege will only be for 5 months or so. This is the least important factor simply due to the low volume of this product. Marketing purposes of the first two points are far more valuable.

If this had been an AMD product with similar performance differences, we would be mostly praising AMD for finally getting the performance crown back. But with this launch and the 20% gap in performance, Nvidia has likely succeeded in staving off AMD from taking the performance crown for now.
I don't know....check HWBOT and all the single card GPU world records are for the 6900xt

Obviously fake review, because it doesn't pull the 600+W that the REALLY REALLY SMART people have been claiming for months.

/s, for those who aren't REALLY REALLY SMART.


GA102 has 28.3 billion transistors in 628 mm², or ~45 million transistors per mm².
2x Vesuvius have 12.4 billion transistors in 876 mm², or ~14 million transistors per mm².

Apparently, fundamental physics escapes you.


I'm sure that has nothing at all to do with a price that very few can afford.


Your confusion will abate once you realise that most of the people pretending they're horrified, are just AMD fanboys.
If I remember correctly, it was all the Nvidia fanboys who couldn't stop talking about efficiency when Maxwell was around, and after that....they never brought it up again.
 
Profit margins.

That sucks! Honestly, if I had crazy amounts of cash to splurge I still wouldn't buy these top end cards. I'm just happy to buy anything that gives me around 100-120fps in the games I play at high settings on 1440p... gonna stick with that performance target! I recently purchased a build from a trusted family friend with a used 2080 TI at a decent price and i'm over the moon. Speaking of "trusted" sellers even the used market is a difficult place not knowing what depth of the cryto crunch these cards have been running on.
 
The issue here is, that they don't correlate. A luxurious Mansion is not driving up normal house prices, an expensive Bentley won't affect the prices of a VW Up in the market.
Nvidia on the other hand is using Halo products like the Titan and now the xx90 (Ti) branding to establish higher prices throughout the whole line-up.

Sure, you don't have to buy them. However, not pointing out that they are charging more and more for less just normalizes the process.

And there's ... zero reasons for the 3090 Ti to drive prices up. If other GPUs are released at the prices people cannot afford, those cards will not be sold and the company will go out of business. That's called logic. You're exactly right about a halo status which also means a halo price point no one cares about except some people here in the comments who never wanted to buy this GPU anyways.

If not for the miners and weird logistic issues which I cannot really explain (neither I've read anything satisfactory as to why we have a semiconductor crisis - we didn't have it and then it suddenly emerged, WTF?), we wouldn't have had these insane prices over the past a year and a half. The old law of demand and supply at work.
 
Last edited:
That sucks! Honestly, if I had crazy amounts of cash to splurge I still wouldn't buy these top end cards. I'm just happy to buy anything that gives me around 100-120fps in the games I play at high settings on 1440p... gonna stick with that performance target! I recently purchased a build from a trusted family friend with a used 2080 TI at a decent price and i'm over the moon. Speaking of "trusted" sellers even the used market is a difficult place not knowing what depth of the cryto crunch these cards have been running on.
It sounds like you got a great deal. To be fair, at MRSP, all cards below the $700 mark, i.e. RTX 3080/6800 XT and lower were decently priced. Unfortunately, MRSP turned out to be a mirage.
 
This really isn't a new thing, we've had late cycle refreshes before, we've had big power consumption before, we've had gunning for the crown before... Oh well to each their own outrage it seems.

RTX 40 series will not have it's line-ups price performance ratio based off the 3090Ti, nor will the entire lineup have 450w+ power consumption, maybe the silly halo product that we always knew was stupid for gamers, like the 3090 is and was. Samsung 8nm was sub-optimal compared to TSMC's 7nm, everyone already seems to agree with that and we know 40 series is back on TSMC.

There will be <250w cards that likely offer 3080+ performance, with hot competition from AMD they have every chance of reasonable MSRP's too. Beyond the 3080 10/12GB things just get silly and it's nothing new, except in name and outright performance..
 
Back
Top