# MSI GeForce RTX 3090 Ti Suprim X



## W1zzard (Mar 29, 2022)

The MSI GeForce RTX 3090 Ti is an impressive quad-slot graphics card. In our review, we found that it runs quieter than other RTX 3090 Ti tested today. MSI also included a large factory overclock with their card, which handles 4K60 with ease.

*Show full review*


----------



## birdie (Mar 29, 2022)

A a Porsche 911 class card for bragging rights and those who need ultimate performance no matter what (overclockers, people with a ton of money, etc.) A last hooray of the Ampere architecture.

There's nothing to discuss really.

I'm looking forward to Ada Lovelace. Hopefully we'll get decent performance below 200W. This trend of an increased power consumption doesn't look healthy.


----------



## Shatun_Bear (Mar 29, 2022)

Interesting: *"power consumption is very high, but when taking the achieved performance into account, it roughly matches RTX 3090. Compared to other Ampere card this means efficiency is 10% reduced, 25% worse than AMD's RDNA2 offerings."*

That 469W whilst gaming is earth shattering.


----------



## trsttte (Mar 29, 2022)

Wow, I almost feel like the 2200$ price should be mentioned as a positive for not being anywhere near as bad as expected


----------



## birdie (Mar 29, 2022)

Shatun_Bear said:


> Interesting: *"power consumption is very high, but when taking the achieved performance into account, it roughly matches RTX 3090. Compared to other Ampere card this means efficiency is 10% reduced, 25% worse than AMD's RDNA2 offerings."*
> 
> That 469W whilst gaming is earth shattering.



People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.


----------



## Cutechri (Mar 29, 2022)

I'm preparing to quit this hobby looking at the power consumption of these cards. I'm just done.


----------



## the54thvoid (Mar 29, 2022)

birdie said:


> People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.



The 7990 was a dual chip card too. Only pulled in 277w typical. The R290 was rubbished at the time for its power consumption.

I'd not criticise people who wish to buy the card, it's down to personal liberty, but the production of it by Nvidia is a backwards step, IMO.


----------



## ppn (Mar 29, 2022)

Can it handle 4K120, No, 60Hz is only playable on OLED.. only if you get the newest 4090Ti 6 slot 600W monster truck, that only lasts for a year until the new games hit.


----------



## Testsubject01 (Mar 29, 2022)

birdie said:


> People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.
> 
> 
> Spoiler: Graphs



Do they? “The Radeon R9 295X2 combines two graphics processors to increase performance.” The 3090 Ti is not 2x the 3090 on one PCB.



birdie said:


> I'm looking forward to Ada Lovelace. Hopefully we'll get decent performance below 200W. This trend of an increased power consumption doesn't look healthy.


I hope the next Gen by the 2H of the year (Ada Lovelace/ RDN3 / Alchemist) or at least the gen after start to really innovate again.
The past couple of years introduced any gains in performance with a massive tax on price, heat output and energy consumption, effectively stagnating or in some cases regressing as a net result.


----------



## birdie (Mar 29, 2022)

ppn said:


> Can it handle 4K120, No, 60Hz is only playable on OLED.. only if you get the newest 4090Ti 6 slot 600W monster truck, that only lasts for a year until the new games hit.



This is blatantly false. 1080 Ti released 5 years ago still allows to play all modern games (sans RTX) at 1080p/1440p and most modern triple-A titles at 4K albeit at a slightly decreased visual quality. It roughly matches the performance of RTX 3060.

Considering that GPUs have been getting more and more expensive recently, game developers are not rushing to obsolete older cards because that will mean lost sales.

Again enthusiasts on tech websites continue to prove how little they care about the world outside. Absolute most people have GPUs which cost way less than $300.







Testsubject01 said:


> Do they? “The Radeon R9 295X2 combines two graphics processors to increase performance.” The 3090 Ti is not 2x the 3090 on one PCB.



Check the second comment to the article. Also reread my comment about dual-die upcoming GPUs.



Cutechri said:


> I'm preparing to quit this hobby looking at the power consumption of these cards. I'm just done.



Do. Not. Buy. What a drama. Not.


----------



## N3M3515 (Mar 29, 2022)

If the 3090 was absurd at $1500, then the 3090 Ti.........WTF?


----------



## Zubasa (Mar 29, 2022)

the54thvoid said:


> The 7990 was a dual chip card too. Only pulled in 277w typical. The R290 was rubbished at the time for its power consumption.
> 
> I'd not criticise people who wish to buy the card, it's down to personal liberty, but the production of it by Nvidia is a backwards step, IMO.


The fact that he had to pull out some 8 year old dual 28nm GPU card and conveniently forgets about the Titan Z.


----------



## Meanhx (Mar 29, 2022)

My 3080 (375W Aorus Master) outputs too much heat in my room to be viable without UV and/or 60fps caps when gaming april to september. The next GPU I buy won't have a TBP above 250W. Could probably game for 20-30 minutes with a 3090ti in my PC before giving up.


----------



## Shatun_Bear (Mar 29, 2022)

birdie said:


> People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.



Yeah, but that is two GPUs.

We're supposed to be moving forward, not backwards.


----------



## NuttinMuch (Mar 29, 2022)

I'm trying to understand how the efficiency is comparable to the 3090 given the power and performance numbers from your charts. The 3090 Ti consumes 32% more power (469w/355w) but the performance uplift is only 12% (92fps/82fps). Am I missing something?


----------



## Hyderz (Mar 29, 2022)

man that power recommendation.....


----------



## usiname (Mar 29, 2022)

NuttinMuch said:


> I'm trying to understand how the efficiency is comparable to the 3090 given the power and performance numbers from your charts. The 3090 Ti consumes 32% more power (469w/355w) but the performance uplift is only 12% (92fps/82fps). Am I missing something?


This is normal, if you limit the consumption of the 3090 ti, you will get same or better performance per watt. If you oc 3090 you will hit same power consuption as 3090ti and will have worse perf/watt


----------



## ppn (Mar 29, 2022)

efficiency is 50% more power for 25% performance, so 3080/10 should be 120% efficient in 4K, but in Vsync60 is 110%. 110W vs 100W


birdie said:


> This is blatantly false. 1080 Ti released 5 years ago still allows to play all modern games (sans RTX) at 1080p/1440p and most modern triple-A titles at 4K albeit at a slightly decreased visual quality. It roughly matches the performance of RTX 3060.


Yes, but I expect no compromise with this beefy GPU for years to come in all quad AA titles with all eye candy enabled. And To think that RTX 4060Ti can dethrone it in less than 1 year with only 24 gb buffer to show for itself, so much for future proof.


----------



## birdie (Mar 29, 2022)

Shatun_Bear said:


> Yeah, but that is two GPUs.
> 
> We're supposed to be moving forward, not backwards.



Physics is standing in the way, sorry. It's only gonna get worse unless we find a way to send signals without losing a ton of energy in the process. Optical computing is unfortunately limited to data transfer and not much more nowadays.

I still don't understand the need to come and shit on products you don't need/can't afford/find inappropriate. Why?? People normally don't get riled up about luxury cars, houses which cost tens of millions dollars, etc. etc. etc. Why go crazy about this particular card which is basically a status item and not much more?



ppn said:


> efficiency is 50% more power for 25% performance, so 3080/10 should be 120% efficient in 4K, but in Vsync60 is 110%. 110W vs 100W
> 
> Yes, but I expect no compromise with this beefy GPU for years to come in all quad AA titles with all eye candy enabled. And To think that RTX 4060Ti can dethrone it in less than 1 year with only 24 gb buffer to show for itself, so much for future proof.



You don't know that. We have nothing but rumors about AD. It may indeed have similar performance with a slightly lower power consumption.


----------



## AnotherReader (Mar 29, 2022)

It seems like a pointless release when there are signs of Ada coming later this year. A regular 3090 would be very close to this after some overclocking. The power consumption is also obscene.


----------



## W1zzard (Mar 29, 2022)

NuttinMuch said:


> I'm trying to understand how the efficiency is comparable to the 3090 given the power and performance numbers from your charts. The 3090 Ti consumes 32% more power (469w/355w) but the performance uplift is only 12% (92fps/82fps). Am I missing something?


MSI 3090 Ti: 149.9 FPS @ 469 W
RTX 3090 FE: 112.3 FPS @ 355 W


----------



## AnotherReader (Mar 29, 2022)

W1zzard said:


> MSI 3090 Ti: 149.9 FPS @ 469 W
> RTX 3090: 112.3 FPS @ 355 W


I thought you used CyberPunk at 1440p to calculate the efficiency metric. If that is the case, then it's:

MSI 3090 Ti: 92 fps @ 469 W
RTX 3090: 82 fps @ 355 W


----------



## W1zzard (Mar 29, 2022)

AnotherReader said:


> I thought you used CyberPunk at 1440p to calculate the efficiency metric. If that is the case, then it's:
> 
> MSI 3090 Ti: 92 fps @ 469 W
> RTX 3090: 82 fps @ 355 W


I'm doing a separate run for power on a different machine (the one with the power testing capability)


----------



## Valantar (Mar 29, 2022)

At least it's good to see that none of these space heaters are getting any kind of recommendation stamp - though IMO the criticism for excessive power draws ought to be amped up a bit.

As I see it, the only reason Nvidia is launching this is to ease the transition to the rumored power draws of the next generation. Nothing else makes sense. Sure, they're squeezing a few hundred dollars more out of a few thousand people, but that pales in comparison to the R&D costs for making these SKUs even if they are essentially identical to higher clocked 3090s. But the main point has to be the normalization of cards nearing 500W at the high end.


----------



## Bzuco (Mar 29, 2022)

Finally there is graphics card(no matter what is the price) on which we can play few years old games(like RDR2) at 1080p@60Hz full details at "stupidly" low consumption 
_EDIT: Can I ask what game was used in that V-Sync 60Hz power consumption summary?_ ...I just read it in testing details  @W1zzard


----------



## Valantar (Mar 29, 2022)

Bzuco said:


> Finally there is graphics card(no matter what is the price) on which we can play few years old games(like RDR2) at 1080p@60Hz full details at "stupidly" low consumption
> _EDIT: Can I ask what game was used in that V-Sync 60Hz power consumption summary?_ ...I just read it in testing details  @W1zzard


Considering how little power my 6900 XT consumes at 1440p60 in most games (75W-ish in Elden Ring, though that's hardly very demanding, just buggy AF), it could be pretty much anything - though I kind of expect it to be at 4k given the seeming advantage of Ampere over RDNA2 in that graph. Care to share some details, @W1zzard ?


----------



## Pumper (Mar 29, 2022)

These are so bad, that the Founders Editions in France have been posted for sale 3.5 hours ago and they are still in stock, lol.


----------



## Deleted member 202104 (Mar 29, 2022)

Valantar said:


> Considering how little power my 6900 XT consumes at 1440p60 in most games (75W-ish in Elden Ring, though that's hardly very demanding, just buggy AF), it could be pretty much anything - though I kind of expect it to be at 4k given the seeming advantage of Ampere over RDNA2 in that graph. Care to share some details, @W1zzard ?



From the Power consumption page (click the Power Consumption Testing Details button near the top):

_V-Sync: If you don't need the highest framerate and want to conserve power, running at 60 FPS is a good option. In this test, we run Cyberpunk 2077 at 1920x1080, capped to 60 FPS. This test is also useful in testing a graphic card's ability to react to situations with only low power requirements. For graphics card that can't reach 60 FPS at 1080p, we report the power draw at the highest achievable frame rate._


----------



## mb194dc (Mar 29, 2022)

You just know they wanted to push these to 550w+ but the design just couldn't handle it.

Got to wait for 4090 till we see the full 600w monster!

In terms of performance, be interested to see this up against the xfx 6900 xt zero wb with power limit also pushed up to 450w odd.


----------



## tajoh111 (Mar 29, 2022)

3 reason for this release. 

1. To ensure the 6950XT does not get performance crown. 
2. Inflate value of next gen RTX 4070/4080. Also pumps up their performance per watt improvements. 
3. To milk the more money than sense crowd who believe this card being the fastest card on the market is worth 2k while not realizing this privilege will only be for 5 months or so. This is the least important factor simply due to the low volume of this product. Marketing purposes of the first two points are far more valuable. 

If this had been an AMD product with similar performance differences, we would be mostly praising AMD for finally getting the performance crown back. But with this launch and the 20% gap in performance, Nvidia has likely succeeded in staving off AMD from taking the performance crown for now.


----------



## fevgatos (Mar 29, 2022)

Im a bit confused with the comments. Im an owner of an aftermarket 3090 that can reach 470w with stock bios. Currently running a 550w bios. What is new about this? Most 3090's with 3x8pin could reach roundabout the same consumption. Why are people going crazy all of a sudden? Did they expect the 3090ti to consumes less than the 3090? Im deeply confused...


----------



## Assimilator (Mar 29, 2022)

Obviously fake review, because it doesn't pull the 600+W that the REALLY REALLY SMART people have been claiming for months.

/s, for those who aren't REALLY REALLY SMART.



Shatun_Bear said:


> Yeah, but that is two GPUs.
> 
> We're supposed to be moving forward, not backwards.


GA102 has 28.3 billion transistors in 628 mm², or ~45 million transistors per mm².
2x Vesuvius have 12.4 billion transistors in 876 mm², or ~14 million transistors per mm².

Apparently, fundamental physics escapes you.



Pumper said:


> These are so bad, that the Founders Editions in France have been posted for sale 3.5 hours ago and they are still in stock, lol.


I'm sure that has _nothing at all_ to do with a price that very few can afford.



fevgatos said:


> Im a bit confused with the comments. Im an owner of an aftermarket 3090 that can reach 470w with stock bios. Currently running a 550w bios. What is new about this? Most 3090's with 3x8pin could reach roundabout the same consumption. Why are people going crazy all of a sudden? Did they expect the 3090ti to consumes less than the 3090? Im deeply confused...


Your confusion will abate once you realise that most of the people pretending they're horrified, are just AMD fanboys.


----------



## BSim500 (Mar 29, 2022)

Cutechri said:


> I'm preparing to quit this hobby looking at the power consumption of these cards. I'm just done.


You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So _"I need a 500w GPU or I can't have fun"_ is definitely not true.


----------



## redzo (Mar 29, 2022)

birdie said:


> People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.


You do make a point when comparing this two halo products but,

Power consumption has already got too damn high across the board compared to this historical chart. If I take into account the current leaks - > it will stay the same or get even higher.

This is bad for pc gaming overall and it will just push people like myself(200w gpu and 200w cpu max that I can take) to consoles: prices will probably be too high, high power consumption and high heat output.


----------



## Vayra86 (Mar 29, 2022)

And people in the 4090 topic over yonder saying 'muh muh 600W of course it won't'... but this one already hits 480. And yes, 'you don't have to buy it'... but sooner rather than later, we've set the norm for much higher TDP GPUs. Turing was up from half a decade of stable top end TDPs. Ampere was up and away. What's next? Mars? Those nodes aren't getting a whole lot smaller, so perhaps GPUs need some fundamental changes to make their generational jump worthwhile.



redzo said:


> You do make a point when comparing this two halo products but,
> 
> Power consumption has already got too damn high across the board compared to this historical chart. If I take into account the current leaks - > it will stay the same or get even higher.
> 
> This is bad for pc gaming overall and it will just push people like myself(200w gpu and 200w cpu max that I can take) to consoles: prices will probably be too high, high power consumption and high heat output.


Important take away from that chart: top end SKUs are circling 200-225W, with 240 the upper end. Where are we now?  240W is x70-x80 territory. This 3090ti doubles it.


----------



## cadaveca (Mar 29, 2022)

Vayra86 said:


> Important take away from that chart: top end SKUs are circling 200-225W, with 240 the upper end. Where are we now?  240W is x70-x80 territory. This 3090ti doubles it.


Double power use, yet SLI is "gone". Makes sense to me.


----------



## Vayra86 (Mar 29, 2022)

cadaveca said:


> Double power use, yet SLI is "gone". Makes sense to me.


A big culprit is the limitation of die size. Those dies are big already. The clocks need to be high. Where are those chiplet GPUs...



BSim500 said:


> You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So _"I need a 500w GPU or I can't have fun"_ is definitely not true.


This is absolutely true as well... the price of entry into gaming isn't increasing a whole lot to be fair, the base line of 'quality' is in a good place even at sub mid range. That is, now that GPU prices are going down again... just a little more pls...


----------



## Cutechri (Mar 29, 2022)

BSim500 said:


> You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So _"I need a 500w GPU or I can't have fun"_ is definitely not true.


I'm not putting any GPU above 250W in my machine. They're going to become a rarity at this point. My 3070 has been undervolted to 160W from 240W stock, and it has much better performance over stock & over a 2080 Ti. That's what I want.

Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.

Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.


----------



## BSim500 (Mar 29, 2022)

Cutechri said:


> Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.
> Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...
> 
> Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.


Oh I agree. In one of my rigs I have a GTX 1660 (120w but I even undervolted that to 88w), that runs 99% of what I want to play these days. But I think they've simply hit the wall. 4k and Ray-Tracing drove up demand (as does ever decreasingly optimised games) just after all the easy per generation efficiency leaps we had with Maxwell, Pascal, etc, ended. So the only way of meeting _"I need triple the horsepower for my 4k ray-tracing"_ now is to triple the wattage. Personally, I find the whole rat-race ridiculous and wouldn't touch a +250w GPU with a barge pole either (it's made easier for me by losing a lot of interest in many "must have" AAA + multi-player games), but I can see why a lot of people are considering switching to console if the PC industry doesn't get its act together over the next couple of years (and start making games more efficient if the hardware's architectural efficiency has genuinely hit a hard wall).

Edit: The "canary in the coal-mine" as to 'the party is over' for massive efficiency gains has been the low-end, ie, when you ignore GPU's of different wattage (and nVidia branding-drift) and just compare "same wattage across generations", the GTX 1060 (120w, 2016) was a huge jump over the GTX 960 (120w, 2015) after just 1 year, the GTX 1660S (120w, 2019) was much less even after 3 years, and the RTX 3050 (120w, 2022) is hardly any improvement at all after another 3 years. The only reason the RTX 2060 was faster than the GTX 1660 was to up the wattage to 160w. If you were to take the RTX 2060 and RTX 3060 and benchmark both capped to 120-130w, that would highlight just how "like for like" efficiency gains have slowed to a crawl since Turing...


----------



## HCT3000 (Mar 29, 2022)

"Significantly faster than RTX 3090 non-Ti"

It's less than 10% when you compare an aftermarket 3090 vs aftermarket 3090Ti, how on earth can less than 10% be deemed significant?


----------



## Testsubject01 (Mar 29, 2022)

birdie said:


> Do. Not. Buy. What a drama. Not.





birdie said:


> I still don't understand the need to come and shit on products you don't need/can't afford/find inappropriate. Why?? People normally don't get riled up about luxury cars, houses which cost tens of millions dollars, etc. etc. etc. Why go crazy about this particular card which is basically a status item and not much more?


The issue here is, that they don't correlate. A luxurious Mansion is not driving up normal house prices, an expensive Bentley won't affect the prices of a VW Up in the market.
Nvidia on the other hand is using Halo products like the Titan and now the xx90 (Ti) branding to establish higher prices throughout the whole line-up.

Sure, you don't have to buy them. However, not pointing out that they are charging more and more for less just normalizes the process.


----------



## Youlocalbox (Mar 29, 2022)

11% better performance on 4K vs the 3090 isn’t really that awful but isn’t really impressive specially because the 3090 ti is gonna be way more expensive in retail price, 

the ray tracing surprised me a lot, it was good to see a good jump difference in the  ray tracing performance vs the 3090, 

hopefully the the retail price isn’t too far out of the 3090 average price atm


----------



## Shatun_Bear (Mar 29, 2022)

mb194dc said:


> You just know they wanted to push these to 550w+ but the design just couldn't handle it.
> 
> Got to wait for 4090 till we see the full 600w monster!
> 
> In terms of performance, be interested to see this up against the xfx 6900 xt zero wb with power limit also pushed up to 450w odd.



There might be some aftermarket 4090's that push closer to 700W+, which would be staggeringly stupid to use anywhere outside Syberia or the North Pole.



Cutechri said:


> I'm not putting any GPU above 250W in my machine. They're going to become a rarity at this point. My 3070 has been undervolted to 160W from 240W stock, and it has much better performance over stock & over a 2080 Ti. That's what I want.
> 
> Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.
> 
> ...



I agree with you, 250W is the absolute ceiling for me because of the heat output and noise such a card would spit out. Ideally sub 200W for high end, which is where I'd thought we'd be now but the reality is going to be 3X that for high end instead.


----------



## wheresmycar (Mar 29, 2022)

With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!

Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.

So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .


----------



## AnotherReader (Mar 29, 2022)

wheresmycar said:


> With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!
> 
> Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.
> 
> So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .


Profit margins.


----------



## AnarchoPrimitiv (Mar 29, 2022)

birdie said:


> People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.


Actually, ONLY AMD will have an MCM GPU the next generation and from all the leaks, Nvidia 4000 series is slated to be even more power hungry.



tajoh111 said:


> 3 reason for this release.
> 
> 1. To ensure the 6950XT does not get performance crown.
> 2. Inflate value of next gen RTX 4070/4080. Also pumps up their performance per watt improvements.
> ...


I don't know....check HWBOT and all the single card GPU world records are for the 6900xt



Assimilator said:


> Obviously fake review, because it doesn't pull the 600+W that the REALLY REALLY SMART people have been claiming for months.
> 
> /s, for those who aren't REALLY REALLY SMART.
> 
> ...


If I remember correctly, it was all the Nvidia fanboys who couldn't stop talking about efficiency when Maxwell was around, and after that....they never brought it up again.


----------



## wheresmycar (Mar 30, 2022)

AnotherReader said:


> Profit margins.



That sucks! Honestly, if I had crazy amounts of cash to splurge I still wouldn't buy these top end cards. I'm just happy to buy anything that gives me around 100-120fps in the games I play at high settings on 1440p... gonna stick with that performance target! I recently purchased a build from a trusted family friend with a used 2080 TI at a decent price and i'm over the moon. Speaking of "trusted" sellers even the used market is a difficult place not knowing what depth of the cryto crunch these cards have been running on.


----------



## birdie (Mar 30, 2022)

Testsubject01 said:


> The issue here is, that they don't correlate. A luxurious Mansion is not driving up normal house prices, an expensive Bentley won't affect the prices of a VW Up in the market.
> Nvidia on the other hand is using Halo products like the Titan and now the xx90 (Ti) branding to establish higher prices throughout the whole line-up.
> 
> Sure, you don't have to buy them. However, not pointing out that they are charging more and more for less just normalizes the process.



And there's ... zero reasons for the 3090 Ti to drive prices up. If other GPUs are released at the prices people cannot afford, those cards will not be sold and the company will go out of business. That's called logic. You're exactly right about a halo status which also means a halo price point no one cares about except some people here in the comments who never wanted to buy this GPU anyways.

If not for the miners and weird logistic issues which I cannot really explain (neither I've read anything satisfactory as to why we have a semiconductor crisis - we didn't have it and then it suddenly emerged, WTF?), we wouldn't have had these insane prices over the past a year and a half. The old law of demand and supply at work.


----------



## AnotherReader (Mar 30, 2022)

wheresmycar said:


> That sucks! Honestly, if I had crazy amounts of cash to splurge I still wouldn't buy these top end cards. I'm just happy to buy anything that gives me around 100-120fps in the games I play at high settings on 1440p... gonna stick with that performance target! I recently purchased a build from a trusted family friend with a used 2080 TI at a decent price and i'm over the moon. Speaking of "trusted" sellers even the used market is a difficult place not knowing what depth of the cryto crunch these cards have been running on.


It sounds like you got a great deal. To be fair, at MRSP, all cards below the $700 mark, i.e. RTX 3080/6800 XT and lower were decently priced. Unfortunately, MRSP turned out to be a mirage.


----------



## wolf (Mar 30, 2022)

This really isn't a new thing, we've had late cycle refreshes before, we've had big power consumption before, we've had gunning for the crown before... Oh well to each their own outrage it seems.

RTX 40 series will not have it's line-ups price performance ratio based off the 3090Ti, nor will the entire lineup have 450w+ power consumption, maybe the silly halo product that we always knew was stupid for gamers, like the 3090 is and was. Samsung 8nm was sub-optimal compared to TSMC's 7nm, everyone already seems to agree with that and we know 40 series is back on TSMC.

There will be <250w cards that likely offer 3080+ performance, with hot competition from AMD they have every chance of reasonable MSRP's too. Beyond the 3080 10/12GB things just get silly and it's _nothing new_, except in name and outright performance..


----------



## noel_fs (Mar 30, 2022)

why not list msrp price


----------



## Deleted member 202104 (Mar 30, 2022)

noel_fs said:


> why not list msrp price



It's on the Value and Conclusion page:


----------



## onemanhitsquad (Mar 30, 2022)

hmmm...never have given "power consumption" a second thought...I just pay my bill


----------



## gmn 17 (Mar 30, 2022)

$3569 in oz for a suprim x card


			https://prod.scorptec.com.au/35/461/96475/266827_large.jpg
		



			https://www.scorptec.com.au/product/graphics-cards/nvidia/96475-geforce-rtx-3090-ti-suprim-x-24g


----------



## nguyen (Mar 30, 2022)

Awesome review with lots of new games added, sadly 3090Ti is just a 3090 with higher default power limit, so nothing exciting here.


----------



## Richards (Mar 30, 2022)

Shows the 3090 was bandwidth  starved.. in some games the 3090ti has a massive lead at 1440p


----------



## ratirt (Mar 30, 2022)

So preparation for ADA started with putting an emphasis on the power consumption from NVIDIA with its 3090 ti which ADA will be compared to. That move will surely ease he blow and power consumption for the ADA release. People already say stupid stuff like 'you dont like the power consumption don't buy it'. Whatever anyone says here the power consumption rating for the current cards is atrocious and it will get worse overtime. It is hard to say but we are definitely going backwards. These companies should start working for their money not look for an excuse to make a furnace out of a graphics card to get some more performance. Fundamental changes are the only way to go.

@W1zzard out of curiosity. The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?


----------



## W1zzard (Mar 30, 2022)

ratirt said:


> The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?


In 4K? Yeah, that's because they are running out of VRAM


----------



## nguyen (Mar 30, 2022)

ratirt said:


> So preparation for ADA started with putting an emphasis on the power consumption from NVIDIA with its 3090 ti which ADA will be compared to. That move will surely ease he blow and power consumption for the ADA release. People already say stupid stuff like 'you dont like the power consumption don't buy it'. Whatever anyone says here the power consumption rating for the current cards is atrocious and it will get worse overtime. It is hard to say but we are definitely going backwards. These companies should start working for their money not look for an excuse to make a furnace out of a graphics card to get some more performance. Fundamental changes are the only way to go.
> 
> @W1zzard out of curiosity. The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?



Just limit the max FPS to reduce power consumption, looks like 3090Ti is using less watts than 6900XT in this scenario.


----------



## ratirt (Mar 30, 2022)

W1zzard said:


> In 4K? Yeah, that's because they are running out of VRAM


Bummer. These would have been capable to run those games at 4k with RT on.


----------



## W1zzard (Mar 30, 2022)

ratirt said:


> Bummer. These would have been capable to run those games at 4k with RT on.


Exactly, yet there's often drama about "only x GB"


----------



## ratirt (Mar 30, 2022)

nguyen said:


> Just limit the max FPS to reduce power consumption, looks like 3090Ti is using less watts than 6900XT in this scenario.
> View attachment 241765


Sure. Or you can just go 1080p with Vsync. Im sure it will use even less. Very efficient card.




W1zzard said:


> Exactly, yet there's often drama about "only x GB"


That is not a good thing. Considering NV is pushing so much for RT and yet constrain the cards with insufficient memory capacity to run them. Even if they are capable for 4k. All the DLSS and RT becomes irrelevant for those cards.


----------



## W1zzard (Mar 30, 2022)

ratirt said:


> That is not a good thing. Considering NV is pushing so much for RT and yet constrain the cards with insufficient memory capacity to run them. Even if they are capable for 4k. All the DLSS and RT becomes irrelevant for those cards.


DLSS lowers the resolution, which lowers the memory requirement, 4K+DLSS will run perfectly fine


----------



## ratirt (Mar 30, 2022)

W1zzard said:


> DLSS lowers the resolution, which lowers the memory requirement, 4K+DLSS will run perfectly fine


So where you able to run Doom at 4k with RT on, with a 3070 Ti for instance with DLSS on and it gave you decent FPS? Or was that out of the picture as well? 
If it did run, then DLSS has another functionality.


----------



## W1zzard (Mar 30, 2022)

ratirt said:


> So where you able to run Doom at 4k with RT on, with a 3070 Ti for instance with DLSS on and it gave you decent FPS? Or was that out of the picture as well?
> If it did run, then DLSS has another functionality.


I haven't tested it, but I'm quite positive that it will run with good FPS in that scenario


----------



## ratirt (Mar 30, 2022)

W1zzard said:


> I haven't tested it, but I'm quite positive that it will run with good FPS in that scenario


How about that. DLSS is a software override for memory capacity insufficiency. 
Do you have any memory requirements for the 4K doom and FarCry when RT is on? I'm guessing the minimum is 10GB since 3080 with 10GB is fine. Assuming, memory requirements numbers are growing, I wonder, how other games tested stack with the ram memory requirements for 4k RT gameplay. 8GB at the edge or there is some spare left.


----------



## W1zzard (Mar 30, 2022)

ratirt said:


> Do you have any memory requirements for the 4K doom and FarCry when RT is on? I'm guessing the minimum is 10GB since 3080 with 10GB is fine.


Yup, maybe it's 9 GB, but same thing really. It also depends on the map, your location in it, and the settings of course (I'm using highest)


----------



## ratirt (Mar 30, 2022)

W1zzard said:


> Yup, maybe it's 9 GB, but same thing really. It also depends on the map, your location in it, and the settings of course (I'm using highest)


Well apparently the 3070 could have had a decent FPS in Doom or FarCry even without DLSS on but memory capacity does not allow it. It is really starting to be an issue. A handicapped card.


----------



## Valantar (Mar 30, 2022)

ratirt said:


> The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?


I assume you're talking about RT performance? 'Cause in rasterization this does not apply at all.

As for "decent FPS" in FarCry6 with RT on? In lower resolutions these cards roughly match or slightly beat the 2080Ti, which delivers 48.8fps - hardly ground breaking performance. Playable? Absolutely. But hardly _good_. Calling the cards handicapped because of poor performance in an extreme edge case scenario (RT performance at the highest reasonably available resolution in two games out of nine), for cards arguably not designed for gaming at that resolution in the first place? Yeah, that's a stretch. Sure, the 3070 and Ti are perfectly capable 2160p60 cards in rasterization. But not in RT. And that is fine - it's an extreme requirement.



nguyen said:


> Just limit the max FPS to reduce power consumption, looks like 3090Ti is using less watts than 6900XT in this scenario.
> View attachment 241765


I'm genuinely curious as to why this is. Looking at CP2077 performance results, sadly these cards seem CPU limited at 1080p, so there isn't much to gather there - though at 1440p the 3090Ti clearly pulls ahead of the 3080Ti, 6900XT and 3090 which are tied. Yet in this power consumption graph the Nvidia cards are closely grouped while the 6900XT is ~40% higher. That strikes me as odd, given that the 6900XT uses less power than a 3090, 3080ti, and even the 3080. I understand that the test scenarios for these measurements aren't the same, but the difference seems very strange to me. The Gaming power consumption numbers are also CP2077, though at 1440p, not 1080p - but in this scenario, the 6900XT delivers the essentially identical performance at ~40W less. So how come the situation is so dramatiaclly reversed at 1080p60? @W1zzard, got any thoughts on this?


----------



## ratirt (Mar 30, 2022)

Valantar said:


> I assume you're talking about RT performance? 'Cause in rasterization this does not apply at all.


If you want to know about then you need to read the post. Cutting in a conversation may lead to misunderstanding.
Yes it is about RT.


Valantar said:


> As for "decent FPS" in FarCry6 with RT on? In lower resolutions these cards roughly match or slightly beat the 2080Ti, which delivers 48.8fps - hardly ground breaking performance. Playable? Absolutely. But hardly _good_. Calling the cards handicapped because of poor performance in an extreme edge case scenario (RT performance at the highest reasonably available resolution in two games out of nine), for cards arguably not designed for gaming at that resolution in the first place? Yeah, that's a stretch. Sure, the 3070 and Ti are perfectly capable 2160p60 cards in rasterization. But not in RT. And that is fine - it's an extreme requirement.


as you mentioned still playable and doom maxed out would be around 70 which is perfect. With FarCry6 you can always drop some detail and play 60 no problem if mid 50 is not what you'd expect. Memory constraints prevent that. That is why I said handicapped card. Same goes for the 3070 Ti as well. You could use both those cards to play 4k with RT on no problem with both games mentioned. Due to lack of memory they can't. New feature for DLSS. Enabling handicapped GPUs to play at 4K due to low memory. I hope that is the case. Something tells me, since RT is booming, more RAM will be required as time goes by so we will see more of those situations, card cant run 4k even though they have enough core performance.
I disagree with you. If it had the 10GB RAM it would have been capable of 4K no problem. So by design they are not capable off that due to memory. It is like you pay cash and you have to play what it is design for even though you could have played higher res. For me a handicap not a feature.



Valantar said:


> I'm genuinely curious as to why this is. Looking at CP2077 performance results, sadly these cards seem CPU limited at 1080p, so there isn't much to gather there - though at 1440p the 3090Ti clearly pulls ahead of the 3080Ti, 6900XT and 3090 which are tied. Yet in this power consumption graph the Nvidia cards are closely grouped while the 6900XT is ~40% higher. That strikes me as odd, given that the 6900XT uses less power than a 3090, 3080ti, and even the 3080. I understand that the test scenarios for these measurements aren't the same, but the difference seems very strange to me. The Gaming power consumption numbers are also CP2077, though at 1440p, not 1080p - but in this scenario, the 6900XT delivers the essentially identical performance at ~40W less. So how come the situation is so dramatiaclly reversed at 1080p60? @W1zzard, got any thoughts on this?


You know how one GPU utilize the given resources? The power consumption and performance are not linear? At some point you need to give more power to achieve certain performance level. That is what you see here. The GPU has more resources than 6900xt you have mentioned. It clocks lower as well so power drops significantly for 3090Ti while 6900xt has to use more resources and that comes with a power usage. Also, 6900 XT has a power limit as you know preventing the situation like 3090 Ti going above 450W.
I hope that is what you have been wondering about. At least that is how I see it. 
You can see similar behavior with CPUs.


----------



## Vayra86 (Mar 30, 2022)

wheresmycar said:


> With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!
> 
> Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.
> 
> So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .



The price of RT and 4K.

Both questionable moves 'forward' that require substantial increases in supporting hardware (cache size, VRAM, and changes in cores/specialized cores). We had very efficient GPUs for rasterized content, and even just 4K wasn't a massive issue on its own. 28nm was in a pretty good place at the end of that node, as was TSMC 16nm.

Still I say, its going to be very interesting to see where RT will go in the future. Widespread adoption, sure, but in what magnitude and how worthwhile it remains to dedicate hardware and die space to it... AMD might be on to something with much smaller dies that do RT but aren't great at it. A bigger die will still do the whole gaming operation faster, but you can use all of it for all content.


----------



## Fluffmeister (Mar 30, 2022)

I'm just depressed a card I would like to eventually upgrade to is already over 60% slower than this.


----------



## Valantar (Mar 30, 2022)

ratirt said:


> If you want to know about then you need to read the post. Cutting in a conversation may lead to misunderstanding.
> Yes it is about RT.


Ahem. Your first post bringing this up said the following:


ratirt said:


> @W1zzard out of curiosity. The 3070 and 3070 Ti are so damn slow at 4k with FarCry6 and Doom. Is that the memory capacity problem or something else?


No mention of RT there. Hence my question. This isn't because I'm "cutting into a conversation", it's because I was curious about the premise of said conversation, as it was unclear. You literally didn't say.


ratirt said:


> as you mentioned still playable and doom maxed out would be around 70 which is perfect. With FarCry6 you can always drop some detail and play 60 no problem if mid 50 is not what you'd expect. Memory constraints prevent that. That is why I said handicapped card. Same goes for the 3070 Ti as well. You could use both those cards to play 4k with RT on no problem with both games mentioned. Due to lack of memory they can't. New feature for DLSS. Enabling handicapped GPUs to play at 4K due to low memory. I hope that is the case. Something tells me, since RT is booming, more RAM will be required as time goes by so we will see more of those situations, card cant run 4k even though they have enough core performance.
> I disagree with you. If it had the 10GB RAM it would have been capable of 4K no problem. So by design they are not capable off that due to memory. It is like you pay cash and you have to play what it is design for even though you could have played higher res. For me a handicap not a feature.


As for this, we'll have to disagree on that. While this "problem" will no doubt become more noticeable in the future, at the same time its absolute compute performance (whether rasterization or RT) will simultaneously decrease relative to the demands put on it by games, meaning that by the point where this is a dominating issue (rather than an extreme niche case, like today), those GPUs likely wouldn't produce playable framerates even if they had infinite VRAM. Remember, Doom Eternal is just about the easiest-to-run AAA shooter out there in terms of its compute requirements (and it can likely run more than fine at 2160p RT on a 3070 if you lower the texture quality or some other memory-heavy setting to the second highest setting). And it's not like these two games are even remotely representative of RT loads today - heck, nothing is, given that performance for the 3090 Ti at 2160p varies from ~137fps to ~24fps. The span is too wide. So, using these two edge cases as a predictor for the future is nit-picking and statistically insignificant. So again, calling the cards "handicapped" here is ... well, you're picking out an extreme edge case and using it in a way that I think is overblown. You can't expect universal 2160p60 RT from _any_ GPU today, so why would you do so with an upper mid-range/lower high end GPU? That just doesn't make sense. Every GPU has its limitations, and these ones clearly have their limitations most specifically in memory-intensive RT at 2160p  - the most extreme use case possible. That is a _really_ small limitation. Calling that a "handicap" is making a mountain out of a molehill.



ratirt said:


> You know how one GPU utilize the given resources? The power consumption and performance are not linear? At some point you need to give more power to achieve certain performance level. That is what you see here. The GPU has more resources than 6900xt you have mentioned. It clocks lower as well so power drops significantly for 3090Ti while 6900xt has to use more resources and that comes with a power usage. Also, 6900 XT has a power limit as you know preventing the situation like 3090 Ti going above 450W.
> I hope that is what you have been wondering about. At least that is how I see it.


That is a way too simplistic solution to this conundrum. As a 6900XT owner using it on a 1440p60 display, I know just how low that GPU will clock and how efficiently it will run if it doesn't need the power (that 75W figure I gave for Elden Ring isn't too exceptional). I've also run an undervolted, underclocked profile at ~2100MHz which never exceeded 190W no matter what I threw at it. The point being: RDNA2 has no problem clocking down and reducing power if needed. And, to remind you, in the game used for power testing here, the 6900XT matches the performance of the 3080Ti and 3090 at 1440p _while consuming less power_. Despite its higher clocks, even at peak. And, of course, all of these GPUs will reduce their clocks roughly equally, given an equal reduction in the workload. Yet what we're seemingly seeing here is a dramatic difference in said reductions, to the tune of a massive reversal of power efficiency.

So, while you're right that power consumption and performance scaling are not linear, and that a wide-and-slow GPU will generally be more efficient than a fast-and-narrow one, your application of these principles here ignores a massive variable: architectural and node differences. We know that RDNA2 on TSMC 7nm is more efficient than Ampere on Samsung 8nm, even at ~500MHz higher clocks. This is true pretty much true across the AMD-Nvidia product stacks, though with some fluctuations. And it's not like the 3090Ti is meaningfully wider than a 3090 (the increase in compute resources is _tiny_), and by extension not a 6900XT either. You could argue that the 3080Ti and 3090 are wider than the 6900 XT, and they certainly clock lower - but that runs counter to your argument, as they then ought to be _more_ efficient at peak performance, not less. This tells us that AMD simply has the architecture and node advantage to clock higher yet still win out in terms of efficiency. Thus, there doesn't seem to be any reason why these GPUs wouldn't also clock down and reduce their power to similar degrees, despite their differing starting points. Now, performance scaling per frequency for any single GPU or architecutre isn't entirely linear either, but it is close to linear within the reasonable operating frequency ranges of most GPUs. Meaning that if two GPUs produce ~X performance, one at 2GHz and one at 2.5GHz, the drop  in clock speeds needed to reach X/2 performance should be similar, not in MHz but in relative % to their starting frequencies. Not the same, but sufficiently similar for the difference not to matter much. And as power and clock speeds follow each other, even if non-linear, the power drop across the two GPUs should also be similar. Yet here we're seeing one GPU drop _drastically_ more than the other - if we're comparing 3090 to 6900 XT, we're talking a 66% drop vs. a 46% drop. That's a rather dramatic difference considering that they started out at the same level of absolute performance.

One possible explanation: That the Ampere cards are actually _really_ CPU limited at 1080p in CP2077, and would dramatically outperform the 6900XT there if not held back. This would require the same to _not_ be true at 1440p, as the Ampere GPUs run at peak power there, indicating no significant bottleneck elsewhere. This would then require power measurements of the Ampere cards at 1080p without Vsync to check. Another possible explanation is that Nvidia is _drastically_ pushing these cards beyond their efficiency sweet spot in a way AMD isn't - but given the massive clock speeds of RDNA2, that is also unlikely - both architectures seem to be pushed roughly equally (outside of the 3090 Ti, which is ridiculous in this regard). It could also just be some weird architectural quirk, where Ampere is suddenly _drastically_ more efficient below a certain, quite low clock threshold (significantly lower than any of its GPUs clock in regular use). This would require power testing at ever-decreasing clocks to test.

Either way, these measurements are sufficiently weird to have me curious.



Fluffmeister said:


> I'm just depressed a card I would like to eventually upgrade to is already over 60% slower than this.


60% slower? You're looking at a 3050 as an upgrade to a 980 Ti?


----------



## noel_fs (Mar 30, 2022)

weekendgeek said:


> It's on the Value and Conclusion page:
> 
> View attachment 241752


yep, but on the intro page im referring, i know they wouldnt be realistic but at least it would tell people what they should be looking for and possibly help on prices drop


----------



## Bzuco (Mar 30, 2022)

@Valantar 
I don't see that power consumption so complicated. Each GPU simply have some predefined performance states at some frequencies and it depends on current GPU core percentage usage when the GPU decide to change that state to other one. Both amd and nvidia have different percentage/frequency/... ranges for different performance states count, ...so it is very hard to tell at default conditions if the card could or could not be more power efficient.

I would advise take some static camera angle in game at 1080p60fps, set manually some lower performance state, start manually locking GPU frequencies to lower values and stop this process when gpu reaches almost 100% core utilization. Then do undervolting  and after that check what is the power consumption. Without this procedure we are all totaly just guessing.

Could you post what are the frequencies (core, vram) of each performance state on your 6900XT?


----------



## ratirt (Mar 30, 2022)

Valantar said:


> As for this, we'll have to disagree on that. While this "problem" will no doubt become more noticeable in the future, at the same time its absolute compute performance (whether rasterization or RT) will simultaneously decrease relative to the demands put on it by games, meaning that by the point where this is a dominating issue (rather than an extreme niche case, like today), those GPUs likely wouldn't produce playable framerates even if they had infinite VRAM. Remember, Doom Eternal is just about the easiest-to-run AAA shooter out there in terms of its compute requirements (and it can likely run more than fine at 2160p RT on a 3070 if you lower the texture quality or some other memory-heavy setting to the second highest setting). And it's not like these two games are even remotely representative of RT loads today - heck, nothing is, given that performance for the 3090 Ti at 2160p varies from ~137fps to ~24fps. The span is too wide. So, using these two edge cases as a predictor for the future is nit-picking and statistically insignificant. So again, calling the cards "handicapped" here is ... well, you're picking out an extreme edge case and using it in a way that I think is overblown. You can't expect universal 2160p60 RT from _any_ GPU today, so why would you do so with an upper mid-range/lower high end GPU? That just doesn't make sense. Every GPU has its limitations, and these ones clearly have their limitations most specifically in memory-intensive RT at 2160p - the most extreme use case possible. That is a _really_ small limitation. Calling that a "handicap" is making a mountain out of a molehill.


I’m not talking about to which degree this is a problem but rather it is here already and obviously in the future it will be more noticeable. These cards are RT capable and if you pay cash for something, you don’t want to be constraint. Saying this card is 1440p doesn’t mean you must play at that resolution, and you can’t go up and crank down the details to play comfortably. Some people might see that as a huge disadvantage you know. Infinite Vram? I'm talking about bare minimum to play a game which apparently this GPUs (3070 and 3070 Ti) would have been capable of with the games I have mentioned. I really don’t understand what are you trying to prove? That he RT implementation sucks because in different games RT is more demanding? So is rasterization in those games obviously and since RT hits the performance when enabled that’s the natural cause what happens. Listen, I asked about the RAM because I was curious. Nonetheless these cards could play RT 4k with those games but can’t already. Not tomorrow but now. So, lets just leave it at that. If you are ok with it that perfectly fine.


Valantar said:


> That is a way too simplistic solution to this conundrum. As a 6900XT owner using it on a 1440p60 display, I know just how low that GPU will clock and how efficiently it will run if it doesn't need the power (that 75W figure I gave for Elden Ring isn't too exceptional). I've also run an undervolted, underclocked profile at ~2100MHz which never exceeded 190W no matter what I threw at it. The point being: RDNA2 has no problem clocking down and reducing power if needed. And, to remind you, in the game used for power testing here, the 6900XT matches the performance of the 3080Ti and 3090 at 1440p _while consuming less power_. Despite its higher clocks, even at peak. And, of course, all of these GPUs will reduce their clocks roughly equally, given an equal reduction in the workload. Yet what we're seemingly seeing here is a dramatic difference in said reductions, to the tune of a massive reversal of power efficiency.
> 
> So, while you're right that power consumption and performance scaling are not linear, and that a wide-and-slow GPU will generally be more efficient than a fast-and-narrow one, your application of these principles here ignores a massive variable: architectural and node differences. We know that RDNA2 on TSMC 7nm is more efficient than Ampere on Samsung 8nm, even at ~500MHz higher clocks. This is true pretty much true across the AMD-Nvidia product stacks, though with some fluctuations. And it's not like the 3090Ti is meaningfully wider than a 3090 (the increase in compute resources is _tiny_), and by extension not a 6900XT either. You could argue that the 3080Ti and 3090 are wider than the 6900 XT, and they certainly clock lower - but that runs counter to your argument, as they then ought to be _more_ efficient at peak performance, not less. This tells us that AMD simply has the architecture and node advantage to clock higher yet still win out in terms of efficiency. Thus, there doesn't seem to be any reason why these GPUs wouldn't also clock down and reduce their power to similar degrees, despite their differing starting points. Now, performance scaling per frequency for any single GPU or architecutre isn't entirely linear either, but it is close to linear within the reasonable operating frequency ranges of most GPUs. Meaning that if two GPUs produce ~X performance, one at 2GHz and one at 2.5GHz, the drop in clock speeds needed to reach X/2 performance should be similar, not in MHz but in relative % to their starting frequencies. Not the same, but sufficiently similar for the difference not to matter much. And as power and clock speeds follow each other, even if non-linear, the power drop across the two GPUs should also be similar. Yet here we're seeing one GPU drop _drastically_ more than the other - if we're comparing 3090 to 6900 XT, we're talking a 66% drop vs. a 46% drop. That's a rather dramatic difference considering that they started out at the same level of absolute performance.


Ok, simplistic, you ask a question, and you answer it yourself. These are two different architectures if you want to compare those just by the result you can. I didn’t miss anything. I don’t see how you can compare two different architectures saying one uses more or less power than the other giving the same performance or similar. Different nodes, different architectures. Obviously, that’s the case. So comparison of these two to understand the difference lies there. Different node and different architectures. There’s no point on dwelling on it. Obviously, they are different, and the difference will be there in the results. I focused on the results themselves and performance/consumption. Here is your answer. Node difference, architecture difference since these are completely different products just have the same goal.


Valantar said:


> One possible explanation: That the Ampere cards are actually _really_ CPU limited at 1080p in CP2077, and would dramatically outperform the 6900XT there if not held back. This would require the same to _not_ be true at 1440p, as the Ampere GPUs run at peak power there, indicating no significant bottleneck elsewhere. This would then require power measurements of the Ampere cards at 1080p without Vsync to check. Another possible explanation is that Nvidia is _drastically_ pushing these cards beyond their efficiency sweet spot in a way AMD isn't - but given the massive clock speeds of RDNA2, that is also unlikely - both architectures seem to be pushed roughly equally (outside of the 3090 Ti, which is ridiculous in this regard). It could also just be some weird architectural quirk, where Ampere is suddenly _drastically_ more efficient below a certain, quite low clock threshold (significantly lower than any of its GPUs clock in regular use). This would require power testing at ever-decreasing clocks to test.
> 
> Either way, these measurements are sufficiently weird to have me curious.


I remember HWUB talking about the NV driver overlay or something. A CPU is being utilized more due to lower resolution than AMD’s counterpart and thus the lower results since the resources are taken. I don’t know if that issue has been fixed by Nvidia or not. I know there was an instance brought by HWUB. Bump resolution up and you have no or way less driver overlay. Also, if they are limited at low res (obviously are) just as AMD counterparts are, maybe it is due to game itself. There was also a mention of the architecture of the Ampere GPUs. The FP32 processing mainly that some has pointed out to have an impact on the lower res high FPS performance. If I remember correctly.

I was curious about the RT and ram insufficiency and thus my question to Wiz. Any conclusions I leave for you but I did share mine which you don't need to agree with.


----------



## nguyen (Mar 30, 2022)

Valantar said:


> I'm genuinely curious as to why this is. Looking at CP2077 performance results, sadly these cards seem CPU limited at 1080p, so there isn't much to gather there - though at 1440p the 3090Ti clearly pulls ahead of the 3080Ti, 6900XT and 3090 which are tied. Yet in this power consumption graph the Nvidia cards are closely grouped while the 6900XT is ~40% higher. That strikes me as odd, given that the 6900XT uses less power than a 3090, 3080ti, and even the 3080. I understand that the test scenarios for these measurements aren't the same, but the difference seems very strange to me. The Gaming power consumption numbers are also CP2077, though at 1440p, not 1080p - but in this scenario, the 6900XT delivers the essentially identical performance at ~40W less. So how come the situation is so dramatiaclly reversed at 1080p60? @W1zzard, got any thoughts on this?



I guess Nvidia driver allows for memory clocks to drop when high VRAM clock is unnecessary
Here I try to downclock the core, which also drop VRAM clock to 10Gbps compare to 19.5Gbps stock






I think the worst offender to Ampere's efficiency is the GDDR6X and not the Samsung 8N fab. These 21Gbps GDDR6X on 3090Ti use 100-110W in-game while the GDDR6 on my 2080Ti only use around 30-40W (probably the same for 6900XT)
GPU-Z screenshot from Tweaktown review


----------



## Fluffmeister (Mar 30, 2022)

Valantar said:


> 60% slower? You're looking at a 3050 as an upgrade to a 980 Ti?



The 3070 based on the 4K graph. It's 63% slower.


----------



## Valantar (Mar 30, 2022)

Bzuco said:


> @Valantar
> I don't see that power consumption so complicated. Each GPU simply have some predefined performance states at some frequencies and it depends on current GPU core percentage usage when the GPU decide to change that state to other one. Both amd and nvidia have different percentage/frequency/... ranges for different performance states count, ...so it is very hard to tell at default conditions if the card could or could not be more power efficient.
> 
> I would advise take some static camera angle in game at 1080p60fps, set manually some lower performance state, start manually locking GPU frequencies to lower values and stop this process when gpu reaches almost 100% core utilization. Then do undervolting  and after that check what is the power consumption. Without this procedure we are all totaly just guessing.
> ...


Current GPUs don't have anything practically understandable as "predetermined performance states", they have voltage/frequency curves with heaps and heaps of points, and they scale their clock speeds dynamically based on load, thermals, voltages, and a bunch of other factors - and they will clock quite low if the load is low. This is precisely why I'm curious as to these results: from the widely documented characteristics of Ampere and RDNA2 as well as my own experiences, I don't quite understand why Ampere cards would drop their power consumption so much more drastically than equally performing RDNA2 cards in the same scenario. This indicates an unevenness of scaling that seems outside the bounds of normal variations to me. Hence my curiosity.


nguyen said:


> I guess Nvidia driver allows for memory clocks to drop when high VRAM clock is unnecessary
> Here I try to downclock the core, which also drop VRAM clock to 10Gbps compare to 19.5Gbps stock
> 
> View attachment 241805
> ...


That's really interesting! An intermediate memory speed state could indeed go some way towards explaining that. Also, holy crap, 107W on the memory rail? That's crazy. Though it is 24GB of the stuff, it's still _a lot_ of power. Memory speed scaling is also something where AMD has a long history of struggling - there have been widely reported issues across many different GPUs where memory speed for various reasons gets stuck at peak clocks even on the desktop, leading to very high idle power. It would stand to reason that this could cause behaviour like this. Would love to have someone look into this in that specific test scenario.


ratirt said:


> I’m not talking about to which degree this is a problem but rather it is here already and obviously in the future it will be more noticeable. These cards are RT capable and if you pay cash for something, you don’t want to be constraint. Saying this card is 1440p doesn’t mean you must play at that resolution, and you can’t go up and crank down the details to play comfortably. Some people might see that as a huge disadvantage you know.


No, but I never said that either. But using a "1440p card" to play at 2160p with lowered details also handily avoids these VRAM issues, rendering that point moot. What we're looking at here is at Ultra, after all.


ratirt said:


> Infinite Vram? I'm talking about bare minimum to play a game which apparently this GPUs (3070 and 3070 Ti) would have been capable of with the games I have mentioned. I really don’t understand what are you trying to prove?


That what you're describing is a very niche issue, and one likely quite easily overcome by - as you yourself brought up above - lowering settings a bit.


ratirt said:


> That he RT implementation sucks because in different games RT is more demanding?


Nope, I just said that RT performance varies wildly across titles and that extrapolating from a couple of worst-case scenarios is thus a poor predictor of future performance.


ratirt said:


> Ok, simplistic, you ask a question, and you answer it yourself.


Have I answered anything? Not as far as I've seen, at least. I've just explained why your simplistic "scaling isn't linear" statement is insufficient to explain the dramatic differences in scaling between these GPUs.


ratirt said:


> These are two different architectures if you want to compare those just by the result you can. I didn’t miss anything. I don’t see how you can compare two different architectures saying one uses more or less power than the other giving the same performance or similar. Different nodes, different architectures. Obviously, that’s the case. So comparison of these two to understand the difference lies there. Different node and different architectures. There’s no point on dwelling on it. Obviously, they are different, and the difference will be there in the results. I focused on the results themselves and performance/consumption. Here is your answer. Node difference, architecture difference since these are completely different products just have the same goal.


But that's the thing: even different nodes and architectures _broadly_ follow predictable voltage/frequency curves. They don't match or align, but the curves tend to look similar, have similar shapes. Yet the scaling here seems to indicate quite dramatic differences between overall similarly performing products in relatively similar workloads. This indicates that _something more _than just simple DVFS variance is happening here. Something like what @nguyen indicated above, for example.


ratirt said:


> I remember HWUB talking about the NV driver overlay or something. A CPU is being utilized more due to lower resolution than AMD’s counterpart and thus the lower results since the resources are taken. I don’t know if that issue has been fixed by Nvidia or not. I know there was an instance brought by HWUB. Bump resolution up and you have no or way less driver overlay. Also, if they are limited at low res (obviously are) just as AMD counterparts are, maybe it is due to game itself. There was also a mention of the architecture of the Ampere GPUs. The FP32 processing mainly that some has pointed out to have an impact on the lower res high FPS performance. If I remember correctly.


This is exactly why I was curious - there are tons of possible explanations here. Isn't that an interesting thing to find out? Saying "duh, power and clock speed don't scale linearly" is ... well, not that.


Fluffmeister said:


> The 3070 based on the 4K graph. It's 63% slower.


Ah, you mean the 3090 Ti is 63% _faster_? Relative percentages don't work in both directions, remember. The 3070 is 39% slower than the 3090 Ti (61% of its 100%); the 3090 Ti is 63% faster than the 3070 (163% of its 100%) - it depends what number you start from, and the wording used needs to match the reference point. And comparing upwards from a smaller number always seems more dramatic than comparing downwards from a larger one (if I have $25 and you have $100, you have 400% more money than me, while I have 75% less money than you, etc.), which is why mixing up these two directions of comparison gets troublesome - it effectively overstates (or understates!) the difference. It's still absolutely a significant difference, don't get me wrong. But it's not quite _that_ bad. Saying it's 63% slower implies that if the faster card gets 100fps, the slower will only get 37fps, when the reality is that it will get 61fps. And that's quite a difference. (And yes, I'm well aware I'm being a bit of a pedantic a** right now, sorry!)


----------



## Fluffmeister (Mar 30, 2022)

Yes, that is indeed what I meant. I could afford to buy even this 3090 Ti with cash in the bank but I like many bulk at the current prices. Sorry I got my last wires crossed with my initial post too.


----------



## Valantar (Mar 30, 2022)

Fluffmeister said:


> Yes, that is indeed what I meant. I could afford to buy even this 3090 Ti with cash in the bank but I like many bulk at the current prices. Sorry I got my last wires crossed with my initial post too.


Understandable  I keep catching myself doing the same type of thing (or just getting confused AF when trying to compare percentages), hence why I noticed it here I guess. And yeah, it's still a travesty even if prices are coming down. I'm rather pessimistic about future outlooks as well - given that most AIB partners have been operating on razor-thin margins for years, I can't avoid thinking they kind of like this new normal, and want to keep it that way. Which says more about the inherent structural flaws of the computer hardware industry than anything else - when Intel, Nvidia and AMD operate on ~40% gross margins while AIB partners struggle to break even on retail GPUs even above MSRP (even if that's a gross v. net margin comparison and thus not equal), that's a problem. But the solution to that problem can't be "GPUs start at $400 and go up way past $1500 now, tough luck". That will erode the entire customer base for PC gaming as a whole. Something has to give, somewhere.


----------



## wheresmycar (Mar 30, 2022)

Valantar said:


> Relative percentages don't work in both directions, remember. The 3070 is 39% slower than the 3090 Ti (61% of its 100%); the 3090 Ti is 63% faster than the 3070 (163% of its 100%) - it depends what number you start from, and the wording used needs to match the reference point. And comparing upwards from a smaller number always seems more dramatic than comparing downwards from a larger one (if I have $25 and you have $100, you have 400% more money than me, while I have 75% less money than you, etc.), which is why mixing up these two directions of comparison gets troublesome - it effectively overstates (or understates!) the difference. It's still absolutely a significant difference, don't get me wrong. But it's not quite _that_ bad. Saying it's 63% slower implies that if the faster card gets 100fps, the slower will only get 37fps, when the reality is that it will get 61fps. And that's quite a difference.



lol i've been a (and on-going) victim of this too. Some time back i saved a snapshot of a relative performance chart from TPU and based a bunch of decisions off it... luckily i didn't pull the trigger on anything at its behest. I tend to eventually look at actual FPS returns in specific games at a given resolution before pulling the trigger. I'm glad you pointed out the above, makes so much sense now. 



Valantar said:


> (And yes, I'm well aware I'm being a bit of a pedantic a** right now, sorry!)



A useful pedantic a** lol with a happy and better informed customer here!!


----------



## the54thvoid (Mar 30, 2022)

So, after a full day, you can still buy the FE 3090ti from Scan in the UK. £1879.

It states the power draw of the FE card is 350 Watts. Recommended PSU is 750W. Wish there was a review of the FE version. Need to go Google it. 350W isn't as bad as 450W.


----------



## Prima.Vera (Mar 31, 2022)

2500Euros in Europe and Japan, without VAT?? Seriously, this is getting out of control. We need Intel to break those callous prices with something significant...


----------



## ratirt (Mar 31, 2022)

nguyen said:


> I guess Nvidia driver allows for memory clocks to drop when high VRAM clock is unnecessary
> Here I try to downclock the core, which also drop VRAM clock to 10Gbps compare to 19.5Gbps stock


So you are saying that, you downclock the Core frequency and somehow the Memory frequency drops as well?


----------



## Bzuco (Mar 31, 2022)

@Valantar
There is also another reasonable explanation, CP2077 could be designed to more utilize shader units than sampling textures or doing rasterization tasks, they have custom inhouse developed realtime global illumination(non RT), which is maybe half precomputed or fully computed at realtime on cuda cores, do not know. CDPR was certainly informed, that the upcoming generation of nvidia cards will have doubled the shader counts, so why not use the potential for their game(or better said, adapt their game to new core units ratio). With the double amount of shader unit no need to use higher core frequencies at 60Hz.
Scenario for AMD would be: do less shader calculations and more texture sampling thanks to infinity cache.

Predefined performance levels in bios on asus 2060s nvidia:
P0 - 1470MHz + boost, vram max.1750MHz
P2 - max. 825MHz clock, vram max.1700MHz
P3 - max. 825MHz clock, vram max.1250MHz
P5 - max. 645MHz clock, vram max.202MHz
P8 - max. 645MHz clock, vram max.101MHz

I had always only nvidia card, and number of those P-states has always been different, same as locked/unlocked voltages/clocks for certain P-state. Too bad I can't set vram e.g. 600MHz, forcing vram frequency is somehow ignored when using official nvidia-smi command-line utility.
Those P-states can be forced, good for e.g. video playback, video encoding, forcing minimizing power consumption, etc.

Therefor I am curious if AMD cards have something similar and also what P-states are on 3090/ti cards.


----------



## ratirt (Mar 31, 2022)

Valantar said:


> No, but I never said that either. But using a "1440p card" to play at 2160p with lowered details also handily avoids these VRAM issues, rendering that point moot. What we're looking at here is at Ultra, after all.


You have suggested that saying the card is designed to play 1440p as a counter to my can't play 4k with RT on.


Valantar said:


> But that's the thing: even different nodes and architectures _broadly_ follow predictable voltage/frequency curves. They don't match or align, but the curves tend to look similar, have similar shapes. Yet the scaling here seems to indicate quite dramatic differences between overall similarly performing products in relatively similar workloads. This indicates that _something more _than just simple DVFS variance is happening here. Something like what @nguyen indicated above, for example.


Well, these are two different architectures. Maybe I don't understand exactly. I tried, like you said 'simple solution' but you mentioned architecture and node difference but you were not satisfied with it and brought up the arch and node differences.


Valantar said:


> This is exactly why I was curious - there are tons of possible explanations here. Isn't that an interesting thing to find out? Saying "duh, power and clock speed don't scale linearly" is ... well, not that.


well they dont scale linearly. There can be other issues but obviously there is a huge correlation between power and clock. If you want to eliminate the for instance driver overlay while testing. Simply, switch to 4K and the problem is gone. Then start tweaking the card and you will find out the power and clock relations for sure. There might be tons and all of them may affect the performance. As usual, these are only valid in certain scenarios. Power to clock, no matter what you do you will have that relation.


----------



## nguyen (Mar 31, 2022)

ratirt said:


> So you are saying that, you downclock the Core frequency and somehow the Memory frequency drops as well?



The driver will decide whether the VRAM run at full speed or 10Gbps depending on graphical demand (unless the "Prefer Maximum Performance" option is On). The Max FPS option can save a lot of unnecessary power usage since the driver will downclock as appropriate.

For example running Dota2 1440p 120hz




vs 4K 120hz




VRAM power consumption goes from 69W at full speed down to 47W at 10Gbps.
So yeah when you don't need maximum performance, these Ampere GPU can be quite efficient.


----------



## W1zzard (Mar 31, 2022)

nguyen said:


> 1440p 120hz





nguyen said:


> 4K 120hz


If you set your desktop to these resolutions, you're getting the same memory clocks as in dota 2?

The card thinks it's idle, and will clock down to its lowest performance state


----------



## nguyen (Mar 31, 2022)

W1zzard said:


> If you set your desktop to these resolutions, you're getting the same memory clocks as in dota 2?
> 
> The card thinks it's idle, and will clock down to its lowest performance state



this is desktop idle 4K 120hz




Dota2 4k120hz vs 1440p120hz


----------



## ratirt (Mar 31, 2022)

nguyen said:


> The driver will decide whether the VRAM run at full speed or 10Gbps depending on graphical demand (unless the "Prefer Maximum Performance" option is On). The Max FPS option can save a lot of unnecessary power usage since the driver will downclock as appropriate.
> 
> For example running Dota2 1440p 120hz
> View attachment 241895
> ...


Well, How I see it. If you limit FPS the driver will lock the resources to save power. The card will use necessary resources to sustain the 120FPS limit you have set. So no wonder the power drops when the frequency of the memory drops as well. It does make sense


----------



## wolf (Apr 3, 2022)

nguyen said:


> I think the worst offender to Ampere's efficiency is the GDDR6X and not the Samsung 8N fab


Very much so, even on my 3080 I see 80+w to the memory often. When I'm undervolted pulling roughly 225w board power, the GPU is 75-100w, it's really not that bad, but the memory and circiutry make up the rest.

Honestly if people are worried about RTX 40 series being 600w monsters they have a very short memory. Halo cards from many past generations have pulled crazy power, and yeah, the halo cards next gen probably will too, but it doesn't mean a RTX 4060 will want 450w. Get your expectations in check.

GTX 295 - 2009 - 289w
GTX 590 - 2011 - 365w
R9 295X2 - 2014- 500w

Silly halo products for people with more money than sense have been creeping power consumption for bad gains for many years. The only difference now is dual GPU is dead. If you find yourself put off by the price or power consumption of the 3090Ti, this product was never going to be for you in the first place.


----------



## Valantar (Apr 3, 2022)

wolf said:


> Very much so, even on my 2080 I see 80+w to the memory often. When I'm undervolted pulling roughly 225w board power, the GPU is 75-100w, it's really not that bad, but the memory and circiutry make up the rest.


But the 2080 has GDDR6, same as AMD's RDNA2 cards, not GDDR6X - if anything, your GDDR6 pulling that much power is an argument _against_ the GDDR6X being the main reason for Ampere's relatively poor efficiency. That would after all indicate similar power draws for AMD's VRAM.


wolf said:


> Honestly if people are worried about RTX 40 series being 600w monsters they have a very short memory. Halo cards from many past generations have pulled crazy power, and yeah, the halo cards next gen probably will too, but it doesn't mean a RTX 4060 will want 450w. Get your expectations in check.
> 
> GTX 295 - 2009 - 289w
> GTX 590 - 2011 - 365w
> R9 295X2 - 2014- 500w


Well, that latter GPU is a dual GPU card, so effectively two GPUs in one, and the next GPU down pulled ~300W. Craziness is kind of expected in that regard. And please note that the two others you listed are _lower_ power than _current _flagships, let alone the rumored 4000-series. Yes, there have always been stupidly power hungry halo cards, even if not every generation. But we've never before been in a situation where it's rumored that even upper midrange cards will exceed 300W. That is completely unprecedented.


wolf said:


> Silly halo products for people with more money than sense have been creeping power consumption for bad gains for many years. The only difference now is dual GPU is dead. If you find yourself put off by the price or power consumption of the 3090Ti, this product was never going to be for you in the first place.


That still isn't an argument against noting that these ballooning power figures are worrying. Dual GPU is inherently inefficient, after all, yet now we're seeing single GPUs reach the power levels of dual GPUs previously. Of course the GPU market is much, much larger and has a lot more wealthy whale customers than previously, as well as the ever growing enterprise markets that gobble up as much performance as they can get, but it's still worrying when the design trend moves from "peak performance in X power envelope" (which has typically been ~300W) to "peak performance, power be damned".

IMO, what we're seeing is the effects of a highly competitive market under massive shareholder pressure to maintain an unrealistic level of growth now that silicon manufacturing gains are slowing and architectures are ever more advanced. If shareholders have seen X% growth once, they expect - nay, demand - that to continue, and as companies are legally obligated to appease shareholders (or risk lawsuits), we see designs increasingly move towards the ridiculous. Of course enthusiasts expecting continued massive performance increases isn't helping either.


----------



## wolf (Apr 3, 2022)

Valantar said:


> But the 2080 has GDDR6


Typo, I meant 3080 in agreement with nyguen and as my SIG shows.


Valantar said:


> Craziness is kind of expected in that regard.





Valantar said:


> That still isn't an argument against noting that these ballooning power figures are worrying


Yeah, crazy is expected for halo products. And if nvidia and AMD alike have learned anything, there are people that will buy these ever increasingly mental products.

Performance per watt will still go up, you will still be able to pick and upgrade that fits a power budget that is reasonable to you. Silly halo products may well push beyond 450w but that doesn't mean there won't be a whole stack of more efficient than previous gen parts too. This is such an outrage nothing burger.


----------



## nguyen (Apr 3, 2022)

wolf said:


> Typo, I meant 3080 in agreement with nyguen and as my SIG shows.
> 
> 
> Yeah, crazy is expected for halo products. And if nvidia and AMD alike have learned anything, there are people that will buy these ever increasingly mental products.
> ...



3090 Ti has a major problem is that the people who don't care about prices and only want the best GPU already got the 3090 some 18 months ago and put them in a WC loop , and watercooled + OCed 3090  is gonna perform the same as stock 3090 Ti anyways.

Beside making 3090 Ti as prototype for 4090, I don't see the point of 3090Ti either.

I could see the point for 450W+ next gen GPU if the uarch efficiency is maintained, for example if the 4090 had the same FPS/watt as the 4070 (GPU power consumption only). That way a PC with 4090 actually has better Perf/Watt than one with 4070 when whole system power consumption is taken into account. Yes the PC with 4090 will use more power, but seriously it's just chump change.

I bought a watt-meter for fun and so far my PC (which use ~550W at the wall when gaming, avg 4h/day gaming for me) barely cost me 9usd in electricity a month (0.13usd per KWh here), if I swapped out the 3090 for 3090Ti it would cost me ~11usd a month in electricity, so yeah the power consumption outrage is just nothing burger


----------

