# ASUS GeForce RTX 3090 STRIX OC



## W1zzard (Sep 24, 2020)

The ASUS GeForce RTX 3090 STRIX OC is the fastest RTX 3090 we have tested today by quite the big margin. It also has a huge power limit adjustment range that maxes out at 480 W! We hance added a whole test run at 480 W to our review to see how much extra headroom RTX 3090 Ampere has left and whether the card would blow up.

*Show full review*


----------



## SIGSEGV (Sep 24, 2020)

one really short word: *unbelievable.   

*thanks for the review


----------



## boulard83 (Sep 24, 2020)

Nothing about the fact that it's the worst $/perf in the "pros/cons"

I'd buy a 3080 and another 3080 for a friend before buy a single 3090. 

My 0.02 cents. (Or my ~2000$ canadian dollars)


----------



## W1zzard (Sep 24, 2020)

boulard83 said:


> that it's the worst $/perf in the "pros/cons"


"Very high price" ?


----------



## boulard83 (Sep 24, 2020)

W1zzard said:


> "Very high price" ?



well, that's something


----------



## gridracedriver (Sep 24, 2020)

with the turing gpu there were the 500watt bios mods, with ampere there was the 480watt stock bios by nvidia... loool


----------



## ZoneDymo (Sep 24, 2020)

Highly Recommended, now that is just hilarious you guys.
Power consumption is insane, barely any better performance then the 3080 which is also rediculously power hungry...I honestly fail to see the tech progress, maybe this is just Nvidia's equivalent to Intel post 6700k....well apart from that oh so important ray tracing.

and then that price tag...good lord.


----------



## TheLostSwede (Sep 24, 2020)

I guess the rumours of $1,800-2,000 cards were true after all...


----------



## birdie (Sep 24, 2020)

> At a starting price of $1500, it is more than twice as expensive as the RTX 3080, but not nearly twice as fast.



Let me disagree with you here, @W1zzard

This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and even more VRAM at a much lower price.

This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.


----------



## okbuddy (Sep 24, 2020)

no side blower FE review? the new 12 pin power!


----------



## ratirt (Sep 24, 2020)

birdie said:


> Let me disagree with you here, @W1zzard
> 
> This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and ever more VRAM at a much lower price.
> 
> This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.


What makes you think 3090 is Titan class GPU? Just curious.


----------



## TheoneandonlyMrK (Sep 24, 2020)

Aha ,the switches are out.


----------



## P4-630 (Sep 24, 2020)

@W1zzard, when will we see a review of the ASUS GeForce RTX *3080* STRIX OC ?


----------



## deu (Sep 24, 2020)

I was never going to buy one but 19% at double the price; damn! 2080Ti looks like a bargin if you compare the difference from 2080->2080Ti and from 3080->3090. The whales left must be superwhales not to choose a 3080 instead :0


----------



## birdie (Sep 24, 2020)

ZoneDymo said:


> Highly Recommended, now that is just hilarious you guys.
> Power consumption is insane, barely any better performance then the 3080 which is also rediculously power hungry...I honestly fail to see the tech progress, maybe this is just Nvidia's equivalent to Intel post 6700k....well apart from that oh so important ray tracing.
> 
> and then that price tag...good lord.



This card (just like I predicted) is a lot more power efficient that the RTX 3080:







So, your criticism is not totally sincere as the RTX 3080 must be an even worse card in your opinion. Also, NVIDIA has a higher performance per watt ratio for this gen vs the previous gen cards, so your comparison to Intel isn't valid.

As for me personally I never buy GPUs with a TDP above 150W (and CPUs above 95W) because I simply don't want that much heat under my desk, so I guess I will skip the entire GeForce 30 series because RTX 3060 has been rumored to have a TDP around 180-200W.

"Good lord" is misplaced. Again, this is a Titan class card. Either use it appropriately or forget about it - it does not justify its cost purely from a gaming perspective.


----------



## SystemMechanic (Sep 24, 2020)

So No Strix TOP Model ? also what is the BIOS power Limit ? sorry if i missed it.


----------



## deu (Sep 24, 2020)

birdie said:


> Let me disagree with you here, @W1zzard
> 
> This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and ever more VRAM at a much lower price.
> 
> This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.


The naming says otherwise to the customers imo 3090 (and not titan), following in line with their older namingscheme x90. Your analysis of use is correct (I agree the card only makes sense to the contencreators), but it is situated more as a gamingcard that a contentcard (at least in the amount of press i have seen. Granted I have not seen everything, but i assume that I have seen more than the average joe


----------



## SIGSEGV (Sep 24, 2020)

birdie said:


> Let me disagree with you here, @W1zzard
> 
> This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and ever more VRAM at a much lower price.
> 
> This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.




oh shit, here we go again.


----------



## asdkj1740 (Sep 24, 2020)

a bit dissppointed to the cooling performance at stock.


----------



## Deleted member 24505 (Sep 24, 2020)

certainly *NOT* worth twice a 3080


----------



## phill (Sep 24, 2020)

So no one will be buying two for SLI then??


----------



## kapone32 (Sep 24, 2020)

phill said:


> So no one will be buying two for SLI then??


You know we will be getting Youtube videos with 3090 SLI performance. So you are probably right.


----------



## scooze (Sep 24, 2020)

ratirt said:


> What makes you think 3090 is Titan class GPU? Just curious.


Name RTX is occupied. What should have been called? TITAN RTX SUPER?


----------



## Frick (Sep 24, 2020)

birdie said:


> Let me disagree with you here, @W1zzard
> 
> This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and ever more VRAM at a much lower price.
> 
> This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.



This card would have been much better if it had carried the Titan name. It's not strange people think of it as a gaming card, because it's named like a gaming card.


----------



## kapone32 (Sep 24, 2020)

Frick said:


> This card would have been much better if it had carried the Titan name. It's not strange people think of it as a gaming card, because it's named like a gaming card.


And marketed as such by Jensen himself as well.


----------



## TheEmptyCrazyHead (Sep 24, 2020)

@W1zzard


> which comes at more affordable *prizing *


 
everything as expected with the 3090...continues eating popcorn


----------



## ZoneDymo (Sep 24, 2020)

birdie said:


> This card (just like I predicted) is a lot more power efficient that the RTX 3080:
> 
> So, your criticism is not totally sincere as the RTX 3080 must be an even worse card in your opinion. Also, NVIDIA has a higher performance per watt ratio for this gen vs the previous gen cards, so your comparison to Intel isn't valid.
> 
> ...



Well, I do not agree with your assesement:
First you cherry pick 4k while at 1080p its slap dead in the middle of the pack.
Secondly the 3080 if we are being nice does 150 fps where the 2080 does 100 fps while drawing 300ish watts vs the 2080's 200ish watts.
So to me it does do better but requires more power to ackomplish this feat, so to me its not worth much more then a power unlocked overclocked 2080 and that to me is not really technological progress.

Personally I am not at all impressed with these cards, the progress is minimal and the prices are still bonkers, all cards should drop atleast 200 bucks to bring things back to somewhat normal….

but apart from all of that the critism was more aimed at TPU's eternally annoying conclusion remarks, "Highly recommened" for a 1800 dollar videocard.... and I can guarantee nobody on the TPU staff actually will buy one of these.


----------



## Animalpak (Sep 24, 2020)

Always had Ti cards, so this may not be the end of the high end cards by Nvidia.

Till that moment im keeping my 2080 Ti.


----------



## dj-electric (Sep 24, 2020)

I'm just sitting here "holy fu$#balls, W1zz was able to get 4 of those... damn..."


----------



## Solid State Soul ( SSS ) (Sep 24, 2020)

I noticed the VRAM cooling isnt as complex as the RTX 3080 TUF


----------



## birdie (Sep 24, 2020)

ZoneDymo said:


> Secondly the 3080 if we are being nice does 150 fps where the 2080 does 100 fps while drawing 300ish watts vs the 2080's 200ish watts.



Sorry, this is not a 1080p card, not by a long shot. If you're here to argue about this card merits for this low resolution I don't know what else to say because I don't have any polite words in my lexicon.

What's next, you're going to argue that an excavator is not a good way of planting seeds? Or maybe you need to play your favourite game at 600fps? Why?


----------



## ReallyBro (Sep 24, 2020)

birdie said:


> Let me disagree with you here, @W1zzard
> 
> This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and even more VRAM at a much lower price.
> 
> This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.


Right.... only that Nvidia named it like a gaming card, the entire marketing around it was a gaming card with "8K GPU"(Nvidia marketing at its best yet)







The only reason you should buy this card is for its VRAM


----------



## mouacyk (Sep 24, 2020)

tigger said:


> certainly *NOT* worth twice a 3080


Certainly not for general gaming or even compute usage.  However, it returns more than double the performance in specific workloads if you need it now and cannot wait for RTX 3080 20GB or RTX 3070 16GB. Most demo reels for marketing are likely to fall into the large scene categories.


----------



## howiec (Sep 24, 2020)

The *3090 is a 1080p *card for those like me who want to *sustain high FPS on a 360Hz monitor*. 

It's definitely expensive though considering you can get _two 3080s for the same price_...

3080 at ~$760 is actually decent pricing because it's basically the *2080 Ti high-end (non-Titan) replacement *which cost around $1300 price when it first debuted.
Sux that we can't do 3080 SLI though.


----------



## mouacyk (Sep 24, 2020)

howiec said:


> The *3090 is a 1080p *card for those like me who want to *sustain high FPS on a 360Hz monitor*.
> 
> It's definitely expensive though considering you can get _two 3080s for the same price_...
> 
> ...


In case you don't know yet, Ampere (and future NVidia GPUs) will no longer do implicit multi-GPU SLI (which is through driver profiles.)


----------



## ReallyBro (Sep 24, 2020)

howiec said:


> The *3090 is a 1080p *card for those like me who want to *sustain high FPS on a 360Hz monitor*.
> 
> It's definitely expensive though considering you can get _two 3080s for the same price_...
> 
> ...


Not really.... 360Hz makes sense for competitive games that are built for extremely high FPS. the 3080 can easily max all these types of games.
AAA games at 360Hz makes no sense. these games are not built around these framerates, many have inherent bottlenecks within the engine and other hardware bottlenecks.

The 3090 is only 7% avg faster than 3080 in 1080P:







The only reason to get a 3090 is if you actually need 24GBs.


----------



## DuxCro (Sep 24, 2020)

Now to patiently wait and see what AMD comes up with.


----------



## r9 (Sep 24, 2020)

Too expensive ? Go buy Radeon ... Oh they don't have one ? ... Too bad soo sad. -NVIDIA


----------



## howiec (Sep 24, 2020)

ReallyBro said:


> Not really.... 360Hz makes sense for competitive games that are built for extremely high FPS. the 3080 can easily max all these types of games.
> AAA games at 360Hz makes no sense. these games are not built around these framerates, many have inherent bottlenecks within the engine and other hardware bottlenecks.
> 
> The 3090 is only 7% avg faster than 3080 in 1080P:
> ...


Sure, I'm only expecting around 10% performance over a 3080 but it's not just the average FPS that I care about.   I care about the *minimum FPS *just as much.

A 3080 should easily maintain above 360fps for some low graphically-complex games like *CSGO or Valorant *but for other games like *Apex Legends*, I want to ensure that I can *maximize the minimum FPS* which a 3090 will do better at.
In Apex Legends, even with all graphics settings on low @ 1680x1050, my 2080 Ti will drop from 180fps-cap to around 130 in different environments which leads to fluctuating mouse sensitivity/feel. Clearly this can be a huge problem.

Again, for the average gamer, of course a 3080 is better value. But for those who want better average *and minimum *FPS/performance even at 1080p @ 360Hz, a 3090 will still be beneficial.... for a price.


----------



## EarthDog (Sep 24, 2020)

ratirt said:


> What makes you think 3090 is Titan class GPU? Just curious.


Dude... this is AT LEAST the third time I've said this to YOU.............. and in this forum more times than that. How this is still unclear to you bud is beyond me. 

1. In the video, Jensen talks Titan, then in the same breath whips out the RTX 3090 from his oven.
2. In the same video, slides state the 3080 is the 'flagship'.


Frick said:


> This card would have been much better if it had carried the Titan name. It's not strange people think of it as a gaming card, because it's named like a gaming card.


this..This...THIS!!!! OMG! Why then went this route I'll never know, but it sure does wad some panties.... lol



ZoneDymo said:


> First you cherry pick 4k while at 1080p its slap dead in the middle of the pack.


What mook buys this card for 1080p? Seriously. While you're at it, make sure Granny doesn't go buy a Lambo...


----------



## howiec (Sep 24, 2020)

EarthDog said:


> What mook buys this card for 1080p? Seriously. While you're at it, make sure Granny doesn't go buy a Lambo...


Those who know they can benefit from it in their specific cases or who have $, etc.



mouacyk said:


> In case you don't know yet, Ampere (and future NVidia GPUs) will no longer do implicit multi-GPU SLI (which is through driver profiles.)


Yeah, that's why I said it sux. I doubt many games will go through the hassle of inherently supporting dual 3090s either.


----------



## EarthDog (Sep 24, 2020)

howiec said:


> Those who know they can benefit from it in their specific cases or who have $, etc.


lol, ya don't say?   

Come on man, zoom out a bit and think about that.


----------



## howiec (Sep 24, 2020)

EarthDog said:


> lol, ya don't say?
> 
> Come on man, zoom out a bit and think about that.


In my case both apply. I have the $ AND for reasons listed above as just 1 example, it will make a noticeable difference. Hell, I'd venture to say even a 3090 may not be sufficient.

For example, assuming the 3090 offers ~36% better performance over the 2080 Ti @ 1080p for *both minimum *and avg fps (likely untrue), and my 2080 Ti drops to 130fps @ 1680x1050 at times, then 1.36*130=*177fps *(yes, I know different res). Surely the 3090 should do better than that but that *won't be nearly enough to utilize most of my upcoming 360Hz monitor*.

The 3090 would have to offer *177% performance over the 2080 Ti (vs just 36%) *to keep a minimum of 360fps in Apex Legends @ 1680x1050 (again we're using these numbers as examples that won't correlate to minimum FPS scaling).

So yeah, for those of us who know what they want and why they need it, a 3090 makes sense.

There are plenty out there who don't know and just have $ to burn to get max performance. Nothing wrong with that either.


----------



## mouacyk (Sep 24, 2020)

If you trick out RTX at native 1080p with shadows, reflections, and global illumination, I wouldn't surprised if this card could only handle it at 60fps.  Mind you, that's still just 1 bounce per pixel, with de-noising to smooth out light gaps.  DLSS isn't free -- it at least costs time for training to ensure no artifacts for a given game, but I'm sure NVidia charges for this training. Full fidelity path-tracing graphics isn't as cheap as it's been made out to be.

For those who want 1080p at 360Hz, isn't it better to just lower the details on a mid-range high-frequency GPU, since they only care about the competitive advantage?  Having RTX and all sorts of other details in the scene just make it hard to discern the opponent anyway.


----------



## EarthDog (Sep 24, 2020)

Ahh yes.. a one off. Point taken, @howiec.


----------



## c2DDragon (Sep 24, 2020)

There is no TITAN name on those for nVidia to sell more cards to gamers I guess.
Now I'm sad, I was hoping it would perform a lot better than the 3080.
Time to wait for an eventual 3080Ti now.
Thanks for the reviews !

PS : In case you need a Titan :


----------



## TheoneandonlyMrK (Sep 24, 2020)

EarthDog said:


> Dude... this is AT LEAST the third time I've said this to YOU.............. and in this forum more times than that. How this is still unclear to you bud is beyond me.
> 
> 1. In the video, Jensen talks Titan, then in the same breath whips out the RTX 3090 from his oven.
> 2. In the same video, slides state the 3080 is the 'flagship'.
> ...


Nvidia recommend a Titan over the 3090 for pro use cases, the 3090 is driver gimped to be no better than a Titan in the likes of specview and pro use cases.
It's a Gamer card, not a pro user card.


----------



## EarthDog (Sep 24, 2020)

theoneandonlymrk said:


> Nvidia recommend a Titan over the 3090 for pro use cases, the 3090 is driver gimped to be no better than a Titan in the likes of specview and pro use cases.
> It's a Gamer card, not a pro user card.


It really depends on the use case where a Titan would benefit.


----------



## Traladingdong (Sep 24, 2020)

Preordering the Strix 3080 after seing how fricking loud the Strix on the 3090 is feels now like a huge mistake. MSI X TRIO got the same FAN load noise on both 3080 and 3090.
Damn, guess I'll cancel the order and go with a TUF/ X Trio.


----------



## howiec (Sep 24, 2020)

EarthDog said:


> Ahh yes.. a one off. Point taken, @howiec.


I doubt Apex Legends is the only MP game that has large FPS fluctuations @1080p even at medium to low settings, and upcoming games will generally be more demanding.

Not to mention 360Hz will only become more common.


----------



## umano (Sep 24, 2020)

480w it is insane


----------



## howiec (Sep 24, 2020)

mouacyk said:


> If you trick out RTX at native 1080p with shadows, reflections, and global illumination, I wouldn't surprised if this card could only handle it at 60fps.  Mind you, that's still just 1 bounce per pixel, with de-noising to smooth out light gaps.  DLSS isn't free -- it at least costs time for training to ensure no artifacts for a given game, but I'm sure NVidia charges for this training. Full fidelity path-tracing graphics isn't as cheap as it's been made out to be.
> 
> For those who want 1080p at 360Hz, isn't it better to just lower the details on a mid-range high-frequency GPU, since they only care about the competitive advantage?  Having RTX and all sorts of other details in the scene just make it hard to discern the opponent anyway.


That's the problem. In modern games, especially BRs with large open environments with a lot of structures, even at minimum graphics settings at 1080p, the FPS drops (*fluctuates*) well below 240fps. Forget 360fps and fully utilizing 360Hz even on a 3090.

However, obviously 360Hz offers better input lag reduction even at lower FPS due to faster scanout which is why I'm still buying one.


----------



## CrAsHnBuRnXp (Sep 24, 2020)

I would absolutely love to get my hands on this card, but with it being $1800, it's going to be really hard for me to nab it up. Might have to wait for the possible 3080 20GB (Ti?) to come out and see the performance. 

Also, 





> We also did a test run with the power limit maximized to the 480 W setting that ASUS provides. 480 W is much higher than anything else availalable on any other RTX 3090, so I wondered how much more performance can we get. At 4K resolution it's another 2%, which isn't that much, but it depends on the game, too. Only games that hit the power limit very early due to their rendering design can benefit from the *addded *power headroom.



Got a typo there in the conclusion.

Also, id love to see WoW added back into reviews.


----------



## AddSub (Sep 24, 2020)

Ouuufffff! Might be worth getting to just see what it does to legacy bechmarks like 3DMark 01 and 03 combined with say a 10900k in 5.3-5.4GHz range on AIO. Non exotic way to grab some HWbot records.

Oh yeah, new games, and all that other stuff in the review. Awesome tech! I guess... Not gonna lie, skipped all those games tests since I literally do NOT own any of the new stuff. (Does No Man Sky count, circa 2016? )

But yeah, crazy fillrate! Definitive HWbot record taker as far as benchmarking and overclocking community is concerned. If you are buying this to play Fortnite/PUBG or something, a $60 ebay RX580 8GB can do that, and has been able to do that since about 2-3 years now.

...
..
.


----------



## EarthDog (Sep 24, 2020)

howiec said:


> I doubt Apex Legends is the only MP game that has large FPS fluctuations @1080p even at medium to low settings, and upcoming games will generally be more demanding.
> 
> Not to mention 360Hz will only become more common.


All games have significant fluctuations. MP can be worse... but that isn't dependent on the GPU really. That's the nature of MP gaming man.

360Hz will become more common, indeed. By that time, the next gen cards will be out. Like 4K, any decent 360 Hz monitor is quite pricey and overkill for anyone who doesn't have F U money or plays competitively where that stuff matters.

You're fighting for the <1%.


----------



## TheoneandonlyMrK (Sep 24, 2020)

EarthDog said:


> It really depends on the use case where a Titan would benefit.



Everyone and Nvidia are saying it's an 8K gaming card, I agree it will be useful to some but it isn't a Titan.
Surely you agree with below, I don't mind greedy cards but that's a lot of heat thrown into any room.


umano said:


> 480w it is insane



I can get close to that with effort and such for benching(,not doing this anymore though) but I think it a tad high 24/7.


----------



## chodaboy19 (Sep 24, 2020)

I like that Nvidia built this BFGPU and put it out there.  It just showcases their top tier product and the potential of their technology.

Kind of like cars, M3 vs. 3-series. You pay for those infinitesimal improvements.


----------



## dalekdukesboy (Sep 24, 2020)

birdie said:


> This card (just like I predicted) is a lot more power efficient that the RTX 3080:
> 
> 
> 
> ...



Are you kidding me? It's a "whole" 7 % more efficient than the 3080 which is VERY little and even if it were 20% more efficient with the ridiculous price difference it'd take a lifetime to make back the initial price difference between these cards, which is huge no matter how you lslice why it is what it is. Even if it is a titan I'd be interested in seeing how much better it performs in business applications etc, doubt it's even worth it in that scenario.


----------



## howiec (Sep 24, 2020)

EarthDog said:


> All games have significant fluctuations. MP can be worse... but that isn't dependent on the GPU really. That's the nature of MP gaming man.
> 
> 360Hz will become more common, indeed. By that time, the next gen cards will be out. Like 4K, any decent 360 Hz monitor is quite pricey and overkill for anyone who doesn't have F U money or plays competitively where that stuff matters.
> 
> You're fighting for the <1%.


Nah, I'm just saying people shouldn't make blanket statements without acknowledging the benefits that most people don't realize can actually be very noticeable. We all acknowledge the price/performance value statement of a 3080 over a 3090. However, I do not agree with anyone saying the 3090 makes no noticeable difference at 1080p.
360Hz is supposed to be available starting end of this month. Any serious competitive gamer will want one.


----------



## ZoneDymo (Sep 24, 2020)

birdie said:


> Sorry, this is not a 1080p card, not by a long shot. If you're here to argue about this card merits for this low resolution I don't know what else to say because I don't have any polite words in my lexicon.
> 
> What's next, you're going to argue that an excavator is not a good way of planting seeds? Or maybe you need to play your favourite game at 600fps? Why?



Well no need to get upset, its just a videocard, lets look at the facts and not made up stats shall we?
Nvidia themselves pushed for new 360hz screens, nr1 so high refreshrate/framerate gaming is definitly something they are going for, now lets look at this review shall we?

Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...

sooo yeah.


----------



## laszlo (Sep 24, 2020)

solid card too bad it is a "scalper" edition also


----------



## birdie (Sep 24, 2020)

ZoneDymo said:


> Well no need to get upset, its just a videocard, lets look at the facts and not made up stats shall we?
> Nvidia themselves pushed for new 360hz screens, nr1 so high refreshrate/framerate gaming is definitly something they are going for, now lets look at this review shall we?
> 
> Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...
> ...



So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance. AMD fans never fail to disappoint with the utmost disrespect towards intelligence and logic.

And according to you games under RDNA 2.0 will magically scale better. LMAO.

sooo yeah.

Here's some harsh reality for you:












Tell me exactly how NVIDIA is supposed to fix this suckfest.


----------



## ShurikN (Sep 24, 2020)

There are diminishing returns and then there's this


Truly the champ of wasted money.


----------



## Tisumi (Sep 24, 2020)

I've ordered one and hopefully going to receive it before Cyberpunk 2077 release (I am from Italy).
What to say, will probably end up being the fastest 3090 together with Aorus Xtreme and Zotac AMP extreme. $/perf 3080 is for sure better but you're paying the fact this card is the fastest GPU on Earth.


----------



## EarthDog (Sep 24, 2020)

howiec said:


> Nah, I'm just saying people shouldn't make blanket statements without acknowledging the benefits that most people don't realize can actually be very noticeable. We all acknowledge the price/performance value statement of a 3080 over a 3090. However, I do not agree with anyone saying the 3090 makes no noticeable difference at 1080p.


I get you... I don't like blanket statements either, but when everything is covered except for the toes.... 

Think of this way too... you're going to be heavily CPU limited at low res trying to feed all of those frames in. I think in TPU reviews we see much larger, significant gains at the higher resolution. To use this on anything less than 4K does it an injustice IMO. You're so CPU limited, especially at 1080p, the bottleneck shifts significantly to the CPU with this monster (and the 3080).

Watch the FPS go up with CPU clock increases at that low of a res. Look how poorly AMD does with its lack of clock speed. I think starting off at a static 4.5, then go 4.7, 4.9, 5.1, 5.3, etc... hell, I'd love to see some single stage results at 5.5+ just to see if it keeps going so we know how high it scales.  The lows aren't due to the GPU at that point1080p).

These overclocked versions like the Strix are awesome, but, unless you are at 4K+, the 'value' (and I use that term loosely, like hot dog down a hallway loosely) of this card for gaming is garbage. Similar to the Titan it 'replaces'. You bought the Titan because it was a bridge between gaming and prosumer offering perks for the latter. This is the same thing going on here, sans the Titan name (so far). Everything about this card screams Titan........except for the Geforce naming.......thanks Nvidia. 



ZoneDymo said:


> Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...


CPU limited at such a low resolution man. It may be the game devs, it may be too slow CPUs for the beast of a GPU at 1080p. Look at how little the card gains over a 3080 at 1080p compared to 4K.



Birdito said:


> Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off


The flagship is the 3080. This is a Titan replacement.

Also, the first Titan was released in 2013... IIRC, it was $1000. So, seven years later, it costs 50% more (reference to FE). And the 780Ti was $700... 780 was $500.


----------



## Birdito (Sep 24, 2020)

Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off


----------



## chodaboy19 (Sep 24, 2020)

Birdito said:


> Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off



The BOM has steadily increased too.


----------



## TheoneandonlyMrK (Sep 24, 2020)

chodaboy19 said:


> The BOM has steadily increased too.


According to one guy on here these cooler's cost £20 (can't recall who luckily for him or his name would be here), the silicon cost has gone up, so has Nvidia's die size , and the cost of all other parts making the 3080 and 70 good buys, personally I think the markup on these 3090 is a bit much, they're worth more than a 3080 but not this much.

@EarthDog , I think they (Nvidia)gimped the pro use of the 3090 specifically because Pros were just buying Titan's, they're 6000 Quadros are due and I think they want their money for them.


----------



## howiec (Sep 24, 2020)

EarthDog said:


> I get you... I don't like blanket statements either, but when everything is covered except for the toes....
> 
> Think of this way too... you're going to be heavily CPU limited at low res trying to feed all of those frames in. I think in TPU reviews we see much larger, significant gains at the higher resolution. To use this on anything less than 4K does it an injustice IMO. You're so CPU limited, especially at 1080p, the bottleneck shifts significantly to the CPU with this monster (and the 3080).
> 
> ...


Yet the 3090 is geared primarily towards the "toes" or 1%.

Here are my PC's main specs and real-world results:
8700k 5.03GHz, 0 AVX offset
2080 Ti - Asus Strix OC - Constant 1890core, 1900mem
32GB DDR4 3216, 13-14-14-28-350
240Hz - PG258Q

When fps drops to ~125fps in heavy scenes in *Apex Legends at 1680x1050* (even lower than 1080), GPU load is at 80%+ read using GPU-Z. As most of us should know, 80% GPU load does not mean it's not occasionally hitting full or near-full capacity. It's just a rough guideline but it's quite high.
When FPS jumps to around 187fps in typical scenes, GPU load is around 50%  (lower).
CPU usage stays relatively constant with the total/average CPU usage at ~25%, and highest *total *usage on any *physical *core at ~40%.

Clearly the CPU is *not *the bottleneck in this case, whereas the GPU is.

Of course we'll see better gains for high-end GPUs at higher resolutions *on average* because the GPU will be the bottleneck *more often *for obvious reasons throughout the scenes but that does *not mean you can't be GPU bound at lower resolutions *as scenes vary drastically. _It simply depends on the complexity/load of the scene vs the capacity of the graphics card._

Again, you keep arguing value while I am not. I'm pointing out the performance differences for those that want it and are willing to pay for it.



Tisumi said:


> I've ordered one and hopefully going to receive it before Cyberpunk 2077 release (I am from Italy).
> What to say, will probably end up being the fastest 3090 together with Aorus Xtreme and Zotac AMP extreme. $/perf 3080 is for sure better but you're paying the fact this card is the fastest GPU on Earth.


Lucky bastard. I'm assuming you're saying you were able to order the *Asus 3090 Strix O24G*.

Where did you find it?

I've only seen it listed as unavailable on BestBuy and Newegg.


----------



## EarthDog (Sep 24, 2020)

howiec said:


> Snip


80% gpu load is a problem. They should be at 99% when they arent held back.


----------



## B-Real (Sep 24, 2020)

Shame on NV for marketing it as a 8K gaming GPU. And from that perspective, it is a patheticly performing product. How can it even be positive that it is "19% faster" than the 3080? Which is not true in terms of that it is the reference 3080. When you the AIB 3090 to an AIB 3080, the difference is more like 15% or below. And comparing the same AIB models, difference is closer to 10%. This is a shame. A big shame. For efficiency, so far both the 3080 and the 3090 sits in a not so better position than the 2080 and 2080 Ti. The only positive thing so far in the 3000 series is the performance upgrade of the 3080, but it's still short of the 1080's performance gain, not to speak of the 1/3 of its efficiency gain. And what we have seen of the 3070 so far (Galax slides showing it's well below the 2080 Ti, while NV slides claimed it was faster than the 2080 Ti), it isn't supposed to be a bomb either. From NV, my last hope is in the 3060.


----------



## ZoneDymo (Sep 24, 2020)

birdie said:


> So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance. AMD fans never fail to disappoint with the utmost disrespect towards intelligence and logic.
> 
> And according to you games under RDNA 2.0 will magically scale better. LMAO.
> 
> ...



Literally did not mention a thing about AMD....the fact that you felt the need to bring that company up says a lot about your mindset in this little discussion I'm afraid.

Lets also not forget its the Nvidia users who did not jump from the 10 series to the 20 series and that the presentation was very much about convincing them "thats its time to upgrade"....

And are you really saying the stack of games TPU uses for their reviews are bad choices? that they somehow not represent reality out there? like I honestly im kinda taken back but your viewpoint here.

And intelligence and logic, already said man, 50% more performacne for 50% more power consumption is not some impressive leap in tech, 50% performance while using no more power would be, 50% more performance for 20% less power...now that would be something to cheer for.


----------



## howiec (Sep 24, 2020)

EarthDog said:


> 80% gpu load is a problem. They should be at 99% when they arent held back.


That's a common misconception. GPU load or utilization is only a certain kind of metric over a *sample period of time. *Specifically:


utilization.gpuPercent of time over the past sample period during which one or more kernels was executing on the GPU.
The sample period may be between 1 second and 1/6 second depending on the product.

It does not indicate *exactly *the "*capacity*" used. There are many other parts of the entire graphics system/pipeline that can be a bottleneck yet result in <99% "utilization".


----------



## B-Real (Sep 24, 2020)

chodaboy19 said:


> I like that Nvidia built this BFGPU and put it out there.  It just showcases their top tier product and the potential of their technology.
> 
> Kind of like cars, M3 vs. 3-series. You pay for those infinitesimal improvements.


You are funny, really. 



birdie said:


> This card (just like I predicted) is a lot more power efficient that the RTX 3080:
> 
> 
> 
> ...


LOL, a lot more power efficient than the 3080? F.e. this chart compares an AIB 3090 to a reference 3080. If you take ASUS TUF 3080 f. .e. it's 2% better in perf/W than the ref 3080. And if you check back-to-back generations, *the 2080 Ti was 22% more efficient than the 1080 Ti *(which is not a big leap at all), *while the 3080 Ti is also around that 22% uplift from the 2080 Ti *(assuming a 96-97% reference in the chart). *When you check the 980 Ti - 1080 Ti switch, there is a whopping 65% efficiency increase*. 3 times more than the 1080TI-2080Ti or the 2080Ti-3080Ti switch.

However, if you check all the other 3 3090's results, all are much worse in terms of efficiency than the ASUS Strix as they all consume more power and they perform worse. There is some problem for me regarding that.













If you check power consumption charts, ASUS consumes 20W less in games while the ASUS card performs 9% better in games. However in Furmark, the ASUS card consumes 50W more. For me, this shows some kind of measurement anomalies in the gaming section.



birdie said:


> So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance.



Please don't cry.


----------



## xorbe (Sep 24, 2020)

> no-holes-barred

no holds barred


----------



## EarthDog (Sep 24, 2020)

howiec said:


> That's a common misconception. GPU load or utilization is only a certain kind of metric over a *sample period of time. *Specifically:
> 
> 
> utilization.gpuPercent of time over the past sample period during which one or more kernels was executing on the GPU.
> ...


I get it.

GPU use is typically maxed when there aren't other external bottlenecks.


----------



## Tisumi (Sep 24, 2020)

howiec said:


> Lucky bastard. I'm assuming you're saying you were able to order the *Asus 3090 Strix O24G*.
> 
> Where did you find it?
> 
> I've only seen it listed as unavailable on BestBuy and Newegg.


Managed to get one on an official ASUS retail store in Italy, Drako.


----------



## birdie (Sep 24, 2020)

ZoneDymo said:


> Literally did not mention a thing about AMD....the fact that you felt the need to bring that company up says a lot about your mindset in this little discussion I'm afraid.
> 
> Lets also not forget its the Nvidia users who did not jump from the 10 series to the 20 series and that the presentation was very much about convincing them "thats its time to upgrade"....
> 
> ...



I've showed you three games which almost do not scale not matter how much GPU horsepower you throw at them. Is there anything else you'd like to say against NVIDIA and how "poorly" the RTX 3090 performs? In games which are properly coded, we see gains which are in line with the increase in CUDA cores/power consumption. Actually, the RTX 3090 performs better in terms of performance/power than the RTX 3080 which could indicate that your criticism is misplaced.

Sorry, for mentioning AMD fans and their way of thinking. AMD has become a sort of cult in the tech world and I've gotten used to seeing the harshest and in most cases either unwarranted or outright false NVIDIA criticism from the cult followers.

The video in which Jensen Huang announced the RTX 3090 he directly mentioned that the card was to become a Titan of this generation, yet in this thread people keep on being mad at NVIDIA that the company is advertising as the card as a gaming product as if NVIDIA is standing behind their backs with a gun pointing at their heads and forcing them to buy it. Are they?

Do not buy it. OK? Vote with your wallet! Let NVIDIA know how insane they are with their pricing strategy. It's expensive as <no swearing here>. Its power consumption is insane. We get it. Now let's discuss something relevant. For instance the fact that it contains 24GB of VRAM and it's great for content creators while being *40% cheaper* than previous generation Titans.

On the other hand, why do luxury/sports cars exist? Who on Earth buys this overpriced junk? They cost up to 200 times more than average cars yet being just 2-3 times faster. Pure insanity!


----------



## MikeSnow (Sep 24, 2020)

This doesn't seem correct:






Setting the limit to 480W should give you better performance, not worse.

Also, in the power consumption analysis, were the results obtained with the power limit increased to 480W, or the default? Why not include both settings in the power analysis as well? If the analysis was done at the default setting, it would be very interesting to know how much power it can draw in Furmark, with the limit increased to 480W, if it can go over 449W or not.


----------



## howiec (Sep 24, 2020)

EarthDog said:


> I get it.
> 
> GPU use is typically maxed when there aren't other external bottlenecks.


Easier if you read this rather than me explaining:
https://developer.nvidia.com/blog/t...lysis-method-for-optimizing-any-gpu-workload/

Also, you completely ignored the fact that I provided real-world numbers on increased GPU load with lower FPS and decreased GPU load with higher FPS (which makes sense) while CPU load stayed constantly low throughout.



Tisumi said:


> Managed to get one on an official ASUS retail store in Italy, Drako.


Nice!


----------



## B-Real (Sep 24, 2020)

MikeSnow said:


> This doesn't seem correct:
> 
> View attachment 169696
> 
> ...


I also don't get the anomalies regarding the efficiency charts in the 4 tested AIB models. ASUS consumes 20W less in gaming than the Zotac while performing 9% better? These are not different GPUs but the same 3090. Then at Furmark it consumes 50W more. There has to be some inaccuracies.


----------



## EarthDog (Sep 24, 2020)

howiec said:


> Easier if you read this rather than me explaining:
> https://developer.nvidia.com/blog/t...lysis-method-for-optimizing-any-gpu-workload/
> 
> Also, you completely ignored the fact that I provided real-world numbers on increased GPU load with lower FPS and decreased GPU load with higher FPS (which makes sense) while CPU load stayed constantly low throughout.


thanks again. Always exceptions.


----------



## Vayra86 (Sep 24, 2020)

birdie said:


> This card (just like I predicted) is a lot more power efficient that the RTX 3080:
> 
> 
> 
> ...



The cost simply isn't justifiable. If you buy this.... you buy this. Don't attribute too much sensibility to it.



birdie said:


> Pure insanity!



Nail > Head


----------



## AddSub (Sep 24, 2020)

So, grab these at $150 a pop in 2025 off of ebay, when RTX 5090 64GB GDDR7 model rolls out???  

...
..
.


----------



## Imouto (Sep 24, 2020)

How is this a Titan class GPU when it doesn't even have workstation certified drivers? There are plenty of professional software that runs like a dog on these just because of that.


----------



## EatingDirt (Sep 25, 2020)

birdie said:


> Let me disagree with you here, @W1zzard
> 
> This is a Titan class card which used to cost over $2500. Now you're getting a massively faster GPU and even more VRAM at a much lower price.
> 
> This is not a gaming card unless you have more money than common sense. It's a productivity card which can be used for gaming.



Titan class cards before the Titan RTX & Titan V used to cost $1,200 MSRP. Nvidia doubled the price with the RTX(& V), because they doubled the price of their flagship gaming line.

There's only 1 reason to buy an RTX 3090, and that's productivity work that needs more than 10GB, and you also need the PC to play games. However, rumors of a 3080 with 20GB are true, then this card is only useful in the immediate short term.

This is just a card for rich people, basically.


----------



## SystemMechanic (Sep 25, 2020)

@W1zzard Can you show us the actual BIOS Powerlimit in from GPU Z please.

This is NOT a Titan Card, Its just a Ti in desguise with more VRAM, if anything, its close to the Dual GPU's like the GTX 590 and 690 Nvidia used to produced. Linus confirmed that many Titan Features are disabled on this card and probably wont be unlocked..so yeah Stop listening to Jensen's Words.


----------



## GhostRyder (Sep 25, 2020)

Ill be honest, the review almost turned me towards the 3080 but I am still firm on getting a 3090.

The only thing I am royally mad about is trying to buy one of these at 8:00AM this morning.  Nvidia's site refused to load at 8:00 and at 8:02 it was sold out...  Newegg loaded but every time I checked out it would say its now sold out right at the end, Best Buy I checked too late, and Micro Center had people around the block waiting.  I cannot believe how many people must have used bots to buy this again even though they upped their "Protection".  There is no way people checked out that fast without a bots help (I cant even get the pages to load that fast on 1gb internet).


----------



## swirl09 (Sep 25, 2020)

Tisumi said:


> Managed to get one on an official ASUS retail store in Italy, Drako.


Oh nice. I ordered it from a few places, who knows how long it will take tho.

Seeing the noise here though, its not what I hoped. Im surprised that the FE might actually have been just fine really. Would like to see it here for comparison.


----------



## pandemonium (Sep 25, 2020)

Any notes on the memory used in any of the games?

Given the modest +18% performance over the 3080 at twice the cost, that memory pool is going to be the only saving grace for this beast.


----------



## swirl09 (Sep 25, 2020)

Theres been a bunch of gaming performance analysis that show you shouldnt be seeing a VRAM problem at 4K with 10GBs in the vast majority of games. So for 199 games out of 200, you will be fine. Its only if you plan to go beyond 4K, or pump SS in VR that you will see a benefit from more VRAM (which I do.)


----------



## turbogear (Sep 25, 2020)

I wonder why this gets Highly Recommended award. 

It is power hungry card that delivers only 20% more performance for double the price of 3080. 

I will rather buy two 3080 for the price of this monster; one for my pc and one for my son but I will wait until end of the year also to see what RDNA2 will offer.


----------



## AvrageGamr (Sep 25, 2020)

phill said:


> So no one will be buying two for SLI then??


I read somewhere that Nvidia wont be making game drivers for sli after Jan 2021.


----------



## Vayra86 (Sep 25, 2020)

turbogear said:


> I wonder why this gets Highly Recommended award.
> 
> It is power hungry card that delivers only 20% more performance for double the price of 3080.
> 
> I will rather buy two 3080 for the price of this monster; one for my pc and one for my son but I will wait until end of the year also to see what RDNA2 will offer.



Those stickers just go out randomly. Don't worry too much about them. Stickers look good. The reviews here are about the contents 

It used to be ridiculous numbers. Let it be. I bet companies love them.


----------



## Bill'O (Sep 25, 2020)

ok I understand we're crossing borders much like with smartphones market prices...
That said as my work tool, I still consider buying a 3090 but wonder if my corsair RM750i will hold up though ?..


----------



## ratirt (Sep 25, 2020)

EarthDog said:


> Dude... this is AT LEAST the third time I've said this to YOU.............. and in this forum more times than that. How this is still unclear to you bud is beyond me.
> 
> 1. In the video, Jensen talks Titan, then in the same breath whips out the RTX 3090 from his oven.
> 2. In the same video, slides state the 3080 is the 'flagship'.


Oh so if NV CEO says it is Titan then it must be it? Well I disagree.
NV's men said a lot of things and it was never true. Don't fall for the marketing dude. Think a bit.


----------



## Bill'O (Sep 25, 2020)

ratirt said:


> Oh so if NV CEO says it is Titan than it must be it? Well I disagree.
> NV's men said a lot of things and it was never true. Don't fall for the marketing dude. Think a bit.


I actually never understood why everybody was bashing 2080ti owners after seeing 2 powerpoints at nvidia... It's like guys those gpus are more than 2y old and a new gen comes up so what do you expect ? Well it's performing better ofc but wait for real benchmarks with current drivers. And guess now...


----------



## ratirt (Sep 25, 2020)

Bill'O said:


> I actually never understood why everybody was bashing 2080ti owners after seeing 2 powerpoints at nvidia... It's like guys those gpus are more than 2y old and a new gen comes up so what do you expect ? Well it's performing better ofc but wait for real benchmarks with current drivers. And guess now...


Well there are people who take things as they are presented to them or stop and think and have different conclusions.


----------



## W1zzard (Sep 25, 2020)

xorbe said:


> no holds barred


Fixed



MikeSnow said:


> This doesn't seem correct:
> 
> View attachment 169696
> 
> Setting the limit to 480W should give you better performance, not worse.


Fixed



> Also, in the power consumption analysis, were the results obtained with the power limit increased to 480W, or the default? Why not include both settings in the power analysis as well? If the analysis was done at the default setting, it would be very interesting to know how much power it can draw in Furmark, with the limit increased to 480W, if it can go over 449W or not.


It was done at default. I added all the 480 W testing only yday with super limited time. Now moving to a new house, so not sure when I can update



B-Real said:


> I also don't get the anomalies regarding the efficiency charts in the 4 tested AIB models. ASUS consumes 20W less in gaming than the Zotac while performing 9% better? These are not different GPUs but the same 3090. Then at Furmark it consumes 50W more. There has to be some inaccuracies.


It seems the variance in leakage between Ampere GPUs is quite big. The power limit settings play a big role here, too. Also silicon operating temperature in the case of ASUS



SystemMechanic said:


> @W1zzard Can you show us the actual BIOS Powerlimit in from GPU Z please.


BIOSes have been posted to our database now


----------



## EarthDog (Sep 25, 2020)

ratirt said:


> Oh so if NV CEO says it is Titan then it must be it? Well I disagree.
> NV's men said a lot of things and it was never true. Don't fall for the marketing dude. Think a bit.


Nice to see you reply and acknowledge it. 

I'm just sharing what we know which is better than plating ignorant when you were told the same thing. Think, dude.


----------



## ratirt (Sep 25, 2020)

EarthDog said:


> Nice to see you reply and acknowledge it.
> 
> I'm just sharing what we know which is better than plating ignorant when you were told the same thing. Think, dude.


I guess you don't understand marketing nor what Titan is. That's OK. 
I'm sharing something as well and you apparently as ignorant as you wanna be. Fine with me. 
3090 is not a Titan. Same way as 2080 Ti or 1080 Ti is not a Titan.


----------



## swirl09 (Sep 26, 2020)

There is actually a really clever way of telling whether or not a 3090 is a Titan. If it was a Titan, it would be called a Titan.

And now for my next joke, heres a picture of a Strix with just 2 MLCCs https://benchlife.info/asus-rog-strix-rtx-3090-tear-down/


----------



## Auer (Sep 26, 2020)

swirl09 said:


> There is actually a really clever way of telling whether or not a 3090 is a Titan. If it was a Titan, it would be called a Titan.
> 
> And now for my next joke, heres a picture of a Strix with just 2 MLCCs https://benchlife.info/asus-rog-strix-rtx-3090-tear-down/


Nice, totally up to spec. 4+2.


----------



## swirl09 (Sep 26, 2020)

You dont pay hundreds over MSRP because you want the bog standard. And the issue is consistency, if these parts do have any connection with the crashing (which appears to be the case) then its even more important that they dont cut corners. On Asus's site they have 6 MOSCAPs shown (presumably renders), TPU shows all MLCC, and elsewhere we see a mix. So which is it.


----------



## Sora (Sep 26, 2020)

the memory is described incorrectly, the chips on this are D8BGX, not D8BGW.


----------



## ratirt (Sep 27, 2020)

swirl09 said:


> You dont pay hundreds over MSRP because you want the bog standard. And the issue is consistency, if these parts do have any connection with the crashing (which appears to be the case) then its even more important that they dont cut corners. On Asus's site they have 6 MOSCAPs shown (presumably renders), TPU shows all MLCC, and elsewhere we see a mix. So which is it.


Yes. I agree with you. Titan is something that is deserved "Like A titan performance" Doesn't count. But that just me


----------



## jabbadap (Sep 27, 2020)

Titan has specific drivers for titan. I.e. even Pascal Titan xp beats rtx2080ti on softwares like Solidworks or Catia, just because some enabled pro features of the driver. Only SPECviewperf numbers I have seen, is on notebookcheck and rtx3090 seems to be slower than RTX Titan on some of those test. So it probably does not have Titan drivers.


----------



## Shatun_Bear (Sep 27, 2020)

Any word in the review whether these have the faulty hardware or CTD issues caused by it? It would be nightmare to have to RMA a card that has very little stock.


----------



## John Naylor (Sep 28, 2020)

This is supposed to be a tech forum, not a fashion site;  so why the focus on what it is called and not focus on what it does ?   People who had reason to buy the Titan certainly will be buying the 3090.

Historically, the only time Titan came into the discussion here was when a user had contrasting needs.  For example, the user who needed a Quadro for rendering but hated sacrificing  gaming (and perhaps 2D / 3D CAD) performance.   Often that meant a high core count CPU / Quadro rendering box and a 2nd box with an Intel i5 / i7 and a GTX / RTX card.   The alternative cost effective solution was the Titan series ... damn it was expensive *but no where near as expensive as building 2 separate systems* ... one with a Quadro and one with the top gaming card.  The Titan wasn't able to perform quite as well as the 2 box option but it was close in both and a very cost effective compromise.   And yes, it didn't stop gamers from buying Titans even when they didn't perform as well as cheaper options.

I hope that TPU goes back and looks at the 3090 in this scenaio.  We have seen on other sites that the 3090 is 3 - 13 times as fast as the 3080 in rendering  ... I don't care what the %$@#%&% you call it,* it is still doing exactly what the old Titan was good at* .... AND ... no compromise on the gaming front.  So yes, anyone who had cause to own a Titan in the past certainly has the same reason to get a 3090.  And yes, again there's always going to be a subset of folks who will buy it for gaming.  If naming it a 3090 instead of the Titan leads to more gamers buying this card than they did Titans in the past, whomever came up with that idea is a marketing genius and should certainly get a fat raise, bonus and a bigger office.    Let's face it, there will always be folks whose calculate Retun on Investment by facores on than cost ..... where being 1st to have something in their circle .... or having the best that money can buy means more than the extra money spent.

If we do see a 2nd look at this usage scenario, would also like to know about the  case fan headers .... are they PWM ?   Can they be used to feed a Phanteks / Swiftech (powered) fan hub ?


----------



## happy medium (Sep 30, 2020)

Is the 3090 2x faster than the 5700xt?
It looks to be 2x faster than a 2080.

I dont think big Navi will come close. 
Thoughts?


----------



## lemoncarbonate (Sep 30, 2020)

Where I live, this thing costs freaking US$ 2500-2600 (converted).. What a joke.


----------



## purecain (Oct 2, 2020)

i bought one of these, i'm looking forward to 2025 when it arrives.


----------



## ARF (Oct 4, 2020)

happy medium said:


> Is the 3090 2x faster than the 5700xt?
> It looks to be 2x faster than a 2080.
> 
> I dont think big Navi will come close.
> Thoughts?



In some titles it's actually 3x faster 

















Indeed unbelievable. AMD are done.


----------



## 5150Joker (Oct 5, 2020)

ARF said:


> In some titles it's actually 3x faster
> 
> View attachment 170820
> 
> ...



Gotta love selective benchmarks.


----------



## furfix (Oct 6, 2020)

It's cute how ppl keep saying "it's not worth. for half a price, you get almost the same performance". It's like wondering why ppl buy a Ferrari instead of a Chevrolet Sonic. Both are just cars...and can take you to the same place, right? If you don't have the money, just let it go...and stop complaining. In case you don't know it, that's the way capitalism works.


----------



## Asryan (Oct 6, 2020)

I have pre ordered a Strix 3090 .. Don't know when i will get it of course but i'm kinda wondering if I did well.

Will there be a noticeable difference with a Strix 3080?.

I have a 3840x1600 monitor with a refresh rate of 160hz. Will a 3090 give me more than a 3080 with that screen?
I can afford the 3090 for now but well i'm still wondering if I did well...

Thanks


----------



## Caring1 (Oct 7, 2020)

furfix said:


> It's cute how ppl keep saying "it's not worth. for half a price, you get almost the same performance". It's like wondering why ppl buy a Ferrari instead of a Chevrolet Sonic. Both are just cars...and can take you to the same place, right? If you don't have the money, just let it go...and stop complaining. In case you don't know it, that's the way capitalism works.


More like instead of buying a sensible family vehicle, you spend twice the amount on a jacked up truck because it boosts your ego looking down on all the little people and it makes you feel superior.


----------



## 5150Joker (Oct 15, 2020)

furfix said:


> It's cute how ppl keep saying "it's not worth. for half a price, you get almost the same performance". It's like wondering why ppl buy a Ferrari instead of a Chevrolet Sonic. Both are just cars...and can take you to the same place, right? If you don't have the money, just let it go...and stop complaining. In case you don't know it, that's the way capitalism works.



Terrible analogy since the 3090 is hardly the only Ferrari in this scenario. Both are nearly identical in performance except the 3090 has extra VRAM most don't need. More like one Ferrari that's 10% slower than another one for half the price.


----------



## cueman (Oct 24, 2020)

weird, real weird that rtx 3090 cant go further from rtx 3080...

looks its need as also rtx 3080 very powerfull cpu,so i see thouse rtx 3000 series top two's gpus real performance coming out little later, i guess when amd ryzen 5900/5950 using with them
but i think also finaly, when incoming intel rocket Lake-s cpu coming.

but, truly best cpu from rtx 3000 series will be, incoming  intel adler lake-s cpu, intel 1st 1hydrib cpu and also 1st10nm process made cpu also.

tes,for calcultating , rtx 3090 should be at least 20% faster than rtx 3080.


looks these days gpus are so powerfull that cpu performance cant handle it anymore,but another way, its not are so big bottleneck for high reasoltion ,starting 4k and forward.

rtx 3080 is just 4K gaming gpu,lower for that i think incoming rtx 3070 version are better, even they can be faster....let see soon.

but if we took 4K gaming, i cant see any gpu what can beat rtx 3080 for 4K performance, not even incoming big navi..it might be faster,little bit lower resolution, but not win 4K gaming.

sad also that nvidia,looks, cancelled for now rtx 3000 series 20gb gpu versions...let see,but what i read and heard,theres coming super and ti version.and they maded from TSMC 7nm process tech...
Q1/2021 i heard...again,lets see.


----------



## Asryan (Nov 3, 2020)

Hi, 

i got this gpu, using the OC mode, having a curve set at 70% for 70c and 100% for 80c
My temps are around 72 on Red Dead and Horizon dawn on ultra, 3840x1600 / 38inch monitor.

Is this ok?


----------



## EarthDog (Nov 3, 2020)

Asryan said:


> Is this ok?


Yes.

Why not look at the review to see and then compare?


----------



## Asryan (Nov 3, 2020)

great, i saw some reviews and youtube videos with temps around 62 kinda but maybe it wasn't on the same screen specs


----------



## EarthDog (Nov 3, 2020)

Asryan said:


> great, i saw some reviews and youtube videos with temps around 62 kinda but maybe it wasn't on the same screen specs


It could be anything. You're fine though.


----------



## tomfuegue (Nov 19, 2020)

@*W1zzard*

Hello, is there some kind of issue with the BIOS OC of the RTX 3090 Strix by Asus, I don't understand those strangely high noise values, do you know anything?.


----------



## Undertoker (Nov 24, 2020)

What missing for me is the 500W+ bios, evga has released one for a weaker SKU than the Strix
please Asus release a bios equal to what this gpu is capable of, because not doing so denies your customers to realise it’s true potential amd let’s face it , at £1700 they deserve to be allowed to get the best out of this Strix 3090 OC



tomfuegue said:


> @*W1zzard*
> 
> Hello, is there some kind of issue with the BIOS OC of the RTX 3090 Strix by Asus, I don't understand those strangely high noise values, do you know anything?.


The bios didn’t turn the fans off on release , there is a bios update that fixes that is all and it adds 50W until the fans kick in now with that new bios


----------



## Mtorrent (Dec 12, 2020)

Hi, I’m just about to buy the non OC version of Rog Strix 3090 and I wonder if I could install the OC vBios.. indeed I’m interested in the bios that reaches 450w like the EVGA Ultra. This version non OC of Rog Strix 3090 could handle that OC/450w Bios? Thanks


----------



## Undertoker (Dec 12, 2020)

No one buying a 3090 is going to run it in 1080p
It’s for 3440 x 1440 or 4k+
The 3090 isn’t for value, it isn’t for performance per watt, it isn’t for a vast improvement over the 3080 - it’s simply the best card money can buy.

It’s simply a card for those who money is no object and who want the very best gpu money can buy and who don’t give a hoot about value - they are just a pc enthusiast.
That’s why I bought an Asus ROG Strix 3090 OC
I just wanted the very best - and I got it.

I can honestly say if it had been £2500 I’d still have bought it and I would not have flinched, because money isn’t an issue for me.
If your passionate about PC’s and have the money spare and the desire for the very best why shouldn’t you get one?

I’m now waiting for intels 11th Gen and I’ll be buying that as well, so roll on March and I will get whatever the best is on offer then as well.
I love PC’s and building them and working on them and PC gaming, it’s my only vice really, I’m 52 years old and have been gaming for 35years on PC now.
I want the best and I’ll get it and I’m not remotely bothered about value or cost or power consumption- I just want the best frankly.


----------



## AzFullySleeved (Nov 13, 2021)

Anyone try to update their ASUS GeForce RTX 3090 STRIX OC with this bios https://www.techpowerup.com/vgabios/235907/asus-rtx3090-24576-201021  and succeed? 
My current bios for this card is "94.02.42.00.A9" and the link above has a version of "94.02.38.00.26. Wondering if I flash to that bios, with 1000.0 w, will it work on mine without bricking my card?


----------



## critofur (Feb 21, 2022)

birdie said:


> As for me personally I never buy GPUs with a TDP above 150W (and CPUs above 95W) because I simply don't want that much heat under my desk, so I guess I will skip the entire GeForce 30 series because RTX 3060 has been rumored to have a TDP around 180-200W.



I own a 3090 (and a 3080) and the power consumption/heat/efficiency (or lack thereof) is precisely why my "main"/gaming PC has a 6900XT card in it rather than either of those Nvidias.  

When I play Apex at 1440p "locked" to 120Hz refresh (I have a 144Hz 1440p gaming monitor) I smile every time I see the reported GPU power consumption is about 90 watts 

Yes the 3090 is _stupidly _expensive - there's simply no possibility I would have bought one, if it didn't pay for itself through mining.


----------



## purecain (Feb 27, 2022)

@critofur - what settings do you use for the 3090 when mining???


Undertoker said:


> No one buying a 3090 is going to run it in 1080p
> It’s for 3440 x 1440 or 4k+
> The 3090 isn’t for value, it isn’t for performance per watt, it isn’t for a vast improvement over the 3080 - it’s simply the best card money can buy.
> 
> ...


Exactly my sentiment....

As far as efficiency goes, I'll be buying AMD if Nvidia don't sort out a better node with less leakage. 450w + GPU's are too power hungry. Especially with the price of electricity doubling in April.


----------

