# MSI GeForce RTX 3090 Gaming X Trio



## W1zzard (Sep 24, 2020)

The MSI GeForce RTX 3090 Gaming X is the quietest RTX 3090 we've tested today. Acoustics are pretty impressive actually, considering the performance and heat output. MSI has also overclocked their card and bumped its power limit, which yields a very decent performance improvement.

*Show full review*


----------



## bug (Sep 24, 2020)

Not much to say about the card (fast, expensive, power hungry), but I don't recall ever seeing a review including 9 GPUs were AMD only had one entry. And at the bottom of the charts. That's so sad...


----------



## dirtyferret (Sep 24, 2020)

Awesome review!  I was in the market for a space heater for my basement office!


----------



## bug (Sep 24, 2020)

dirtyferret said:


> Awesome review!  I was in the market for a space heater for my basement office!


A 400W heater can be had a tad cheaper: https://www.amazon.com/Givebest-Wattage-Personal-Tip-Over-Protection/dp/B07Y37VTKR


----------



## theonek (Sep 24, 2020)

well, i have encountered some differences in fps reviews on some game titiles and on others there were major differences in fps. I wonder why the differences when using a similar test platforms and same cards? For example review was on guru3d. Just asking a question, not any form of crytisism, it is a fantastic review of yours for this card, like always!


----------



## the54thvoid (Sep 24, 2020)

theonek said:


> well, i have encountered some differences in fps reviews on some game titiles and on others there were major differences in fps. I wonder why the differences when using a similar test platforms and same cards? For example review was on guru3d. Just asking a question, not any form of crytisism, it is a fantastic review of yours for this card, like always!



Different reviewers use different gaming 'benchmarks'. W1zzard uses an in-game test of actual play, timed to ensure the same scene is rendered. It's not a pre-determined 'flyby' benchmark, as other sites often use. Those explain much of the differences in site by site variations.


----------



## Xaled (Sep 24, 2020)

Ok. So now who will pay 100% more for only 15-20% performance? 
Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive


----------



## mahirzukic2 (Sep 24, 2020)

Xaled said:


> Ok. So now who will pay 100% more for only 15-20% performance?
> Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive


You'd be surprised at the money people have to burn, or lack sense of what to do with it.


----------



## rtwjunkie (Sep 24, 2020)

Great review!  50-70 fps on average performance more than a 2080 Ti and in two generations, over 100 more fps on average than the 1080Ti.  Obviously totally impressive performance!    

It's just out of reach for nearly everyone.   Hard pass, thank-you.


----------



## the54thvoid (Sep 24, 2020)

rtwjunkie said:


> Great review!  50-70 fps on average performance more than a 2080 Ti and in two generations, over 100 more fps on average than the 1080Ti.  Obviously totally impressive performance!
> 
> It's just out of reach for nearly everyone.   Hard pass, thank-you.



Yeah, I'm sticking with my 2080ti but if I had to upgrade, it'd be the 3080. I mean, the price difference between it and the 3090 could buy me a new CPU and mobo.


----------



## wheresmycar (Sep 24, 2020)

I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i woulnd't buy one - PERIOD!

Hehe... or would i


----------



## Hatrix (Sep 24, 2020)

I have a 2080ti and i want more frames as i play in 5120x1440 but if i ever bought a new graphics card it would have to be better then 3080 and knowing the 3090 which is double the price but only gives about 10-15 more fps(over the 3080), whatever comes in between, like 3080ti or super at 1000-1300dolares, it won't be worth it, so i'm skipping this gen for now.


----------



## tomstone123 (Sep 24, 2020)

Xaled said:


> Ok. So now who will pay 100% more for only 15-20% performance?
> Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive



I'm planning on getting the Strix OC version with some EK block and back plate. lol so almost 3 times the price of the 3080. lol


----------



## Xaled (Sep 24, 2020)

tomstone123 said:


> I'm planning on getting the Strix OC version with some EK block and back plate. lol so almost 3 times the price of the 3080. lol


Good job sir! Jensen Huang would be very pleased with that.


----------



## Tom Sunday (Sep 24, 2020)

Fantastic review. Indeed one has to have very deep pockets but many will not hesitate to snapping up that price to performance ratio (3080 versus 3090) in a heartbeat. Not to forgetting the persisting call of "bragging-rights" for the many in our world of tech. The ongoing very profitable RGB and AIO craze only supporting the showing-off madness even further. Although remember NIVIDIA needs all the money it can muster after just grabbing ARM for a cool $40 bilion. With the RTX 3080 currenly costing around $700 and being taunted to be a 'go-to' kind of GPU for the enthusiasts, I may be able to getting my hands on a used RTX 1080 for $125 or less in early 2021. With that my wife can finally get her new refrigerator and I can play Fallout 4 on max-settings for the very first time. Being relegated by my spouse to be a basement office dweller, there might just be enough $$$ left for a space heater.


----------



## John Naylor (Sep 24, 2020)

The Titans were oft the solution for gamers who were also doing work typically performed by workstation cards ....   haven't seen much written about what nVidia intended for 3090 in thie respect.  Will there be any followup testing with video editing, rendering and such or will the 3090 just be for gamers who don't have to care about cost or willing to take on the cost because, for an extra $4800 they get to say "mine's bigger" ?

Also wondering why .... With the 3080, MSI was the only AIB card to outpreform the FE but to do so, they dropped their usual spot at the top of the leader board for sound and temps, leaving Asus to take the top spot for lowest temps.  And here.... MSI finishes 2nd to Asus on OC performance ...., and Asus has to run way up at 42 dbA to grab that lead

Asus Normal Mode = 68C / 42 dbA (2175 RPM )
Asus Quiet Mode = 75C / 34 dbA (1615 RPM )

MSI Normal Mode = 77C / 33 dbA (1543 RPM )
MSi Low Temp Mode = 67C / 37 dbA (2016 RPM)

Odd that the fan curves on the 3080 versus 3090 models had different priorities.


----------



## bug (Sep 24, 2020)

John Naylor said:


> Also wondering why .... With the 3080, MSI was the only AIB card to outpreform the FE but to do so, they dropped their usual spot at the top of the leader board for sound and temps, leaving Asus to take the top spot for lowest temps.  And here.... MSI finishes 2nd to Asus on OC performance ...., and Asus has to run way up at 42 dbA to grab that lead


Keep in mind TPU reviewed the _TUF_ version of the 3080, not the _Strix_.
It would seem FE has raised the bar this time around, but that's of little consequence if AIBs don't plan to top that.


----------



## Vayra86 (Sep 24, 2020)

wheresmycar said:


> I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i woulnd't buy one - PERIOD!
> 
> Hehe... or would i



You'd buy two to figure out how glorious SLI support is going to be going forward!


----------



## bug (Sep 24, 2020)

wheresmycar said:


> I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i woulnd't buy one - PERIOD!
> 
> Hehe... or would i


Who's suggesting "workstation use"? Sure you can run AutoCAD on these, but they don't run on the same drivers as Quadro cards. There's no mention of the term on the official product page either.


----------



## John Naylor (Sep 28, 2020)

Xaled said:


> Ok. So now who will pay 100% more for only 15-20% performance?
> Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive



a)  People who do rendering or other workstation apps and don't want to buld 2 ststems.
b)  Peeps with Wheaties Box Syndrome .... in the Olympics, the difference between Gold and Silver is often fractions of a second.   The Gold winner gets their pic on the Wheaties box and millions in endorsements.   The silver winner goes home and takes a job selling cars or insurance.
c)  The same peeps who bought a 24 core threadripper for a gaming box cause someone told them more cores is better.




wheresmycar said:


> I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i wouln't buy one - PERIOD!



The fact remains ... it is a very capable workstation card regardless of the label one puts on it.   People bought a Titan when they needed a box to do rendering thereby necessitating a Quadro ... but they also wanted to enjoy  gaming for which the Quadro was ill suited.   The Titan offered a great compromise for far less cost than building two systems.   People who bought the Titan for this reason will still buy the 3090 which is 3 to 13 times faster in rendering.

But there's always the subset of folks who weigh Return on Investment by factors other than performance gained versus cash invested.  For some, the  return is perceived increased social standing that comes with having the fastest or being the first to have it.   These are the folks standing on lines to buy the new phone ... because some review said it's faster  ... I don't talk that fast.

The dude in marketing who came up with the idea of dropping the Titan moniker and using 3090 should get a fat bonus .... the subset of gamers, with money to burn, will jump at this.




bug said:


> Who's suggesting "workstation use"? Sure you can run AutoCAD on these, but they don't run on the same drivers as Quadro cards. There's no mention of the term on the official product page either.



Jensen for one in the release presentation. But anyone who had had a Titan will want a 3090 ... I own an engineering consulting firm, build all or our PCs and have been doing so for this office and others for almost 30 years.   The fact is AutoCAD runs great on GTX / RTX ... has always been the better option ....  2D / 3D AutoCAD has consistently performed better or equal to the most expensive Quadro cards allowing the realization of huge cost savings while costing up to $4000. The area where quadro excels is in rendering which makes it well suited for other AuotoDesk products like Solid Works, some AutoDesk applications will not even run on GTX / RTX.   An architectural / engineering office will rarely ever be called upon to create a rendering of their 2D / 3D CAD drawings .... most, when needed send it out. Those that do might have one (1) rendering box for every 10 or 20 AutoCAD boxes. Those that need to render, will want the 3090 because it is up to 13 times faster than the 3080.

And one for Wizzard .....

... " the card introduces the Tri Frozr 2 cooling solution with many segment-first features." ... any info on what these segment first features might be ?

Yes MSI introduced "Zero Frozr " (passive cooling) in 2008 and then "Hybrid Frozr, " (independent fan contro) in 2014 with the 9xx series .... but looking at the MSI site, didn't see anything I could call a "segment 1st feature";

Torx Fan 4.0 - the 4.0 kinda suggests there's nothing new here ... but closer inspection reveals that each pair of fan blades has an out ring which, theoretically at least, should reduce blowby.  Does it work ?  Whose gonna test ?  No one wanna cut off  the ring and do a before and after ?

Core Pipe- details heat pipe design improvements ... again nothing to be excited about but, again theoretically, the dull contact of rectangular pipes should be more efficient than circular  / ovoid shaped one with 1 side flattened.  And as much as I prefer MSI's approach to the heat / noise balance (MSI is 9 dbA quieter / Asus is 9C cooler) ... would like to see with both at max load and the fan curve tuned to the same dbA.

Airflow Control - Yeah, good idea to optimize but that's hardly new.  From the review pics and even MSI site, hard to see or understand why curvey fin edges improve cooling.

Love the included support bracket but would rather see it attached to the shroud frame or metal backplate


----------



## phill (Sep 29, 2020)

I think the 3090 is a big disappointment to me and likewise with SLI now being completely pointless, why have they even bothered to include the connector at all?  

I'm sad to say that Nvidia's 30 series of cards, is such a disappointment for me, I think I might end up buy AMDs just so I don't have to bother with Nvidia's offering..  I could quite happily not upgrade at all and buy some other hardware I'd like instead...  Heck I suppose I could even buy a 20 series card if the price was crazily low enough....


----------



## Hatrix (Sep 29, 2020)

I bet that, despite it's incompatibilities and the troubles of drivers with some games, if the 3080 had SLI, 2 cards would be alot better then a 3090, and i also bet that internally they must have tested this, and thought, _"we need to make some money with these restrictions(including 10GB on 3080 at release)"_ or i might be wrong and i don't know why nvidia stop with SLI, but still adding it to the *3090 only*. Also did motherboard vendors won't care for and extra PCIexpress in the systems?
What are your thoughts guys?

Also adding a benchmark for 2080ti SLI here with some games having 10% performance and others 90% like Sniper Elite 4








						NVLink Titan RTX Benchmarks vs. 2080 Ti SLI: Gaming & Power Consumption
					

We already reviewed an individual NVIDIA Titan RTX over here, used first for gaming, overclocking, thermal, power, and acoustic testing. We may look at production workloads later, but that’ll wait. We’re primarily waiting for our go-to applications to add RT and Tensor Core support for 3D art.  -




					www.gamersnexus.net


----------



## John Naylor (Sep 29, 2020)

phill said:


> I think the 3090 is a big disappointment to me and likewise with SLI now being completely pointless, why have they even bothered to include the connector at all?
> 
> I'm sad to say that Nvidia's 30 series of cards, is such a disappointment for me, I think I might end up buy AMDs just so I don't have to bother with Nvidia's offering..  I could quite happily not upgrade at all and buy some other hardware I'd like instead...  Heck I suppose I could even buy a 20 series card if the price was crazily low enough....



Because, like the Titan, SLI  has usages that don't involve gaming.

What are you looking for ? .. and why a disappointment ? When I walk into a restaurant, I have to make a decison between what's on the menu and what's o the Specials list ... aren't we better off making that decision after we have seen both ?



Hatrix said:


> I bet that, despite it's incompatibilities and the troubles of drivers with some games, if the 3080 had SLI, 2 cards would be alot better then a 3090, and .....



This was always the case .... Twin 970s were the same price as a 980 and averaged 40% faster in TPUs test suite.  It was the proverbial no-brainer from a consumer standpoint.  You may have lost 8% on games that didn't support SLI, but you gained as much as 95% in the big AAA titles that did.  The problem is, nvidia's margins on a single 980 were greater than on two 970s.  SLI at thgis point has nvidia competing with itself at the high end ... AMD wasn't wooing away 980 sales, the 970 was.  For a while, nvidia left SLI performance at 4k with a high scaling % ... but they nerfed it at lower res, to stifle the competition with their own lower tier cards.  Now at this point, they can do 4k with a singe card so having SLI is not compelling.


----------



## phill (Oct 1, 2020)

John Naylor said:


> Because, like the Titan, SLI  has usages that don't involve gaming.
> 
> What are you looking for ? .. and why a disappointment ? When I walk into a restaurant, I have to make a decison between what's on the menu and what's o the Specials list ... aren't we better off making that decision after we have seen both ?


I'm posting this a little late so apologies there but I'm looking for something that has a butt load of power.  I'm after two cards working in tandem with each other to game and do other things, compute loads are always nice touch or even to simply convert films as they can use the Cuda cores in them...  

I miss SLI and Crossfire, I've used both for many a years and yes it's always a shame to not get 100% improvement from the second card but quite honestly I never expected to..  It used to be that we'd have big cases and we'd need to fit things in, one of them would be two big GPUs...  One GPU in my case or even two for that matter makes the hardware look small..  It's a real shame on the progression I feel, but as always, just my personal preference and point of view


----------



## wheresmycar (Oct 1, 2020)

John Naylor said:


> Because, like the Titan, SLI  has usages that don't involve gaming.
> 
> What are you looking for ? .. and why a disappointment ? When I walk into a restaurant, I have to make a decison between what's on the menu and what's o the Specials list ... aren't we better off making that decision after we have seen both ?
> 
> ...



IMO, credit to the protestor. As much as NVIDIA earns the right to "overcharge" and make a larger profit (after-all it's a business), likewise the consumer should always expect loftier performance achievements for an extraordinary asking price. Immensely overcharging for a ridiculous return in investment deserves criticism and rightly so to build on buyer awareness. If I took my Mrs to the restaurant and offered to go "special", i'd be more than delighted if she responded with "no no no, it costs twice as much and then some on top and tastes likes pants". Money saved, stomach full, happy days!

I'm not even going to touch on SLI. They ripped that one out right from under our feet.. a perfect example of technological advancements and the possibilities that come with them being locked down for corporate stipends. I get it it's business as usual but what goes hand-in-hand is consumer denunciation and community awareness - a small stand for a small meaningful difference for some!


----------



## lexluthermiester (Oct 4, 2020)

bug said:


> Not much to say about the card (fast, expensive, power hungry), but I don't recall ever seeing a review including 9 GPUs were AMD only had one entry. And at the bottom of the charts. That's so sad...


Wait a few weeks. AMD's new hotness is incoming. Could get very interesting. AMD is giving Intel hell in the CPU arena. There is the possibility that they will give NVidia some grief as well.


----------



## bug (Oct 4, 2020)

lexluthermiester said:


> Wait a few weeks. AMD's new hotness is incoming. Could get very interesting. AMD is giving Intel hell in the CPU arena. There is the possibility that they will give NVidia some grief as well.


I've lost count for how many generations I've been hearing that song. This time I don't expect anything but a tweaked RDNA with RTRT slapped on (probably Frankenstein style).
Wishing for AMD to prove me wrong, of course.


----------



## lexluthermiester (Oct 4, 2020)

bug said:


> I've lost count for how many generations I've been hearing that song. This time I don't expect anything but a tweaked RDNA with RTRT slapped on (probably Frankenstein style).
> Wishing for AMD to prove me wrong, of course.


What are you talking about? It was only a few years ago Radeon's were cleaning NVidia's clocks. It's bound to happen again..


----------



## ARF (Oct 4, 2020)

Sorry but why do we have to wait?
The consoles were unveiled many months ago? And still no word about new GPUs?


----------



## bug (Oct 4, 2020)

lexluthermiester said:


> What are you talking about? It was only a few years ago Radeon's were cleaning NVidia's clocks. It's bound to happen again..


Idk, last I remember AMD dominating the high end was back in Fermi days. They've only had like a couple decent mid-rangers since.


----------



## ARF (Oct 4, 2020)

lexluthermiester said:


> What are you talking about? It was only a few years ago Radeon's were cleaning NVidia's clocks. It's bound to happen again..



This time AMD needs to really release a groundbreaking, innovative architecture because the performance deficit at the moment is gigantic and there are no conventional, evolutionary steps over GCN or RDNA 1 which AMD could potentially take in order to at the very least catch and be at performance parity.



bug said:


> Idk, last I remember AMD dominating the high end was back in Fermi days. They've only had like a couple decent mid-rangers since.



R9 Fury X was on par with GTX 980 Ti but was under engineered with 4 GB of VRAM.


----------



## bug (Oct 4, 2020)

ARF said:


> R9 Fury X was on par with GTX 980 Ti but was under engineered with 4 GB of VRAM.


It was slightly slower. And it ate a lot more power. Hence the undervolting and other tricks users used to resort to.
I would also argue it was _over_engineered to use HBM. That probably killed it.


----------



## brown66 (Nov 20, 2020)

John Naylor said:


> a)  People who do rendering or other workstation apps and don't want to buld 2 ststems.
> b)  Peeps with Wheaties Box Syndrome .... in the Olympics, the difference between Gold and Silver is often fractions of a second.   The Gold winner gets their pic on the Wheaties box and millions in endorsements.   The silver winner goes home and takes a job selling cars or insurance.
> c)  The same peeps who bought a 24 core threadripper for a gaming box cause someone told them more cores is better.
> 
> ...




Sorry my English. It's not my native language.

a) I don't render and I don't use workstation apps.
b)  I think I'm not a peeps with Wheaties Box Syndrome.
c)  And I'm not the peeps who bought a 24 core threadripper for a gaming box cause someone told them more cores is better. One month ago I had a Ryzen 5 1600X, that I bought more than two years ago, and now I have i7 10700K.

So, why I bought  this MSI RTX 3090 Gaming X Trio yesterday?

Because I am a fanatic Skyrim player and to play this game at 4k resolution, loaded with 4K textures, neither the 11 GB of VRam of the GTX 1080Ti that I own is enough, not to mention the measly 10 GB of the RTX 3080.

Theoretically, the minimum would be 16Gb, but I have already tried using 8K textures, which are simply fabulous, so I ended up buying a graphics card that cost 1600 € (a little more than 1600 $), but has 24Gb of VRam.

As you can see, in the gaming world, it's not just the maximum FPS at 1080P or 1440P that matters. VRam's capacity and maximum performance at 4k also count a lot for some other players


----------



## lexluthermiester (Nov 20, 2020)

brown66 said:


> So, why I bought this MSI RTX 3090 Gaming X Trio yesterday?
> 
> Because I am a fanatic Skyrim player and to play this game at 4k resolution, loaded with 4K textures, neither the 11 GB of VRam of the GTX 1080Ti that I own is enough, not to mention the measly 10 GB of the RTX 3080.
> 
> Theoretically, the minimum would be 16Gb, but I have already tried using 8K textures, which are simply fabulous, so I ended up buying a graphics card that cost 1600 € (a little more than 1600 $), but has 24Gb of VRam.


That's as good a reason as any and the only one you need to justify the purchase for yourself. You will enjoy your new gaming experience with that card.


----------



## W1zzard (Nov 20, 2020)

lexluthermiester said:


> That's as good a reason as any and the only one you need to justify the purchase for yourself. You will enjoy your new gaming experience with that card.


 There's so much discussion and drama on the forums recently about hardware and this is really the only thing that matters


----------



## owen10578 (Dec 22, 2020)

Hey w1zzard I forgot to mention this back then when this went live but the DrMOS you listed are wrong. It uses OnSemi NCP302145 for the vCore not OnSemi NCP302150.


----------



## LeylaBay (Dec 23, 2020)

brown66 said:


> Sorry my English. It's not my native language.
> 
> a) I don't render and I don't use workstation apps.
> b)  I think I'm not a peeps with Wheaties Box Syndrome.
> ...


Thank you! Because I just spent $2000 to buy this beast as my 12yo son’s Christmas present from eBay, just so I don’t hear about graphics cards anymore! At least I know now he might use over 10gb memory one day and will appreciate my sacrifice on new YSL boots


----------

