• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 3090 Gaming X Trio

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,658 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The MSI GeForce RTX 3090 Gaming X is the quietest RTX 3090 we've tested today. Acoustics are pretty impressive actually, considering the performance and heat output. MSI has also overclocked their card and bumped its power limit, which yields a very decent performance improvement.

Show full review
 
Not much to say about the card (fast, expensive, power hungry), but I don't recall ever seeing a review including 9 GPUs were AMD only had one entry. And at the bottom of the charts. That's so sad...
 
Awesome review! I was in the market for a space heater for my basement office!
 
well, i have encountered some differences in fps reviews on some game titiles and on others there were major differences in fps. I wonder why the differences when using a similar test platforms and same cards? For example review was on guru3d. Just asking a question, not any form of crytisism, it is a fantastic review of yours for this card, like always!
 
well, i have encountered some differences in fps reviews on some game titiles and on others there were major differences in fps. I wonder why the differences when using a similar test platforms and same cards? For example review was on guru3d. Just asking a question, not any form of crytisism, it is a fantastic review of yours for this card, like always!

Different reviewers use different gaming 'benchmarks'. W1zzard uses an in-game test of actual play, timed to ensure the same scene is rendered. It's not a pre-determined 'flyby' benchmark, as other sites often use. Those explain much of the differences in site by site variations.
 
Ok. So now who will pay 100% more for only 15-20% performance?
Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive
 
Ok. So now who will pay 100% more for only 15-20% performance?
Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive
You'd be surprised at the money people have to burn, or lack sense of what to do with it.
 
Great review! 50-70 fps on average performance more than a 2080 Ti and in two generations, over 100 more fps on average than the 1080Ti. Obviously totally impressive performance! :clap:

It's just out of reach for nearly everyone. Hard pass, thank-you.
 
Great review! 50-70 fps on average performance more than a 2080 Ti and in two generations, over 100 more fps on average than the 1080Ti. Obviously totally impressive performance! :clap:

It's just out of reach for nearly everyone. Hard pass, thank-you.

Yeah, I'm sticking with my 2080ti but if I had to upgrade, it'd be the 3080. I mean, the price difference between it and the 3090 could buy me a new CPU and mobo.
 
I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i woulnd't buy one - PERIOD!

Hehe... or would i
 
I have a 2080ti and i want more frames as i play in 5120x1440 but if i ever bought a new graphics card it would have to be better then 3080 and knowing the 3090 which is double the price but only gives about 10-15 more fps(over the 3080), whatever comes in between, like 3080ti or super at 1000-1300dolares, it won't be worth it, so i'm skipping this gen for now.
 
  • Like
Reactions: ppn
Ok. So now who will pay 100% more for only 15-20% performance?
Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive

I'm planning on getting the Strix OC version with some EK block and back plate. lol so almost 3 times the price of the 3080. lol
 
Fantastic review. Indeed one has to have very deep pockets but many will not hesitate to snapping up that price to performance ratio (3080 versus 3090) in a heartbeat. Not to forgetting the persisting call of "bragging-rights" for the many in our world of tech. The ongoing very profitable RGB and AIO craze only supporting the showing-off madness even further. Although remember NIVIDIA needs all the money it can muster after just grabbing ARM for a cool $40 bilion. With the RTX 3080 currenly costing around $700 and being taunted to be a 'go-to' kind of GPU for the enthusiasts, I may be able to getting my hands on a used RTX 1080 for $125 or less in early 2021. With that my wife can finally get her new refrigerator and I can play Fallout 4 on max-settings for the very first time. Being relegated by my spouse to be a basement office dweller, there might just be enough $$$ left for a space heater.
 
The Titans were oft the solution for gamers who were also doing work typically performed by workstation cards .... haven't seen much written about what nVidia intended for 3090 in thie respect. Will there be any followup testing with video editing, rendering and such or will the 3090 just be for gamers who don't have to care about cost or willing to take on the cost because, for an extra $4800 they get to say "mine's bigger" ?

Also wondering why .... With the 3080, MSI was the only AIB card to outpreform the FE but to do so, they dropped their usual spot at the top of the leader board for sound and temps, leaving Asus to take the top spot for lowest temps. And here.... MSI finishes 2nd to Asus on OC performance ...., and Asus has to run way up at 42 dbA to grab that lead

Asus Normal Mode = 68C / 42 dbA (2175 RPM )
Asus Quiet Mode = 75C / 34 dbA (1615 RPM )

MSI Normal Mode = 77C / 33 dbA (1543 RPM )
MSi Low Temp Mode = 67C / 37 dbA (2016 RPM)

Odd that the fan curves on the 3080 versus 3090 models had different priorities.
 
Also wondering why .... With the 3080, MSI was the only AIB card to outpreform the FE but to do so, they dropped their usual spot at the top of the leader board for sound and temps, leaving Asus to take the top spot for lowest temps. And here.... MSI finishes 2nd to Asus on OC performance ...., and Asus has to run way up at 42 dbA to grab that lead
Keep in mind TPU reviewed the TUF version of the 3080, not the Strix.
It would seem FE has raised the bar this time around, but that's of little consequence if AIBs don't plan to top that.
 
I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i woulnd't buy one - PERIOD!

Hehe... or would i

You'd buy two to figure out how glorious SLI support is going to be going forward!
 
I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i woulnd't buy one - PERIOD!

Hehe... or would i
Who's suggesting "workstation use"? Sure you can run AutoCAD on these, but they don't run on the same drivers as Quadro cards. There's no mention of the term on the official product page either.
 
Ok. So now who will pay 100% more for only 15-20% performance?
Nobody because it won't be 100% more expensive as the 700 price of 3080 won't be be true and it will be much more expensive

a) People who do rendering or other workstation apps and don't want to buld 2 ststems.
b) Peeps with Wheaties Box Syndrome .... in the Olympics, the difference between Gold and Silver is often fractions of a second. The Gold winner gets their pic on the Wheaties box and millions in endorsements. The silver winner goes home and takes a job selling cars or insurance.
c) The same peeps who bought a 24 core threadripper for a gaming box cause someone told them more cores is better.


I'm just going to say it. Although unlikely I hope AMD whoops Nvidias behind with something of equal/marginally less measure at a far more pleasant asking price. The Green team needs to be taught a lesson. An average of 15% perf advantage at 4K over a 3080 and a $1500-$1700 asking price is bonkers. No one should suggest ONLY "workstation use" as Nvidia is also marketing the card as a GAMER too. If I had a million i wouln't buy one - PERIOD!

The fact remains ... it is a very capable workstation card regardless of the label one puts on it. People bought a Titan when they needed a box to do rendering thereby necessitating a Quadro ... but they also wanted to enjoy gaming for which the Quadro was ill suited. The Titan offered a great compromise for far less cost than building two systems. People who bought the Titan for this reason will still buy the 3090 which is 3 to 13 times faster in rendering.

But there's always the subset of folks who weigh Return on Investment by factors other than performance gained versus cash invested. For some, the return is perceived increased social standing that comes with having the fastest or being the first to have it. These are the folks standing on lines to buy the new phone ... because some review said it's faster ... I don't talk that fast.

The dude in marketing who came up with the idea of dropping the Titan moniker and using 3090 should get a fat bonus .... the subset of gamers, with money to burn, will jump at this.


Who's suggesting "workstation use"? Sure you can run AutoCAD on these, but they don't run on the same drivers as Quadro cards. There's no mention of the term on the official product page either.

Jensen for one in the release presentation. But anyone who had had a Titan will want a 3090 ... I own an engineering consulting firm, build all or our PCs and have been doing so for this office and others for almost 30 years. The fact is AutoCAD runs great on GTX / RTX ... has always been the better option .... 2D / 3D AutoCAD has consistently performed better or equal to the most expensive Quadro cards allowing the realization of huge cost savings while costing up to $4000. The area where quadro excels is in rendering which makes it well suited for other AuotoDesk products like Solid Works, some AutoDesk applications will not even run on GTX / RTX. An architectural / engineering office will rarely ever be called upon to create a rendering of their 2D / 3D CAD drawings .... most, when needed send it out. Those that do might have one (1) rendering box for every 10 or 20 AutoCAD boxes. Those that need to render, will want the 3090 because it is up to 13 times faster than the 3080.

And one for Wizzard .....

... " the card introduces the Tri Frozr 2 cooling solution with many segment-first features." ... any info on what these segment first features might be ?

Yes MSI introduced "Zero Frozr " (passive cooling) in 2008 and then "Hybrid Frozr, " (independent fan contro) in 2014 with the 9xx series .... but looking at the MSI site, didn't see anything I could call a "segment 1st feature";

Torx Fan 4.0 - the 4.0 kinda suggests there's nothing new here ... but closer inspection reveals that each pair of fan blades has an out ring which, theoretically at least, should reduce blowby. Does it work ? Whose gonna test ? No one wanna cut off the ring and do a before and after ?

Core Pipe- details heat pipe design improvements ... again nothing to be excited about but, again theoretically, the dull contact of rectangular pipes should be more efficient than circular / ovoid shaped one with 1 side flattened. And as much as I prefer MSI's approach to the heat / noise balance (MSI is 9 dbA quieter / Asus is 9C cooler) ... would like to see with both at max load and the fan curve tuned to the same dbA.

Airflow Control - Yeah, good idea to optimize but that's hardly new. From the review pics and even MSI site, hard to see or understand why curvey fin edges improve cooling.

Love the included support bracket but would rather see it attached to the shroud frame or metal backplate
 
Last edited:
I think the 3090 is a big disappointment to me and likewise with SLI now being completely pointless, why have they even bothered to include the connector at all?

I'm sad to say that Nvidia's 30 series of cards, is such a disappointment for me, I think I might end up buy AMDs just so I don't have to bother with Nvidia's offering.. I could quite happily not upgrade at all and buy some other hardware I'd like instead... Heck I suppose I could even buy a 20 series card if the price was crazily low enough.... :(
 
I bet that, despite it's incompatibilities and the troubles of drivers with some games, if the 3080 had SLI, 2 cards would be alot better then a 3090, and i also bet that internally they must have tested this, and thought, "we need to make some money with these restrictions(including 10GB on 3080 at release)" or i might be wrong and i don't know why nvidia stop with SLI, but still adding it to the 3090 only. Also did motherboard vendors won't care for and extra PCIexpress in the systems?
What are your thoughts guys?

Also adding a benchmark for 2080ti SLI here with some games having 10% performance and others 90% like Sniper Elite 4
 
Last edited:
I think the 3090 is a big disappointment to me and likewise with SLI now being completely pointless, why have they even bothered to include the connector at all?

I'm sad to say that Nvidia's 30 series of cards, is such a disappointment for me, I think I might end up buy AMDs just so I don't have to bother with Nvidia's offering.. I could quite happily not upgrade at all and buy some other hardware I'd like instead... Heck I suppose I could even buy a 20 series card if the price was crazily low enough.... :(

Because, like the Titan, SLI has usages that don't involve gaming.

What are you looking for ? .. and why a disappointment ? When I walk into a restaurant, I have to make a decison between what's on the menu and what's o the Specials list ... aren't we better off making that decision after we have seen both ?

I bet that, despite it's incompatibilities and the troubles of drivers with some games, if the 3080 had SLI, 2 cards would be alot better then a 3090, and .....

This was always the case .... Twin 970s were the same price as a 980 and averaged 40% faster in TPUs test suite. It was the proverbial no-brainer from a consumer standpoint. You may have lost 8% on games that didn't support SLI, but you gained as much as 95% in the big AAA titles that did. The problem is, nvidia's margins on a single 980 were greater than on two 970s. SLI at thgis point has nvidia competing with itself at the high end ... AMD wasn't wooing away 980 sales, the 970 was. For a while, nvidia left SLI performance at 4k with a high scaling % ... but they nerfed it at lower res, to stifle the competition with their own lower tier cards. Now at this point, they can do 4k with a singe card so having SLI is not compelling.
 
Because, like the Titan, SLI has usages that don't involve gaming.

What are you looking for ? .. and why a disappointment ? When I walk into a restaurant, I have to make a decison between what's on the menu and what's o the Specials list ... aren't we better off making that decision after we have seen both ?
I'm posting this a little late so apologies there but I'm looking for something that has a butt load of power. I'm after two cards working in tandem with each other to game and do other things, compute loads are always nice touch or even to simply convert films as they can use the Cuda cores in them...

I miss SLI and Crossfire, I've used both for many a years and yes it's always a shame to not get 100% improvement from the second card but quite honestly I never expected to.. It used to be that we'd have big cases and we'd need to fit things in, one of them would be two big GPUs... One GPU in my case or even two for that matter makes the hardware look small.. It's a real shame on the progression I feel, but as always, just my personal preference and point of view :)
 
Because, like the Titan, SLI has usages that don't involve gaming.

What are you looking for ? .. and why a disappointment ? When I walk into a restaurant, I have to make a decison between what's on the menu and what's o the Specials list ... aren't we better off making that decision after we have seen both ?



This was always the case .... Twin 970s were the same price as a 980 and averaged 40% faster in TPUs test suite. It was the proverbial no-brainer from a consumer standpoint. You may have lost 8% on games that didn't support SLI, but you gained as much as 95% in the big AAA titles that did. The problem is, nvidia's margins on a single 980 were greater than on two 970s. SLI at thgis point has nvidia competing with itself at the high end ... AMD wasn't wooing away 980 sales, the 970 was. For a while, nvidia left SLI performance at 4k with a high scaling % ... but they nerfed it at lower res, to stifle the competition with their own lower tier cards. Now at this point, they can do 4k with a singe card so having SLI is not compelling.

IMO, credit to the protestor. As much as NVIDIA earns the right to "overcharge" and make a larger profit (after-all it's a business), likewise the consumer should always expect loftier performance achievements for an extraordinary asking price. Immensely overcharging for a ridiculous return in investment deserves criticism and rightly so to build on buyer awareness. If I took my Mrs to the restaurant and offered to go "special", i'd be more than delighted if she responded with "no no no, it costs twice as much and then some on top and tastes likes pants". Money saved, stomach full, happy days!

I'm not even going to touch on SLI. They ripped that one out right from under our feet.. a perfect example of technological advancements and the possibilities that come with them being locked down for corporate stipends. I get it it's business as usual but what goes hand-in-hand is consumer denunciation and community awareness - a small stand for a small meaningful difference for some!
 
Back
Top