• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 3080 Gaming X Trio

It seems MSI Trio has better chip and higher boosted clock than ASUS Tuf, but the cooling and power supply is even worse than ASUS Tuf. So which one should I go for? A better overclock performance card? Or a better power supply and cooling card?
 
So sad to see actual RT performance has not improved much, it's just the sheer grunt of the card carrying over to better performance with RT enabled, but the RT cores aren't doing much better. They have to rely on DLSS to get good performance with RT and that limits it to a handful of games. One can only hope Hopper does a hell of a lot better on RT and doesn't focus too much on rasterization. 120fps at 4K ultra is a good target but 100% increase in RT performance is needed at least.
 
I didn't see this coming: TUF is just killing the MSI Gaming X for $30 less. This MSI model is a massive disappointment. :/

It seems MSI Trio has better chip and higher boosted clock than ASUS Tuf, but the cooling and power supply is even worse than ASUS Tuf. So which one should I go for? A better overclock performance card? Or a better power supply and cooling card?
Well, the MSI got 1% better than the TUF, which is nothing. In return, TUF is 13 degreed cooler while being only 3 dBA louder. And there is the quiet BIOS mode which is a bit quiter than MSI. I don't know if it means performance loss. Anyway, I would definitely go with the TUF as it is $30 less.
 
Okay, can someone explain this to me?

Power consumption of the Gaming X might look scary at first: triple 8-pin power inputs, up to 425 W in Furmark, and 315 W in typical gaming. Actually, if we take the performance gain over FE into account, there's no loss in power efficiency—4% faster, 4% more power. Good. Custom designs usually compromise on some efficiency to achieve higher performance. While three 8-pins are a little bit more complicated to use than dual 8-pins, it's a reasonable choice, as it gave MSI enough power headroom to achieve meaningful performance gains. That's why I find it surprising that MSI has set the manual power limit adjustment range for overclocking to just 350 W, which is lower than the 370 W on the Founders Edition. Also, considering 3x 8-pin + slot power = 525 W power capability, a 350 W limit seems like a waste of that third power connector.

So can you no longer adjust the TDP limit? on my 980TI I can set 130% TDP. I was hoping a 3x8pin would allow for you to just maintain max clock with the added juice. I get overclocking is fairly dead now but I was really hoping with these being so badly power limited that with 3x8 pin you could maintain at least max clock with the 3 pin.

Is that SOL now? Is that something that a BIOS flash could fix or may get patched down the road?

This was looking to be a huge feature to me and I am confused.
 
Well there's always hard-mods for those who want the last 0.001% of performance at the expense of 10% (or more) power.
 

How are R7 and I7 pulling the same wattage? These are cookie cutter charts, Intel consumes much more power at full multi-threaded tilt. Zen2 can easily get away with with 100W less watts.

Lack of sleep :toast: keep up the good work though.

These cards are quite nice but if the 3080 consumes almost 380 watts when Gaming? How much will the 3090 draw and will it have double the performance to justify the more than double MSRP?

These are peak draws. Sustained draws are in the low 300s. every quality psu will handle slight peaks like these. 3090 will draw about 50W more on average than 3080, supposedly. And no it will not have double the performance because that's not how things work in life. You pay premium for premium goods.

You can probably use a high quality 550W Gold / Plat PSU with the proper mindfulness and settings in the BIOS if you had to (don't buy a 10900k or just lock it to 75W, it won't affect gaming performance). 600W+ is preferred.
My results are with no power limits. I'm giving the worst case scenario. Just in case you want to render on the side while gaming.... or do a PSU test.

Okay, can someone explain this to me?



So can you no longer adjust the TDP limit? on my 980TI I can set 130% TDP. I was hoping a 3x8pin would allow for you to just maintain max clock with the added juice. I get overclocking is fairly dead now but I was really hoping with these being so badly power limited that with 3x8 pin you could maintain at least max clock with the 3 pin.

Is that SOL now? Is that something that a BIOS flash could fix or may get patched down the road?

This was looking to be a huge feature to me and I am confused.
Power limits section should answer your question. Asus board does 109%TDP
 
How are R7 and I7 pulling the same wattage? These are cookie cutter charts, Intel consumes much more power at full multi-threaded tilt. Zen2 can easily get away with with 100W less watts.



These are peak draws. Sustained draws are in the low 300s. every quality psu will handle slight peaks like these. 3090 will draw about 50W more on average than 3080, supposedly. And no it will not have double the performance because that's not how things work in life. You pay premium for premium goods.


My results are with no power limits. I'm giving the worst case scenario. Just in case you want to render on the side while gaming.... or do a PSU test.


Power limits section should answer your question. Asus board does 109%TDP
my question was why is it so low? is it moddable or will it get a patch and be able to increase it? or did nvidia force AIB to cap it? It is pointlessly low for a 3x8 pin. It is obvious this was rushed to market. Most brands don't even have boards out yet. EVGA supposedly is going to have 420w limit.
 
this supply shortage on purpose is very disappointing ...i hope AMD will not do the same
 
my question was why is it so low? is it moddable or will it get a patch and be able to increase it? or did nvidia force AIB to cap it? It is pointlessly low for a 3x8 pin. It is obvious this was rushed to market. Most brands don't even have boards out yet. EVGA supposedly is going to have 420w limit.
I really don't know. My guess is 350W is a lot of heat to dissipate. Also many of these cards are peaking around 400W so they need all the power connectors already.
 
Question: the 3080 is obviously positioned at a 4k gaming card. Is 10gb of memory enough for 4k gaming at ultra settings?
We are moving to 4k gaming, the way we moved to 1080p.
All cards somewhat faster than 2080s (next gen consoles) is a 4k card.
 
The cooler on this seems rather cheap and nasty compared to the Turing Gaming X Trio models. Very unimpressive results relative to the TUF.
 
Appreciate the review. I have this one on "pre-order", another way to say backorder but we have your order lol. Gives me some insight into what I'll eventually be receiving. My last Gaming X was the 1080 and it was good. Though 76deg load doesn't seem to much improved from FE. Finally 1440p 144hz gaming is a reality without turning down settings
 
Last edited:
Nice review, personally I'm waiting for the MSI Ventus review. Where I live it's significantly cheaper, and if it's reasonably quiet it's probably a much better buy. I'll also wait for all this over hyped rush stupidity to end. I mean, who is thick enough to wait at 3 in the morning, hitting F5 like a mad man, trying to buy a consumer product and raging like a hyperactive child if he can't get what he wants? Have some dignity, don't be like "fruit company" fanboys.

Hype and availability aside, this seems to be a good GPU, even if obviously stretched to the max from factory. Tiny overclocking headroom makes it an exercise in futility, power draw is what I find personally annoying - I really wanted to use the Corsair SF600 Platinum, but now I think it might be too close for comfort, even if I use 9700k running at 4.8GHz.
 
This power draw is so horrendous it's approaching the same number of watts as my little office eco heater, which is 500W o_O

I advise people to wait for 3080 20GB but an extra 10GB will add another 30-50W . Maybe the 16GB 3070S/Ti will be the first good Ampere card.
 
i am baffled by the power limit too!
evga ftw has 400 max but does not have really better oc said Gnexus
 
It will be interesting to see how the power draw is handled in the future.
In the past, a smaller node used to solve this, but we don't have many smaller nodes left and definitely none on the horizon. 5nm is ramping up, but it will be gobbled up by mobile SoCs and stuff for at least a couple of years from now.
 
Volume production planned before the end of 2020, TSMC's N6 technology provides customers with additional cost-effective benefits while extending the industry-leading power and performance from the 7nm family for a broad array of applications, ranging from high-to-mid end mobile, consumer applications, AI, networking, 5G infrastructure, GPU, and high-performance computing.

Give it another 10-12 months.
 
How would a 3090 with 1,2 GPC more than the 3080 ever get double performance? What fairy tales are this? And when did a double perf card ever cost double on the top end?

3090 will likely draw around 400-420, I reckon on the AIB cards. The TDP gap is only 30W. Still a lot, I do agree on that.

Looking at these cards it seems like this node was pushed to, and maybe a little bit over the limit here. Not a pretty sight IMO. Pascal and Turing already gained some substantial watts on OC, but this... Man.
I know what you mean but the way Jensen introduced and marketed them in Nvidia's launch video makes it seem like this card will be the cat's meow. It is funny though that I remember Jensen making fun of Vega needing 2x8 pin. I fully agree with the 400 to 420 but I can see a recommended 1000W PSU for this just to ensure it remains in the right price bracket.
 
I know what you mean but the way Jensen introduced and marketed them in Nvidia's launch video makes it seem like this card will be the cat's meow. It is funny though that I remember Jensen making fun of Vega needing 2x8 pin. I fully agree with the 400 to 420 but I can see a recommended 1000W PSU for this just to ensure it remains in the right price bracket.

Yeah its funny how the same crap about power draw applies to different companies at each point in time :D
If we have to believe Nvidia everything is the cat's meow though... Huang has that same tone of voice with every product he presents, going all 'Jobs' on it.
 
Yeah its funny how the same crap about power draw applies to different companies at each point in time :D
If we have to believe Nvidia everything is the cat's meow though... Huang has that same tone of voice with every product he presents, going all 'Jobs' on it.
I think it applies the same: 350W+ is a crapload for a video card. Vega also had the privilege of offering a pretty poor perf/W on top of that.

3090 will be something else though :shudder:
 
Back
Top