• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI R9 390X Gaming 8 GB

as w1zzard said : 8 GB VRAM provides no benefit. the card is not powerful to address full 8gb (like r9 290x 8gb version).



and same story with 12gb titanx.

i think they do this to sell their product little more ( for no reason )
What about multi-GPU solutions? What about memory intensive hobby-renders?

The only downside is cost, but it's the same with DRAM, why not more?

My 64GB is way overkill, even for my needs but i'm glad its there, I never have to worry about a potential bottleneck.
 
What about multi-GPU solutions? What about memory intensive hobby-renders?

The only downside is cost, but it's the same with DRAM, why not more?

My 64GB is way overkill, even for my needs but i'm glad its there, I never have to worry about a potential bottleneck.

Its not really the same. Unless the GPU has enough power to make use of the extra memory, its not really doing much. However if there are more than one GPU in a configuration the extra memory can help. But most of the time GPUs run out of grunt performance before the memory capacity is an issue.
 
It seems like AMD just have made a repackaged 7970 yet again, even though most people would say I'm a AMD fanboy, but now it seems like they closed the development department in order to save money so they can squeeze out the last bit of money out of a sinking ship. I don't think AMD have done any real new development for several years, on the cpu front I believe that every cpu are made from old(pre 2012) development work, A-series and the low power variants are derivatives off old cpu and gpu designs put in the same chip, there hasn't been a "highend" cpu from AMD since 2012( 8370 and 95** are just clocked higher).

There is a small chance that the heterogeneous computing "platform" is something the development department(if one exists) have been working on, there may be 3-4 guys working in a shed that comes up with new AMD products, that's what it feels like to me. For the past 15 years I have only used AMD/ATI cpu's and gpu's in the belief that AMD would keep the flame up against Intel, but no, now their just a cash-cow for the management team(CEO etc) and owners, at least that's what it seems like to me.

Unless they come up with something new AMD have lost me as a customer, don't mean anything,but I seriously doubt I'm the only one thinking like me. These are harsh words and I hope I'm wrong.
 
Last edited:
I feel like they just shouldn't have released it. This isn't the card people have been waiting for and it just makes a bad impression for the brand. They should just focus on Fury.
 
Its seems that the only ones not gulping juice from R9 300 series is the HIS models (Haven't seen a Gigabyte review yet)

TECHSPOT - HIS IceQ X2 OC Radeon R9 390X, R9 390 & R9 380 Review

Power_01.png



Power_02.png

Power_03.png
 
No, otherwise temps would be high but not power draw

I know that Tom Logan at OC3D got the same card where he found out that other reviewers before him had taken the card apart to take pictures, then put it back together without applying new thermal paste, that meant around 13-14c temp diffrence (adjustet for room temp), and his card ran around the temps your reporting before hand, then around 72c after applying thermal paste himself.

If you still have your sample, could you please check for that? ^^
 
Cool review, but I find this a bit odd and to counter some other reviews. I wonder almost if there really is something wrong with this card as those power consumption figures seem extremely high. I mean if that was the case for the 290X/390X my machine should not be able to function at 1125mhz overclock on my 3 290X's... (FYI not doubting the review, just wondering about the card itself)

Guru reported lower power consumption on their MSI 390X, techspots HIS 390X (As shown by @Xzibit ) while not as far overclocked consumed less than a typical (or right around a 290X). This is quite odd and I wonder if samples really vary that much or if this card just has some problems...
 
I know that Tom Logan at OC3D got the same card where he found out that other reviewers before him had taken the card apart to take pictures, then put it back together without applying new thermal paste, that meant around 13-14c temp diffrence (adjustet for room temp), and his card ran around the temps your reporting before hand, then around 72c after applying thermal paste himself.

If you still have your sample, could you please check for that? ^^
I'm not saying that W1zzard's sample can't be faulty, but this post...... Allow me please to translate this to myself:
1, Tom Logan at OC3D can properly assemble a disassembled card (for the second try!) but the other reviewers (who also assembled thousands of cards in their life) can't.
2, He also has some magic powers and found that out remotely without being next to those reviewers...

fascinating stuff.
/me heads to OC3D to read that review.

Also, we are testing real card-only power consumption while many other sites test system power, this could also be a factor.
Sorry if the question is stupid, but isn't it possible somehow that your setup gets more inacurate at higher power-draws? For example I could not achieve the insane numbers you got in your 980 G1 review, and I pushed the card much further. I understand it can be just about different samples of course, just asking:toast:
 
Last edited:
I'm not saying that W1zzard's sample can't be faulty, but this post...... Allow me please to translate this to myself:
1, Tom Logan at OC3D can properly assemble a disassembled card (for the second try!) but the other reviewers (who also assembled thousands of cards in their life) can't.
2, He also has some magic powers and found that out remotely without being next to those reviewers...

fascinating stuff.

/me heads to OC3D to read that review.

Watch the video, thats where he talks about it :)
 
Watch the video, thats where he talks about it :)
Yea I did and I understand now, his sample came from an other reviewer who did not reapply the paste after he was done.
I still doubt that W1zzard or any other reviewer wouldn't notice such an obvious thing tho.
 
Its not faulty, each and every GPU core gets tested for what it will become, some have hardware failures and become a 980Ti instead of a Titan, or a 390 instead of a 390X, sometimes that failure is the core volt leakage, and to understand what and why this is and happens you have to understand how a die is manufactured.


Core voltage isn't just applied at one magic spot to the silicon, instead it gets pushed through multiple traces so the voltage is stable to ALL the circuits on the chip, why you ask? Due to the size of the manufacturing process the traces (copper wires) in the die are TINY, and each may only be capable of carrying a tenth of the amperage at the rated voltage.
Next we have to understand that silicon is unlike copper in that it becomes MORE conductive as it heats up, this compounds the problem as once over "the bend of the knee" where voltage input correlates closely to achievable frequency, but at the bend the effect becomes exponentially less efficient and voltage is lost through heat, which in turn causes more leakage, and in turn more heat, and the voltage drops, the voltage controller without any limit would kill any chip.


So W1zz probably has a very bottom of the barrel sample, luck of the draw, that needs more voltage at rated speeds, which correlates directly to Watts of use, and higher heat output, overclocking makes the problem worse as it creates more heat, and requires more voltage...which.... well you should get the picture.
 
I got an idea... This card must be ideal fir nitrogen clocking... W1zz? :D
 
Yea I did and I understand now, his sample came from an other reviewer who did not reapply the paste after he was done.
I still doubt that W1zzard or any other reviewer wouldn't notice such an obvious thing tho.
Yeah I'll definitely notice such a thing. I've done nearly 500 VGA reviews.. When I disassemble the card for pictures I reassemble it with thermal paste properly applied, and then check the re-assembled card for temps, and compare those to what I've seen on the untouched card. If big difference -> card has been messed with before.
This very rarely happens to me, because we usually get the first round of samples.
 
Yeah I'll definitely notice such a thing. I've done nearly 500 VGA reviews.. When I disassemble the card for pictures I reassemble it with thermal paste properly applied, and then check the re-assembled card for temps, and compare those to what I've seen on the untouched card. If big difference -> card has been messed with before.
This very rarely happens to me, because we usually get the first round of samples.

Fair enough, not questioning your proffesionality just figured that when he noticed the issue that it maybe was the cause of your problem too, just odd that your temps are so much higher, in a open envoirment where as OC3D iirc tests in a closed case.

Do you record your room tempature anywhere? Did not see anything about it in the review.
 
Wait ... what? OCed Hawaii with extra 4 GB of VRAM with higher clocks has 100 more watts of power consumption at 100 bucks higher price for 2% performance increase.
AMD, what have you been smoking? I want some.
 
with this power consumption, this vga should cost $ 280

fail :(

not a nvidia fanboy here, i had really high hopes for this new vga.. #dissapointed
Look around and check other site reviews about this same GPU to be sure of its consumption. And furthermore, wait for other GPU manufacturers to send their GPUs to be reviewed. I am absolutely sure that it is a bad sample this. No way for 4-5 other reviewers to have same consumption for 390X as they had with 290X and the one W1z tried to be 100W above 290X.
 
Look around and check other site reviews about this same GPU to be sure of its consumption. And furthermore, wait for other GPU manufacturers to send their GPUs to be reviewed. I am absolutely sure that it is a bad sample this. No way for 4-5 other reviewers to have same consumption for 390X as they had with 290X and the one W1z tried to be 100W above 290X.
24-MSI-R9-390X-Gaming-8GB-All-Rails-Torture_r_600x450.png

----------- Minimum Maximum Average
PCI-E Total
58.56 W 421.20 W 324.78 W
Mainboard 3.3V 1.65 W 3.30 W 2.53 W
Mainboard 12V 30.24 W 52.00 W 41.00 W
VGA Card Total 93.76 W 468.04 W 368.32 W

1434612549l1GBQzJE5q_10_1.gif
7202_777_msi-radeon-r9-390x-gaming-8g-video-card-review.png

Don't look at maximum or peak measurements, look for averages. It's around 60W to 80W extra depending on stress test used.
 
Last edited:
I was never that interested in the 390X as I already have 290X and GTX970 systems, but on an interesting note, the new revision of the Twin Frozer V cooler maxed out at 82c at 40dB, so assuming the GTX980ti Gaming Twin Frozer V is just as solid it should be able to run even quieter with a card that uses 140w at stock.

*EDIT*

Could the discrepancies in power draw between sites be down to the card having three settings for performance?

1100 MHz / 6100 MHz (OC Mode)
1080 MHz / 6000 MHz (Gaming Mode)
1050 MHz / 6000 MHz (Silent Mode)
 
Could the discrepancies in power draw between sites be down to the card having three settings for performance?

1100 MHz / 6100 MHz (OC Mode)
1080 MHz / 6000 MHz (Gaming Mode)
1050 MHz / 6000 MHz (Silent Mode)
Probably and mostly because overclocked memory with that many memory ICs.
 
People need to understand....that although this thing cost a fortune to just play games, it doesn't just play games....it'll also heat your house.
So really it might end up being quite cost effective. There could be a future for these GPU/Central Heating Systems.
So Kudos to AMD.
 
People need to understand....that although this thing cost a fortune to just play games, it doesn't just play games....it'll also heat your house.
So really it might end up being quite cost effective. There could be a future for these GPU/Central Heating Systems.
So Kudos to AMD.
That's funny, I was thinking of putting a grill rack in the top of an Nvidia system to make toast while I was gaming, but then it crossed my mind I might attract trolls.
 
Back
Top