# MSI R9 390X Gaming 8 GB



## W1zzard (Jun 18, 2015)

MSI's Radeon R9 390X GAMING comes with a massive triple-slot, dual-fan cooler that turns its fans off completely when the card idles. Thanks to the overclock out of the box, the card matches GTX 980 performance at 1440p and beyond. Like all R9 390X cards, the MSI R9 390X GAMING is equipped with 8 GB of video memory.

*Show full review*


----------



## Nokiron (Jun 19, 2015)

Oh god, that power consumption!

80-100W _*more*_ than a 290X?


----------



## The Quim Reaper (Jun 19, 2015)

Nokiron said:


> Oh god, that power consumption!
> 
> 80-100W _*more*_ than a 290X?



Boy you said it....looks smugly at my two 970's running at 30c idle (65-70c gaming) with a 4690K (@4.5Ghz) pulling no more than 470w @ peak load.

You'd have to be nuts to buy one of these over a 970/980.


----------



## Sempron Guy (Jun 19, 2015)

guru3d reports 28w less vs the 290x, 76c temp load

http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,8.html

overclock3d reports 6w more vs 290x, 72c temp load

http://www.overclock3d.net/reviews/gpu_displays/msi_r9_390x_gaming_8g_review/4

results are going haywire


----------



## DarkOCean (Jun 19, 2015)

424W, $430, 8gb of vram ... do this rebrand makes any sense to you ?


----------



## Luka KLLP (Jun 19, 2015)

The power consumption is horrible... I don't see any reason to buy this... When I buy a video card, I want to have the feeling I'm getting a high quality product.
I'm not getting that feeling here. Hopefully the Fury series lives up to the expectations!


----------



## Over_Lord (Jun 19, 2015)

Power consumption of that thing is too damn high.


----------



## Lionheart (Jun 19, 2015)

I gotta admit I'm surprised it caught up to the GTX 980 & beat it out in some games but man oh man, that power consumption is too damn high.


----------



## Rowsol (Jun 19, 2015)

So, 970 has 30% better perf/$ and 100% better perf/watt.    I don't see the purpose of this card.

Bring on the nano?  I think that's the name.  Anxious to see the 50% better power efficiency.


----------



## LAN_deRf_HA (Jun 19, 2015)

How did they come up with the pricing for this. Double mem is $50 usually, factory overclock maybe $10-30 depending on brand.


----------



## moproblems99 (Jun 19, 2015)

I'm a little curious how the card ended up getting so close to a 980 considering that the 290X is only hovering around the 970.  Was it really only the small bump in clocks?  Is the overclocked 290X that close to a 980 (EDIT: when overclocked)?


----------



## horik (Jun 19, 2015)

499€ for this card?
They can keep it.
http://www.pccomponentes.com/msi_r9_390x_gaming_8gb_gddr5.html


----------



## newtekie1 (Jun 19, 2015)

Well...this is a disaster...

Turning the fans off when the card is under 60°C is nice, but if the thing runs so hot it will never be below 60°C, then its pointless.

In the open test bench it idled at 57°C, put it in a closed case and it will never get below 60.


----------



## Sihastru (Jun 19, 2015)

*Dat power consumption...* This is what you get when you increase voltage from 1.14V to 1.21V I guess. This is all this card is. An overclocked 290X. So why would anyone buy this? 8GB? There are 8GB 290X cards.


----------



## newtekie1 (Jun 19, 2015)

Sihastru said:


> *Dat power consumption...* This is what you get when you increase voltage from 1.14V to 1.21V I guess. This is all this card is. An overclocked 290X. So why would anyone buy this? 8GB? There are 8GB 290X cards.



And the 8GB is shown to not help, even at 4k, except in a few rare situations.


----------



## mirakul (Jun 19, 2015)

LAN_deRf_HA said:


> How did they come up with the pricing for this. Double mem is $50 usually, factory overclock maybe $10-30 depending on brand.


4 GB of GDDR5 costs way more than 50$.


----------



## dados8756 (Jun 19, 2015)

this is awfully rebranding shit performance. more power... more price... with same performance  .not worth buy, i think i will go with green side this year


----------



## W1zzard (Jun 19, 2015)

mirakul said:


> 4 GB of GDDR5 costs way more than 50$.


$50 for 4 GB GDDR5 is actually roughly correct


----------



## btarunr (Jun 19, 2015)

This chip, with this same clocks, but 4 GB of RAM, at $350, coulda worked out.


----------



## Fluffmeister (Jun 19, 2015)

On average it uses double the power of a 970/980. 

/hugs 970


----------



## Ikaruga (Jun 19, 2015)

The funny thing is , that somewhere at AMD in some room, some very well payed staff looked at it one last time and they said: Yes, this is fine, we are ready and releasing this card like this, who wouldn't buy it?


----------



## CrazyBass (Jun 19, 2015)

There's no reason for anyone to buy one of this cards here in Brasil with current days bills of electricity.

Same 1080p/1440p performance than 970/980 with double+ power consumption? 

Let's see the Fiji thing...


----------



## techy1 (Jun 19, 2015)

AMD releases "new" card after 2 years of "hard work"... and the result is: price/performance = DOWN; Power consumption = UP... I am not an expert, but shouldn't it be the other way around? (just sayin)


----------



## buildzoid (Jun 19, 2015)

Well that power draw increase was predictable. Also why do AMD cards all come with bad VRAM ICs. The H5GC4H24AJR-T2C are specced for 5Gbps at 1.5V not the 6.1Gbps that this card ships with.


----------



## Octopuss (Jun 19, 2015)

Holy shit what did they do with the power consumption?


----------



## mirakul (Jun 19, 2015)

W1zzard said:


> $50 for 4 GB GDDR5 is actually roughly correct


Not really.


techy1 said:


> AMD releases "new" card after 2 years of "hard work"... and the result is: price/performance = DOWN; Power consumption = UP... I am not an expert, but shouldn't it be the other way around? (just sayin)


290x was 550$ at launch, and this time you have 4GB more and a factory overclock card. The price is stupid btw but this is how they clears 290/290x stock, as they did with 280x/7970 rebranding 2 years ago.
 AMD spent 2 years to give us Fury line-up with the sparkling new tech HBM, which will be the future of GPU world. I don't see why people are making big fuss about the rebrand cards instead of giving the new tech a warm welcome.


----------



## Cataclysm_ZA (Jun 19, 2015)

@W1zzard, isn't the VRAM running at 6.0GHz effective instead of 5.0GHz? 

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/4.html


----------



## the54thvoid (Jun 19, 2015)

mirakul said:


> Not really..



It's good you know more than @W1zzard.  I'll turn to you for all my info now.  Tell me, how far up your ass is your head?  I'd like to see how it's done.

On other news - 390X is a Failiosaurus.  If it matches a 980 - I'd still have the 980, based on the other metrics.

However, as I and others keep thinking, Fury X will be Awesomausaurus.

Then we wait and see if Nvidia has kept anything back.....


----------



## newtekie1 (Jun 19, 2015)

Cataclysm_ZA said:


> @W1zzard, isn't the VRAM running at 6.0GHz effective instead of 5.0GHz?
> 
> http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/4.html




Yes, the RAM is running at 6.0GHz(6.1GHz on this card actually), but it is only specced to run at 5.0GHz.  MSI is overclocking the RAM.


----------



## Mindweaver (Jun 19, 2015)

btarunr said:


> This chip, with this same clocks, but 4 GB of RAM, at $350, coulda worked out.


I agree this wouldn't be a bad card for that range of cards. This is exactly how I expected this card to perform for a rebrand. This should be priced around the same as the
SAPPHIRE Tri-X OC Radeon R9 290X 8GB... I have a buddy that just picked that card up and he loves it, but he is a big AMD fan, and coming from a 270 he should love it. I tried to talk him out of it and wait for the new 3XX series but when the itch hits.. lol

Plus, amd should see a nice boost when ever DX12 hits, but power is another story... If your worried about power I don't see Hawaii based cards saving any trees.. hehe Plus, these are not the cards we are looking for.. (_I had to do the Star Wars reference after E3.. lol_)


----------



## Mindweaver (Jun 19, 2015)

Mindweaver said:


> I agree this wouldn't be a bad card for that range of cards. This is exactly how I expected this card to perform for a rebrand. This should be priced around the same as the
> SAPPHIRE Tri-X OC Radeon R9 290X 8GB... I have a buddy that just picked that card up and he loves it, but he is a big AMD fan, and coming from a 270 he should love it. I tried to talk him out of it and wait for the new 3XX series but when the itch hits.. lol
> 
> Plus, amd should see a nice boost when ever DX12 hits, but power is another story... If your worried about power I don't see Hawaii based cards saving any trees.. hehe Plus, these are not the cards we are looking for.. (_I had to do the Star Wars reference after E3.. lol_)



*EDIT: Reference meaning we want FURY! lol*


----------



## Cataclysm_ZA (Jun 19, 2015)

newtekie1 said:


> Yes, the RAM is running at 6.0GHz(6.1GHz on this card actually), but it is only specced to run at 5.0GHz.  MSI is overclocking the RAM.



Ah, that agrees with how I'm reading it now, as in that's the actual reference clocks for the RAM. I wonder how much headroom is left there.


----------



## the54thvoid (Jun 19, 2015)

Having checked back on the 980 Custom reviews - they run 5-10% faster than stock 980, so the custom to custom comparisons still have 980 ahead of the 390X.  Stock versus stock and best OC versus best OC - these are what matters.


----------



## SASBehrooz (Jun 19, 2015)

Too high power consumption
4GB Useless Memory
2% more Perormance than 3.5 Gb GTX 970 in 1080p
High temperature even in custom design
*barely* match GTX 980 in total performance

after 1.5 years ........ gg wp AMD.


----------



## Nokiron (Jun 19, 2015)

SASBehrooz said:


> Too high power consumption
> 4GB Useless Memory
> 2% more Perormance than 3.5 Gb GTX 970 in 1080p
> High temperature even in custom design
> ...


There is no real reason to complain about the amount of memory. It's better it's there instead of a lack of it.


----------



## fullinfusion (Jun 19, 2015)

so they still cant cool that chip down, what a burn.


----------



## SASBehrooz (Jun 19, 2015)

Nokiron said:


> There is no real reason to complain about the amount of memory. It's better it's there instead of a lack of it.



actually we should say this . because *they* did say that after GTX 970 memory allocation issue. so *why* dont we.


----------



## Nokiron (Jun 19, 2015)

SASBehrooz said:


> actually we should say this . because *they* did say that after GTX 970 memory allocation issue. so *why* dont we.


There isn't a lack of memory on GTX 970, it is still 4GB. But that's not the point. Hawaii has the bandwidth to actually make use of it.

So again, why shouldn't it be 8GB?

It is exactly the same with Titan X, why shoudn't it have 12GB?


----------



## SASBehrooz (Jun 19, 2015)

Nokiron said:


> So again, why shouldn't it be 8GB?



as w1zzard said  : 8 GB VRAM provides no benefit. the card is not powerful to address full 8gb (like r9 290x 8gb version).



Nokiron said:


> It is exactly the same with Titan X, why shoudn't it have 12GB?



and same story with 12gb titanx.

i think they do this to sell their product little more ( for no reason )


----------



## cokker (Jun 19, 2015)

Sempron Guy said:


> guru3d reports 28w less vs the 290x, 76c temp load
> 
> http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,8.html
> 
> ...



+1,  Even:

http://www.legionhardware.com/articles_pages/his_iceq_xsup2_oc_radeon_r9_390xr9_390_r9_380,10.html

and

http://www.techspot.com/review/1019-radeon-r9-390x-390-380/page7.html

Got similar results to the 290/x.

The biggest difference I've seen is ~50w not the near 100w over the 290x, duff card or borked the readings...


----------



## Cataclysm_ZA (Jun 19, 2015)

cokker said:


> The biggest difference I've seen is ~50w not the near 100w over the 290x, duff card or borked the readings...



Its possible that drivers are also a component of the higher power use. The Catalyst 15.15 drivers handed out for the reviews are only compatible with R-300 cards, so there may be a few tweaks AMD can do to get things more in line with their initial expectations.


----------



## jchambers2586 (Jun 19, 2015)

2 of these cards would eat up my power supply I just bought (EVGA supernova 850 G2)  glad I just bought a 970.


----------



## thebluebumblebee (Jun 19, 2015)

> Power delivery requires one 8-pin and one 6-pin PCI-Express power connector. This configuration is specified for up to *300 W* power draw.


Isn't this a HUGE problem?


----------



## W1zzard (Jun 19, 2015)

thebluebumblebee said:


> Isn't this a HUGE problem?


not really, you can easily draw more power from the PCIe connectors, it's not like there is a magical device that shuts them down when they exceed power


----------



## W1zzard (Jun 19, 2015)

So I've benched power consumption of the PowerColor R9 390X PCS+ and I get:
Typical Gaming: 231 W
Peak Gaming: 253 W

Which seems to fall in line with what to expect.

Next, I shut down the system, removed the PowerColor card, installed the MSI card, booted up the system, ran power testing again, and the numbers in my review are confirmed.

MSI is running higher clocks, at higher voltage, and the card gets hotter. GPU temperature is a huge factor for power consumption, the hotter the GPU, the higher power draw, for doing the same thing.

Also I'm not starting my power consumption test from a cold card. I first do one run of about 2 minutes, then do another for which power is measured, which ensures you get realistic long-term gaming conditions, not just some magical numbers that don't apply to gaming.

Also, we are testing real card-only power consumption while many other sites test system power, this could also be a factor.

It might also be possible that the variation between Hawaii GPUs is very large and I got an unlucky sample. Let's just not hope that reviewers get low-power-picked cards and the high-power cards end up with customers.

GPU-Z ASIC Quality, MSI: 79.3, PowerColor 73.1


----------



## fullinfusion (Jun 19, 2015)

@W1zzard you test on an open bench right?

any crossfire action in the near future with this card?

one card at 85c is hot and having a second installed, that card would be like a melt down.


----------



## W1zzard (Jun 19, 2015)

fullinfusion said:


> you test on an open bench right?


half-open case



fullinfusion said:


> any crossfire action in the near future with this card?


no plans for CF, look up 290X CF articles


----------



## HD64G (Jun 19, 2015)

So, most possible this MSI 390X that W1z tested here is a problematic GPU (core problem or bios) which causes the excessive power consumption. All other reviewers didn't have any of these, so...

In W1z place I would RMA that card if it is bought by TPU and redo the test once the new arrives.


----------



## Bjorn_Of_Iceland (Jun 19, 2015)

Lionheart said:


> I gotta admit I'm surprised it caught up to the GTX 980 & beat it out in some games but man oh man, that power consumption is too damn high.


Same as 290x when you overclock it. It can reach a GTX980's performance.. but then, a GTX980 can also be overclocked even more, so it is back to square one or even worse. Plus the heat and power consumption of the 290x / 390x.


----------



## illli (Jun 19, 2015)

would have been decent if it had half-ish the power consumption.  I wish amd would succeed, but why do their chips use so much more power compared to nvidia?


----------



## _BARON_ (Jun 19, 2015)

I am confused with such high power consumption comparing to other reviews, anyhow

I noticed other reviews who reviewed non-X version, R9 390, it's the same price as GTX970, any chance we see that review soon? Reviews so far of that card suck, but it seems it's not running that HOT as 390x and it's not that power hungry while it's just 330$


----------



## W1zzard (Jun 19, 2015)

_BARON_ said:


> I am confused with such high power consumption comparing to other reviews, anyhow
> 
> I noticed other reviews who reviewed non-X version, R9 390, it's the same price as GTX970, any chance we see that review soon? Reviews so far of that card suck, but it seems it's not running that HOT as 390x and it's not that power hungry while it's just 330$


I don't have a 390 non-X yet from anyone. MSI sent a 380 and 370 too, but these are lower priority.

In the queue: EVGA 980 Ti SC+, PowerColor 390X PCS+, Zotac 980 Ti AMP!, and Fury X when I get one


----------



## SetsunaFZero (Jun 19, 2015)

thous gtx 970 vs gtx 780ti benches don't look legit to me. Wizz could u compere the "nv 353.06 WHQL Kepler performance crippling driver" to some older one?


----------



## thebluebumblebee (Jun 19, 2015)

Could this be a cooler failure?  That one (what looks like a ) 10 mm pipe that appears to be responsible for a majority of the heat doesn't seem to have much contact with the fins.
Edit: Would the 290X cooler fit........?


----------



## Casecutter (Jun 19, 2015)

W1zzard said:


> $50 for 4 GB GDDR5 is actually roughly correct


That's true if this had "as good" of memory as a reference 290X got; SK Hynix (H5GQ2H24AFR-R0C) *1500 MHz* 6000 MHz GDDR5 effective stuff.  But W1zzard indicates this is MSI has Hynix model number H5GC4H24AJR-T2C,  specified to run at *1250 MHz* (5000 MHz GDDR5 effective), so they stuff it with crap and OC the s#it out of it (MSI *1525 MHz) is that increasing temp/power.  *Then W1zzard says there's a wall (1780 MHz) like it being held back by BIOS timing (is this an MSI thing?).

This has NO merits.  This is just not any consideration!  How is it AMD could claim a lower the TDP from 290W for a 290X, to now 275W (reference).  This MSI provides no bearing on that assertion, while even ref vs. ref I'm not seeing it!

When they first said a 390X is this... that was well a letdown, when they said $430... I was like that's stupid, but now this MSI with it's crappy memory and lousy performance/dbA/thermals/power.  God I hope this a MSI issue?


----------



## W1zzard (Jun 19, 2015)

thebluebumblebee said:


> Could this be a cooler failure?


No, otherwise temps would be high but not power draw


----------



## _BARON_ (Jun 19, 2015)

W1zzard said:


> I don't have a 390 non-X yet from anyone. MSI sent a 380 and 370 too, but these are lower priority.
> 
> In the queue: EVGA 980 Ti SC+, PowerColor 390X PCS+, Zotac 980 Ti AMP!, and Fury X when I get one


Thank you for quick response, I just graduated from high school and I have college(university) in near future. I plan to work during summer and GTX970 seems as perfect card in my price range. I'm now a little in doubt because there's new R9 390, but I can't find a trusted source or proper review about it, and it's also $330.

About higher power draw, I've watched Tom Logan's review and* he mentioned that MSI changed circuitry, PCB and more, also it runs on higher clock speed,* so I guess that is the answer for enormous power draw. In his review test system is taking 510W from the wall, so it kind of fits in, right?


----------



## Xzibit (Jun 19, 2015)

@W1zzard

In your conclusion

Lack of HDMI 2.0
MSI Website 390X Specifications
HDMI-Output
1 (version 1.4a/2.0)
Max Resolution: 4096x2160 @24 Hz (1.4a), 3840x2160 @60 Hz (2.0)


----------



## thebluebumblebee (Jun 19, 2015)

W1zzard said:


> No, otherwise temps would be high but not power draw


Maybe "failure" is too strong of a  word to use.  Idle power usage is actually lower than the MSI 290X, but temps are *17°C higher*.  Forgot about the fans turning off at idle.
And you said: 





W1zzard said:


> GPU temperature is a huge factor for power consumption, the hotter the GPU, the higher power draw, for doing the same thing.


----------



## revin (Jun 19, 2015)

thebluebumblebee said:


> Idle power usage is actually lower than the MSI 290X, but temps are *17°C higher*


I think it's from the MSI option for the fan not running under 60c

EDIT:  I kinda wish I woulda got a $270 290X  



Xzibit said:


> @W1zzard
> In your conclusion
> 
> Lack of HDMI 2.0
> ...


WTH is 3840 ???? @60Hz


----------



## W1zzard (Jun 19, 2015)

Xzibit said:


> Lack of HDMI 2.0


I think MSI has posted the wrong specs on their page. Hawaii does not support HDMI 2.0, neither does Fiji. Maybe copy&paste from NVIDIA, email sent to MSI for clarification


----------



## MxPhenom 216 (Jun 19, 2015)

buildzoid said:


> Well that power draw increase was predictable. Also why do AMD cards all come with bad VRAM ICs. The H5GC4H24AJR-T2C are specced for 5Gbps at 1.5V not the 6.1Gbps that this card ships with.



Probably lackluster memory controller.


----------



## CounterSpell (Jun 19, 2015)

with this power consumption, this vga should cost $ 280

fail 

not a nvidia fanboy here, i had really high hopes for this new vga.. #dissapointed


----------



## CrazyBass (Jun 19, 2015)

Beside this isn't necessary at all, i've just finished the reading of another hardware's press big player of the same GPU model and...

W1zzard isn't alone in its power consumption and temperatures results, so that's not the point anymore.

The point is that AMD appears to be playing with the market and don't giving any sh** about "save the trees" or consumers that look at their electric bills/cooling solutions-PSUs budgets before purchasing anything (or overclock headroom...).


----------



## GorbazTheDragon (Jun 19, 2015)

I just find the overclocking on these rather disappointing. NV seems to have really tweaked maxwell to be able to run fast.


----------



## NC37 (Jun 19, 2015)

8GB VRAM will mean more going into the following year. Consoles are all running with 8GB stock and devs are lazy brats who will likely deliver ports which use that much. Right now there is only 1 or 2 games that even go over 4GB. Well plus StarCitizen. So saying there is no point now is true, but there is a point down the line.  As far as power draw. Everyone is crazy about the max but isn't looking at the low end. Sure the max means more but those low end numbers are under the 290X. So it both uses less and uses more power than a 290X...somethings not right with that picture. If MSI did some custom stuff to the card, that would make sense. Either way, the MSI variant will be the one to stay away from.


----------



## newtekie1 (Jun 19, 2015)

W1zzard said:


> I think MSI has posted the wrong specs on their page. Hawaii does not support HDMI 2.0, neither does Fiji. Maybe copy&paste from NVIDIA, email sent to MSI for clarification



Why do I have the feeling they are going to say "the port is 1.4a, but you can plug it into a 2.0 port and use the resolution/refresh rate".


----------



## MxPhenom 216 (Jun 19, 2015)

CounterSpell said:


> with this power consumption, this vga should cost $ 280
> 
> fail
> 
> not a nvidia fanboy here, i had really high hopes for this new vga.. #dissapointed



How can you have high hopes for a rebrand? Just going to be more of the same shit.


----------



## Casecutter (Jun 19, 2015)

W1zzard said:


> So I've benched power consumption of the
> PowerColor R9 390X PCS+ _(1060MHz/1500mhz)_
> and I get:
> Typical Gaming: 231 W
> Peak Gaming: 253 W


Vs this MSI 390X (1100MHz/1525mhz)
Typical Gaming: 344 W
Peak Gaming: 370 W
So 4% increase in clock means the power jumps like 50%?


W1zzard said:


> Which seems to fall in line with what to expect.


Not bust your..., but that's whack.


----------



## newtekie1 (Jun 19, 2015)

Casecutter said:


> So 4% increase in clock means the power jumps like 50%?
> Not bust your..., but that's whack.



If MSI upped the core voltage, the higher power draw is to be expected.  But the MSI should overclock better too.


----------



## W1zzard (Jun 19, 2015)

newtekie1 said:


> But the MSI should overclock better too.


it does. powercolor 1090 MHz GPU


----------



## dwade (Jun 19, 2015)

They just overclocked the 290x and call it the 390x. You can do the same for 970 and 980 and it'll undoubtedly smoke this with ease.


----------



## Ferrum Master (Jun 19, 2015)

Oh a great card for Nordic people  A middle of summer +10C outside... now  

Well I would like to see some Witcher 3 bencmarks... at least in beta stage with 980Ti only... well i know it is a rebrand, but for the Fiji... i hope to see it.

I hope AMD just had shaiteload of those dies laying around and they just respun them for the sake of not writing them off... otherwise...  MEH.


----------



## Basard (Jun 19, 2015)

This card sucks, yeah... but it's funny how many people complain about the 'useless' extra four GB but when they hear the Fury only has four then they complain that it's not eight!


----------



## SmokingCrop (Jun 19, 2015)

Basard said:


> This card sucks, yeah... but it's funny how many people complain about the 'useless' extra four GB but when they hear the Fury only has four then they complain that it's not eight!


Because The R9 390X isn't strong enough. By the time you need 6+GB, you're already at way too low FPS.
Different case with Fury X.


----------



## Nokiron (Jun 19, 2015)

SASBehrooz said:


> as w1zzard said  : 8 GB VRAM provides no benefit. the card is not powerful to address full 8gb (like r9 290x 8gb version).
> 
> 
> 
> ...


What about multi-GPU solutions? What about memory intensive hobby-renders?

The only downside is cost, but it's the same with DRAM, why not more?

My 64GB is way overkill, even for my needs but i'm glad its there, I never have to worry about a potential bottleneck.


----------



## MxPhenom 216 (Jun 19, 2015)

Nokiron said:


> What about multi-GPU solutions? What about memory intensive hobby-renders?
> 
> The only downside is cost, but it's the same with DRAM, why not more?
> 
> My 64GB is way overkill, even for my needs but i'm glad its there, I never have to worry about a potential bottleneck.



Its not really the same. Unless the GPU has enough power to make use of the extra memory, its not really doing much. However if there are more than one GPU in a configuration the extra memory can help. But most of the time GPUs run out of grunt performance before the memory capacity is an issue.


----------



## Ivaroeines (Jun 20, 2015)

It seems like AMD just have made a repackaged 7970 yet again, even though most people would say I'm a AMD fanboy, but now it seems like they closed the development department in order to save money so they can squeeze out the last bit of money out of a sinking ship. I don't think AMD have done any real new development for several years, on the cpu front I believe that every cpu are made from old(pre 2012) development work, A-series and the low power variants are derivatives off old cpu and gpu designs put in the same chip, there hasn't been a "highend" cpu from AMD since 2012( 8370 and 95** are just clocked higher).

There is a small chance that the heterogeneous computing "platform" is something the development department(if one exists) have been working on, there may be 3-4 guys working in a shed that comes up with new AMD products, that's what it feels like to me. For the past 15 years I have only used AMD/ATI cpu's and gpu's in the belief that AMD would keep the flame up against Intel, but no, now their just a cash-cow for the management team(CEO etc) and owners, at least that's what it seems like to me.

Unless they come up with something new AMD have lost me as a customer, don't mean anything,but I seriously doubt I'm the only one thinking like me. These are harsh words and I hope I'm wrong.


----------



## LAN_deRf_HA (Jun 20, 2015)

I feel like they just shouldn't have released it. This isn't the card people have been waiting for and it just makes a bad impression for the brand. They should just focus on Fury.


----------



## Frick (Jun 20, 2015)

LOL


----------



## Xzibit (Jun 20, 2015)

Its seems that the only ones not gulping juice from R9 300 series is the HIS models (Haven't seen a Gigabyte review yet)

*TECHSPOT - HIS IceQ X2 OC Radeon R9 390X, R9 390 & R9 380 Review*


----------



## Khorngor (Jun 20, 2015)

W1zzard said:


> No, otherwise temps would be high but not power draw



I know that Tom Logan at OC3D got the same card where he found out that other reviewers before him had taken the card apart to take pictures, then put it back together without applying new thermal paste, that meant around 13-14c temp diffrence (adjustet for room temp), and his card ran around the temps your reporting before hand, then around 72c after applying thermal paste himself. 

If you still have your sample, could you please check for that? ^^


----------



## GhostRyder (Jun 20, 2015)

Cool review, but I find this a bit odd and to counter some other reviews.  I wonder almost if there really is something wrong with this card as those power consumption figures seem extremely high.  I mean if that was the case for the 290X/390X my machine should not be able to function at 1125mhz overclock on my 3 290X's...  (FYI not doubting the review, just wondering about the card itself)

Guru reported lower power consumption on their MSI 390X, techspots HIS 390X (As shown by @Xzibit ) while not as far overclocked consumed less than a typical (or right around a 290X).  This is quite odd and I wonder if samples really vary that much or if this card just has some problems...


----------



## Ikaruga (Jun 20, 2015)

Khorngor said:


> I know that Tom Logan at OC3D got the same card where he found out that other reviewers before him had taken the card apart to take pictures, then put it back together without applying new thermal paste, that meant around 13-14c temp diffrence (adjustet for room temp), and his card ran around the temps your reporting before hand, then around 72c after applying thermal paste himself.
> 
> If you still have your sample, could you please check for that? ^^


I'm not saying that W1zzard's sample can't be faulty, but this post...... Allow me please to translate this to myself:
*1,* Tom Logan at OC3D can properly assemble a disassembled card (for the second try!) but the other reviewers (who also assembled thousands of cards in their life) can't.
*2*, He also has some magic powers and found that out remotely without being next to those reviewers...

fascinating stuff.
/me heads to OC3D to read that review.



W1zzard said:


> Also, we are testing real card-only power consumption while many other sites test system power, this could also be a factor.


Sorry if the question is stupid, but isn't it possible somehow that your setup gets more inacurate at higher power-draws? For example I could not achieve the insane numbers you got in your 980 G1 review, and I pushed the card much further. I understand it can be just about different samples of course, just asking


----------



## Khorngor (Jun 20, 2015)

Ikaruga said:


> I'm not saying that W1zzard's sample can't be faulty, but this post...... Allow me please to translate this to myself:
> *1,* Tom Logan at OC3D can properly assemble a disassembled card (for the second try!) but the other reviewers (who also assembled thousands of cards in their life) can't.
> *2*, He also has some magic powers and found that out remotely without being next to those reviewers...
> 
> ...



Watch the video, thats where he talks about it


----------



## Ikaruga (Jun 20, 2015)

Khorngor said:


> Watch the video, thats where he talks about it


Yea I did and I understand now, his sample came from an other reviewer who did not reapply the paste after he was done. 
I still doubt that W1zzard or any other reviewer wouldn't notice such an obvious thing tho.


----------



## Steevo (Jun 20, 2015)

Its not faulty, each and every GPU core gets tested for what it will become, some have hardware failures and become a 980Ti instead of a Titan, or a 390 instead of a 390X, sometimes that failure is the core volt leakage, and to understand what and why this is and happens you have to understand how a die is manufactured. 


Core voltage isn't just applied at one magic spot to the silicon, instead it gets pushed through multiple traces so the voltage is stable to ALL the circuits on the chip, why you ask? Due to the size of the manufacturing process the traces (copper wires) in the die are TINY, and each may only be capable of carrying a tenth of the amperage at the rated voltage. 
Next we have to understand that silicon is unlike copper in that it becomes MORE conductive as it heats up, this compounds the problem as once over "the bend of the knee" where voltage input correlates closely to achievable frequency, but at the bend the effect becomes exponentially less efficient and voltage is lost through heat, which in turn causes more leakage, and in turn more heat, and the voltage drops, the voltage controller without any limit would kill any chip. 


So W1zz probably has a very bottom of the barrel sample, luck of the draw, that needs more voltage at rated speeds, which correlates directly to Watts of use, and higher heat output, overclocking makes the problem worse as it creates more heat, and requires more voltage...which.... well you should get the picture.


----------



## Ferrum Master (Jun 20, 2015)

I got an idea... This card must be ideal fir nitrogen clocking... W1zz?


----------



## W1zzard (Jun 20, 2015)

Ikaruga said:


> Yea I did and I understand now, his sample came from an other reviewer who did not reapply the paste after he was done.
> I still doubt that W1zzard or any other reviewer wouldn't notice such an obvious thing tho.


Yeah I'll definitely notice such a thing. I've done nearly 500 VGA reviews.. When I disassemble the card for pictures I reassemble it with thermal paste properly applied, and then check the re-assembled card for temps, and compare those to what I've seen on the untouched card. If big difference -> card has been messed with before.
This very rarely happens to me, because we usually get the first round of samples.


----------



## Khorngor (Jun 20, 2015)

W1zzard said:


> Yeah I'll definitely notice such a thing. I've done nearly 500 VGA reviews.. When I disassemble the card for pictures I reassemble it with thermal paste properly applied, and then check the re-assembled card for temps, and compare those to what I've seen on the untouched card. If big difference -> card has been messed with before.
> This very rarely happens to me, because we usually get the first round of samples.



Fair enough, not questioning your proffesionality just figured that when he noticed the issue that it maybe was the cause of your problem too, just odd that your temps are so much higher, in a open envoirment where as OC3D iirc tests in a closed case.

Do you record your room tempature anywhere? Did not see anything about it in the review.


----------



## BiggieShady (Jun 20, 2015)

Wait ... what? OCed Hawaii with extra 4 GB of VRAM with higher clocks has 100 more watts of power consumption at 100 bucks higher price for 2% performance increase.
AMD, what have you been smoking? I want some.


----------



## HD64G (Jun 20, 2015)

CounterSpell said:


> with this power consumption, this vga should cost $ 280
> 
> fail
> 
> not a nvidia fanboy here, i had really high hopes for this new vga.. #dissapointed


Look around and check other site reviews about this same GPU to be sure of its consumption. And furthermore, wait for other GPU manufacturers to send their GPUs to be reviewed. I am absolutely sure that it is a bad sample this. No way for 4-5 other reviewers to have same consumption for 390X as they had with 290X and the one W1z tried to be 100W above 290X.


----------



## BiggieShady (Jun 20, 2015)

HD64G said:


> Look around and check other site reviews about this same GPU to be sure of its consumption. And furthermore, wait for other GPU manufacturers to send their GPUs to be reviewed. I am absolutely sure that it is a bad sample this. No way for 4-5 other reviewers to have same consumption for 390X as they had with 290X and the one W1z tried to be 100W above 290X.





*----------- Minimum* *Maximum* *Average
PCI-E  Total* 58.56 W  421.20 W  324.78 W
*Mainboard 3.3V* 1.65 W  3.30 W  2.53 W
*Mainboard 12V* 30.24 W  52.00 W  41.00 W
*VGA Card Total* 93.76 W  468.04 W  *368.32 W*




Don't look at maximum or peak measurements, look for averages. It's around 60W to 80W extra depending on stress test used.


----------



## Ubersonic (Jun 20, 2015)

I was never that interested in the 390X as I already have 290X and GTX970 systems, but on an interesting note, the new revision of the Twin Frozer V cooler maxed out at 82c at 40dB, so assuming the GTX980ti Gaming Twin Frozer V is just as solid it should be able to run even quieter with a card that uses 140w at stock.

*EDIT*

Could the discrepancies in power draw between sites be down to the card having three settings for performance?

1100 MHz / 6100 MHz (OC Mode)
1080 MHz / 6000 MHz (Gaming Mode)
1050 MHz / 6000 MHz (Silent Mode)


----------



## BiggieShady (Jun 20, 2015)

Ubersonic said:


> Could the discrepancies in power draw between sites be down to the card having three settings for performance?
> 
> 1100 MHz / 6100 MHz (OC Mode)
> 1080 MHz / 6000 MHz (Gaming Mode)
> 1050 MHz / 6000 MHz (Silent Mode)


Probably and mostly because overclocked memory with that many memory ICs.


----------



## BoutTime (Jun 21, 2015)

People need to understand....that although this thing cost a fortune to just play games, it doesn't just play games....it'll also heat your house.
So really it might end up being quite cost effective. There could be a future for these GPU/Central Heating Systems.
So Kudos to AMD.


----------



## Caring1 (Jun 21, 2015)

BoutTime said:


> People need to understand....that although this thing cost a fortune to just play games, it doesn't just play games....it'll also heat your house.
> So really it might end up being quite cost effective. There could be a future for these GPU/Central Heating Systems.
> So Kudos to AMD.


That's funny, I was thinking of putting a grill rack in the top of an Nvidia system to make toast while I was gaming, but then it crossed my mind I might attract trolls.


----------



## Sakurai (Jun 21, 2015)

Finally, the new nuclear reactor near my city has found a use


----------



## SetsunaFZero (Jun 21, 2015)

this remembered me on gtx480


----------



## Fluffmeister (Jun 21, 2015)

Also!


----------



## SetsunaFZero (Jun 21, 2015)

ok last one


----------



## Tsukiyomi91 (Jun 21, 2015)

Here I am thinking AMD could bring the fight to Nvidia with this... after all that hype, it just dropped hard like a hammer on me after reading a few bench results... as a former AMD user, it seems that this new card isn't going to make the cut. With a $430 price tag, I dun think I would recommend a pixel pusher that's still a few percentage behind a GTX970 despite having 4GB extra buffer. Power consumption is horrible compared with the 290X. With the weather here getting too warm for gaming, I think temps on this custom card would hit near 90s unless I on the air-cond to keep it cool, which it's gonna give me nightmares on my electricity bills. Heck, even my 2+ year old GTX760 I've re-used for the test rig barely uses 200W on load. If there's a gamer who wants the best card money can buy, pushes all current & new games @ 1080p or higher without issues & doesn't break his wallet, I would recommend him a custom GTX980Ti which Gigabyte has just released.


----------



## mirakul (Jun 21, 2015)

Tsukiyomi91 said:


> Here I am thinking AMD could bring the fight to Nvidia with this... after all that hype, it just dropped hard like a hammer on me after reading a few bench results... as a former AMD user, it seems that this new card isn't going to make the cut. With a $430 price tag, I dun think I would recommend a pixel pusher that's still a few percentage behind a GTX970 despite having 4GB extra buffer. Power consumption is horrible compared with the 290X. With the weather here getting too warm for gaming, I think temps on this custom card would hit near 90s unless I on the air-cond to keep it cool, which it's gonna give me nightmares on my electricity bills. Heck, even my 2+ year old GTX760 I've re-used for the test rig barely uses 200W on load. If there's a gamer who wants the best card money can buy, pushes all current & new games @ 1080p or higher without issues & doesn't break his wallet, I would recommend him a custom GTX980Ti which Gigabyte has just released.


I think you should have put your hope on Fury line up. The sole purpose of 300 series is clear up 200 series stock, and when that purpose is reached, the price will cool down.
Note that this card has the potential to shine on DX12, which its rivals from nVidia don't have. You can read more from here http://www.techpowerup.com/forums/t...-amd-and-foe-to-nv.213658/page-2#post-3301837


----------



## Tsukiyomi91 (Jun 22, 2015)

Well, only time will tell... I might keep an eye on Fury once official benches are up. For DX12 performance, I think it will make up for it's lackluster & hopefully, there will be some owners who will get it. If AMD can deliver it like what the Fury did, then it's a good sign that they can finally show some confidence in the pixel-pushing department.


----------



## Kyuuba (Jun 22, 2015)

A step back from previous generation, unless the card is intented for DX12...


----------



## Performer81 (Jun 24, 2015)

w1zzard can you please upload the 390X PCS+ Bios to the database?


----------



## W1zzard (Jun 24, 2015)

Performer81 said:


> w1zzard can you please upload the 390X PCS+ Bios to the database?


i dont have the 390x pcs+ i have the 390 non x pcs+


----------



## Performer81 (Jun 24, 2015)

Ok you wrote you have it in queue so i thought you have it yet.


----------



## eodeo (Jun 24, 2015)

I was looking into getting one of these cards. I love 290 performance, but I was hoping 390 would fix the multi monitor power.

Having a single monitor is no fun or advantageous. I just don't get AMD. They should at least have tried to fix this glaring problem. What did they do except slightly OC the card... Why bother releasing 3xx series at all.... 

And now even the Fury has the same problem. Sure, it has lower multi monitor consumption, but only because memory is locked to 500mhz and is so efficient. Video playback still kicks the GPU to 3d clocks and messes everything up. If it were up to AMD, they would love to push those memory clocks up to quadrillion ghz while idle with the 2nd connected monitor...


----------



## W1zzard (Jun 24, 2015)

Performer81 said:


> Ok you wrote you have it in queue so i thought you have it yet.


My mistake i confused the 390 with 390x. i do have the 390 non-x


----------



## Tibor Hazafi (Jun 30, 2015)

Are these FPS numbers in techpowerup game benchmarks generally minimum or average values?


----------



## eidairaman1 (Aug 17, 2015)

fullinfusion said:


> so they still cant cool that chip down, what a burn.


im wondering if I should go Full WC on my rig, what you think?


----------

