• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon RX Vega Needs a "Damn Lot of Power:" AIB Partner Rep

So the source is HW Battle with regards to "AMD dragging their feet. And the high power is quoting the Specs that we all have and can read as well as the MSI Rep.

Are these the actual facts without any editorial embellishment?
 
So just how much damn power is this thing going to draw from the PSU?
 
Only really care about performance. 100 exta watts is no issue for me and hey! Maybe it'll keep the miners away and people can actually buy them.

The rep also implied that the performance was worth the power draw.
 
https://www.techpowerup.com/reviews/EVGA/GTX_1080_Ti_SC2/34.html

Every 1080Ti has a a TDP adjustment limit of 330-350-384W if you doesnt use the stock FE. Why are we panicking over "RX Vega gonna use a lot of power" again?

We're talking about graphics cards that will make our eyes bleed rainbow. Of course they need shit ton a power to do that.

All of the very high end cards eat power. 980Ti, 1080Ti, Fury X. It is foolish to think that any of the 2 companies will put a graphics card on the table that does flawless buttery smooth 4K gaming with like ~150W power consumption.
In 2017/2018 it just no gonna happen.

P.S: and we all saw the MSI 1080TI lighting card with 3x8pin power connection meaning the card can draw up to 525W of power. I cant see why would a 300W TDP card be an issue on the current market.
 
Last edited:
Guys, this could be the requested power system, if I recall correctly the RX 480 would have similar specifications
 
I'm seeing TDP numbers but I don't see power requirements. Unless I'm overlooking it.

The power requirements are not listed officially but TDP gives you a good idea.

Its going to interesting to see how AIB partners try to dissipate that 300+W on the non pro(or whatever) non-reference cards.
 
Last edited:
What's wrong with reference cards?
 
Why is everyone surprised? Currently Pascal is ~80% more energy efficient than Polaris. Even though Vega will be better, is anyone seriously thinking that it will reduce this gap to less than 10%?

Even if they cut the efficiency gap in half, a Vega clocked to "compete" with GP102 will still easily pass 300W TDP.
 
Let's just hope they didn't round down like Polaris.
 
So the source is HW Battle with regards to "AMD dragging their feet. And the high power is quoting the Specs that we all have and can read as well as the MSI Rep.

Are these the actual facts without any editorial embellishment?

Thats the sad state of PC tech journalism on the internet. Anybody can post any junk and it gets published to get page clicks. Anyway we have to endure such crap till early August when RX Vega arrives.
 
Only really care about performance. 100 exta watts is no issue for me and hey! Maybe it'll keep the miners away and people can actually buy them.

I actually almost pre-ordered one last night...
 
I think that the rep also implied that the performance was worth the power draw.
 
TDP is no longer a goo indicator of power requirements as we have seen with Skylake-X. :laugh:
It was never really an indicator of power consumption, however if Vega runs hotter or as hot as Fury then it'll definitely be a problem, 350W TDP or not!
 
You should not care about Powerdraw on such Cards.
You only care if you have at least couple of dozens working full load in some sort of super Computer. That is where it matters.
But for us mostly single gpu users, it wont matter if ist 250 or 300 or 350 W. For waht, for disipated heat? For extra power costs ?
What is the Advantage of 250W versus 350W ?
You are not going to hold your VEGA in 100% full load 24/7, and even if you did, the difference from 250W to 350W is 2.4 KWh per day which translates in 0.72 EUR per day. Which is 21EUR per month, if you Keep it in FULL load that is, which as a gamer only you will never.
Chances are you are getting more Performance for those 21 EUR per month in a full load Scenario. So if you had the couple of hundert euros to pay for the Card, you sure as hell wont mind paying a bit much for power usage.

That beeing said, im waiting for the DUAL GPU Version of this, and im going to get me two of them for 4 GPUs in total, see then how much the powerdraw is in full load.
I personaly wont mind at all if the DUAL VEGA Card would have 4x8pin power plugs. I can easily accomodate 8x8pin power plugs from my two ANTEC High current pros 1300W.
So bring it on AMD, i'm waiting.
 
Not a big deal.
I would never go under 1KW PSU because I want to be far away from power limitation.
Now you can find cheap quality 1KW PSU and they don't need to be big.
There is a models with very small PSU case as EVGA 1000 GX.
mATX motherboard + Intel 6-10 cores.

This only could be problem for guys who thought 700W is enough for everything and keep Intel X chipset and now is time for GPU upgrade.
Even than 750W should be enough but fan will work like crazy.
 
Just finished reading on TPU Facebook page. I find it quite fun to match up TPU forum users to Facebook actual human. I think I linked up to 6 now. lol
You have way too much time on your hands, :laugh:
 
It's so funny looking at peasants raving about power consumption of top of the line card. It's abotu as equally as stupid as raving about MPG of a Ferrari F12...
 
Let's recap this article for a moment:
Hi everyone, here are the power specs of a couple of cards that are not the ones we are referencing in this article!
On another story, hear this:
A guy told another guy that somwere on a german forum, someone the 1st guy thinks is MSI rep, wrote a long post of wich exactly 24 words were
"Specs van Vega RX gezien. Tering wat power heeft die nodig. Wij zijn er aan bezig, dat is een start dus launch komt dichterbij,"
with a comma at the end because no one quoted no one else out of context.
We imply VEGA is bullcrap because the rumors say it draws 25 more watt than 1080Ti so let's all go out and buy nvidia.
 
Last edited:
Oh, dear, it's not like AIB 1080's consume well beyond 200W.
Also, by the same partner:

upload_2017-6-23_12-44-8.png


Another comment (Bits&Chips relying on unnamed sources) on aggressive price/perf to expect from Vega (and Navi being the game changer in enterprise):
https://twitter.com/BitsAndChipsEng/status/877972194755837952

So, while AMD definitely won't beat nVidia in perf/watt, let alone as badly as back in Fermi times, great performance/$ is expected.

And then there is undervolting for environment friendly users.
 
Back
Top