Thursday, June 22nd 2017

Radeon RX Vega Needs a "Damn Lot of Power:" AIB Partner Rep

AMD is dragging its feet with the launch of its next performance/enthusiast segment graphics card based on the cutting-edge "Vega 10" silicon, the Radeon RX Vega. The last we heard, the company is announcing the product late-July/early-August, along the sidelines of SIGGRAPH 2017. The company already put out specifications of the first consumer product based on this silicon, the Radeon Pro Vega Frontier Edition; and according to listings by online retailers, its power figures aren't looking good. The air-cooled version has its TDP rated at 300W, and the faster liquid-cooled variant 375W. This is way above the 275W TDP of the TITAN Xp, NVIDIA's fastest client-segment graphics card.

An MSI company representative posting on Dutch tech-forums confirmed our worst fears, that the RX Vega will have a very high power draw. "Specs van Vega RX gezien. Tering wat power heeft die nodig. Wij zijn er aan bezig, dat is een start dus launch komt dichterbij," said the representative who goes by "The Source" on Dutch tech forums Tweakers.net. As a gentleman scholar in Google Translate, and citing VideoCardz which cited a native Dutch speaker; the MSI rep's statement translates as "I've seen the specs of Vega RX. It needs a damn lot of power. We're working on it, which is a start so launch is coming closer."
Sources: VideoCardz, Tweakers.net (forums)
Add your own comment

95 Comments on Radeon RX Vega Needs a "Damn Lot of Power:" AIB Partner Rep

#51
_larry
I just upgraded to a Corsair CX850 watt. This should be enough to handle this beast, correct? Just one of them xD
Posted on Reply
#52
rtwjunkie
PC Gaming Enthusiast
_larryI just upgraded to a Corsair CX850 watt. This should be enough to handle this beast, correct? Just one of them xD
Really? I know you are kidding us. :)

Yes, it will be just fine.
Posted on Reply
#53
_larry
rtwjunkieReally? I know you are kidding us. :)

Yes, it will be just fine.
lol Yes, a tad bit of trolling. I thought about getting an awesome water loop setup and overclocking the hell out of my R9 290 and Ryzen 1600X but, I think I will wait for Vega.
Posted on Reply
#54
DeOdView
To all the fan boys and girls of NV and AMD included INTEL

POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!

If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.

Power available is only a part of the Graphic Card design specification!

And I'm commenting on a marketing guy!? NVM!!

Cheer!

{@}
^(^
Peace!
Posted on Reply
#55
Prince Valiant
RejZoRIt's so funny looking at peasants raving about power consumption of top of the line card. It's abotu as equally as stupid as raving about MPG of a Ferrari F12...
One more thing to claim dominance over while they OC the 1080ti and push up power usage o_O.
Posted on Reply
#56
RejZoR
DeOdViewTo all the fan boys and girls of NV and AMD included INTEL

POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!

If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.

Power available is only a part of the Graphic Card design specification!

And I'm commenting on a marketing guy!? NVM!!

Cheer!

{@}
^(^
Peace!
Exactly. If card has 500W TDP and consumes just 200W during actual operation, it just means it's so far over engineered (highly over-exaggerated example, but you get the picture). It's also important to distinct core TDP and board TDP. Which is a bit tricky with Vega since it has HBM2 on same interposer as chip. Is this counted within core design or board design?
Posted on Reply
#57
Casecutter
WOW HOW QUICK WE LOOK AWAY...

A MSI marketing director said this... he's already out the door; Or MSI want's Miners to not hold out for Vega and buy MSI stock of 1080 and 1080Ti at now their now rapidly inflating prices. The idea that you might not find currency as fast, but do it with less energy is the Nvidia mantra toward mining and it a premise that can work. So here's "some marketing shrill" at MSI that has done both MSI/Nvidia a favor to promote lining their pockets...

I'm not saying Vega isn't going to want power probably, but it *might* deliver better perf/w, so it's relative. Nvidia is working that same angle with mining, saying you don't unearth as fast, but we do it with less energy. It' just takes longer to see a return on the investment. Here's the thing you what to find as fast as you can as who know when it will again "go bust" and then need to relieve yourself of the currency and either companies GPU's. I comes down on ROI, and the guys with more coin that sold under high prices will almost always come-out ahead not on used equipment resale.
Posted on Reply
#58
dozenfury
RejZoRExactly. If card has 500W TDP and consumes just 200W during actual operation, it just means it's so far over engineered (highly over-exaggerated example, but you get the picture). It's also important to distinct core TDP and board TDP. Which is a bit tricky with Vega since it has HBM2 on same interposer as chip. Is this counted within core design or board design?
The power requirements are usually very close to what they rate them at, within about 25W + or -. They aren't going to say it requires 300W and it only pulls 100W. We're talking wattage under load also. Pretty much any card will have a much lower power pull if in 2D surfing the net.
Posted on Reply
#59
DeOdView
dozenfuryThe power requirements are usually very close to what they rate them at, within about 25W + or -. They aren't going to say it requires 300W and it only pulls 100W. We're talking wattage under load also. Pretty much any card will have a much lower power pull if in 2D surfing the net.
YES, but what is the PERFORMANCE PER WATT?! That's the most critical information missing. Without it, how do you compared?

The article should has title it as: >> Vega available with 250W and 300W of power!<< THAT IT'S!
Posted on Reply
#60
efikkan
RejZoRExactly. If card has 500W TDP and consumes just 200W during actual operation, it just means it's so far over engineered (highly over-exaggerated example, but you get the picture).
TDPs are always with a small margin, and as we can see with CPUs with TDP brackets like 65W, 91W, 140W, etc., then of course some CPUs are going to be at the lower end of each bracket, and some at the higher end.

When you see GPUs with TDPs of 120W, 150W, 180W, 250W, 300W, etc., a higher number generally means more energy consumption. TDPs of GPUs are also much closer to typically gaming consumption than TDPs of CPUs (CPUs are usually hotter during encoding), so there is no denying that a graphics card with a TDP of 300W is going to consume a lot.
RejZoRIt's also important to distinct core TDP and board TDP. Which is a bit tricky with Vega since it has HBM2 on same interposer as chip. Is this counted within core design or board design?
All graphics cards use board TDP.
Posted on Reply
#61
KainXS
BytalesYou should not care about Powerdraw on such Cards.
You only care if you have at least couple of dozens working full load in some sort of super Computer. That is where it matters.
But for us mostly single gpu users, it wont matter if ist 250 or 300 or 350 W. For waht, for disipated heat? For extra power costs ?
What is the Advantage of 250W versus 350W ?
You are not going to hold your VEGA in 100% full load 24/7, and even if you did, the difference from 250W to 350W is 2.4 KWh per day which translates in 0.72 EUR per day. Which is 21EUR per month, if you Keep it in FULL load that is, which as a gamer only you will never.
Chances are you are getting more Performance for those 21 EUR per month in a full load Scenario. So if you had the couple of hundert euros to pay for the Card, you sure as hell wont mind paying a bit much for power usage.

That beeing said, im waiting for the DUAL GPU Version of this, and im going to get me two of them for 4 GPUs in total, see then how much the powerdraw is in full load.
I personaly wont mind at all if the DUAL VEGA Card would have 4x8pin power plugs. I can easily accomodate 8x8pin power plugs from my two ANTEC High current pros 1300W.
So bring it on AMD, i'm waiting.
Its high but as long as they can cool it I agree it shouldn't be an issue. I haven't heard anything about a dual consumer level vega card. It looks like theirs a FirePro/RadeonPro dual vega but no consumer yet so far. Has anyone heard about a consumer dual vega in development?
Posted on Reply
#62
Octopuss
Assuming the rumor is true, how is it possible? I thought power consumption continually goes down as CPUs/graphic cards are built on smaller processes.
Posted on Reply
#63
RejZoR
Not when you try to make them ridiculously fast. More transistors means more power consumption. With smaller nodes, you're just shifting the inevitable. When you hit the limit, like it was with 28nm, you have to go radical like NVIDIA has with Maxwell 2.

In a way, I'm ok with RX Vega having higher consumption. At least god damn miners won't take all of them immediately.
Posted on Reply
#64
Octopuss
What did Nvidia do?
Also, what's Maxwell 2? Do you mean Pascal, or?...
Posted on Reply
#65
Basard
OctopussAssuming the rumor is true, how is it possible? I thought power consumption continually goes down as CPUs/graphic cards are built on smaller processes.
Yeah, sure it does.... Until they increase the core count and clock speed.
Posted on Reply
#66
efikkan
RejZoRNot when you try to make them ridiculously fast. More transistors means more power consumption. With smaller nodes, you're just shifting the inevitable. When you hit the limit, like it was with 28nm, you have to go radical like NVIDIA has with Maxwell 2.

In a way, I'm ok with RX Vega having higher consumption. At least god damn miners won't take all of them immediately.
Heat is becoming a really serious issue when approaching 300W. Most of us wants a card that can remain stable for ~5 years. I wouldn't complain if AMD managed to get within 5-10% of the efficiency of Nvidia, but when it's even just 20-30% it really limits their competitiveness in the upper range, and now when it's like ~80% they even struggle in the mid range. Energy efficiency is now the main limiting factor for performance scaling.

The only advantage Vega has right now is cheaper fp16 performance, which wouldn't matter for gaming anytime soon.
Posted on Reply
#67
jabbadap
OctopussWhat did Nvidia do?
Also, what's Maxwell 2? Do you mean Pascal, or?...
Maxwell 1 is gm108 and gm107 aka gtx750/ti. Maxwell 2 is gm206, gm204 and gm200(gtx900 series) all within TSMC 28nm like kepler was. Nvidia stripped off FP64 more with maxwell that they have with kepler and by that could add more fp32 cuda cores while being same 28nm node.
Posted on Reply
#68
EarthDog
DeOdViewTo all the fan boys and girls of NV and AMD included INTEL

POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!

If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.

Power available is only a part of the Graphic Card design specification!

And I'm commenting on a marketing guy!? NVM!!
power use and power available are two different things... true..but how is that relevant here? Power available is just done by adding up pcie slot and tbe two connectors...assuming thats what you meant (its what was said anyway...)
KainXSIts high but as long as they can cool it I agree it shouldn't be an issue. I haven't heard anything about a dual consumer level vega card. It looks like theirs a FirePro/RadeonPro dual vega but no consumer yet so far. Has anyone heard about a consumer dual vega in development?
they are cooling it with 120mm aio. Most would agree those are good for 100w... and we have up to 375w, stock, to cool with it. Hell, they used a 120mm aio with 295x2 at 500w...and it worked...but man that thing got hot! Sorry... 375w on water, 300w on air...


@RejZoR - ive never heard of a gpu using core power for power rating.... you got a link of that being used??? In fact ive never heard of the term....





For all - Board power includes everything afaik. Doesnt matter if its hbm or not. Board power is power use of the card, not a 'part' of it. And while tdp isnt power used, it is very close and most would consider it a good ballpark for actual power used.
DeOdViewYES, but what is the PERFORMANCE PER WATT?! That's the most critical information missing. Without it, how do you compared?

The article should has title it as: >> Vega available with 250W and 300W of power!<< THAT IT'S!
well, it needs to be 20% faster than a 1080ti to take that away from it...considering we know 1080ti board power is 250...this one is 300, 20% more. Problem is, its predicted to be as fast, not 20% faster...and thats not talking the 375w part...
Vlada011Not a big deal.
I would never go under 1KW PSU because I want to be far away from power limitation.
Now you can find cheap quality 1KW PSU and they don't need to be big.
There is a models with very small PSU case as EVGA 1000 GX.
mATX motherboard + Intel 6-10 cores.

This only could be problem for guys who thought 700W is enough for everything and keep Intel X chipset and now is time for GPU upgrade.
Even than 750W should be enough but fan will work like crazy.
No. Wow...no.
trparkySo just how much damn power is this thing....
from the first para... yes its tdp, but again, its plenty close enough!
btarunrair-cooled version has its TDP rated at 300W, and the faster liquid-cooled variant 375W. This is way above the 275W TDP of the TITAN Xp, NVIDIA's fastest client-segment graphics card.
Posted on Reply
#69
Vayra86
I won't say 'told you so'

But... told you so. This is just Fury X v2, pointless HBM that is way too fast for the core it is coupled with in a desperate attempt to leave more TDP budget for a crapload of GCN cores that still date back to 2012.
Posted on Reply
#70
Aquinus
Resident Wat-man
Vayra86But... told you so. This is just Fury X v2, pointless HBM that is way too fast for the core it is coupled with in a desperate attempt to leave more TDP budget for a crapload of GCN cores that still date back to 2012.
The HBM memory controller is a lot smaller than the GDDR5 one, even more so if you're considering the 512-bit wide bus on the 290(x) and 390(x) cards. 4096 shaders all together is a lot of compute which is going to create a lot of heat which is going to self-limit itself with respect to how fast it can go. I don't think HBM is the problem, I think the problem is that AMD has to find a better way to spread those GCN cores out so all the heat being generated isn't so concentrated because then you have to resort to more innovative ways to cool the chip because the problem isn't so much that it makes a lot of heat but rather that the circuit gets hot faster than the heat itself can move through the rest of the GPU material between that and the heat sink. Even with the same temperature gradient, it takes more time for heat to move a larger distance and as circuits get smaller, heat gets concentrated in a smaller area which is going to impact the temperature of the circuit but, not necessarily the rate at which heat can be removed from the system.

Think of it this way, if you have a square, 1 meter by 1 meter producing some amount of heat consistent across its surface, if you reduce the the size of it by half (0.5m x 0.5m,) but, heat produced stays the same, that smaller surface will be hotter, even though it's producing the same amount of heat. What changed is that the rate of heat dissipation does not remain a constant because the available area to conduct heat has been reduced but, the end result is that a higher temperature difference is required for the heat to flow faster. As a result, circuits themselves get hotter the smaller you make them and when you try to cool or reduce their temperature the same way, you end up with higher circuit temperatures which brings everything back to equilibrium but ultimately becomes the limiting factor with how fast they can operate.

tl;dr: Too much heat in too small of an area is to blame, not HBM.
Posted on Reply
#71
DeOdView
EarthDogpower use and power available are two different things... true..but how is that relevant here? Power available is just done by adding up pcie slot and tbe two connectors...assuming thats what you meant (its what was said anyway...)

they are cooling it with 120mm aio. Most would agree those are good for 100w... and we have up to 375w, stock, to cool with it. Hell, they used a 120mm aio with 295x2 at 500w...and it worked...but man that thing got hot! Sorry... 375w on water, 300w on air...


@RejZoR - ive never heard of a gpu using core power for power rating.... you got a link of that being used??? In fact ive never heard of the term....





For all - Board power includes everything afaik. Doesnt matter if its hbm or not. Board power is power use of the card, not a 'part' of it. And while tdp isnt power used, it is very close and most would consider it a good ballpark for actual power used.

well, it needs to be 20% faster than a 1080ti to take that away from it...considering we know 1080ti board power is 250...this one is 300, 20% more. Problem is, its predicted to be as fast, not 20% faster...and thats not talking the 375w part...


No. Wow...no.

from the first para... yes its tdp, but again, its plenty close enough!
We have no information in regarding to Vega's performance as of yet. it all rumored...

as for me, the articles just indicated that Vega will be available up to 250W and 300W of power usaged, Nothing more.
Posted on Reply
#72
EarthDog
DeOdViewWe have no information in regarding to Vega's performance as of yet. it all rumored...

as for me, the articles just indicated that Vega will be available up to 250W and 300W of power usaged, Nothing more.
correct, its all rumor. My reply to you was based off a 1080ti which we know how it performs. I simply stated performamce /w would need to beat something KNOWN, that 1080ti i mentioned.

First post follows another rumor from tech report showing 300/375.
btarunrThe company already put out specifications of the first consumer product based on this silicon, the Radeon Pro Vega Frontier Edition; and according to listings by online retailers, its power figures aren't looking good. The air-cooled version has its TDP rated at 300W, and the faster liquid-cooled variant 375W.
techreport.com/news/32112/rumor-vega-frontier-edition-board-power-ratings-surface
Posted on Reply
#73
R-T-B
RejZoRAt least god damn miners won't take all of them immediately.
They will if they perform well at mining for said wattage...
Posted on Reply
#74
crazyeyesreaper
Not a Moderator
no one gives a fuck about power usage in reality. Its a bs metric we we use to see who can piss farther. What will matter is end all be all performance. If the card is affordable, and performs on par with a 1080Ti people will buy it. Power usage does not even form a blip on my radar. High end GPU expect high end power consumption. All that matters is performance per $ for a GPU.

Example AMD Vega comes out same price as a 1080Ti thats fine if performance is similar thats fine if it uses more wattage that is also fine. What I would like to see is minimum frame rate. If the VEGA card can sustain a higher frame rate with less performance Dips with its HBC setup then its a worthwhile offering even with the higher consumption. My gut tells me its too little too late.

But power usage does not compute. because most system sit idle or off the majority of the time. The higher power usage means diddly squat. For the sake of the argument.

375w vs 250w at 10 cents per KWH, = $27 vs $18 per month to run said GPUs at maximum performance 24/7. cut that in half so 12 hours a day thats $13.5 vs $9 per month. None of us play games for 12 hours a damn day. So lets say 6 hours thats still a bit much but its a more believable amount. thats $6.75 vs $4.5 per month. 3 hours per day of gaming on a 375w Vega vs 250w 1080Ti would end up being $3.37 vs $2.25.

Efficiency aside. a $1 a damn month difference. thats what 125w between 2 gpus will realistically work out to. So unless you live in a third world country where you probably cant get a Vega or a 1080Ti. Realistically speaking the wattage amount between gpus is basically just bs unless running the cards 24/7 full bore something 99% of gamers will not do.

Keep in mind I used $0.10 cents. My local rate is $0.07 cents. So even cheaper lol. I am pretty sure you can save more money driving a more efficient car. Buying cheaper shit tickets, taking a shorter shower than buying a GPU based on power usage.
Posted on Reply
#75
trparky
crazyeyesreaperPower usage does not even form a blip on my radar.
I don't know about you but I care. Not only do I care about the power usage since electricity isn't cheap and heat output isn't that great either since that means I'm going to have to spend more money in terms of having to run my AC more often during the summer months.
Posted on Reply
Add your own comment
Dec 19th, 2024 15:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts