Thursday, June 22nd 2017
Radeon RX Vega Needs a "Damn Lot of Power:" AIB Partner Rep
AMD is dragging its feet with the launch of its next performance/enthusiast segment graphics card based on the cutting-edge "Vega 10" silicon, the Radeon RX Vega. The last we heard, the company is announcing the product late-July/early-August, along the sidelines of SIGGRAPH 2017. The company already put out specifications of the first consumer product based on this silicon, the Radeon Pro Vega Frontier Edition; and according to listings by online retailers, its power figures aren't looking good. The air-cooled version has its TDP rated at 300W, and the faster liquid-cooled variant 375W. This is way above the 275W TDP of the TITAN Xp, NVIDIA's fastest client-segment graphics card.
An MSI company representative posting on Dutch tech-forums confirmed our worst fears, that the RX Vega will have a very high power draw. "Specs van Vega RX gezien. Tering wat power heeft die nodig. Wij zijn er aan bezig, dat is een start dus launch komt dichterbij," said the representative who goes by "The Source" on Dutch tech forums Tweakers.net. As a gentleman scholar in Google Translate, and citing VideoCardz which cited a native Dutch speaker; the MSI rep's statement translates as "I've seen the specs of Vega RX. It needs a damn lot of power. We're working on it, which is a start so launch is coming closer."
Sources:
VideoCardz, Tweakers.net (forums)
An MSI company representative posting on Dutch tech-forums confirmed our worst fears, that the RX Vega will have a very high power draw. "Specs van Vega RX gezien. Tering wat power heeft die nodig. Wij zijn er aan bezig, dat is een start dus launch komt dichterbij," said the representative who goes by "The Source" on Dutch tech forums Tweakers.net. As a gentleman scholar in Google Translate, and citing VideoCardz which cited a native Dutch speaker; the MSI rep's statement translates as "I've seen the specs of Vega RX. It needs a damn lot of power. We're working on it, which is a start so launch is coming closer."
95 Comments on Radeon RX Vega Needs a "Damn Lot of Power:" AIB Partner Rep
Yes, it will be just fine.
POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!
If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.
Power available is only a part of the Graphic Card design specification!
And I'm commenting on a marketing guy!? NVM!!
Cheer!
{@}
^(^
Peace!
A MSI marketing director said this... he's already out the door; Or MSI want's Miners to not hold out for Vega and buy MSI stock of 1080 and 1080Ti at now their now rapidly inflating prices. The idea that you might not find currency as fast, but do it with less energy is the Nvidia mantra toward mining and it a premise that can work. So here's "some marketing shrill" at MSI that has done both MSI/Nvidia a favor to promote lining their pockets...
I'm not saying Vega isn't going to want power probably, but it *might* deliver better perf/w, so it's relative. Nvidia is working that same angle with mining, saying you don't unearth as fast, but we do it with less energy. It' just takes longer to see a return on the investment. Here's the thing you what to find as fast as you can as who know when it will again "go bust" and then need to relieve yourself of the currency and either companies GPU's. I comes down on ROI, and the guys with more coin that sold under high prices will almost always come-out ahead not on used equipment resale.
The article should has title it as: >> Vega available with 250W and 300W of power!<< THAT IT'S!
When you see GPUs with TDPs of 120W, 150W, 180W, 250W, 300W, etc., a higher number generally means more energy consumption. TDPs of GPUs are also much closer to typically gaming consumption than TDPs of CPUs (CPUs are usually hotter during encoding), so there is no denying that a graphics card with a TDP of 300W is going to consume a lot. All graphics cards use board TDP.
In a way, I'm ok with RX Vega having higher consumption. At least god damn miners won't take all of them immediately.
Also, what's Maxwell 2? Do you mean Pascal, or?...
The only advantage Vega has right now is cheaper fp16 performance, which wouldn't matter for gaming anytime soon.
@RejZoR - ive never heard of a gpu using core power for power rating.... you got a link of that being used??? In fact ive never heard of the term....
For all - Board power includes everything afaik. Doesnt matter if its hbm or not. Board power is power use of the card, not a 'part' of it. And while tdp isnt power used, it is very close and most would consider it a good ballpark for actual power used. well, it needs to be 20% faster than a 1080ti to take that away from it...considering we know 1080ti board power is 250...this one is 300, 20% more. Problem is, its predicted to be as fast, not 20% faster...and thats not talking the 375w part... No. Wow...no. from the first para... yes its tdp, but again, its plenty close enough!
But... told you so. This is just Fury X v2, pointless HBM that is way too fast for the core it is coupled with in a desperate attempt to leave more TDP budget for a crapload of GCN cores that still date back to 2012.
Think of it this way, if you have a square, 1 meter by 1 meter producing some amount of heat consistent across its surface, if you reduce the the size of it by half (0.5m x 0.5m,) but, heat produced stays the same, that smaller surface will be hotter, even though it's producing the same amount of heat. What changed is that the rate of heat dissipation does not remain a constant because the available area to conduct heat has been reduced but, the end result is that a higher temperature difference is required for the heat to flow faster. As a result, circuits themselves get hotter the smaller you make them and when you try to cool or reduce their temperature the same way, you end up with higher circuit temperatures which brings everything back to equilibrium but ultimately becomes the limiting factor with how fast they can operate.
tl;dr: Too much heat in too small of an area is to blame, not HBM.
as for me, the articles just indicated that Vega will be available up to 250W and 300W of power usaged, Nothing more.
First post follows another rumor from tech report showing 300/375.techreport.com/news/32112/rumor-vega-frontier-edition-board-power-ratings-surface
Example AMD Vega comes out same price as a 1080Ti thats fine if performance is similar thats fine if it uses more wattage that is also fine. What I would like to see is minimum frame rate. If the VEGA card can sustain a higher frame rate with less performance Dips with its HBC setup then its a worthwhile offering even with the higher consumption. My gut tells me its too little too late.
But power usage does not compute. because most system sit idle or off the majority of the time. The higher power usage means diddly squat. For the sake of the argument.
375w vs 250w at 10 cents per KWH, = $27 vs $18 per month to run said GPUs at maximum performance 24/7. cut that in half so 12 hours a day thats $13.5 vs $9 per month. None of us play games for 12 hours a damn day. So lets say 6 hours thats still a bit much but its a more believable amount. thats $6.75 vs $4.5 per month. 3 hours per day of gaming on a 375w Vega vs 250w 1080Ti would end up being $3.37 vs $2.25.
Efficiency aside. a $1 a damn month difference. thats what 125w between 2 gpus will realistically work out to. So unless you live in a third world country where you probably cant get a Vega or a 1080Ti. Realistically speaking the wattage amount between gpus is basically just bs unless running the cards 24/7 full bore something 99% of gamers will not do.
Keep in mind I used $0.10 cents. My local rate is $0.07 cents. So even cheaper lol. I am pretty sure you can save more money driving a more efficient car. Buying cheaper shit tickets, taking a shorter shower than buying a GPU based on power usage.