# AMD Radeon Pro Vega Frontier Edition TDP and Pricing Revealed



## btarunr (Jun 16, 2017)

AMD Radeon Pro Vega Frontier Edition goes on sale later this month (26 June). It is designed to provide a "gateway" to the "Vega" GPU architecture for graphics professionals and game developers alike, with the consumer graphics product, the Radeon RX Vega, is bound for late-July/early-August. Radeon Pro Vega Frontier Edition, being a somewhat "enterprise-segment" product, was expected to have slightly lower TDP than its consumer-graphics sibling, since enterprise-segment implementations of popular GPUs tend to have slightly restrained clock speeds. Apparently, AMD either didn't clock the Radeon Pro Vega Frontier Edition low, or the chip has extremely high TDP. 

According to specifications put out by EXXACT, a retailer which deals with enterprise hardware, the air-cooled variant of the Radeon Pro Vega Frontier Edition has a TDP rated at 300W, while its liquid-cooled variant has its TDP rated as high as 375W. To put this in perspective, the consumer-segment TITAN Xp by NVIDIA has its TDP rated at 275W. EXXACT is claiming big performance advantages in certain enterprise benchmarks such as SPECVIEWPERF and Cinebench. In other news, the air-cooled Radeon Pro Vega Frontier Edition is reportedly priced at USD $1,199; while the liquid-cooled variant is priced at $1,799. Based on the 14 nm "Vega 10" silicon, the Pro Vega Frontier Edition features 4,096 stream processors and 16 GB of HBM2 memory across a 2048-bit memory interface.



 

 



*View at TechPowerUp Main Site*


----------



## uuuaaaaaa (Jun 16, 2017)

If those TDP numbers are real this is looking like R600 (HD 2900 XTX) all over again... I hope that the performance is at least monstrous...


----------



## Frick (Jun 16, 2017)

Whyyyyyy not use that colour scheme for your consumer cards, AMD? That blue they use is so very nice and the gold one is rad.


----------



## _Flare (Jun 16, 2017)

if the RX Vega is much slower than 1080Ti i wouldn´t buy one with more than a single 8-pin or 2x 6-pin
because a TDP over 225 Watts will be highly inefficient given the performance the 1080 non-Ti has with much lower TDP


----------



## Liviu Cojocaru (Jun 16, 2017)

Yep, this is a power hungry monster, let's hope it delivers. We have to wait for the consumer edition and see what TDP that has. Let the vega hype resume )


----------



## Aldain (Jun 16, 2017)

This is not the power consumption of the card itself , it is the MAXIMUM BOARD POWER


----------



## P4-630 (Jun 16, 2017)

300 Watts....

Performance must be really great, otherwise:


----------



## champsilva (Jun 16, 2017)

Aldain said:


> This is not the power consumption of the card itself , it is the MAXIMUM BOARD POWER



Meaning is freaking hot, even hotter then TTXP, thats why they use WC in their cards.


----------



## HTC (Jun 16, 2017)

Just a question: why are they comparing an "enterprise segment" card (Vega Frontier edition) to a gaming card (Titan Xp)?


----------



## Fluffmeister (Jun 16, 2017)

HTC said:


> Just a question: why are they comparing an "enterprise segment" card (Vega Frontier edition) to a gaming card (Titan Xp)?



Because the AMD Advantage column would also say "--".


----------



## Aldain (Jun 16, 2017)

champsilva said:


> Meaning is freaking hot, even hotter then TTXP, thats why they use WC in their cards.



Nope , it means 1x8 pin + 1x6pin + MB = 300W and 2x8pin + MB = 375W simple logic, the card itself wont be sucking up 375 watts of power that would be ludicrous


----------



## I No (Jun 16, 2017)

$1200 USD for air cooling $1800 for water. That's one hell of a water cooler for $600 extra. And to think I thought nVidia AIBs were charging extra for gimmicks...


----------



## bogami (Jun 16, 2017)

That the results of these tests are better consumption and TPD as well as a simple solution by increasing frequency. It has some really new solutions and the time invested is expected 100% over the current generation of advantages. Price is again a serious jump!
With 100% higher prices AMD showed its true face ,and the matter will be burned to the real tests ! Again will be sell for  700€ max.  In whole thing's price is usurious ! Volta of the already knocking at the door .
With such delays and the prices will earn little .


----------



## Deleted member 172152 (Jun 16, 2017)

P4-630 said:


> 300 Watts....
> 
> Performance must be really great, otherwise:


Rge performance those numbers in that chart are based on are feon the air cooled version which can't cool itself very well. Even titan XP/1080 ti fe should be cooler, so a decent increase in performance can be expected from a) the gaming version, b) AIB's and it already outperforms a 1080 ti/titan XP apparently in synthetic benchmarks! If games are well-optimized and it runs great in dx12/vulkan, it's gonna be even better than I expected!



I No said:


> $1200 USD for air cooling $1800 for water. That's one hell of a water cooler for $600 extra. And to think I thought nVidia AIBs were charging extra for gimmicks...


Also higher speeds and less throttling.


----------



## Nokiron (Jun 16, 2017)

Hugh Mungus said:


> Rge performance those numbers in that chart are based on are feon the air cooled version which can't cool itself very well. Even titan XP/1080 ti fe should be cooler, so a decent increase in performance can be expected from a) the gaming version, b) AIB's and it already outperforms a 1080 ti/titan XP apparently in synthetic benchmarks! If games are well-optimized and it runs great in dx12/vulkan, it's gonna be even better than I expected!


Just an FYI, a cut down GP104 based Quadro P4000 performs similiar to this Vega FE in Specviewperf.

Titan XP also does have gimped perfomance in these application with the lack of workstation drivers.

So the benchmarks themselves makes little to no sense.


----------



## jabbadap (Jun 16, 2017)

Well it has lots of fp16 power, that nvidia has only with P100 teslas or Quadro GP100 and they are not cheap.



> To put this in perspective, the consumer-segment TITAN Xp by NVIDIA has its TDP rated at 275W.



Nvidia gives graphics card power, which is 250W for titan xp. Where did you get that 275W tdp figure?


----------



## bug (Jun 16, 2017)

Aldain said:


> This is not the power consumption of the card itself , it is the MAXIMUM BOARD POWER


It sure doesn't mean it will be a 150W card either.


----------



## Deleted member 172152 (Jun 16, 2017)

Nokiron said:


> Just an FYI, a cut down GP104 based Quadro P4000 performs similiar to this Vega FE in Specviewperf.
> 
> Titan XP also does have gimped perfomance in these application with the lack of workstation drivers.
> 
> So the benchmarks themselves makes little to no sense.


What resolution though and what resolution is used here?


----------



## I No (Jun 16, 2017)

Hugh Mungus said:


> Also higher speeds and less throttling.




Last time I checked getting a GPU under water would cost you WAY less than $600 ... still something doesn't add up. Also if Apple wants to stick these into their nice efficient Macs... well .... you get the picture. Furthermore it's like the article says if this is aimed at the prosumer market the gaming ones should be more aggressive with clockspeeds and whatnot, let's just hope these things deliver, if they prove to be slower than nVidia's offerings RTG has some serious trouble.


----------



## _Flare (Jun 16, 2017)

the exact spect of the Mainboard-Slot say 5.5A for 12 Volts means 66 Watts, the remaining 9 Watts are for 3.3V etc.
The PCIe-12V connectors rely mainly on what yout PSU and its cables are.
Specs: 6-pin 6.25A/75W, 8-pin 12.5A/150W
good PSU can handle +10% on Amps

spec-conform combinations 12V-only [Amperes] are:
66W   - Slot only [5.5A]
75W   - 1x 6-pin [6.25A]
141W - Slot + 6-pin [11.75A]
150W - 2x 6-pin or 1x 8-pin [12.5A]
216W - Slot + 2x 6-pin or 1x 8-pin [18A]
225W - 1x 6-pin + 1x 8-pin [18.75A]
291W - Slot + 1x 6-pin + 1x 8-pin [24.25A]
300W - 2x 8-pin [25A]
366W - Slot + 2x 8-pin [30.5A]


----------



## Deleted member 172152 (Jun 16, 2017)

I No said:


> Last time I checked getting a GPU under water would cost you WAY less than $600 ... still something doesn't add up. Also if Apple wants to stick these into their nice efficient Macs... well .... you get the picture. Furthermore it's like the article says if this is aimed at the prosumer market the gaming ones should be more aggressive with clockspeeds and whatnot, let's just hope these things deliver, if they prove to be slower than nVidia's offerings RTG has some serious trouble.


So sceptical. Apple gets "special" gpu's and has a huge bulge on its imac pro, it's a pro card, so 600 dollars for a certain overclock and lower temps is normal and vega will deliver a lot, we just don't know quite how much.


----------



## Nokiron (Jun 16, 2017)

Hugh Mungus said:


> What resolution though and what resolution is used here?


Specviewperf is a standardized benchmark, there are no settings to change.


----------



## Deleted member 172152 (Jun 16, 2017)

Than that's odd.


----------



## rainzor (Jun 16, 2017)

Aldain said:


> Nope , it means 1x8 pin + 1x6pin + MB = 300W and 2x8pin + MB = 375W simple logic, the card itself wont be sucking up 375 watts of power that would be ludicrous



Where do you see a six pin tho? Both cards seem to have 2x8pin according to those pictures. There was a a 6+8 pin card that Raja had during some presentation but afaik he said it was only eng. sample and the production cards will have 2x8pin.


----------



## I No (Jun 16, 2017)

Hugh Mungus said:


> So sceptical. Apple gets "special" gpu's and has a huge bulge on its imac pro, it's a pro card, so 600 dollars for a certain overclock and lower temps is normal and vega will deliver a lot, we just don't know quite how much.




After the major cock-up they pulled @ Computex? How can one not be a sceptic regarding this. Also charging $600 on cooling I'm sorry to say it, justifies jack-all. Furthermore how do we know that this will deliver? So far everything is in Limbo with a paper lunch on the way.


----------



## bug (Jun 16, 2017)

I No said:


> Last time I checked getting a GPU under water would cost you WAY less than $600 ... still something doesn't add up. Also if Apple wants to stick these into their nice efficient Macs... well .... you get the picture. Furthermore it's like the article says if this is aimed at the prosumer market the gaming ones should be more aggressive with clockspeeds and whatnot, let's just hope these things deliver, if they prove to be slower than nVidia's offerings RTG has some serious trouble.


AMD is probably binning higher clocking chips for water cooling.


----------



## Duality92 (Jun 16, 2017)

am I the only one who doesn't care about power used?

I mean, a 100 watt difference only means like $8.76 a year more if it's working 4 hours a day at 100%, which realistically isn't near that. (that's for what I pay for electricity).

someone who pays $0.25 per kWh will only be like $36 per year for 100 watts. (3$ per month).

If you can afford the card you can afford that increase (IMO).

For rendering, at 20/7 usage, it would be $180 per year, if you're using it for income, $180 per year is only $15 per month. easily justifiable.


----------



## EarthDog (Jun 16, 2017)

Its really the heat and noise with that power that is most peoples concern. Id like to see a 120mm cool 300W+ loads...

Oh wait..amd did that already with 295x2...and that rad got HOT!!!!


----------



## Duality92 (Jun 16, 2017)

EarthDog said:


> Its really the heat and noise with that power that is most peoples concern. Id like to see a 120mm cool 300W+ loads...
> 
> Oh wait..amd did that already with 295x2...and that rad got HOT!!!!



Fury X too. Don't forget though, this has a blower too.

But cooling method is irrelevant really, at 100% the card will put out the same heat air cooled or watercooled. It's just the efficiency that changes.


----------



## Nokiron (Jun 16, 2017)

Duality92 said:


> am I the only one who doesn't care about power used?
> 
> I mean, a 100 watt difference only means like $8.76 a year more if it's working 4 hours a day at 100%, which realistically isn't near that. (that's for what I pay for electricity).
> 
> ...


The power bill is not the problem. It's the heat, noise and overclockability.


----------



## Duality92 (Jun 16, 2017)

Nokiron said:


> The power bill is not the problem. It's the heat, noise and overclockability.



So extra heat equivalent to three 60 watt lightbulbs is a concern? I just have a hard  time seeing it. I mean I understand it does make a difference in a room, but there are small changes you can make to reduce total heat produced in a room to offset this.

(I have a 290X lightning that pulls 450W ALONE from the wall (680 total power draw from the wall with a 7700K), I know all about heat xD.


----------



## bug (Jun 16, 2017)

Duality92 said:


> am I the only one who doesn't care about power used?
> 
> I mean, a 100 watt difference only means like $8.76 a year more if it's working 4 hours a day at 100%, which realistically isn't near that. (that's for what I pay for electricity).
> 
> ...


What about noise? Dissipating 100W more, will always be noisier.


----------



## EarthDog (Jun 16, 2017)

Mentioned at least twice already. 



And derp the heatloadload doesnt change with air or water... you really think that was the point there duality????


----------



## Duality92 (Jun 16, 2017)

bug said:


> What about noise? Dissipating 100W more, will always be noisier.



Get a better fan. a full cover waterblock on a 360mm radiator for this card will not be noisier than stock, even with 100w more.

Get a lower ambient, will improving cooling for the card, without being noisier.

Allow your card to be hotter than your target temp, this will increase the delta, thus enable it to dissipate more heat with the same fan speed, won't be noisier either.



EarthDog said:


> Mentioned at least twice already.
> 
> And derp the heatloadload doesnt change with air or water... you really think that was the point there duality????



Nope, and I hope you didn't think that's what I was saying. That's why i said efficiency. Watercooling dissipates the same heat at air cooled, but more efficiently, meaning faster and better.


----------



## EarthDog (Jun 16, 2017)

Oh, so spend $250 more is the answer?? Gotcha.

Get a lower ambient???!!! Yeah, i want to lower my AC 2C to make a 2c difference...costs keep adding up!!!


----------



## Duality92 (Jun 16, 2017)

EarthDog said:


> Oh, so spend $250 more is the answer?? Gotcha.



He said "Always" I was just confirming that it wasn't the case applying a blanket statement as he did. No need to be patrionizing, we're having a discussion.


----------



## EarthDog (Jun 16, 2017)

Sorry... the ideas need grounded.

 You say it doesnt cost much more (power wise and that small nugget is true), yet here we are 'discussing' $250+ solutions and lowering ambient to mitigate the noise...


----------



## bug (Jun 16, 2017)

Duality92 said:


> He said "Always" I was just confirming that it wasn't the case applying a blanket statement as he did. No need to be patrionizing, we're having a discussion.


And I was right, too. Because I can apply any of the solutions you mentioned to a card that uses 100W less power, and the end result will be quieter.


----------



## Duality92 (Jun 16, 2017)

bug said:


> Dissipating 100W more, will *always* be noisier.





EarthDog said:


> Sorry... the ideas need grounded. You say it doesnt cost much more, yet here we are 'discussing' $250+ solutions and lowering ambient...



I was answering that specifically to this.


----------



## EarthDog (Jun 16, 2017)

Live in the minutia...


----------



## xkm1948 (Jun 16, 2017)

The fail is strong with VEGA. Do some homework youself and compare FE VEGA to Quadro cards, you will understand why AMD put it against TitanXp in professional benchmarks.

Too expensive to produce for some pathetic performance. VEGA is DOA. Move along people. Hype train next stop: VEGA20


----------



## bug (Jun 16, 2017)

Duality92 said:


> I was answering that specifically to this.


You can answer to whatever you want, physics dictates when there's less power to dissipate, you can do it with less noise. It's really simple, I'm not even sure why we're discussing this.

Edit: Is power usage the primary factor when deciding to buy such a card? Undoubtedly no. But using more power than another product to deliver roughly the same results is definitely not an advantage. And it's also not something to be ignored, like you think it is.


----------



## notb (Jun 16, 2017)

Duality92 said:


> So extra heat equivalent to three 60 watt lightbulbs is a concern?


I haven't seen a 60W lightbulb for years. :-D


Duality92 said:


> I just have a hard  time seeing it. I mean I understand it does make a difference in a room, but there are small changes you can make to reduce total heat produced in a room to offset this.


Like what? Install air-conditioning? ;-)

It's not just about the cost of electricity. It's also about the actual current needed for the building, certificates and so on.

Moreover, I can understand this is not such a great issue in sparsely populated US and Canada, but in dense european and asian countries it's a lot more important.

BTW: here you can find some figures about electricity usage per employee. Not many countries are listed, but Germany vs US is already pretty educative.
http://g20-energy-efficiency.enerda...ity-consumption-of-services-per-employee.html


----------



## Deleted member 172152 (Jun 16, 2017)

I No said:


> After the major cock-up they pulled @ Computex? How can one not be a sceptic regarding this. Also charging $600 on cooling I'm sorry to say it, justifies jack-all. Furthermore how do we know that this will deliver? So far everything is in Limbo with a paper lunch on the way.


What went wromg at computex? 600 dollars is fine for people wanting a single workstation gpu and the others will likely get multiple gpu's for a xfire setup in a ultra high-end workstation. If you only need one workstation gpu, paying 600 dollars extra for improved cooling and performance may not seem too bad, especially considering even threadripper will likely be $1000+ extra including mobo for the 16-core anyway, so you end up paying relatively little extra.


----------



## Deleted member 172152 (Jun 16, 2017)

xkm1948 said:


> The fail is strong with VEGA. Do some homework youself and compare FE VEGA to Quadro cards, you will understand why AMD put it against TitanXp in professional benchmarks.
> 
> Too expensive to produce for some pathetic performance. VEGA is DOA. Move along people. Hype train next stop: VEGA20


These aren't technically full-on pro cards yet. More general powerhouses aimed at basic professional use. There is a difference apparently according to nvidia and amd. 

The vega pro cards for the imac pro should be more efficiënt and may have a few features making it better for certain professionals. Pro gpu's have niches as well.

Also, where is the MI25?


----------



## Aldain (Jun 16, 2017)

bug said:


> It sure doesn't mean it will be a 150W card either.


Of course not I think it will be around 225-275 in that range


----------



## Boosnie (Jun 16, 2017)

xkm1948 said:


> The fail is strong with VEGA. Do some homework youself and compare FE VEGA to Quadro cards, you will understand why AMD put it against TitanXp in professional benchmarks.
> 
> Too expensive to produce for some pathetic performance. VEGA is DOA. Move along people. Hype train next stop: VEGA20



I'm fairly sure you also know the prices of those quadro you mention.


----------



## Aldain (Jun 16, 2017)

xkm1948 said:


> The fail is strong with VEGA. Do some homework youself and compare FE VEGA to Quadro cards, you will understand why AMD put it against TitanXp in professional benchmarks.
> 
> Too expensive to produce for some pathetic performance. VEGA is DOA. Move along people. Hype train next stop: VEGA20



The fail is strong in your comment m8, vega FE is not a firepro card, firepro cards are the ones that are pitted against the nvidia Quadro cards... This card is for workstations...


----------



## Aldain (Jun 16, 2017)

rainzor said:


> Where do you see a six pin tho? Both cards seem to have 2x8pin according to those pictures. There was a a 6+8 pin card that Raja had during some presentation but afaik he said it was only eng. sample and the production cards will have 2x8pin.



I was referring to the gaming vega cards, those cards will not have 16 gb of hbm 2.0 and sure as hell not all variants of it will be 2x8 pin..


----------



## Deleted member 172152 (Jun 16, 2017)

Aldain said:


> The fail is strong in your comment m8, vega FE is not a firepro card, firepro cards are the ones that are pitted against the nvidia Quadro cards... This card is for workstations...


Forgot firepro was a thing. Vega will be positioned above the wx7100. 8100 and 9100 maybe?


----------



## Aldain (Jun 16, 2017)

Hugh Mungus said:


> Forgot firepro was a thing. Vega will be positioned above the wx7100. 8100 and 9100 maybe?




It seems so , I would vaguer that this FE will be the fastest WS card RTG will offer , that is why it is pitted against the TXP BUT there is an small PR bull here , the TXP can do WS stuff BUT it cant use the quadro line drivers I think and as for gaming this card can use gaming drivers BUT it will not have any gaming performance optimization like the gaming RX Vega Raja said so himself..

p.s The 500 bucks difference between the AC and WC the WC edition better have some miracle of an AIO..


----------



## Nokiron (Jun 16, 2017)

Aldain said:


> The fail is strong in your comment m8, vega FE is not a firepro card, firepro cards are the ones that are pitted against the nvidia Quadro cards... This card is for workstations...


So are quadros. They are workstation cards.
https://www.nvidia.com/object/quadro-desktop-gpus.html



Boosnie said:


> I'm fairly sure you also know the prices of those quadro you mention.


If we are comparing the Specviewperf-results a comparable Quadro (P4000) costs less then $900. The Polaris-based WX 7100 comes pretty close aswell and that's below $700.


----------



## Aldain (Jun 16, 2017)

Nokiron said:


> So are quadros. They are workstation cards.
> https://www.nvidia.com/object/quadro-desktop-gpus.html
> 
> Nvidia and AMD make a clear distinction between the fire pro and quadro line vs others


----------



## Deleted member 172152 (Jun 16, 2017)

Aldain said:


> It seems so , I would vaguer that this FE will be the fastest WS card RTG will offer , that is why it is pitted against the TXP BUT there is an small PR bull here , the TXP can do WS stuff BUT it cant use the quadro line drivers I think and as for gaming this card can use gaming drivers BUT it will not have any gaming performance optimization like the gaming RX Vega Raja said so himself..
> 
> p.s The 500 bucks difference between the AC and WC the WC edition better have some miracle of an AIO..


100-150 dollar AIO some higher clocks and that's it. 200 dollars worth in gaming terms, but gaming vega will be something like 600 dollars, not 1200, making it 400 bucks worth of stuff relatively and you have the added bonus of a safe oc to roughly AIB gaming speeds which is apparently worth another 200 dollars.


----------



## Prima.Vera (Jun 16, 2017)

Dual GPU...


----------



## Deleted member 172152 (Jun 16, 2017)

Nokiron said:


> So are quadros. They are workstation cards.
> https://www.nvidia.com/object/quadro-desktop-gpus.html
> 
> If we are comparing the Specviewperf-results a comparable Quadro (P4000) costs less then $900. The Polaris-based WX 7100 comes pretty close aswell and that's below $700.


If the wx7100 gets close, it's not a full-on pro card, but a prosumer or pro-prosumer card. I'm running out of terms at this point, but pro and firepro are two different lineups for different users.


----------



## Nokiron (Jun 16, 2017)

Aldain said:


> Nvidia and AMD make a clear distinction between the fire pro and quadro line vs others


That's the problem. AMD does not, they have a really messy line-up at the moment that has to be fixed.




Hugh Mungus said:


> If the wx7100 gets close, it's not a full-on pro card, but a prosumer or pro-prosumer card. I'm running out of terms at this point, but pro and firepro are two different lineups for different users.


No. FirePro is slated for replacement by Radeon Pro.



> Watters said FirePro will continue to be sold and will get support, but AMD will transition to the Radeon Pro and Radeon Pro WX cards for professional users.



http://www.pcworld.com/article/3099...-radeon-pro-wx-series-to-replace-firepro.html


----------



## X800 (Jun 16, 2017)

Now when seen the pricing on this vega thing I'm glad that I bought the 1080.


----------



## bonomork (Jun 16, 2017)

It's possible to run Vega FE with 8-pin and 6-pin PCIe power connectors, rather than the two 8-pin connectors that made it onto production boards for "extra headroom."


----------



## Aldain (Jun 16, 2017)

X800 said:


> Now when seen the pricing on this vega thing I'm glad that I bought the 1080.




This is not a gaming vega


----------



## Estaric (Jun 16, 2017)

HTC said:


> Just a question: why are they comparing an "enterprise segment" card (Vega Frontier edition) to a gaming card (Titan Xp)?


NVidia did a lot of marketing for the Pascal Titans to be deep learning cards a few reviewers like JayZtwocents made videos about this saying how Nvidia wouldn't give them a review sample due to them being ment for deep learning


----------



## Boosnie (Jun 16, 2017)

Nokiron said:


> If we are comparing the Specviewperf-results a comparable Quadro (P4000) costs less then $900. The Polaris-based WX 7100 comes pretty close aswell and that's below $700.



Can I see this VEGA Specviewperf result you compare against?


----------



## Nokiron (Jun 16, 2017)

Boosnie said:


> Can I see this VEGA Specviewperf result you compare against?


https://www.spec.org/gwpg/gpc.data/vp12.1/Fujitsu/CELSIUS_M740_E5-2699v4_P4000/resultHTML.html


----------



## the54thvoid (Jun 16, 2017)

Well someone's going to buy it soon and it'll get benched too.  Despite what will be said - it IS a Vega card but it will be treated differently once consumer Vega is out.  So as soon as it's benched - we'll get agood idea about consumer Vega (as it's the same card).


----------



## cdawall (Jun 16, 2017)

You know what's funny? They still haven't released vega.


----------



## Captain_Tom (Jun 16, 2017)

Liviu Cojocaru said:


> Yep, this is a power hungry monster, let's hope it delivers. We have to wait for the consumer edition and see what TDP that has. Let the vega hype resume )



I have a suspicion that their "TDP" rating here is only for guiding overclocking.   TDP =/= Power usage, and additionally there is NO WAY the Liquid-Cooled version uses 75w more than the aircooled version while running at the same clocks.


The only way it would use 20% more energy is if they clocked it 10%+ faster.


----------



## Captain_Tom (Jun 16, 2017)

cdawall said:


> You know what's funny? They still haven't released vega.


----------



## Boosnie (Jun 16, 2017)

Nokiron said:


> https://www.spec.org/gwpg/gpc.data/vp12.1/Fujitsu/CELSIUS_M740_E5-2699v4_P4000/resultHTML.html



... I completely missed the image in the article.

I browsed the specviewperc db and I'm on your side with this. Something doesn't add up as they are selling a card that underperforms the competition to a much higher price.
Maybe is marketed for realtime workload.
IDK.


----------



## Deleted member 172152 (Jun 16, 2017)

X800 said:


> Now when seen the pricing on this vega thing I'm glad that I bought the 1080.


Rx vega will be roughly half the price probably or less depending on the version, with some cheaper and some more expensive AIB's plus prices will drop a bit after a few weeks or even a single week since AMD now had time to get HBM2 supply and deliver to retailers.



Boosnie said:


> ... I completely missed the image in the article.
> 
> I browsed the specviewperc db and I'm on your side with this. Something doesn't add up as they are selling a card that underperforms the competition to a much higher price.
> Maybe is marketed for realtime workload.
> IDK.


Quadro and radeon pro have different purposes. Wx 8100 and 9100 (?!) vega cards will perform better, but for general prosumers the titan XP comparison is useful.



the54thvoid said:


> Well someone's going to buy it soon and it'll get benched too.  Despite what will be said - it IS a Vega card but it will be treated differently once consumer Vega is out.  So as soon as it's benched - we'll get agood idea about consumer Vega (as it's the same card).


But rx vega will be evenless workstation optimized, so probably less memory, higher clocks, less efficiency (?!) and maybe even rgb.


----------



## xkm1948 (Jun 16, 2017)

Inside RTG:

Engineer:  Hey Raja, RX VEGA wont match 1080Ti performance unless we give it 240mm AIO and 400watt juice for a 2GHz core clock

Accountant: We can't sell them less than XXX dollars or it will not keep up with cost of HBM2.

Marketing: Shit we are swarmed with forum posts demanding for VEGA.


Raja: Alright let's slap the pro logo on this and call it a "professional card" Then we can hit three birds with one stone! 1). We can avoid talking about gaming performance because, duh, it is a PROFESSIONAL card!   2). We can sell it for a very high price to make the bean counters happy. 3). The community finally gets their VEGA. Besides, our loyal fanboys will shred any douters to pieces on forums.

Lisa Su(in her mind): You idiot pulled a Polaris on me and another shit show like this. You are so fired.


----------



## Nokiron (Jun 16, 2017)

Hugh Mungus said:


> Quadro and radeon pro have different purposes. Wx 8100 and 9100 (?!) vega cards will perform better, but for general prosumers the titan XP comparison is useful.


No, they have the same purpose. That's why they are retiring FirePro. Titan XP is strictly for gaming and specialized compute (extreme hashcat anyone?), the comparsion is seriously halting.

Again, WX7100 performs almost as good as this while being a much weaker GPU.


----------



## Deleted member 172152 (Jun 16, 2017)

Nokiron said:


> No, they have the same purpose. That's why they are retiring FirePro. Titan XP is strictly for gaming and specialized compute (extreme hashcat anyone?), the comparsion is seriously halting.
> 
> Again, WX7100 performs almost as good as this while being a much weaker GPU.


So it's a different type of gpu. Essentially it's kinda titan competition with some pro stuff thrown in, not actually an über-pro card or a gaming card. There has to be some strange other pro card at some point for a higher price that will have less compute power, but will perform better in specific pro stuff that will compete with quadro.


----------



## I No (Jun 16, 2017)

Captain_Tom said:


> I have a suspicion that their "TDP" rating here is only for guiding overclocking.   TDP =/= Power usage, and additionally there is NO WAY the Liquid-Cooled version uses 75w more than the aircooled version while running at the same clocks.
> 
> 
> The only way it would use 20% more energy is if they clocked it 10%+ faster.




Wait what? Posting a TDP rating for "guiding overclocking" really? Whatever the case may be this will suck more juice than anything nVidia has. Why do you think they insisted on HBM2 in the first place? Power draw however isn't the main concern here people are waiting on tests on this thing just to get a glimpse at the gaming version since that will have the clocks all bumped up (if they can be bumped up at least). Power draw aside nobody wants a jet engine furnace combo.


----------



## xkm1948 (Jun 16, 2017)

Overclockers' dream #2

Called it.


----------



## EarthDog (Jun 16, 2017)

xkm1948 said:


> Overclockers' dream #2
> 
> Called it.


What?


----------



## the54thvoid (Jun 16, 2017)

I'm sure Vega will be a big hit but with HBM2 it's going to have limited production and be costly or at least of low profit to AMD. What's more important is that so many upgrades that were waiting on Vega (myself included) went for a 1080ti. 
Even if Vega is the same or a bit faster, the card I now own is way fast enough for anything I need.

Shame really.


----------



## xkm1948 (Jun 16, 2017)

EarthDog said:


> What?




During FuryX release, it was called an overclockers' dream.

As someone who's been using FuryX right from the launch, I got say the overclocking sucks ass. It is hot, it is slow.

VEGA is shaping up to be another Fury. Bad efficiency and probably overclocked to the max from factory.


----------



## B-Real (Jun 16, 2017)

HTC said:


> Just a question: why are they comparing an "enterprise segment" card (Vega Frontier edition) to a gaming card (Titan Xp)?


Very easy to answer: because the Titan Xp is NOT a gaming card either.


----------



## HTC (Jun 16, 2017)

B-Real said:


> Very easy to answer: because the Titan Xp is NOT a gaming card either.



But it's sold as one.


----------



## B-Real (Jun 16, 2017)

HTC said:


> But it's sold as one.


I don't see it advertised as a gaming card. Moreover, who would buy a card for an extra 500$ instead of the 1080Ti for gaming, when it delivers the same performance?


----------



## R-T-B (Jun 16, 2017)

B-Real said:


> I don't see it advertised as a gaming card. Moreover, who would buy a card for an extra 500$ instead of the 1080Ti for gaming, when it delivers the same performance?



It doesn't even have DP FP.

It can pretty much only BE a gaming card.

It's marketed as an extreme gaming option.


----------



## HTC (Jun 16, 2017)

R-T-B said:


> It doesn't even have DP FP.
> 
> *It can pretty much only BE a gaming card.
> 
> It's marketed as an extreme gaming option.*



Exactly.


----------



## Deleted member 172152 (Jun 16, 2017)

the54thvoid said:


> I'm sure Vega will be a big hit but with HBM2 it's going to have limited production and be costly or at least of low profit to AMD. What's more important is that so many upgrades that were waiting on Vega (myself included) went for a 1080ti.
> Even if Vega is the same or a bit faster, the card I now own is way fast enough for anything I need.
> 
> Shame really.


1080 ti is fast enough, but what did you have before that you had to upgrade anyway? I have a gtx760, because I used to be a mid-range guy and now I want to improve my video/stream quality, so I have to massively upgrade soon and I might as well get the best of the best and I like freesync monjtors more generally speaking.

Nobody cares if AMD makes little profit either, as long as they get back in the high-end game first. I think they nailed their return, but it really depends on drivers and game optimization. We shall see. 10 bucks says rx vega is going to beat 1080 ti and will be competitive with volta!


----------



## efikkan (Jun 16, 2017)

Currently, Polaris needs quite a bit more computational power to compete with Pascal. So if that's still somewhat the case with Vega, consuming 300W to get 13 TFlop/s is not good news.


----------



## Deleted member 172152 (Jun 16, 2017)

efikkan said:


> Currently, Polaris needs quite a bit more computational power to compete with Pascal. So if that's still somewhat the case with Vega, consuming 300W to get 13 TFlop/s is not good news.


Not 300W consumption, 300W max consumption if it doesn't get too hot first, which it will. 250 or less realistically and 275W max. Always check if it's system power draw or gpu power draw in reviews btw.

I expect the watercooling version to have 14tflop/s and much higher clocks with 300W power draw max. if the AIO is rubbish.

Efficiency still won't be great, but definitely no polaris type situation.


----------



## EarthDog (Jun 16, 2017)

xkm1948 said:


> During FuryX release, it was called an overclockers' dream


ahh, another who misunderstood what they were talking about (perfectly understandable). Gotcha.


----------



## cdawall (Jun 16, 2017)

EarthDog said:


> ahh, another who misunderstood what they were talking about (perfectly understandble). Gotcha.



To be fair no one really understands what on earth AMD is talking about during press releases. Remember they also said two 480's where better than a 1080, during a press release


----------



## efikkan (Jun 16, 2017)

Hugh Mungus said:


> Not 300W consumption, 300W max consumption if it doesn't get too hot first, which it will. 250 or less realistically and 275W max. Always check if it's system power draw or gpu power draw in reviews btw.


For comparison, both GTX 1080 Ti and Titan Xp have a TDP of 250W. There is no way a TDP of ~300W is good news.


----------



## EarthDog (Jun 17, 2017)

cdawall said:


> To be fair no one really understands what on earth AMD is talking about during press releases. Remember they also said two 480's where better than a 1080, during a press release


truth. Though, i was sitting there during the presentation we all saw. I understood it under the context of which they were speaking, cooling headroom.


----------



## FordGT90Concept (Jun 17, 2017)

Vega Frontier Edition is a premium card for corporate customers.  I suspect they went balls to the wall with the thermals/power because they could.  This card is a really poor metric to measure the consumer cards by.


----------



## Blueberries (Jun 17, 2017)

Hugh Mungus said:


> So it's a different type of gpu. Essentially it's kinda titan competition with some pro stuff thrown in, not actually an über-pro card or a gaming card. There has to be some strange other pro card at some point for a higher price that will have less compute power, but will perform better in specific pro stuff that will compete with quadro.



At some point when you have to fumble to come up with a half-ass explanation like this it's probably best to just call a spade a spade.


----------



## toilet pepper (Jun 17, 2017)

I No said:


> After the major cock-up they pulled @ Computex? How can one not be a sceptic regarding this. Also charging $600 on cooling I'm sorry to say it, justifies jack-all. Furthermore how do we know that this will deliver? So far everything is in Limbo with a paper lunch on the way.



Well paying $100 more for a $hi++y blower cooler compared to AIBs makes a lot more sense then.


----------



## Deleted member 172152 (Jun 17, 2017)

Blueberries said:


> At some point when you have to fumble to come up with a half-ass explanation like this it's probably best to just call a spade a spade.


Read the comment above yours if you want someone else explaining it a bit better.


----------



## HD64G (Jun 17, 2017)

At least someone will provide us with a review of the FE edition and then we will know about efficiency, thermals and power consumtpion. Not so much about gaming performance though until RX Vega launches.


----------



## S@LEM! (Jun 17, 2017)

oh my, this gonna be a hell of a card, and with our weather here reaching 45C+ that's no good news at all.
all those years of development and they just came up with more Fury X card.

you know what funny though, ATi has never stuck to their TDP, it's not a MAX TDP for them, ATI quite known for EXCEEDING that TDP figure. so you can imagine how hell it's gonna be

----> 1070GTX here I come


----------



## the54thvoid (Jun 17, 2017)

Hugh Mungus said:


> 1080 ti is fast enough, but what did you have before that you had to upgrade anyway? I have a gtx760, because I used to be a mid-range guy and now I want to improve my video/stream quality, so I have to massively upgrade soon and I might as well get the best of the best and I like freesync monjtors more generally speaking.
> 
> Nobody cares if AMD makes little profit either, as long as they get back in the high-end game first. I think they nailed their return, but it really depends on drivers and game optimization. We shall see. 10 bucks says rx vega is going to beat 1080 ti and will be competitive with volta!



980ti. The 1080ti is a far superior card, more than I thought it would be (and my 980ti was fast).  I game at 1440p and don't worry about 144fps, even then no single card will run that across the board, not even Vega.
Playing BF1 at settings maxed getting around 130-140 fps average is pretty sweet. For the first time, I have a card that doesn't have ANY issues. The 980ti wasn't perfect but this one is, for me.

Even if Vega manages to beat my card, I neither have the desire or funds to get it. And that's still kind of my point, due to how long Vega has taken to come out, AMD have lost a lot from the high end consumer.  
As for Vega matching Volta, choose a Volta. If you think it will match GV102, I really, really don't think so. GV104, maybe as the mid range xx80 card tend to match the last gen xx80ti.


----------



## Frick (Jun 17, 2017)

I just fail to see why we should discuss performance when we can discuss that gold shroud instead. Seriously, am i the only one loving the looks on that? The blue is nice too, as they have been for a bit now.


----------



## Italia1 (Jun 17, 2017)

Not good card. I hoped in same performance but lower tdp (250 max). This time i think i must change from Amd to Nvidia for GPU...
... or simply they need stability, so "tdp" is for 300/ 375 W, but power used is much lower.


----------



## Caring1 (Jun 17, 2017)

Frick said:


> I just fail to see why we should discuss performance when we can discuss that gold shroud instead. Seriously, am i the only one loving the looks on that? The blue is nice too, as they have been for a bit now.


While I love the monoblock look of the gold shroud, it makes me question how the AIO cooler effectively cools the entire card, without the use of a supplemental fan like most LC cards.
I would have thought they would have used a modified version of the blue shroud with fan.


----------



## Deleted member 172152 (Jun 17, 2017)

the54thvoid said:


> 980ti. The 1080ti is a far superior card, more than I thought it would be (and my 980ti was fast).  I game at 1440p and don't worry about 144fps, even then no single card will run that across the board, not even Vega.
> Playing BF1 at settings maxed getting around 130-140 fps average is pretty sweet. For the first time, I have a card that doesn't have ANY issues. The 980ti wasn't perfect but this one is, for me.
> 
> Even if Vega manages to beat my card, I neither have the desire or funds to get it. And that's still kind of my point, due to how long Vega has taken to come out, AMD have lost a lot from the high end consumer.
> As for Vega matching Volta, choose a Volta. If you think it will match GV102, I really, really don't think so. GV104, maybe as the mid range xx80 card tend to match the last gen xx80ti.


Should've waited if you don't have a g-sync monitor.


----------



## HD64G (Jun 17, 2017)

Italia1 said:


> Not good card. I hoped in same performance but lower tdp (250 max). This time i think i must change from Amd to Nvidia for GPU...
> ... or simply they need stability, so "tdp" is for 300/ 375 W, but power used is much lower.


As you said, max power of the board is 300/375W in order for the owner to be able to oc it. Not the actual power consumption figure. It should be very clear to anyone related to computer HW for a few years even as a hobby.


----------



## efikkan (Jun 17, 2017)

Professional cards might have a few percent headroom in TDP for stability, but they certainly don't contain a OC headroom. That would be a first…


----------



## Deleted member 172152 (Jun 17, 2017)

efikkan said:


> Professional cards might have a few percent headroom in TDP for stability, but they certainly don't contain a OC headroom. That would be a first…


The air-cooled one could oc a bit if you don't mind listening to your miniature jet engine all the time.


----------



## EarthDog (Jun 17, 2017)

efikkan said:


> Professional cards might have a few percent headroom in TDP for stability, but they certainly don't contain a OC headroom. That would be a first…


Have you tried? Seen it? Are we able to? Why would you think that?

My thought is that they are binned like anything else. Just needs to hit X clocks with X voltage and power use and it fits a bin. Some have headroom some don't. I don't know. I never saw it, honestly and why I am asking.


----------



## notb (Jun 17, 2017)

FordGT90Concept said:


> Vega Frontier Edition is a premium card for corporate customers.  I suspect they went balls to the wall with the thermals/power because they could.  This card is a really poor metric to measure the consumer cards by.


You assume that corporate clients care less about high power consumption, while most of the time it's the other way round. 


Hugh Mungus said:


> Should've waited if you don't have a g-sync monitor.


But why? As he said: 1080Ti is fast enough for pretty much every traditional gaming these days (<=4K).
Anyway, as AMD seems to understand, the real money is to be made in the mainstream category, not the top-end gaming. That's why they don't upgrade the flagship cards very often and the top RX card is somewhere in the middle of NVIDIA offer.


----------



## efikkan (Jun 17, 2017)

AMD is barely participating in the midrange where the real money in the consumer market is made. There might be volumes in low-end, but the margins are also very small. At the moment they only have RX 580 to "compete" with GTX 1060, even though it has the power consumption of a GTX 1080.

It's over a year since the launch of Pascal, Pascal is now past midway in its life cycle, and AMD is still struggling to prepare its competitor. GV104 will arrive early next year, while Navi is pushed to the end of 2018…

AMD should have dropped HBM so they at least could have mass produced their chips. They need to compete with GTX 1080, GTX 1070 and GTX 1060, that's where the revenue is. Perhaps even put a model between GTX 1060 and GTX 1070, since there is a huge performance and price gap there.


----------



## FordGT90Concept (Jun 17, 2017)

notb said:


> You assume that corporate clients care less about high power consumption, while most of the time it's the other way round.


They generally don't.  Most of the customers buying these cards are going to use them for that 26 TFLOP FP16 performance.  They're really not going to care how much energy it uses because to get the same level of performance from NVIDIA (lower power consumption), they'd have to spend a lot more money on the hardware itself.  Convincing a board to spend $1 million on a super computer versus $3 million is easy.  The 30% higher energy costs on the backend is just details.


----------



## notb (Jun 17, 2017)

FordGT90Concept said:


> They generally don't.  Most of the customers buying these cards are going to use them for that 26 TFLOP FP16 performance.  They're really not going to care how much energy it uses because to get the same level of performance from NVIDIA (lower power consumption), they'd have to spend a lot more money on the hardware itself.  Convincing a board to spend $1 million on a super computer versus $3 million is easy.  The 30% higher energy costs on the backend is just details.


But now you're concentrating on heavy calculation usage for large servers, datacenters etc. In that case you're right, because here power consumption is just about the cost.

However, in general power consumption is not just money.
These cards will land not only in RACK servers, but also in desktop workstations designed for office infrastructure. As such, it is important how much energy they draw (because there's a limit) and how much heat/noise they produce (obvious).
A TDP of 375W is beyond anything we've seen lately and the FP16 performance is just slightly better than Pascal's (TDP=250W).


----------



## kruk (Jun 17, 2017)

efikkan said:


> AMD is barely participating in the midrange where the real money in the consumer market is made. There might be volumes in low-end, but the margins are also very small. At the moment they only have RX 580 to "compete" with GTX 1060, even though it has the power consumption of a GTX 1080..



The difference in power consumption is important only if you game 24/7, otherwise it's negligible.

Here, enjoy some nice numbers from computerbase.de:
The difference between 1060 FE and RX 580 in *idle* is *4 Watts*. The difference at* gaming* is *77 Watts*. The difference in *Youtube video* is *2 Watts*. If you leave the RX 580 idling with monitor off, it can turn off into *ZeroCore Power* modus which consumes virtually no power, but I'm putting *1 W* there just to be fair. Both cards perform virtually identical in games.

If you leave the computer running 24/7 and it *idles 12 hours*, you *game 6 hours*, watch a *Youtube for 2 hours* and do other things non graphical intensive things for *4 hours* the 1060 FE will consume: (12h*7W + 6h*111W + 2h*29W+4h*7W)/24h = *35W per hour* and the RX 580 will consume (12h*1W + 6h*188W + 2h*31W+4h*11W)/24h = *50W per hour*.

15 Watts/hour difference. Better have a fire extinguisher near your RX 580 system or it might catch fire! /s


----------



## FordGT90Concept (Jun 17, 2017)

notb said:


> A TDP of 375W is beyond anything we've seen lately and the FP16 FP32 performance is just slightly better than Pascal's (TDP=250W).


FTFY

I can't see anyone buying Vega over Pascal for FP32.


----------



## efikkan (Jun 17, 2017)

kruk said:


> The difference in power consumption is important only if you game 24/7, otherwise it's negligible…


You know very well the cost of energy consumption is the least concern.
AMD can't really compete when their contender consumes 185W and the competition consumes 120W, their competition becomes the superior choice here. Energy consumption matters because of heat and noise. AMD needs comparable efficiency to become relevant again.

With Vega FE with a (claimed) TDP of 300W, and the comparable product Quadro GP100 with a TDP of *235W*, it becomes clear that Vega will struggle to compete. In the professional market they probably can sell a few by selling them a bit cheaper, but in the consumer market the margins are a lot smaller.


----------



## kruk (Jun 17, 2017)

efikkan said:


> You know very well the cost of energy consumption is the least concern.
> AMD can't really compete when their contender consumes 185W and the competition consumes 120W, their competition becomes the superior choice here. Energy consumption matters because of heat and noise. AMD needs comparable efficiency to become relevant again.



Oh, come on. You didn't even click the link and read the full test . Here, noise and temps: again, negligible differences. You are blowing things out of proportion ...

As for RX Vega: when the card comes out and if its noisier, hotter and consumes 15 Watts/hour more per my calculation vs. 1080Ti FE then you won and AMD messed up. Happy now?


----------



## notb (Jun 17, 2017)

FordGT90Concept said:


> FTFY
> 
> I can't see anyone buying Vega over Pascal for FP32.


Nope. I meant GP100 with 20.7 TFLOPS in FP16. But I made a mistake nevertheless: Pascal is rated at 235W, not 250W.


----------



## FordGT90Concept (Jun 17, 2017)

GP100 is the primary competitor of the Vega Frontier Edition.


----------



## EarthDog (Jun 18, 2017)

EarthDog said:


> Have you tried? Seen it? Are we able to? Why would you think that?
> 
> My thought is that they are binned like anything else. Just needs to hit X clocks with X voltage and power use and it fits a bin. Some have headroom some don't. I don't know. I never saw it, honestly and why I am asking.


@efikkan


----------



## Prince Valiant (Jun 18, 2017)

notb said:


> Nope. I meant GP100 with 20.7 TFLOPS in FP16. But I made a mistake nevertheless: Pascal is rated at 235W, not 250W.


I'm well out of my depth here, please correct me if I'm wrong.

Isn't $1,700 for 20%(ish) more flops a much more attractive prospect than $6,000 for GP100 regardless of heat or noise unless those are absolutely critical issues?


----------



## Deleted member 172152 (Jun 18, 2017)

Prince Valiant said:


> I'm well out of my depth here, please correct me if I'm wrong.
> 
> Isn't $1,700 for 20%(ish) more flops a much more attractive prospect than $6,000 for GP100 regardless of heat or noise unless those are absolutely critical issues?


Depends really. Wx7100 and p4000 get close to or beat vega frontier air-cooled in specviewperf, so depending on your applucations it can be better or worse than 6000 dollar nvidia gpu's. How pro are you?

Don't worry if you are an über-pro, AMD will make a wx vega or something as well probably that's more suited for quadro style applications at some point and for servers there will be the MI25.



Prince Valiant said:


> I'm well out of my depth here, please correct me if I'm wrong.
> 
> Isn't $1,700 for 20%(ish) more flops a much more attractive prospect than $6,000 for GP100 regardless of heat or noise unless those are absolutely critical issues?


Depends really. Wx7100 and p4000 get close to or beat vega frontier air-cooled in specviewperf, so depending on your applucations it can be better or worse than 6000 dollar nvidia gpu's. How pro are you?

Don't worry if you are an über-pro, AMD will make a wx vega or something as well probably that's more suited for quadro style applications at some point and for servers there will be the MI25 accelerator.


----------



## Prince Valiant (Jun 18, 2017)

I'm just a regular dude but I was curious about it .

I hope they manage to get Vega out the door sometime soon. I don't think they can afford more delays in the high end GPU segment and they need something new to trickle down across the range.


----------



## Deleted member 172152 (Jun 18, 2017)

Prince Valiant said:


> I'm just a regular dude but I was curious about it .
> 
> I hope they manage to get Vega out the door sometime soon. I don't think they can afford more delays in the high end GPU segment and they need something new to trickle down across the range.


FE arrives near the end of the month and the rx cards will arrive late july apparently. Probably because of drivers and mainly getting stock to retailers.


----------



## notb (Jun 18, 2017)

Prince Valiant said:


> I'm well out of my depth here, please correct me if I'm wrong.
> 
> Isn't $1,700 for 20%(ish) more flops a much more attractive prospect than $6,000 for GP100 regardless of heat or noise unless those are absolutely critical issues?



Well... GP100 is supposed to be an enterprise-grade hardware prepared for heavy 24/7 load (yet offering normal GPU functionality).
It's a bit to early to say how well built and spec'd this new Radeon Pro Vega FE is going to be, but the pricing suggests it'll just be an expensive consumer/workstation model, not a calculation workhorse. Just by raw numbers we see in specification, it is going to outperform AMD's own Pro alternatives costing over twice as much.

Plus, NVIDIA can always ask more because of CUDA...


----------



## efikkan (Jun 18, 2017)

EarthDog said:


> Have you tried? Seen it? Are we able to? Why would you think that?
> 
> My thought is that they are binned like anything else. Just needs to hit X clocks with X voltage and power use and it fits a bin. Some have headroom some don't. I don't know. I never saw it, honestly and why I am asking.


Please read the post before mine, and you'll see what I'm talking about.
I've never seen a product incorporate OC headroom inside their TDP, and certainly not a professional product. If the final TDP really is 300/375 W, then the actual max consumption + a smal safety margin will equal this number, as with every previous product.


----------



## djemergenc (Jun 18, 2017)

What is NVIDIA doing over there? The only job they had to do is work on their GPUs and mobile processors, but here comes AMD, killing the GPU game while manufacturing CPUs and chipsets on the side.


----------



## notb (Jun 19, 2017)

djemergenc said:


> What is NVIDIA doing over there? The only job they had to do is work on their GPUs and mobile processors, but here comes AMD, killing the GPU game while manufacturing CPUs and chipsets on the side.


Here comes AMD doing WHAT? :-D
BTW: NVIDIA is doing way more things than AMD. And they're making a profit. A lot of it.


----------



## evernessince (Jun 19, 2017)

uuuaaaaaa said:


> If those TDP numbers are real this is looking like R600 (HD 2900 XTX) all over again... I hope that the performance is at least monstrous...



Every time, without fail there is always someone who thinks TDP = power consumption.  It is not and companies will calculate TDP differently.


----------

