# AMD Radeon RX Vega Put Through 3DMark



## btarunr (Jul 25, 2017)

Ahead of its July 27 unveiling at AMD's grand media event on the sidelines of SIGGRAPH, performance benchmarks of the elusive Radeon RX Vega consumer graphics card surfaced once again. Someone with access to an RX Vega sample, with its GPU clocked at 1630 MHz and memory at 945 MHz, put it through 3DMark. One can tell that it's RX Vega and not Pro Vega Frontier Edition, looking at its 8 GB video memory amount. 

In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2. 



 

 



*View at TechPowerUp Main Site*


----------



## R00kie (Jul 25, 2017)

Hmmmm....

No thanks


----------



## erek (Jul 25, 2017)

gdallsk said:


> Hmmmm....
> 
> No thanks




:'(


----------



## EarthDog (Jul 25, 2017)

Well, if this is the XTX air/water... that isn't good. If it is something lower in the product stack... 

300/375W vs 180 (gtx 1080) doesn't look good in performance /W. Looks like they will slide in on price: performance ratio and undercut.


----------



## S@LEM! (Jul 25, 2017)

poor Vega


----------



## DRDNA (Jul 25, 2017)

Right now I'm disappointed but I may not be in a month or two of Cat updates. AMD GPU's normally raise in their performance for the first few months if not for a year due to the Cat updates....


----------



## Prima.Vera (Jul 25, 2017)

If the price is lower or the same of a 1070.... well 
Anyways seems like AMD completelly lost the battle for top cards.


----------



## ERazer (Jul 25, 2017)

that gotta hurt for ppl that waited this long


----------



## ratirt (Jul 25, 2017)

ERazer said:


> that gotta hurt for ppl that waited this long


Well it does. I'm waiting for it still and yet nothing breathtaking has been shown so far. Bummer or there's still hope for something slightly better? Hope there is


----------



## ixi (Jul 25, 2017)

Shhjjjiieeet..... Ou well. Sad for those eho were waiting.


----------



## P4-630 (Jul 25, 2017)

Hmm using a i7-5960X to get a somewhat respectable overall score...
Looking at the graphics score it's not much faster than my 1070 OC'd.

My 1070 OC'd does a respectable  21.169 graphics score.


----------



## erek (Jul 25, 2017)




----------



## TheinsanegamerN (Jul 25, 2017)

Not really unexpected, based on the performance of polaris. Most educated guesses put the VEGA chip at 1080 level at best. The fact that it is consistently faster then a 1070 is a bit better then expected. Unfortunately, it also confirmed that AMD completely threw this GPU generation to nvidia for anything other then low end cards, and does not bode well for volta VS navi.

Of course, that assumes this is the full 4096 core part. Based on FE's performance, I have little reason to believe otherwise, but still. 





DRDNA said:


> Right now I'm disappointed but I may not be in a month or two of Cat updates. AMD GPU's normally raise in their performance for the first few months if not for a year due to the Cat updates....


At which point volta will be out and dominating what is left of VEGA. 

AMD needs to deliver at launch, not a year later.


----------



## GhostRyder (Jul 25, 2017)

Well that's not very good, depending on which variant it was...  Pretty much rules them out of being competitive on the high end though as even if its the lowest card in that stack its not going to magically be up to 1080ti levels (Unless there is something I am missing).

Realistically, I think the only way for AMD to be completely competitive again is a completely new architecture, on a new node, and built from the ground up


----------



## cryohellinc (Jul 25, 2017)

If its cheaper compared to 1080, this will sell very well.


----------



## fynxer (Jul 25, 2017)

Now just watch when nVidia demolishes Vega and sends it into the lower end of the midrange segment with the release of 2070 and 2080 Volta in September

nVidia will do it just because they can right now and to keep AMDs profits down from the expensive to produce Vega.

They want AMD to continue being an underdog and let them chew on that dry bone barely covering their expenses in the graphics card division.


----------



## B-Real (Jul 25, 2017)

P4-630 said:


> Hmm using a i7-5960X to get a somewhat respectable overall score...
> Looking at the graphics score it's not much faster than my 1070 OC'd.
> 
> My 1070 OC'd does a respectable  21.169 graphics score.



How interesting that is:






An OC 1070 nearly gets to ref 1080 level...


----------



## cdawall (Jul 25, 2017)

I see we are still a generation behind nvidia assuming this is the higher sku.

http://www.3dmark.com/fs/13034600



cryohellinc said:


> If its cheaper compared to 1080, this will sell very well.



Only if it is priced well. If it is priced like a 1070 and performs like a 1080 some people will buy it. Issue is it is still half the performance per watt and in the quiet computing scene that now exists having an extra gpu worth of heat dumped into you case is bad all around.


----------



## Liviu Cojocaru (Jul 25, 2017)

If the price is right it could be ok, waiting for the launching benches so I can judge this better


----------



## B-Real (Jul 25, 2017)

cdawall said:


> I see we are still a generation behind nvidia assuming this is the higher sku.
> 
> http://www.3dmark.com/fs/13034600
> 
> ...


Gamers will not buy GPUs that are better in performance per watt. They will buy GPUs that are cheaper with the same performance, or same price like rival's cheaper card but performing like their bigger card. However, with the HBM, I doubt the 1080 performance will be sold for 1070 prices. We will see.


----------



## Daven (Jul 25, 2017)

"This puts its performance either on-par or below that of the GeForce GTX 1080..."

No.

The Geforce GTX 1080 was overclocked to 1924 MHz which is well above stock/turbo/max clocks of 1607/1733/1800 MHz. This is pretty much the best possible score a GTX 1080 can have overclocked almost to the max.

If the RX Vega at 1630 MHz is stock or even turbo, they will be in a good position. I calculate the card will be halfway between a stock GTX 1080 and a stock GTX 1080 Ti.

Rumored power consumption of 375W still suuuuuuuuuuuuuuuuuuuccccckkkkkkkssss.


----------



## B-Real (Jul 25, 2017)

Mark Little said:


> "This puts its performance either on-par or below that of the GeForce GTX 1080..."
> 
> No.
> 
> ...



That is for the water cooled version. Air cooled is 300W. 1080 is 180W, 1080Ti is 250W. Who cares about power consumption when we speak about a 400-500-700$ card?


----------



## P4-630 (Jul 25, 2017)

Aaaand a sudden stream of new TPU members coming in....LOL


----------



## cdawall (Jul 25, 2017)

B-Real said:


> Gamers will not buy GPUs that are better in performance per watt. They will buy GPUs that are cheaper with the same performance, or same price like rival's cheaper card but performing like their bigger card. However, with the HBM, I doubt the 1080 performance will be sold for 1070 prices. We will see.



Gamers but quiet, quiet cards don't double the heat output of their competitor. You know how I know this? Sales figures for the last 10 years.

If performance per watt didnt matter amd wouldn't have been only 28% of the market share when the 290/390 was selling.


----------



## notb (Jul 25, 2017)

In line with everything we've seen until now.
RX Vega == Vega FE Gaming mode.
As it should be.


----------



## B-Real (Jul 25, 2017)

cdawall said:


> Gamers but quiet, quiet cards don't double the heat output of their competitor. You know how I know this? Sales figures for the last 10 years.
> 
> If performance per watt didnt matter amd wouldn't have been only 28% of the market share when the 290/390 was selling.



What I saw was that the AIB models of 290, 390 produced not much more heat than the 970 980 models. Same with the 480/580-1060. But ok, you tell that people are buying cards that have better performance/watt. I just don't accept it.


----------



## R00kie (Jul 25, 2017)

B-Real said:


> Who cares about power consumption when we speak about a 400-500-700$ card?



People who care about having a cool and quiet card in their PC. Nobody wants a card that either heats your loins up to a crisp, or takes half of your slots in a case. 



B-Real said:


> What I saw was that the AIB models of 290, 390 produced not much more heat than the 970 980 models. Same with the 480/580-1060. But ok, you tell that people are buying cards that have better performance/watt. I just don't accept it.



Sorry, but that was not the case at all, and I will be living proof of this. I bought an MSI R9 290 with a TFIV cooler, and that thing was still hot as the sun. Then I finally decided to get a GTX 970 just because it was a lot cooler and ate a lot less power.


----------



## B-Real (Jul 25, 2017)

cdawall said:


> Gamers but quiet, quiet cards don't double the heat output of their competitor. You know how I know this? Sales figures for the last 10 years.
> 
> If performance per watt didnt matter amd wouldn't have been only 28% of the market share when the 290/390 was selling.



Well, you say that gamers chose after performance/watt. I say that at the 390 times, 970 was at the same price, delivered same performance, but consuming less power (without tuning AMD cards). I say if one manufacturer delivers a GPU that is significantly cheaper than the other while performing nearly the same altough consuming more power, people will buy the former one.


----------



## Daven (Jul 25, 2017)

notb said:


> In line with everything we've seen until now.
> RX Vega == Vega FE Gaming mode.
> As it should be.



Not quite. Again the 1080 was overclocked to a whopping 1924 MHz. I don't believe the Vega FE Gaming Mode was compared to the 1080 clocked this high.


----------



## B-Real (Jul 25, 2017)

gdallsk said:


> People who care about having a cool and quiet card in their PC. Nobody wants a card that either heats your loins up to a crisp, or takes half of your slots in a case.
> 
> 
> 
> Sorry, but that was not the case at all, and I will be living proof of this. I bought an MSI R9 290 with a TFIV cooler, and that thing was still hot as the sun. Then I finally decided to get a GTX 970 just because it was a lot cooler and ate a lot less power.


Yep, it was true for the 290. ~+10 Celsius compared to the 970. Then check the 390: https://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/27.html Nearly same temps as the 970 with nearly the same fan noise. Still, the AIB versions of 290 producing a bit above 70 Celsius while the reference getting 94 Celsius (about 20 Celsius cooler) and making about 15dB louder noise is absolutely not a bad result, given their TDP.


----------



## R00kie (Jul 25, 2017)

B-Real said:


> Yep, it was true for the 290. ~+10 Celsius compared to the 970. Then check the 390: https://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/27.html Nearly same temps as the 970 with nearly the same fan noise.


Yet with far greater power consumption, same performance and slightly higher price tag.


----------



## Frick (Jul 25, 2017)

B-Real said:


> Well, you say that gamers chose after performance/watt. I say that at the 390 times, 970 was at the same price, delivered same performance, but consuming less power (without tuning AMD cards). I say if one manufacturer delivers a GPU that is significantly cheaper than the other while performing nearly the same altough consuming more power, people will buy the former one.



It wasn't as bad a difference as this. People started caring when Fermi came out. As cdawall said, some do not care, but in general people care more (which is a good thing).


----------



## B-Real (Jul 25, 2017)

gdallsk said:


> Yet with far greater power consumption, same performance and slightly higher price tag.


That's why I said earlier that if 2 cards come out with the same price tag and same performance, what people can consider? Performance/Watt ratio and drivers.


----------



## EarthDog (Jul 25, 2017)

Mark Little said:


> The Geforce GTX 1080 was overclocked to 1924 MHz which is well above stock/turbo/max clocks of 1607/1733/1800 MHz. This is pretty much the best possible score a GTX 1080 can have overclocked almost to the max.
> 
> If the RX Vega at 1630 MHz is stock or even turbo, they will be in a good position. I calculate the card will be halfway between a stock GTX 1080 and a stock GTX 1080 Ti.
> 
> Rumored power consumption of 375W still suuuuuuuuuuuuuuuuuuuccccckkkkkkkssss.


Hey mark... actual boost clocks go around 100Mhz over whatever is listed. Its likely a simple factory overclocked card. Most 1080's will do 2000 MHz overclocked manually. Not to mention, 3Dmark isn't good at reading clockspeeds in the first place... so its a tough call.


----------



## R00kie (Jul 25, 2017)

B-Real said:


> That's why I said earlier that if 2 cards come out with the same price tag and same performance, what people can consider? Performance/Watt ratio and drivers.


Which is why RX Vega is not going to be a go to card this late to the party.


----------



## r9 (Jul 25, 2017)

cryohellinc said:


> If its cheaper compared to 1080, this will sell very well.


I'm sure they will ask more and play the gsync cheaper monitors card.


----------



## Daven (Jul 25, 2017)

EarthDog said:


> Hey mark... actual boost clocks go around 100Mhz over whatever is listed. Its likely a simple factory overclocked card. Most 1080's will do 2000 MHz overclocked manually. Not to mention, 3Dmark isn't good at reading clockspeeds in the first place... so its a tough call.



To put things into perspective, here is the TP review of the Gigabyte GTX 1080 Aorus Xtreme Edition:
https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/

The card is clocked at 1759+ MHz. Even with another 100 MHz turbo, the card would not beat Vega in the 3dMark benchmark (not that means much as we need real world gaming benchmarks). Also the peak power consumption of the Gigabyte card was 243W. I'm sure the power would be even more at 1924 MHz.

Vega is equivalent to a very high, manually overclocked 1080 GTX. That's all I'm saying.


----------



## cdawall (Jul 25, 2017)

B-Real said:


> What I saw was that the AIB models of 290, 390 produced not much more heat than the 970 980 models. Same with the 480/580-1060. But ok, you tell that people are buying cards that have better performance/watt. I just don't accept it.









Accept what you want. Name another reason why anything released post 290 was garbage for sales figures? Mining is the only reason Q2 2016 numbers moved.


----------



## HD64G (Jul 25, 2017)

B-Real said:


> That's why I said earlier that if 2 cards come out with the same price tag and same performance, what people can consider? Performance/Watt ratio and drivers.


Freesync vs G-Sync cost is a good factor for a gamer's decision also. And new features for last and next gen games alongside with the better performance on last graphic APIs as DX12 and Vulcan should be considered as well. If Vega RX manages to be on par with factory oced 1080s for the same price, it will be sold easily. Some request 1080 performance for 1070 price which is suicidal for any enterprice to do such stupidity to itself. Let's hope now (not very possible judging by AMD's past new arch GPU releases) for reviews on mature drivers with all the features enabled to get a clear picture of where RX Vega will stand.


----------



## chief-gunney (Jul 25, 2017)

Sounds ok, will give me the performance I'm looking for as an upgrade from the RX 480. I'll be putting a water block on the one with the best vrm's. I hope the pricing is good.
Ive got a 1440p Freesync monitor so no Nvidia ever again for me.

From what I can surmise, Vega is a stepping stone to Navi which will see a dual vega on infinity fabric which the OS sees as single gpu. Nvidia are worried about that and I believe they're working on doing something similiar.


----------



## Imsochobo (Jul 25, 2017)

TheinsanegamerN said:


> Not really unexpected, based on the performance of polaris. Most educated guesses put the VEGA chip at 1080 level at best. The fact that it is consistently faster then a 1070 is a bit better then expected. Unfortunately, it also confirmed that AMD completely threw this GPU generation to nvidia for anything other then low end cards, and does not bode well for volta VS navi.
> 
> Of course, that assumes this is the full 4096 core part. Based on FE's performance, I have little reason to believe otherwise, but still.
> At which point volta will be out and dominating what is left of VEGA.
> ...



true, but it may have additional 10% left in it until actual launch, unless above is soo true.
if it has 10% more perf than we expect then it may have an actual chance to sell.

I really really want to get away from crapvidia linux drivers, but I still want to game at 4K so crossing my finger but I won't buy it unless it surpasses GTX1080 by a good margin


----------



## Dimi (Jul 25, 2017)

Mark Little said:


> Not quite. Again the 1080 was overclocked to a whopping 1924 MHz. I don't believe the Vega FE Gaming Mode was compared to the 1080 clocked this high.



You couldn't be more wrong. First off, the Vega was probably clocked to the absolute max. Secondly, a 1080 can go a lot higher than a measely 1924 mhz. Thats probably just a weak boost without ANY OC at all. Most 1080's will go to around 2025-2050 comfortably.


----------



## schrank251 (Jul 25, 2017)

Mark Little said:


> "This puts its performance either on-par or below that of the GeForce GTX 1080..."
> 
> No.
> 
> ...


 
No, i have the same GTX 1080 and even at stock it boots up higher than that in the table. Firestrike shows the peak GPU clock, at stock mine boots up to about 2 GHz but average is about 1950 MHz, resulting in a 23k graphics score.
Here is my worst stock result.
http://www.3dmark.com/fs/11885095
Overclocked i get 25k graphics score.
http://www.3dmark.com/fs/12177709


----------



## EarthDog (Jul 25, 2017)

Mark Little said:


> To put things into perspective, here is the TP review of the Gigabyte GTX 1080 Aorus Xtreme Edition:
> https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/
> 
> The card is clocked at 1759+ MHz. Even with another 100 MHz turbo, the card would not beat Vega in the 3dMark benchmark (not that means much as we need real world gaming benchmarks). Also the peak power consumption of the Gigabyte card was 243W. I'm sure the power would be even more at 1924 MHz.
> ...


We haven't seen a manually overclocked 1080 compared to it yet. As I said, that is likely a factory overclocked card.

Lesson on GPUz and boost: There is baseclock, boost, and actual clocks. The 1759 Mhz is the BASE CLOCK for that card. The BOOST clock is typically around 100 MHz above that. The BOOST clock is the MINIMUM boost, not the ACTUAL clocks. YOu can take what GPUz reads and generally add 100 MHz or so for ACTUAL clocks.

As far as the power rating, that is with the clocks stock, around 2000 MHz for the AORUS. Here is a snip from the article you linked. Manual overclocking had it going over 2100 MHz...default, was 2038. Youi can bet your arse the Vega will show the same increase in power when AIB's have their factory versions out too raising the already high values of 300/375W.


> Maximum overclock of our sample is 1470 MHz on the memory (15% overclock) and +88 MHz to the GPU's base clock, *which increases maximum Boost from 2038 MHz to 2126 MHz* (4% overclock).



You have to know how to read GPUz and how boost works bud. I agree with your end point mostly (performs like a factory OC 1080), but not how you managed to get there as that was just plain off.


----------



## PowerPC (Jul 25, 2017)

Just release it already! Nobody will care, if it's priced right. Only a handful of people are ever interested in $800 cards, so who cares it doesn't beat 1080 ti? It's such a petty argument. The only factor that ever matters with AMD will be price/performance, as always. It's great that there is a $500.000 Lamborghini on the market, but most people will still go out and buy the more affordable Subaru.


----------



## Dimi (Jul 25, 2017)

Mark Little said:


> To put things into perspective, here is the TP review of the Gigabyte GTX 1080 Aorus Xtreme Edition:
> https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/
> 
> The card is clocked at 1759+ MHz. Even with another 100 MHz turbo, the card would not beat Vega in the 3dMark benchmark (not that means much as we need real world gaming benchmarks). Also the peak power consumption of the Gigabyte card was 243W. I'm sure the power would be even more at 1924 MHz.
> ...



Did you even read the review? You link a review of a card that OC's to 2126 MHz according to the review yet you think this card clocks at a max of 1859 mhz..........


----------



## cdawall (Jul 25, 2017)

Mark Little said:


> To put things into perspective, here is the TP review of the Gigabyte GTX 1080 Aorus Xtreme Edition:
> https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/
> 
> The card is clocked at 1759+ MHz. Even with another 100 MHz turbo, the card would not beat Vega in the 3dMark benchmark (not that means much as we need real world gaming benchmarks). Also the peak power consumption of the Gigabyte card was 243W. I'm sure the power would be even more at 1924 MHz.
> ...



Did you even read that review?



			
				w1z said:
			
		

> Maximum overclock of our sample is 1470 MHz on the memory (15% overclock) and +88 MHz to the GPU's base clock, which increases maximum Boost from 2038 MHz to 2126 MHz (4% overclock).



Stock boost was 2038


----------



## dozenfury (Jul 25, 2017)

Wattage I'm not as concerned about, I have dozens of appliances that use more than the 120w difference between the Vega and 1080.  The monthly power cost difference is relatively minimal unless you are mining at 100% 24x7.  What I'm more concerned with is heat, since that heats up all of the other components in my PC and even to a small degree the room temp.  Heat also is going to likely hurt oc headroom, which has been an issue on recent AMD cards.

If the benches are accurate, I'd put a fair price on a RX Vega at $399.  That would beat the 1070 on pure performance/price (I paid $399 for my 1070 new), but also offsets the higher power and other minor AMD cons.  The NV platform is much more mature now having been out a year, oc's well, gets quicker driver updates than AMD, and will probably run cooler.  Any more than $399 and I'd happily buy a 1070 for the same price, a 1080 in the low $500's, or go $700 for a much faster 1080ti.  That's all excluding the mining price shenanigans which should eventually wind down.


----------



## DRDNA (Jul 25, 2017)

I was really looking forward to this release and I thought it would be a pinch better.


----------



## Durvelle27 (Jul 25, 2017)

This could possibly be the lower end Vega considering there will be 3 different Skus


----------



## Crap Daddy (Jul 25, 2017)

I think the big issue with this card is that it comes so late, the predicted disastrous power efficiency compared to the competition is secondary. For fanboys and fantasyland dreamers Vega is shaping up to be a disappointment. The unrealistic expectations were fueled by AMD themselves but that shouldn't come as a surprise, Fury being an overclockers dream was golden. For those who have been paying attention, Vega is more or less in line with what RTG is capable of doing compared to Nvidia.


----------



## bug (Jul 25, 2017)

My question is, if it won't beat the GTX 1080, why does it need HBM?


----------



## Supercrit (Jul 25, 2017)

This is not a gaming card, wait for the gaming version for 20% extra performance! /s


----------



## chaosmassive (Jul 25, 2017)

yo, its still 4096 cores same as Fiji silicon
improvement we see here is from IPC gain and uarch changes...
honestly I dont see the point of this card
with costly HBM2 memory chip and delayed product yet still outclassed by one years old GPU

this is worse than Fury X, people used to compare Fury X with 980 Ti
but this, this is just so bad....

we need another Jim Keller type guy to research/design new GPU uarch


----------



## EarthDog (Jul 25, 2017)

Supercrit said:


> This is not a gaming card, wait for the gaming version for 20% extra performance! /s


It is... Take a second to read the title, the link, etc.. RX Vega (the gaming card).. not Vega Frontier Edition.


----------



## Supercrit (Jul 25, 2017)

EarthDog said:


> It is... Take a second to read the title, the link, etc.. RX Vega (the gaming card).. not Vega Frontier Edition.


Poe's law so powerful /s didn't even work.


----------



## cdawall (Jul 25, 2017)

So this brought up some curiosity from me, let us see what it takes to get a 1080Ti which is a similarly sized GPU with similar performance numbers in bandwidth etc.

This testing is done with the rig in my specs

5960X@4.8, Asus X99m WS, 4x8GB DDR4 @2666CL14, 512GB NVMe, 2 EVGA SC BLACK 1080Ti's (SLi disabled OBVIOUSLY)

50% TDP rest of card settings at stock, this yielded a graphics score of 20343 and consumed 125W at the GPU (under mining stress load)







55% TDP rest of card settings at stock, this yielded a graphics score of 23158 and consumed 138W at the GPU (under mining stress load)






58% TDP rest of card settings at stock, this yielded a graphics score of 24016 and consumed 144W at the GPU (under mining stress load)






So with a similarly sized GPU I can basically run my cards silent for the same performance. Fan curve never broke 20%, my ceiling fan is louder.


----------



## RejZoR (Jul 25, 2017)

Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...


----------



## Captain_Tom (Jul 25, 2017)

Hope this is cut down RX Vega or it is going to be a pathetic launch.   Even if it's still just immature drivers, it should never be released in this state.  A 300w card competing with a 1070.  WTF?!


They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...


----------



## cdawall (Jul 25, 2017)

Captain_Tom said:


> Hope this is cut down RX Vega or it is going to be a pathetic launch.   Even if it's still just immature drivers, it should never be released in this state.  A 300w card competing with a 1070.  WTF?!
> 
> 
> They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...



HBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.


----------



## EarthDog (Jul 25, 2017)

Supercrit said:


> Poe's law so powerful /s didn't even work.


There is a reason most forums have emoticons.. to clear up how the written word's tone should be read... I'd consider using them for sarcasm.


----------



## Captain_Tom (Jul 25, 2017)

RejZoR said:


> Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...



A lot of parallels to Bulldozer going on here:

-Massive hype
-Tons of delays
-Completely new arch with high power usage

I am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did.    However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!"



cdawall said:


> HBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.



Polaris is actually more memory starved than compute.  Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.


----------



## ratirt (Jul 25, 2017)

RejZoR said:


> Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...


Wait for Navi and Volta. Both will be dual GPU's


----------



## Vayra86 (Jul 25, 2017)

I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win (because drivers!) from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality


----------



## cdawall (Jul 25, 2017)

Captain_Tom said:


> Polaris is actually more memory starved than compute.  Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.



Two polaris dies worth of BS would have allowed a 512bit GDDR5X bus which would in turn fix that issue. Remember the 352bit GDDR5X bus on the 1080Ti is about even with the HBM bus on vega


----------



## Vayra86 (Jul 25, 2017)

cdawall said:


> Two polaris dies worth of BS would have allowed a 512bit GDDR5X bus which would in turn fix that issue. Remember the 352bit GDDR5X bus on the 1080Ti is about even with the HBM bus on vega



They would also be stuck with a board TDP of over 320W that way. 512 bit GDDR5X is quite costly I reckon and a single RX480 happily goes towards 170W typical draw

Let's face it, GCN is like Intel Core right now, as DX12 has landed its been scaled up to the max. Not the best timing for AMD and I really hope they have something up their sleeve for Navi besides glueing GCN together.


----------



## ratirt (Jul 25, 2017)

Vayra86 said:


> I'll not say I predicted this already when the 1080 prices were slashed and I bought one.
> 
> But that's what it is
> 
> ...


So what did you predict for Vega? Leaked benchmarks? Hype train? the card is not out yet so chillax.


----------



## cdawall (Jul 25, 2017)

Vayra86 said:


> They would also be stuck with a board TDP of over 320W that way. 512 bit GDDR5X is quite costly I reckon and a single RX480 happily goes towards 170W typical draw



So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.


----------



## Vayra86 (Jul 25, 2017)

cdawall said:


> So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.



Yeah you're right, they have no place to go feasibly, its horrifying and to me shows a glaring lack of planning ahead. As well as they set up their CPU right now, you'd expect different. They should have pulled Navi towards 2H 2017 and skipped Vega altogether.

Perhaps the only advantage of dual RX480 would have been a much, much cheaper to produce high end, that could have been to market a year ago.

@ratirt not sure how much more you need here to know what's what. Even if it still gains 10% from drivers, which I find reasonably plausible (and still a feat on its own tbh, go look at pre- and post driver launch benches of the past) it would make 24k points and still be much too hungry for too little performance and a high cost to make. Keep in mind that high board TDP also severely limits OC headroom.


----------



## Captain_Tom (Jul 25, 2017)

Vayra86 said:


> I'll not say I predicted this already when the 1080 prices were slashed and I bought one.
> 
> But that's what it is
> 
> ...



RX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization.    It is a monster card, period.  

Have you read/watched reviews of Frontier?   The Radeon Menu's don't even work yet lol.   There are clearly some massive driver issues.


*Will AMD ever get their drivers to work?   Nobody knows.   But at the very least the potential is there, and Frontier clearly wasn't using all of its features.*



cdawall said:


> So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.



Exactly.


If Vega really turns out this bad, it will be because Vega was never built for gaming.  GCN=gaming, Vega=professional work.

However if this is the case it is insanely bone-headed for AMD to even bother making a  gaming version of Vega, and I will expect a GDDR6 large Polaris card by early 2018.


----------



## cdawall (Jul 25, 2017)

Captain_Tom said:


> RX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization.    It is a monster card, period.
> 
> Have you read/watched reviews of Frontier?   The Radeon Menu's don't even work yet lol.   There are clearly some massive driver issues.
> 
> ...



TBR doesn't work at all. So unless something changes that is just another missing cog from the hype train.


----------



## Patriot (Jul 25, 2017)

ERazer said:


> that gotta hurt for ppl that waited this long



I bought a 1080ti while waiting, and it still hurts lol.


----------



## Captain_Tom (Jul 25, 2017)

cdawall said:


> TBR doesn't work at all. So unless something changes that is just another missing cog from the hype train.



And I 100% agree with you on that.   But like I said - the potential is there.  If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.


----------



## Vayra86 (Jul 25, 2017)

Captain_Tom said:


> And I 100% agree with you on that.   But like I said - the potential is there.  If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.
> 
> This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.



You gotta admit progress has been quite stagnant over the past six months driver wise. We can get all hung up on TBR but core count + clocks already tell me more than enough.

In other news you're double posting again


----------



## cdawall (Jul 25, 2017)

Captain_Tom said:


> And I 100% agree with you on that.   But like I said - the potential is there.  If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.
> 
> This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.



I don't think even a combo of Intel's and nvidias driver team could fix a card that only competes with a 1080ti downclocked on a 55% tdp...that is a lot of ground to make up for.


----------



## Daven (Jul 25, 2017)

cdawall said:


> Did you even read that review?
> 
> 
> 
> Stock boost was 2038



Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.


----------



## cdawall (Jul 25, 2017)

Mark Little said:


> Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:
> 
> First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.
> 
> ...



And if you look at the post I made earlier as far as more efficient nvidia cards go the 1080ti massively underclocked still beats it.


----------



## Captain_Tom (Jul 25, 2017)

cdawall said:


> I don't think even a combo of Intel's and nvidias driver team could fix a card that only competes with a 1080ti downclocked on a 55% tdp...that is a lot of ground to make up for.



It's not a matter of having a better driver team, it's a matter of the card just being too complicated period.

But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once.   Then throw in the fact that AMD has been on GCN for 6 years straight.


----------



## Daven (Jul 25, 2017)

cdawall said:


> And if you look at the post I made earlier as far as more efficient nvidia cards go the 1080ti massively underclocked still beats it.



Oh yes. I quite agree. The TI version puts AMD even more to shame. This is why I hope they price Vega around $500 like the GTX 1080.


----------



## cdawall (Jul 25, 2017)

Captain_Tom said:


> It's not a matter of having a better driver team, it's a matter of the card just being too complicated period.
> 
> But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once.   Then throw in the fact that AMD has been on GCN for 6 years straight.



2 years since the first benchmarks started their rounds. In that time AMD has done what? I strongly doubt their card id anymore advanced than the HPC products on the market.


----------



## Vayra86 (Jul 25, 2017)

Captain_Tom said:


> Then throw in the fact that AMD has been on GCN for 6 years straight.



That's six years to optimize it incrementally, but AMD was happier adding shaders and going to 512 bit. They started considering incremental refinements with Tonga which was a dud, and we all know the rebrand hell that followed. AMD could have seen this coming for at least 3-4 years, you can't defend that TBR shouldn't have been here long ago already.

Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.


----------



## Eric3988 (Jul 25, 2017)

The timing of this card really couldn't be much worse but the fact is that if AMD can provide nearly 1080 performance for less than the MSRP, it can do well. Doubtless Nvidia won't sit still, so keep on waiting if that's your thing. I for one need a new GPU to replace my super long in the tooth hd6970 in my wife's PC. She'll get my 970 and I'll take Vega with the Freesync monitor I purchased recently.


----------



## Captain_Tom (Jul 25, 2017)

Vayra86 said:


> That's six years to optimize it incrementally, but AMD was happier adding shaders and going to 512 bit. They started considering incremental refinements with Tonga which was a dud, and we all know the rebrand hell that followed. AMD could have seen this coming for at least 3-4 years, you can't defend that TBR shouldn't have been here long ago already.
> 
> Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.



What about Hawaii?

Hawaii wiped the floor with Kepler and Grenada even managed to stay competitive with Maxwell.



Eric3988 said:


> The timing of this card really couldn't be much worse but the fact is that if AMD can provide nearly 1080 performance for less than the MSRP, it can do well. Doubtless Nvidia won't sit still, so keep on waiting if that's your thing. I for one need a new GPU to replace my super long in the tooth hd6970 in my wife's PC. She'll get my 970 and I'll take Vega with the Freesync monitor I purchased recently.




In my opinion AMD needs to just give up on making a profit on gaming Vega.  At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.

If Vega is a 300w 1080, it should cost $400 at most.


----------



## DeathtoGnomes (Jul 25, 2017)

so when will we see a real review here?


----------



## Eric3988 (Jul 25, 2017)

Captain_Tom said:


> In my opinion AMD needs to just give up on making a profit on gaming Vega.  At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.
> 
> If Vega is a 300w 1080, it should cost $400 at most.



Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.


----------



## Vayra86 (Jul 25, 2017)

Eric3988 said:


> Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.



The bulk of market share is within the RX480 segment, and Polaris was targeting that. Not Vega. The only way you gain market share with enthusiast/high end is by having the halo card actually be the fastest and allow for trickle down, which is everything Vega's not. Trickle down HBM won't ever happen in the foreseeable future.

@Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.


----------



## Tomgang (Jul 25, 2017)

I came, I saw and I do not want it.

If this is how rx vega performe and if the rated tdp of 300 or 375 watt i true, i do not want.

Then i am just even more happy i dit not wait for vega and got a 1080 TI.

Besides a hig power use that will effect your electrical bill, there are also other downsides of a high tdp card:

It heats up the room it is in up faster.
Potentially more noise from fans.
Needs to be bigger to have space for a bigger cooler that can handle 300 watt TDP+ if this TDP turn out true off cause.
Can you really live with a card that uses close to up to dobbelt so much power for the same performance another card can deliver at the half power use. I can not.

Nope RX vega dosent impress me much so far.


----------



## refillable (Jul 25, 2017)

I know it's disappointing but how did you deduce that it slower than GTX 1080? Most reference 1080 got around 20000 and since this one is reference it should be the other way around.


----------



## Eric3988 (Jul 25, 2017)

Vayra86 said:


> The bulk of market share is within the RX480 segment, and Polaris was targeting that. Not Vega. The only way you gain market share with enthusiast/high end is by having the halo card actually be the fastest and allow for trickle down, which is everything Vega's not. Trickle down HBM won't ever happen in the foreseeable future.
> 
> @Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.



There's more than one way to skin a cat, as they say. Anything at or above 1070 is enthusiast/high end and with what we know of Vega it will be at that level at least. At a competitive price point it can gain some share. I'm not saying it's going to flip the market share around in a cycle, but at this point AMD is looking for any gains it can muster right? What's important is that they don't screw this up. I know we have our team red and green, but we should all be rooting for both to do well to keep competition healthy and prices low for all.


----------



## Nkd (Jul 25, 2017)

EarthDog said:


> Well, if this is the XTX air/water... that isn't good. If it is something lower in the product stack...
> 
> 300/375W vs 180 (gtx 1080) doesn't look good in performance /W. Looks like they will slide in on price: performance ratio and undercut.



 gaming vega have a TDP of 275w for the entire board before overclocking. Videocardz had the numbers earlier this month I think.


----------



## Basard (Jul 25, 2017)

Captain_Tom said:


> I am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did. However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!"


Bingo! We have a winner! I would have jizzed my pants over an 8 core Phenom @ 4ghz....


----------



## EarthDog (Jul 25, 2017)

Nkd said:


> gaming vega have a TDP of 275w for the entire board before overclocking. Videocardz had the numbers earlier this month I think.


There were 2 values. One for the air cooled and one for water. The water cooled shows 375W there, while air is 285W. Most other sites report 300/375. A cheesy pump or two sure as hell isn't 90W. 

https://videocardz.com/amd/radeon-500


----------



## Slizzo (Jul 25, 2017)

Mark Little said:


> Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:
> 
> First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.
> 
> ...




The definition of an overclocked card is somewhat now a misnomer as Pascal has launched. Our understanding of overclocking has been changed drastically with the advent of GPU Boost 3.0 (which is on all Pascal based GPUs). ALL 1080s, whether they're Founders' Edition or not, will generally boost up to at least ~1900MHz, that is really what you should look for as the normalized frequency of the cards. It's rare you'll see one that won't boost to that level or higher unless it's starved for air.

What you should be looking for is STOCK for STOCK performance, no matter if the card is "overclocking" itself or not. Compare a Founders' Edition card to a stock RX Vega, which is I'm sure what you will see all the news outlets test once they are able to publish their reviews on RX Vega.


----------



## OneMoar (Jul 25, 2017)

wow slower then a 1080 and nearly double the power consumption
sign me up

a overclocked 1080 draws about 200W

vega draws >350 at thje clocks needed to match a 1080

if you run it at ~1300 to 1400Mhz it _only_ draws 290W and gets its ass beat by the 1070
pretty simple math


----------



## EarthDog (Jul 25, 2017)

Mark Little said:


> Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:
> 
> First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.
> 
> ...


They aren't if you know... as I explained in post #44 to ya. Its a clear picture. 

1. Cant compare clockspeeds between the two..
2. Correct. It was stock for that specific card. 
3. The BOOST clocks vary by temperature, correct. Not so much load unless its a light load and  it drops to a different set of clocks lower than the base clock. 
4. We get it... just sharing with you the way to properly read GPUz and how Boost works with NVIDIA cards. 
5. Again, we get what you are saying, but how you got there wasn't correct. Looks settled now. 
6. Yes, and that is a factory overclocked 1080. When AIBs get their hands on Vega, its power will use will go up making that 35% value larger.


----------



## OneMoar (Jul 25, 2017)

either way you are looking at a 400W card loosing to a 200 watt card
that costs less btw

I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units

time and time again they throw more CU's at the problem and all we get is more power consumption and MEH performance


----------



## cdawall (Jul 25, 2017)

EarthDog said:


> They aren't if you know... as I explained in post #44 to ya. Its a clear picture.
> 
> 1. Cant compare clockspeeds between the two..
> 2. Correct. It was stock for that specific card.
> ...



We actually don't know for sure number 6. Remember how the polaris and Fermi cards pull equal or even sometimes less power than their OEM equal due to temperature drop. Fermi was notorious for pulling less than stock power when heavily overclocked on water. Now I am not saying it will be a miracle 50% decrease, but I could see a drop.


----------



## Slizzo (Jul 25, 2017)

OneMoar said:


> either way you are looking at a 400W card loosing to a 200 watt card
> that costs less btw
> 
> I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units
> ...



Vega and previous Fury have the same amount of shaders (4096). From what we've seen of Frontier Edition, the resulting increase in performance is directly relatable to the increase in clock speeds. (Tests have been done with a Frontier Edition at 1050MHz and it's performance matched that of R9 Fury X).


----------



## EarthDog (Jul 25, 2017)

cdawall said:


> We actually don't know for sure number 6. Remember how the polaris and Fermi cards pull equal or even sometimes less power than their OEM equal due to temperature drop. Fermi was notorious for pulling less than stock power when heavily overclocked on water. Now I am not saying it will be a miracle 50% decrease, but I could see a drop.


Yes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.


----------



## OneMoar (Jul 25, 2017)

EarthDog said:


> Yes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.


we know from various benches of the FE what happens when the card goes beyond 1500Mhz the power consumption skyrockets
we also know that AMD is using some ultra aggressive clock-gating to keep the card from drawing 400 fking watts
I leave the rest to your imagination


----------



## EarthDog (Jul 25, 2017)

I'm just trying to figure out how a 120mm AIO does the job well enough... I mean they strapped one on the 295x2 (500W), and it 'worked'... but sweet l0rd baby jebus, that radiator was HOT to the touch!!


----------



## cdawall (Jul 25, 2017)

EarthDog said:


> Yes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.



It is a single 120mm aluminum radiator, traditional thought puts maximum thermal load through those at 150w assuming a good radiator and brass/copper construction using a pump with actual flow. We already saw this same nonsense garbage cooling with the 295X2. Radiators were well past their efficiency curve and temps/noise suffered substantially for it. 375w board TDP through a 120mm radiator is a bloody joke.


----------



## EarthDog (Jul 25, 2017)

JINX!!!!!!!!!!!!!!


----------



## OneMoar (Jul 25, 2017)

the fact that it needs water cooling to beat a 1070 is a joke -.- and a unfunny one


----------



## cadaveca (Jul 25, 2017)

Mark Little said:


> Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet.




Want it to be more confusing? 3DMark does not list actual clocks the card ran at in the benchmark; it merely lists the maximum reported. My 1080, which is a stock-clocked card (but with better cooler) regularly reports 1924 MHz as it's clock in 3DMark, but it usually runs @ 1872.

I know that's not that big of a difference, but it's still nearly 50 MHz.

Here's my 1080 with a 7900X (both at stock, but memory bus @ 3600 MHz, because 100gb/s memory bandwidth ) I ran this just now, just for you (note the 1911 MHz reported clock)

http://www.3dmark.com/fs/13200948







cdawall said:


> It is a single 120mm aluminum radiator, traditional thought puts maximum thermal load through those at 150w assuming a good radiator and brass/copper construction using a pump with actual flow. We already saw this same nonsense garbage cooling with the 295X2. Radiators were well past their efficiency curve and temps/noise suffered substantially for it. 375w board TDP through a 120mm radiator is a bloody joke.



Most companies rate 120 mm as good for 250W+. You can thank Koolance for that. 


http://koolance.com/radiator-1-fan-120mm-30-fpi-copper

That one is rated for 400w. ROFL.


----------



## efikkan (Jul 25, 2017)

Captain_Tom said:


> RX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization.    It is a monster card, period.


A hot monster, sure.



Captain_Tom said:


> Will AMD ever get their drivers to work?   Nobody knows.   But at the very least the potential is there, and Frontier clearly wasn't using all of its features.


So, AMD can't live up to the hype and you conclude it's due to their drivers? Perhaps your expectations are unrealistic?
What evidence is there for "potential"?

Do you remember Fiji? ~53% more GFlop/s and still it was beaten by GTX 980 Ti. So much for "potential".



Captain_Tom said:


> If Vega really turns out this bad, it will be because Vega was never built for gaming.  GCN=gaming, Vega=professional work.


So, in full damage control mode already?
I knew we were going to see people in denial about Vega…


----------



## Captain_Tom (Jul 25, 2017)

Eric3988 said:


> *Agreed. Pricing can make or break the product.* I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.



There is no such thing as a bad product, just bad pricing .     


Let's assume that indeed Vega is a worst-case-scenario, and it's performance will fall short of what it's spec suggested.  If so, I would say this should be the line-up:


$600 = Fully unlocked and water-cooled Vega.   This would be the 1080 Ti competitor (still a little weaker) that trades efficiency for more features and future proofing.  
In general this is just for the AMD fanboys.
*Even if it uses 400w, I think it is imperative AMD has something comparable to the 1080 Ti* so that CPU benchmarks sometimes use an AMD card.
Over time this will likely catch up to the 1080 Ti or higher, but it will take many driver revisions.


$450 = Fully unlocked (or not?), lower clocked air-cooled Vega.  This would narrowly beat the 1080 (Or trade blows) while using 250 - 300w.
This is the card AMD needs to nail perfectly.  Most people don't buy uber graphics cards, and something at this caliber is sorely needed at a lower price point.
Paired with a Freesync monitor this would be a very attractive alternative to Nvidia's high-end cards.
The fact that you could crossfire 3 of these for around the same price as a Titan Xp would be a big selling point.


$350= Cut down, lower clocked, 4GB Vega.   This is the card that beats the 1070 for a tad less money.
A very important gap to fill between the RX 580 and RX Vega cards.
I would expect 8GB and Nano variants.
It would be sweet if they could make a GDDR5 version to further reduce price, but I know that won't happen.



^^^I know this is a long-shot, but I feel this is what's required to even have a chance of competing.   Nvidia can surely drop prices by $50 across their entire line-up, and a minor Pascal refresh could be out by December  (Or a 12nm refresh by Spring 2018).   Plus even at these aggressive prices AMD would still be profiting some on each card.


----------



## OneMoar (Jul 25, 2017)

Captain_Tom said:


> There is no such thing as a bad product, just bad pricing .
> 
> 
> Let's assume that indeed Vega is a worst-case-scenario, and it's performance will fall short of what it's spec suggested.  If so, I would say this should be the line-up:
> ...


NONE of that is going to happen
no way in hell underwater with a 400WTDP its barely matching a vanilla 1080


----------



## Captain_Tom (Jul 25, 2017)

efikkan said:


> A hot monster, sure.
> 
> 
> So, AMD can't live up to the hype and you conclude it's due to their drivers? Perhaps your expectations are unrealistic?
> ...




What denial?  LMAO!    

Can you read?   Most of my posts here center around Vega being a big disappointment so far (And likely overall).  But everything I said still stands:


Vega Frontier's drivers ARE terrible.  Learn how to read some reviews, there are bugs everywhere.    *This is a fact that the drivers are bad, what is opinion is if better drivers will improve performance.*

Although none of us are fortune tellers, it would be idiotic to think performance won't increase by a decent margin considering how bad they are now.   In fact GCN 1.0 had gained so much performance from it's 12.11 drivers that TechPowerUp said "The 7870 felt like an entirely different card", and GCN had nowhere near the issues Vega clearly has buddy:
 https://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/


If GCN 1.0 could gain 10-20% in the first year, it is not insane to think Vega could gain the same or even more performance considering how big a departure this architecture is.  *Again, I am not saying Vega will become substantially stronger, but it is not at all crazy to think it could.*

The 980 Ti vs Fury debate is a dead horse.  Stop beating it.   The only thing I will say is that *the Fury X is currently trading blows with the 1070 while the 980 Ti  is treading water above he 390X.*  If you call that a victory, congratulations.


----------



## Captain_Tom (Jul 25, 2017)

OneMoar said:


> NONE of that is going to happen
> no way in hell underwater with a 400WTDP its barely matching a vanilla 1080



Call me crazy, but I don't think that's how it will shake out in the reviews.   We will see though, and if it is that bad...It's bad lol


----------



## Basard (Jul 25, 2017)

A 120mm rad will have a 150w cooling capacity on one chip and 250w on another.... depends how much heat each chip can tolerate.


----------



## EarthDog (Jul 25, 2017)

Basard said:


> A 120mm rad will have a 150w cooling capacity on one chip and 250w on another.... depends how much heat each chip can tolerate.


Que?


----------



## KainXS (Jul 25, 2017)

Just bringing my popcorn to see how the local AMD Defense Force will defend this.

This entire Vega launch is like a train wreck in slow motion at this point.


----------



## Kronauer (Jul 25, 2017)

Isn't FireStirke 1.1 is a fairly old DirectX 11 benchmark btw? 
I mean even the 2nd year old Polaris is designed for DirectX 12/Vullkan gaming, yet everybody loses their mind over a directx 11 benchmark score.
Im confident that Vega will show very good fps numbers, and prove that these older benchmarks does not represent the real-world performance of a newer graphics card.


----------



## efikkan (Jul 25, 2017)

Captain_Tom said:


> Vega Frontier's drivers ARE terrible.  Learn how to read some reviews, there are bugs everywhere.    This is a fact that the drivers are bad, what is opinion is if better drivers will improve performance.


And what is the evidence that Vega's drivers are any worse than anything else?
This is just the same old excuse; "the drivers are immature, buy it and the performance will come".



Captain_Tom said:


> If GCN 1.0 could gain 10-20% in the first year, it is not insane to think Vega could gain the same or even more performance considering how big a departure this architecture is.


GCN 1.0 was a brand new architecture, while Vega is a slight refinement in comparison.

Don't forget that the competition also improves their drivers, so even if Vega improves 10% over the next year, Nvidia will also improve.



Captain_Tom said:


> The 980 Ti vs Fury debate is a dead horse.  Stop beating it.


You don't like the tough facts, do you?
Fury X (Fiji), offered significantly more computational performance, more memory bandwidth, etc., and yet it was beaten by a much "weaker" on paper GTX 980 Ti. As I always say; theoretical figures are irrelevant, only real performance matters. That's why I bring it up, because the same history keep repeating itself; AMD fall short, but fans claim they have potential to be unleashed (which of course never happen).


----------



## evernessince (Jul 25, 2017)

gdallsk said:


> Yet with far greater power consumption, same performance and slightly higher price tag.



If you ask me people who bought the R9 390 won, simply because they could have cashed out for $500 or more during the height of the ether craze.  The GTX 970 only goes for around $200.


----------



## Slizzo (Jul 25, 2017)

Captain_Tom said:


> Over time this will likely catch up to the 1080 Ti or higher, but it will take many driver revisions.



If you think that driver updates will allow this card to make up a 30% or greater performance deficit you have your blinders on.

Of course my statement assumes that RX Vega will perform similarly to Frontier Edition; which I also assume is a safe bet.


----------



## the54thvoid (Jul 25, 2017)

Kronauer said:


> Isn't FireStirke 1.1 is a fairly old DirectX 11 benchmark btw?
> I mean even the 2nd year old Polaris is designed for DirectX 12/Vullkan gaming, yet everybody loses their mind over a directx 11 benchmark score.
> Im confident that Vega will show very good fps numbers, and prove that these older benchmarks does not represent the real-world performance of a newer graphics card.



DX11 AAA games are still being made.  DX12 has not been the Holy Grail some hoped for.  Besides, Vega cannot improve much over Fiji on DX12, it's already well enough designed for it.


----------



## cdawall (Jul 25, 2017)

cadaveca said:


> Most companies rate 120 mm as good for 250W+. You can thank Koolance for that.
> 
> 
> http://koolance.com/radiator-1-fan-120mm-30-fpi-copper
> ...



At what noise level? If this releases at a $500 price point to compete with the 1080 I expect noise level to match. Not the 50dB that the 290X released with. That gives them numbers of 36dB and 37dB on the reference 1070/1080 to at least equal. The 295X2 being reference with a single 120mm radiator and similar power profile (430w average consumption) it was able to maintain 40-41dB (I do not know the length of time w1zzard runs this test for reference...)

I stand by 150w being a nice safe, silent number. There is a reason I can passively cool my 5960x if notched down to stock clocks.



Captain_Tom said:


> What denial?  LMAO!
> 
> Can you read?   Most of my posts here center around Vega being a big disappointment so far (And likely overall).  But everything I said still stands:
> 
> ...



I am going to take this point by point.

How do you rate drivers as bad? Are they crashing, do you have proof that we will see improvement or are you just assuming this?

Vega is just GCN with an elongated pipeline to allow high clockspeed, as well as the addition of an HBC. Neither of those should require a ground up driver rewrite. This is further proven from the usage of a Fury driver from the get go. Consider the GCN 2.1

10-20% which is wishful thinking puts this even with an AIB 1080, which still consumes less than half the power.

980Ti vs Fury debate is awful, more so when you try and make the fury sound better than it is. The reference 980Ti is barely edged out by the fury, which sees no improvement in performance with AIB cards in normal situations.







The AIB model 980Ti's still consistently compete and often best 1070's



Captain_Tom said:


> There is no such thing as a bad product, just bad pricing .
> 
> 
> Let's assume that indeed Vega is a worst-case-scenario, and it's performance will fall short of what it's spec suggested.  If so, I would say this should be the line-up:
> ...



Long shot is a nice way to put "not going to bloody happen"


----------



## EarthDog (Jul 25, 2017)

Maybe im myopic but, im not hedging my bets on drivers improving things to reach a 1080ti, no way. 5-10% sure. I'm also not hedging my bets on vulkan being the savior either. 

That is simply too much risk for a consumer to wait for market saturation of dx12/vulkan games and improved drivers.


----------



## cdawall (Jul 25, 2017)

EarthDog said:


> Maybe im myopic but, im not hedging my bets on drivers improving things to reach a 1080ti, no way. 5-10% sure. I'm also not hedging my bets on vulkan being the savior either.
> 
> That is simply too much risk for a consumer to wait for market saturation of dx12/vulkan games and improved drivers.



Considering the Vega FE card is kicking out worse numbers in DX12 than a 11 that is a good bet


----------



## efikkan (Jul 25, 2017)

And the myth about AMD's superiority in Direct3D 12 and Vulkan lives on…
There is nothing inherent in these APIs giving GCN an edge, but rather many of the initial games being developed as AMD exclusives. Nvidia also chose to bring the driver side improvements of Direct3D 12 to all APIs, giving them a lower "relative gain". Over time Nvidia has prioritized Direct3D 12 more, reducing the initial advantage of AMD. Still, considering that nearly all games so far are using an abstraction layer instead of the new APIs directly, so we have no evidence to claim AMD have an advantage. The claims that AMD is superior in these APIs are approaching superstition at this point.


----------



## cadaveca (Jul 25, 2017)

cdawall said:


> I stand by 150w being a nice safe, silent number. There is a reason I can passively cool my 5960x if notched down to stock clocks.



I wasn't disagreeing with your sentiment; the fact remains that a 120mm rad can remove a lot of heat, given the right... uh... conditions. Yeah. 



EarthDog said:


> That is simply too much risk for a consumer to wait for market saturation of dx12/vulkan games and improved drivers.



These GPUs should have two focuses I think; HPC and 4K. What you mention isn't even in the picture. The question is, how many GPUs does AMD expect you to buy to get there? If you look at the entire PC ecosystem right now, it's obvious that AMD wants to sell high-end users more than a single GPU. The average consumer gets one GPU, maybe capable of 2560x1080 at more than reasonable framerates. To aim for any other performance target is unrealistic. You have to position yourself properly in the market, both in performance/price, but also matching the consumer base you intend to sell to. Understanding who that audience is really explains everything, but I see very few people that truly understand where AMD is headed here. I'd like to blame AMD for this, but unfortunately, I know better.


I see a lot of reverse hype going on lately, and it annoys the crap out of me, but only because people don't see things for what they obviously are.


----------



## cdawall (Jul 25, 2017)

cadaveca said:


> I wasn't disagreeing with your sentiment; the fact remains that a 120mm rad can remove a lot of heat, given the right... uh... conditions. Yeah.



Hehe you mean like the 120x38mm delta that is spec'd for the rad you posted lol


----------



## xkm1948 (Jul 25, 2017)

Well during heavy load the FuryX raditor can become too hot to touch as well. And that is with just stock speed.

Anyway the Vega train is DOA. Move on. People should be gearing up for the Volta/Navi train now.


----------



## Fluffmeister (Jul 25, 2017)

cdawall said:


> 980Ti vs Fury debate is awful, more so when you try and make the fury sound better than it is. The reference 980Ti is barely edged out by the fury, which sees no improvement in performance with AIB cards in normal situations.
> 
> 
> 
> ...



It's funny isn't it, I enjoy revisiting the comments section of the GTX 1080 review (among others), their main argument then was that was barely faster than AIB 980 Ti's.

Short memories I guess.


----------



## cdawall (Jul 25, 2017)

Fluffmeister said:


> It's funny isn't it, I enjoy revisiting the comments section of the GTX 1080 review (among others), their main argument then was that was barely faster than AIB 980 Ti's.
> 
> Short memories I guess.



Very short. I mean what do people expect when you further neuter Kepler...its also why I have stood behind 1070/1080 being midrange.


----------



## Basard (Jul 25, 2017)

EarthDog said:


> Que?


Well, that's how I understand it to be anyways.  An Intel chip can run up to 95C and  AMD's FX chips have something like 72C max.  If both have a TDP of 125w you will have a hell of a time cooling your FX chip with a "125w" cooler because when they give a cooler a "wattage TDP" they are usually talking about "when used with Intel chips (because they can handle more heat)".... It's just marketing....

Or am I just unaware of some scientific way they use describe a cooler's cooling capacity?


----------



## EarthDog (Jul 25, 2017)

Basard said:


> Well, that's how I understand it to be anyways.  An Intel chip can run up to 95C and  AMD's FX chips have something like 72C max.  If both have a TDP of 125w you will have a hell of a time cooling your FX chip with a "125w" cooler because when they give a cooler a "wattage TDP" they are usually talking about "when used with Intel chips (because they can handle more heat)".... It's just marketing....
> 
> Or am I just unaware of some scientific way they use describe a cooler's cooling capacity?


Noe i see what you were trying to get at! 

Capacity of a rad/heatsink is its capacity, period.

Let me ask you this... which is hotter? A yellow flame from a lighter, or yellow flames from a bonfire? (A: both the same temperature)

Now let me ask..which has more energy that lighter or the bonfire? (A: The bonfire)

See where im going with this? 

The temps are a product of that heatsink as well as many other variables...things like the core material, whats on the die making the heat, size of it, thermal paste under the ihs, the ihs, thermal paste above the ihs...now we get to the heatsink...then fans.

 You cant really compare temps between amd and intel.


----------



## Basard (Jul 25, 2017)

@EarthDog So, basically, if you put a 125 watt cooler on two different chips, then set up both systems to feed 125w exactly to each chip(FX vs Intel), the temps at load will be TMAX on both chips?  How do they measure the capacity of any given cooler?


----------



## EarthDog (Jul 25, 2017)

A heatplate with a specific load is how they are tested id assume. Not sure honestly.

As far as cpu temperature, that would vary by cpu and its tdp. In theory a processor with a lower tdp than the heatsink it shouldnt reach tjmax. And vice versa with higher. But again soooo many other variables...

Think of it the other way around...rad/heatsinks given capacity doesnt change (assuming no other variables change), yet, temperatures between the exact same model cpu vary, none the less completely different brands in amd and intel with two different processes/parameters for their substrate and getting the heat out to the ihs.


----------



## N3M3515 (Jul 26, 2017)

This is going to release over a year later than the 1080, at the same performance, but a lot worse perf/watt??
Damn AMD......... i expected over 1080Ti perf... 

Price should be U$400 max.


----------



## Footman (Jul 26, 2017)

Like Ryzen, AMD will have to come in at an attractive price point to sell Vega.

Ryzen has not proven to be any faster than the equivalent clocked and cored Intel cpu's, however it is hugely cheaper. I finally upgraded my own personal pc to Ryzen R5 1600X.

I don't expect Vega to set any speed records, but it needs to be at least as fast as the 1080 with a price somewhere between the 1070 and 1080 for success.

Firestrike scores show a similarity to the 1080, so now we need to hope for a price at or around the $450-$499 mark.

I am planning on buying Vega myself, I need better performance than the RX 580 to drive my 2560x1440 IPS 144hz Freesync monitor. I will not be happy if performance is at 1080 levels with a higher cost of entry along with higher power requirements.

Just saying...


----------



## Aenra (Jul 26, 2017)

ERazer said:


> that gotta hurt for ppl that waited this long



Not at all.
While decidedly an increasing minority, i don't have the mindset of a juvenile, nor their customs or habbits. "Above 'X' FPS" performance is more than good enough for me, leaving it to brand preference, price vs performance and/or voicing my arguments as a customer in the only i have available to me; by paying. And i can assure you i _will_ be paying for one of these.

People are free to stick to their pixelated pew pew, RGB lightz and 'does it come from Asus' mentalities, they are entitled to them. 
They should however refrain from generalising. Not all of us are and think like "gamerz", thank God


----------



## Brusfantomet (Jul 26, 2017)

And here I was hoping that the RX vega would be really good, guess it is down to pricing now.




EarthDog said:


> Capacity of a rad/heatsink is its capacity, period.



At a given ΔT for the system, so the same cooler rated a 125 W for a chip with a ΔT of 80 degrees opposed to one with a ΔT at 60 degrees will be lower, it will actually be at 93,75 Watt. how you ask?
Look at the equation here. considering this is not LaTex we are using here i will simplify the Q-dot and m-dot signes with Q and m.

The equation then becomes this: Q =m*Cp*ΔT
where:
Q = cooling capacity [kW]
m = mass rate [kg/s]
Cp = specific heat capacity [kJ/kg K]
Δ T = the temperature change [K] from the cooled thing to the ambient air.

m and Cp are given by the cooler, in a normal tower cooler m can be increased with a fan blowing more air over the cooler, but for our experiment we keep the same fan on the same speed, this means that m*Cp is a constant.
ΔT for the first cooler will be 80 degrees (100 °C - 20 °C = 80 °C and °K) and Q for this ΔT is 0,125 kW, giving us:

0,125 = m*Cp*80

hence m*Cp = 0,125/80 = 0,0015625

now, using the lower ΔT (80 °C - 20 °C = 60 °C and °K) we get the flowing equation:

60 * 0,0015625 = 0,09375 and 0,09375 kW is 93,75 kW.

This means that a cooler rated at 125W for a ΔT of 80 °K will cool 93,75 W when the ΔT is 60 °K.

So no, 





> Capacity of a rad/heatsink is its capacity, period.


 unless you have a fixed ΔT it is not.




Basard said:


> Or am I just unaware of some scientific way they use describe a cooler's cooling capacity?



From Wikipedia: Cooling capacity


----------



## EarthDog (Jul 26, 2017)

Yes, when you change variables its rating would change. Thats inferred, no? Oops, said... #131.

In other words, a heatsink rated at xxxW with a xxC delta will yield different temps on different cpus...but its the cpu (and all other variables) which is causing the difference in temperature between different cpus with the same heatload.

Heatsinks arent tested initially with chips, outside of computer modeling, its hot plates with specific heat loads. Its essentially saying, this chunk of metal can dissipate xxxW at a deltaT of xxC. When you put something under it at ths same heatload with different substrate materials, die sizes, paste, ihs, etc...its going to have a different temperature due to the OTHER variables. I think what you are trying to say is the amount of wattage a heatsink can handle will vary based on the max temp you want out of it... which is of course true...but not what im saying. 

Am i missing something still?


----------



## Basard (Jul 26, 2017)

@EarthDog  I dunno... you started it lol.... 
@Brusfantomet  Yeah!  What you said, with the math numbers and stuff!


----------



## EarthDog (Jul 26, 2017)

I wasnt asking you for this answer...lol!


----------



## Th3pwn3r (Jul 26, 2017)

Brusfantomet said:


> And here I was hoping that the RX vega would be really good, guess it is down to pricing now.
> 
> 
> 
> ...



What about constant temps versus peak? Cpu don't run at the same temperature forever is what I'm getting at. Ability to cool xxx watts of heat for eternity or for an hour? My wording isn't great but maybe you understand what I mean. Being able to sustain the cooling capacity of its maximum ability basically.


----------



## Prima.Vera (Jul 26, 2017)

Price this to 1060 levels and they will sale like hotcakes.


----------



## cadaveca (Jul 26, 2017)

Brusfantomet said:


> The equation then becomes this: Q =m*Cp*ΔT
> where:
> Q = cooling capacity [kW]
> m = mass rate [kg/s]
> ...



Woah, buddy.

Your calculations are incorrect to this scenario, as they are for a closed-loop _refrigerant-based_ cooling system. I did go to school for this stuff, so I saw this, I nearly spit my tea all over my desk seeing refrigerant calculations in this thread. You need a far more complicated equation; mass in your equation refers to the refrigerant flow, not airflow. You cannot increase "m" with a higher fan; that's not the purpose of this equation. "m" is the flow rate of the refrigerant, in this case, water, not the air across the rad (which is why it is rated in Kg/s, and not CFM). The Cp is actually the specific heat of the refrigerant, or the water in the loop, not the cooler. We use various refrigerants, which is why it works this way. Delta-T is the change _across the evaporator (ie, from the inlet to the outlet)_, not the cooler's difference from ambient. Delta-T refers to how the refrigerant changes, and is not TD, which is what you at referring to. Delta-T always refers to temperature changes within the same media, not differences between two media. Many people get this bit wrong,


It's not often the stuff I learned in school I get to put to practice... but thanks.


----------



## the54thvoid (Jul 26, 2017)

cadaveca said:


> Woah, buddy.
> 
> Your calculations are incorrect to this scenario, as they are for a closed-loop _refrigerant-based_ cooling system. I did go to school for this stuff, so I saw this, I nearly spit my tea all over my desk seeing refrigerant calculations in this thread. You need a far more complicated equation; mass in your equation refers to the refrigerant flow, not airflow. You cannot increase "m" with a higher fan; that's not the purpose of this equation. "m" is the flow rate of the refrigerant, in this case, water, not the air across the rad (which is why it is rated in Kg/s, and not CFM). The Cp is actually the specific heat of the refrigerant, or the water in the loop, not the cooler. We use various refrigerants, which is why it works this way. Delta-T is the change _across the evaporator (ie, from the inlet to the outlet)_, not the cooler's difference from ambient. Delta-T refers to how the refrigerant changes, and is not TD, which is what you at referring to. Delta-T always refers to temperature changes within the same media, not differences between two media. Many people get this bit wrong,
> 
> ...



Thanks teach!  The glaring thing to the layperson would be the use of Delta T.  I hadn't read the initial post (saw equations and bugged out) but i read your critique of it and it was that misuse of Delta T that struck a chord.  Think I've known since I bought my first cooler that it meant the change/difference in temp of the chip (or point of measurement on chip) from idle to full load.


----------



## ratirt (Jul 26, 2017)

I see not much has changed in regards of posting. Always trying to prove who's right and who said it first and who predicted anything. kinda boring i'd say. I was hoping for a knowledge ride from all the people here who conside themselves as experts in all it's meaning. (some know more about a graphics chip than the company producing it does). Amusing predictions
it will be out soon than we can talk about it. 1080 was the Vega's perf point it always has been. Some even say they predicted that  funny. How can you predict something which was stated by the company producing the chip and announced.
in my opinion the confusion is at the highest state now.
Vega was hitting on 1080 and it did. ( well from what we know so far). Maybe the power consumption isn't satisfying but well AMD was never great at that. 
THE DELAY.
Well it is delayed and there was so many assumptions.
First HBM is so expensive and that might cause a delay in delivering the memory or simply the demand was big and lack of resources caused that.
Second driver issues. Well might be true since AMD is hiring people for the driver development but keep in mind that console market is in AMD's hands now. It's almost twice as much customers than PC market.
And third. They are tweaking Vega to boost performance to match or come close to 1080 TI. I'm sure AMD is at least trying to accomplish that. if they succeed I don't know but I think Vega has the potential and they can squeeze more outta it.

As for me all above can be true but also none can be right. There's so many things you people may  not be aware of what's going on in AMD and that's the one fact here that's correct. Soon it will be out and it will all come clear


----------



## las (Jul 26, 2017)

Captain_Tom said:


> What denial?  LMAO!
> 
> Can you read?   Most of my posts here center around Vega being a big disappointment so far (And likely overall).  But everything I said still stands:
> 
> ...



Custom 980 Ti beats Fury X with ease, out of the box. Custom 980 Ti is ~20% faster than reference, which Fury X is being compared to in 99% of tests. Fury X can't OC. You might gain 5% if lucky, with huge increase in power.

Fully clocked custom 980 Ti is pretty much on par with 1080 FE. 30-40% faster than reference here. Go see TPU's 980 Ti reviews for proof.

I had 1080 shortly, but returned it because it felt like a side-grade coming from 980 Ti @ 1.5+ GHz.


----------



## Frick (Jul 26, 2017)

KainXS said:


> Just bringing my popcorn to see how the local AMD Defense Force will defend this.



Fisticuffs, obviously.


----------



## I No (Jul 26, 2017)

Prima.Vera said:


> Price this to 1060 levels and they will sale like hotcakes.




Hell let's hope they price it @ 1030's levels. Why not? that 1080 perf for $75 /s. All that HBM and cooling is gonna be a tad steep from a production's point of view.
Hard to think AMD will make any money off these. After all this time and they can't even match Maxwell's efficiency. Gotta hand it to them that PR got everyone hyped up and they delivered a lemon, same performance you could've gotten an year ago. Fun stuff.


----------



## nemesis.ie (Jul 26, 2017)

RejZoR said:


> Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day.





cdawall said:


> HBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.





Captain_Tom said:


> What about Hawaii?
> 
> Hawaii wiped the floor with Kepler and Grenada even managed to stay competitive with Maxwell.



Something is really strange here, if the performance doesn't get an uplift (or the intent of the card (chip) is really to compete in another market (AI etc.)) it's pointless (edit: for gaming, other than freesync/if you put in more than one, but then the potential heat issue gets worse).

My 2 x 290X get roughly the same FireStrike score, 2 x  almost 3 year old cards at 600MHz less (each, that's 60% uplift on Vega's clock) and only 25% more power consumption for the same result and Vega does not have to use crossfire? Weird stuff. Although the die area is not much larger than Hawaii the compute performance is supposed to be > 2x ... 

As mentioned 2 x RX580 would use the same or less power and produce a similar score for ~€500-600 at non-gouged RRP.

I can't believe all this money, R&D time/effort etc. would be poured into the new architecture to effectively end up with 2 x 290x/RX580 in one slot.

There must be more to this or gaming is just an afterthought "we can do it and it'll be good with freesync and no crossfire issues, but the main focus of Vega is pro/AI/compute".

Anyway we should know more over the next few days.

I'm still a little hopeful the numbers posted (if real) are for the lowest tier card/older API/old drivers.


----------



## Brusfantomet (Jul 26, 2017)

EarthDog said:


> Yes, when you change variables its rating would change. Thats inferred, no? Oops, said... #131.
> 
> In other words, a heatsink rated at xxxW with a xxC delta will yield different temps on different cpus...but its the cpu (and all other variables) which is causing the difference in temperature between different cpus with the same heatload.
> 
> ...



I think we are almost in agreement, my point was that if chip A i designed for a Tj of 95 °C while another is designed for 70 °C, if the power consumed by both chips are the same, say 125 W a cooler that is capable of cooling chip A at 125 W will exceed the Tj of chip B, therefore, your line in post #129:


EarthDog said:


> Capacity of a rad/heatsink is its capacity, period.


becomes a bit wrong.

Then again, as cadaveca points out I am not completely correct myself. closes ting i get to cooling is on hobby basis with computers. 

Also, so that not all of this post is OT, if the new RX vega can handle a higher Tj a single 120 mm rad could be enough.


----------



## Slizzo (Jul 26, 2017)

ratirt said:


> And third. They are tweaking Vega to boost performance to match or come close to 1080 TI. I'm sure AMD is at least trying to accomplish that. if they succeed I don't know but I think Vega has the potential and they can squeeze more outta it.



Again, from what we've seen of Vega Frontier Edition, if you expect drivers to make up a 30%+ difference in performance you guys need to adjust your thinking.


----------



## EarthDog (Jul 26, 2017)

Brusfantomet said:


> I think we are almost in agreement, my point was that if chip A i designed for a Tj of 95 °C while another is designed for 70 °C, if the power consumed by both chips are the same, say 125 W a cooler that is capable of cooling chip A at 125 W will exceed the Tj of chip B, therefore, your line in post #129:
> 
> becomes a bit wrong.
> 
> ...


buuuut, the heatsink's ability doesnt change. Its the product underneath it which has the limits ARTIFICIALLY lowering its effectiveness. Again, the HS properties do not change all otber variables remaining tbe same. Heatsinks arent measured by what temps it can achieve on cpus, but on the amount of heat it can dissipate and the delta over ambient. 

I digress as well.. OT.


----------



## efikkan (Jul 26, 2017)

Prima.Vera said:


> Price this to 1060 levels and they will sale like hotcakes.


Then how will AMD make a profit?


----------



## DRDNA (Jul 26, 2017)

I have been meditating on this whole thing and I think just maybe this is the introductory to what AMD is going to do NEXT with this technology. I have a feeling they are running this release as a huge mas abduction of people who are buying into this new GPU to be their develop testers for this whole strange mess of a release....I can't see any other reason for what is happening unless this card is their entry level card! AMD and their GPU release is very confusing as to what they are up to.


----------



## RejZoR (Jul 26, 2017)

Slizzo said:


> Again, from what we've seen of Vega Frontier Edition, if you expect drivers to make up a 30%+ difference in performance you guys need to adjust your thinking.



Let me ask you something, how fast do you think GTX 1080Ti would be if you slam GTX 980 drivers on it? We're not talking game optimizations here, never had. It's drivers actually working with all new architecture. This is the other thing people don't seem to be realizing. It's always GCN this, GCN that, apparently thinking that if it's GCN, they don't even have to write new drivers at all. C'mon people, are you dumb? It's not how things work. Vega FE was never meant to be a gaming card. Period. Look at pro benchmarks they did on Vega FE. It works spectacularly well. But for gaming, a total dud. Now, what does that tell you? That something is very wrong with the chip or with the drivers that (as far as games are concerned) don't even work, let alone work well? And yet people go on and on about same thing that drivers just can't deliver any changes anymore. That's like slamming R9 290X drivers on GTX 1080 and just expect it to do anything. What makes you think Vega doesn't require same level of attention from driver side as it would jumps between NV and AMD? Vega is a huge leap from Hawaii or Fiji core. You can't just slam a driver on it and voila, it'll magically make insane performance out of freaking thin air.


----------



## cdawall (Jul 26, 2017)

RejZoR said:


> Let me ask you something, how fast do you think GTX 1080Ti would be if you slam GTX 980 drivers on it? We're not talking game optimizations here, never had. It's drivers actually working with all new architecture. This is the other thing people don't seem to be realizing. It's always GCN this, GCN that, apparently thinking that if it's GCN, they don't even have to write new drivers at all. C'mon people, are you dumb? It's not how things work. Vega FE was never meant to be a gaming card. Period. Look at pro benchmarks they did on Vega FE. It works spectacularly well. But for gaming, a total dud. Now, what does that tell you? That something is very wrong with the chip or with the drivers that (as far as games are concerned) don't even work, let alone work well? And yet people go on and on about same thing that drivers just can't deliver any changes anymore. That's like slamming R9 290X drivers on GTX 1080 and just expect it to do anything. What makes you think Vega doesn't require same level of attention from driver side as it would jumps between NV and AMD? Vega is a huge leap from Hawaii or Fiji core. You can't just slam a driver on it and voila, it'll magically make insane performance out of freaking thin air.



Considering Pascal is just an efficient Maxwell it would likely be just fine.

2 years ago amd showed us vega actively running. In two years amd can't make a driver?

In 4 years will they finally have it nailed down as they move away from slightly modified gcn? I mean how long do they need. This market isn't for lagging behind.


----------



## EarthDog (Jul 26, 2017)

RejZoR said:


> Let me ask you something, how fast do you think GTX 1080Ti would be if you slam GTX 980 drivers on it? We're not talking game optimizations here, never had. It's drivers actually working with all new architecture. This is the other thing people don't seem to be realizing. It's always GCN this, GCN that, apparently thinking that if it's GCN, they don't even have to write new drivers at all. C'mon people, are you dumb? It's not how things work. Vega FE was never meant to be a gaming card. Period. Look at pro benchmarks they did on Vega FE. It works spectacularly well. But for gaming, a total dud. Now, what does that tell you? That something is very wrong with the chip or with the drivers that (as far as games are concerned) don't even work, let alone work well? And yet people go on and on about same thing that drivers just can't deliver any changes anymore. That's like slamming R9 290X drivers on GTX 1080 and just expect it to do anything. What makes you think Vega doesn't require same level of attention from driver side as it would jumps between NV and AMD? Vega is a huge leap from Hawaii or Fiji core. You can't just slam a driver on it and voila, it'll magically make insane performance out of freaking thin air.


Buuuuuuuuuuuuuuut a 1080 never had 980Ti drivers on it. Nor will Vega have Polaris drivers...

Certainly optimizations are in store down the road, but, 30% change after release is unheard of. Several %... with you.

What the Vega Pro results tell me is THOSE drivers are optimized for the Pro card and not gaming. Those are also NOT the drivers which RX Vega will be using on release day. Only time will tell, but, I can't make that leap with you.. not logical.


----------



## jabbadap (Jul 26, 2017)

Nordichardware has some rumors about pricing. 7000 SEK(~$850) without VAT and 9000 SEK(~$1098) inc. VAT.


----------



## SPLWF (Jul 26, 2017)

I'm still getting it but AIB versions.  Remember AMD drivers mature really well.  R290(x)/390(x)/Fury still perform to this day.


----------



## I No (Jul 26, 2017)

SPLWF said:


> I'm still getting it but AIB versions.  Remember AMD drivers mature really well.  R290(x)/390(x)/Fury still perform to this day.




Gee and I wonder why that's the case. Could it be that half of the line-up was actually a rebrand and they all share the same architecture? This whole "finewine" BS will have to stop at some point. AMD's doing that because they cannot sustain (budget wise) a whole new arch every gen. If they decide to streamline Vega as their next "backbone" arch you can kiss the aggressive support for older cards buh-bye.


----------



## ratirt (Jul 26, 2017)

Slizzo said:


> Again, from what we've seen of Vega Frontier Edition, if you expect drivers to make up a 30%+ difference in performance you guys need to adjust your thinking.


I didn't say it will happen but I'm sure AMD is at least giving a shot and try since 1080 TI showed up. I'm sure this is what others think about too. If AMD can pull this one off i don't know. We'll find out soon.
I don't need to adjust my thinking cause it's my thinking. Getting all the information in one picture and it is reasonable from my perspective at least. That's what the delay may be about but not necessarily and i know a lot people on this form been saying this. I don't know if that's what it is. Nobody knows even You. all speculations. So please don't say noting is possible. Not all is drivers you know.



RejZoR said:


> Let me ask you something, how fast do you think GTX 1080Ti would be if you slam GTX 980 drivers on it? We're not talking game optimizations here, never had. It's drivers actually working with all new architecture. This is the other thing people don't seem to be realizing. It's always GCN this, GCN that, apparently thinking that if it's GCN, they don't even have to write new drivers at all. C'mon people, are you dumb? It's not how things work. Vega FE was never meant to be a gaming card. Period. Look at pro benchmarks they did on Vega FE. It works spectacularly well. But for gaming, a total dud. Now, what does that tell you? That something is very wrong with the chip or with the drivers that (as far as games are concerned) don't even work, let alone work well? And yet people go on and on about same thing that drivers just can't deliver any changes anymore. That's like slamming R9 290X drivers on GTX 1080 and just expect it to do anything. What makes you think Vega doesn't require same level of attention from driver side as it would jumps between NV and AMD? Vega is a huge leap from Hawaii or Fiji core. You can't just slam a driver on it and voila, it'll magically make insane performance out of freaking thin air.



It's like intel and it's 7700k vs 7800X. Same CPU and look at the performance. Only cache has changed and how it has affected IPC. I agree with you. Nothing has been decided yet. RX vega may be faster but not necessarily. We'll find out. I honestly hope it will be faster than 1080 by a noticeable margin. Let's hope.


----------



## Parn (Jul 27, 2017)

375W TDP compared to ~180W of an GTX1080, hmmm no thanks. 

Even if the price is £50 lower than that of an GTX1080 I think I'll still pass. Having a 375W TDP GPU in the box requires better PSU and better cooling for the other components and these translate to an overall higher price.


----------



## ratirt (Jul 27, 2017)

Parn said:


> 375W TDP compared to ~180W of an GTX1080, hmmm no thanks.
> 
> Even if the price is £50 lower than that of an GTX1080 I think I'll still pass. Having a 375W TDP GPU in the box requires better PSU and better cooling for the other components and these translate to an overall higher price.


I don't exactly know what you are after but when putting a rig together even 2 years ago and you wanted to have a decent PC experience you must go for a PSU at least 650W which is enough for the Vega for sure. All I need to do is just swap the card no other enhancements to my rig so cant see the higher price. Unless you're talking about power consumption.


----------



## EarthDog (Jul 27, 2017)

The problem there.... it isnt 2 years ago....and that isnt quite true either. The 980 was a great 1080p/1440 card 2 years ago at 165W. A quality 500W psu was plenty for that, an intel i7, and overclocking while still having headroom. 375W gpu and an intel chip can mean a different psu if you are in the 600W or less range...especially when overclocking cpu and gpu.


----------



## ratirt (Jul 27, 2017)

EarthDog said:


> The problem there.... it isnt 2 years ago....and that isnt quite true either. The 980 was a great 1080p/1440 card 2 years ago at 165W. A quality 500W psu was plenty for that, an intel i7, and overclocking while still having headroom. 375W gpu and an intel chip can mean a different psu if you are in the 600W or less range...especially when overclocking cpu and gpu.


I know it's not it's just I don't understand the argument. Look at your PSU and his? 1000Watt 660/750Watt. What the hell you guys are complaining about more power draw more money spent. If it's not the power consumption costs then this is pointless. 375W and OMG what PSU I would buy for that. Both you and the other got way more power than needed. If Intel is stressing your PSU too much go with Ryzen it doesn't but with your PSU I wouldn't worry.

BTW. the 2 years mark I mentioned? I don't recall anyone buying PSU lower than 650 Watts. Anyone. Its like a standard 650W and up. not even sure if the power demands for a PC are getting lower within the time I guess they should but more performance equals more power.


----------



## EarthDog (Jul 27, 2017)

Look.. your assertion you "must" have "at least" a 650w psu two years ago for a "decent" pc is patently false is my point.

As i went on to say, anyone with a 600w psu or less, will likely want an upgrade if they decide to use the 375w(stock) monster.

As for me... im a reviewer..the review rig with 1kw psu is of ZERO matter in this case. My daily driver with 750w.... i used to run a 500w card (295x2) and 5820k, both overclocked a couple years back. But this, i thought clearly, wasnt about me or him specifically but about general users/use. Come on, get your head out of the minutia and look from a more broad perspective...


----------



## Slizzo (Jul 27, 2017)

ratirt said:


> I know it's not it's just I don't understand the argument. Look at your PSU and his? 1000Watt 660/750Watt. What the hell you guys are complaining about more power draw more money spent. If it's not the power consumption costs then this is pointless. 375W and OMG what PSU I would buy for that. Both you and the other got way more power than needed. If Intel is stressing your PSU too much go with Ryzen it doesn't but with your PSU I wouldn't worry.
> 
> BTW. the 2 years mark I mentioned? I don't recall anyone buying PSU lower than 650 Watts. Anyone. Its like a standard 650W and up. not even sure if the power demands for a PC are getting lower within the time I guess they should but more performance equals more power.



Be aware that a PSU running at 50% capacity is usually running at it's most efficient.  I have an 850W PSU, do I need it? Hell no. But I know the PSU will last a long while as it's not being stressed at all, and it is providing its' cleanest power at it's load, and it is running its' best at current load.


----------



## EarthDog (Jul 27, 2017)

Slizzo said:


> Be aware that a PSU running at 50% capacity is usually running at it's most efficient.


This is true, but the difference between 20% and 90% is VERY small efficiency wise. On your PSU, it's less than 1%. A 650W unit is plenty for 99% of people running single GPU and CPU with overclocking (ambient). Would have saved your $40 and you wouldn't have noticed a thing. 

I prefer to save my money on initial outgoing. Its why I ran(run) a 750W PSU with a 500W card and 140W CPU...and I overclocked both. I did see the fan spin up!!! That's it. Most people would have run out and bought AT LEAST a 900W PSU (AMD rec. 1KW). I don't believe in overbuying on the PSU unless you plan on adding another card in the future for SLI/CFx...


----------



## N3M3515 (Jul 27, 2017)

I have a 750w psu, at least for 6 years, and i'm no reviewer or power user, just like to be prepared. Anyone who buys highend gpu, from 2010 likely has at least a 650w psu....

That being said, i don't like that 375W of the vega....too much. Now that i pay the power bill i can feel the extra bucks the gpu packs.


----------



## ratirt (Jul 27, 2017)

EarthDog said:


> Look.. your assertion you "must" have "at least" a 650w psu two years ago for a "decent" pc is patently false is my point.
> 
> As i went on to say, anyone with a 600w psu or less, will likely want an upgrade if they decide to use the 375w(stock) monster.
> 
> As for me... im a reviewer..the review rig with 1kw psu is of ZERO matter in this case. My daily driver with 750w.... i used to run a 500w card (295x2) and 5820k, both overclocked a couple years back. But this, i thought clearly, wasn't about me or him specifically but about general users/use. Come on, get your head out of the minutia and look from a more broad perspective...


Ok i'm not and that is why i got 650w. Why is it that whatever you guys say must be correct. I haven't seen for ages lower PSU than 650 and this is what i share. And of course i get this answers.
For example i got an 780 Ti. Stock needs at least 600 Watts PSU. Release date 2013 of course that's the stock OC'ed like mine would require way more. Please stop being like that and neglect everything people say just because you think different.



N3M3515 said:


> I have a 750w psu, at least for 6 years, and i'm no reviewer or power user, just like to be prepared. Anyone who buys highend gpu, from 2010 likely has at least a 650w psu....
> 
> That being said, i don't like that 375W of the vega....too much. Now that i pay the power bill i can feel the extra bucks the gpu packs.


I totally agree. Don't like it either but it's not surprising that people will buy stronger PSU's even if they are not power users nor anything. If you buy less then it's probably someone was getting your computer put together and didn't wanted you to strain your wallet extensively. The fact stays here it's been 650W minimum god knows for how long now.




Slizzo said:


> Be aware that a PSU running at 50% capacity is usually running at it's most efficient.  I have an 850W PSU, do I need it? Hell no. But I know the PSU will last a long while as it's not being stressed at all, and it is providing its' cleanest power at it's load, and it is running its' best at current load.


That's the other side of the coin. Minimum doesn't mean 50% of the PSU's power. Still that's a minimum. You don't need it yet you got it. So i guess you need it for some reason.


----------



## EarthDog (Jul 27, 2017)

Rat...lol... man, get over it . I disagreed with your point and supported my stance on why. You can choose to agree or disagree. What you see is what you see and what I see is what I see. That doesn't mean either of us are right in this case (opinions can of course be wrong!) 

I'm glad you haven't seen lower PSU than 650 and you shared it. All I very clearly stated was "...anyone with a 600W PSU or less..." in TWO posts...


EarthDog said:


> ...can mean a different psu if you are in the 600W or less range...especially when overclocking cpu and gpu





EarthDog said:


> ...anyone with a 600w psu or less,  will likely want an upgrade if they decide to use the 375w(stock) monster.



Those people, the lot of them you see, with 650W+ won't have to upgrade as you said. Those that have less, will need to consider/do it. Its really easy, my point.



ratirt said:


> The fact stays here it's been 650W minimum god knows for how long now.


Depends man... the 980 "requires" a 500W PSU. It can be said, as a fact, more people buy the 2nd card through midrange than the flagship, which require less power and PSU capacity as well... 2 years ago the 9 series was almost a year old (980). The 7 series you are referencing says 600W to the 770 as well. Below that, its 500W or less again.

Many are prepared with what they have, but many may not be.


----------



## ratirt (Jul 27, 2017)

EarthDog said:


> Rat...lol... man, get over it . I disagreed with your point and supported my stance on why. You can choose to agree or disagree. What you see is what you see and what I see is what I see. That doesn't mean either of us are right in this case (opinions can of course be wrong!)
> 
> I'm glad you haven't seen lower PSU than 650 and you shared it. All I very clearly stated was "...anyone with a 600W PSU or less..." in TWO posts...
> 
> ...


And i supported mine saying 650W is a minimum and it has been like that for quite a while. So can say same thing. Get over it Dog  What you want me to say. You are right? Well you are not.
yeah it does need that much. but doesn't change a thing. PSU is one of the important stuff in the computer.


----------



## EarthDog (Jul 27, 2017)

You really don't need to say anything. I just have a differing opinion and supported my assertion. My apologies if that struck a nerve.



ratirt said:


> You are right? Well you are not.


I am spot on with my main talking point... 


EarthDog said:


> As i went on to say, anyone with a 600w psu or less, will likely want an upgrade if they decide to use the 375w(stock) monster.



I'm not going to continue to debate the efficacy of how many 650W+ PSUs are in the wild versus less than that... its an impossible thing to do. So I simply supported why I thought otherwise... you did the same... except, appear to have taken offense to the discussion...


----------



## efikkan (Jul 28, 2017)

RejZoR said:


> Let me ask you something, how fast do you think GTX 1080Ti would be if you slam GTX 980 drivers on it? We're not talking game optimizations here, never had. It's drivers actually working with all new architecture. This is the other thing people don't seem to be realizing. It's always GCN this, GCN that, apparently thinking that if it's GCN, they don't even have to write new drivers at all. C'mon people, are you dumb? It's not how things work.


Irrelevant.
Drivers translate API calls into ISA operations. A driver from a previous generation would work or not work. If a part of the ISA is changed, then the old driver wouldn't work bad, it would fail.



RejZoR said:


> Vega FE was never meant to be a gaming card. Period.


You know very well it was never intended for gamers, but for game developers. Vega FE and RX Vega will do similarly in gaming.



RejZoR said:


> Look at pro benchmarks they did on Vega FE. It works spectacularly well. But for gaming, a total dud. Now, what does that tell you? That something is very wrong with the chip or with the drivers that (as far as games are concerned) don't even work, let alone work well? And yet people go on and on about same thing that drivers just can't deliver any changes anymore.


That's not something new. There have always been some compute benchmarks where GCN does well. The mistake is that you claim Vega sucks because the drivers are immature, which is ridiculous.


----------



## ratirt (Jul 29, 2017)

Found this.
Wonder if you guys had a chance to read this.
http://wccftech.com/amds-raja-kodur...e-rx-vega-gaming-cards-july-launch-confirmed/
In the article - AMD’s Jason Evangelho said that gamers should not draw conclusions of RX Vega gaming performance base on Vega FE card benchmark. tells a lot in my opinion.


----------



## Th3pwn3r (Jul 29, 2017)

efikkan said:


> Then how will AMD make a profit?


AMD will be lucky to break even considering how much crap Vega is getting/is in.


----------



## mandelore (Aug 2, 2017)

Hopefully we will see what Vega 64 can do in crossfire even if AMD are backing away from it, because at this moment my 2014 crossfire setup on _stock_ settings is pulling around 24k in firestrike 1.1.

Vega does not look like a viable upgrade so far, but if the crossfire performance is there, for the stated price I think id grab 2 of them, slap some custom waterblocks on and enjoy a bit more breathing room with an expanded memory usually enjoyed by the green side. Which is also a bit rediculous since technically between a 295X2 and R9290x I have 12Gb GDDR5


----------



## EarthDog (Aug 2, 2017)

There isnt anything technical about it. While ypu cam add up vram and get 12gb, you have 4gb useable as each vram buffer on the cards have mirrored data.


----------

