# AMD Radeon Pro Vega Frontier Edition Unboxed, Benchmarked



## Raevenlord (Jun 28, 2017)

A lucky customer has already gotten his hands on one of these coveted, sky-powered AMD graphics cards, and is currently in the process of setting up his system. Given the absence of review samples from AMD to any outlet - a short Vega Frontier Edition supply ensured so - there isn't any other real way to get impressions on this graphics card. As such, we'll be borrowing Disqus' user #define posts as a way to cover live pics and performance measurements of this card. Expect this post to be updated as new developments arise.

After some glamour shots of the card were taken (which really are justified by its unique color scheme), #define mentioned the card's build quality. After having installed the driver package (which, as we've covered today, includes both a developer and gaming path inside the drivers, granting increased performance in both workloads depending on the enabled driver profile, he is now about to conduct some testing on SPECViewperf and 3DMark, with both gaming and non gaming profiles. 



 

 

 

 



Specs of the system include an Intel Core i7 4790K (apparently at stock 4GHz), an ASUS Maximus VII Impact motherboard, and 16 GB (2x8) of Corsair Vengeance Pro Black DDR3 modules, running at 2133 MHZ, and a 550 W PSU.



 

 





*Update 1:* #define has made an update with a screenshot of the card's score in 3DMark's FireStrike graphics test. The user reported that the Pro drivers' score "didn't make sense", which we assume means are uncooperative with actual gaming workloads. On the Game Mode driver side, though, #define reports GPU frequencies that are "all over the place". This is probably a result of AMD's announced typical/base clock of 1382 MHz and an up to 1600 MHz peak/boost clock. It is as of yet unknown whether these frequencies scale as much with GPU temperature and power constraints as NVIDIA's pascal architecture does, but the fact that #define is using a small case along with the Frontier Edition's blower-style cooler could mean the graphics card is heavily throttling. That would also go some way towards explaining the actual 3DMark score of AMD's latest (non-gaming geared, I must stress) graphics card: a 17,313 point score isn't especially convincing. Other test runs resulted in comparable scores, with 21,202; 21,421; and 22,986 scores. However, do keep in mind these are the launch drivers we're talking about, on a graphics card that isn't officially meant for gaming (at least, not in the sense we are all used to.) It is also unclear whether there are some configuration hoops that #define failed to go through.





 





*Update 2:* # After fiddling around with Wattman settings, #Define managed to do some more benchmarks. Operating frequency should be more stable now, but alas, there still isn't much information regarding frequency stability or throttling amount, if any. He reports he had to set Wattman's Power Limit to 30% however; #define also fiddled with the last three power states in a bid to decrease frequency variability on the card, setting all to the 1602 MHz frequency that AMD rated as the peak/boost frequency. Temperature limits were set to their maximum value.



 

Latest results post this non-gaming Vega card around the same ballpark as a GTX 1080:



 

For those in the comments going about the Vega Frontier Edition professional performance, I believe the following results will come in as a shock. #define tested the card in Specviewperf with the PRO drivers enabled, and the results... well, speak for themselves.

#define posted some Specviewperf 12.1 results from NVIDIA's Quadro P5000 and P6000 on Xeon machines, below (in source as well):



 

 

And then proceeded to test the Vega Frontier Edition, which gave us the following results:





So, this is a Frontier Edition Vega, which isn't neither a professional nor a consumer video card, straddling the line in a prosumer posture of sorts. And as you know, being a jack of all trades usually means that you can't be a master at any of them. So let's look at the value proposition: here we have a prosumer video card which costs $999, battling a $2000 P5000 graphics card. Some of its losses are deep, but it still ekes out some wins. But let's look at the value proposition: averaging results between the Vega Frontier Edition (1014,56 total points) and the Quadro P5000 (1192.23 points), we see the Vega card delivering around 80% of the P5000's performance... for 50% of its price. So if you go with NVIDIA's Quadro P5000, you're trading around a 100% increase in purchase cost, for a 20% performance increase. You tell me if it's worth it. Comparisons to the P6000 are even more ridiculous (though that's usual considering the increase in pricing.) The P6000 averages 1338.49 points versus Vega's 1014,56. So a 25% performance increase from the Quadro P6000 comes with a price tag increased to... wait for it... $4800, which means that a 25% increase in performance will cost you a 380% increase in dollars.

*Update 3:*

Next up, #define did some quick testing on the Vega Frontier Edition's actual gaming chops, with the gaming fork of the drivers enabled, on The Witcher 3. Refer to the system specs posted above. he ran the game in 1080p, Über mode with Hairworks off. At those settings, the Vega Frontier Edition was posting around 115 frames per second when in open field, and around 100 FPS in city environments. Setting the last three power states to 1602 MHz seems to have stabilized clock speeds.



 

 

 

 

 



*Update 4:*

#define has now run 3D Mark's Time Spy benchmark, which uses a DX12 render path. Even though frequency stability has improved on the Vega Frontier Edition due to the change of the three last power states, the frequency still varies somewhat, though we can't the how much due to the way data is presented in Wattman. That said, the Frontier Edition Vega manages to achieve 7,126 points in the graphics section of the Time Spy benchmark. This is somewhat on the ballpark of stock GTX 1080's, though it still scores a tad lower than most.



 



*View at TechPowerUp Main Site*


----------



## Supercrit (Jun 28, 2017)

That acrylic "R" made me want to get it...


----------



## MilesMetal (Jun 28, 2017)

Do you have a link to the Disqus article? 

Thanks, 
Miles.


----------



## Liviu Cojocaru (Jun 28, 2017)

Looks good  , can't wait to see some bench results


----------



## robert3892 (Jun 28, 2017)

Looking forward to the benchmarks


----------



## Raevenlord (Jun 28, 2017)

MilesMetal said:


> Do you have a link to the Disqus article?
> 
> Thanks,
> Miles.



It's in the source =)


----------



## renz496 (Jun 28, 2017)

17k on fire strike? that is 1070 territory?


----------



## P4-630 (Jun 28, 2017)

renz496 said:


> 17k on fire strike? that is 1070 territory?



Graphics score you should look at, it's 22.916 of this vega card, my 1070 OC'd scores 21.169 so yeah....


----------



## robert3892 (Jun 28, 2017)

Looks like his PSU is too low in wattage for a card like this


----------



## fullinfusion (Jun 28, 2017)

Let's wait to see if its in developer or gaming path mode.


----------



## Nokiron (Jun 28, 2017)

fullinfusion said:


> Let's wait to see if its in developer or gaming path mode.


He explicitly said it was in game mode since "pro mode made no sense".

This is initially pretty darn disappointing...


----------



## XiGMAKiD (Jun 28, 2017)

Looks much better compared to images from AMD's own marketing slide. I'm curious about the thermal and acoustic performance, hope it's not a hot screamer


----------



## robert3892 (Jun 28, 2017)

Nokiron said:


> He explicitly said it was in game mode since "pro mode made no sense".
> 
> This is initially pretty darn disappointing...



If I understood him correctly he's using a 550 watt PSU which isn't enough to drive a VEGA card with that TDP. That might be why his scores aren't so good.


----------



## fullinfusion (Jun 28, 2017)

Nokiron said:


> He explicitly said it was in game mode since "pro mode made no sense".
> 
> This is initially pretty darn disappointing...


Thanks, well looking at that system I think that's to much of a card for what's he's got..


----------



## Raevenlord (Jun 28, 2017)

Nokiron said:


> He explicitly said it was in game mode since "pro mode made no sense".
> 
> This is initially pretty darn disappointing...



Remember that this graphics card isn't geared for gaming. It's geared for professional workloads, while allowing developers to use a special code path in the drivers to run in gaming environments with a performance acceptable enough for them to test the result of their development.

Since the change is made by a driver toggle, I think it's safe to assume there is a limit to the number of "gaming" flags you can enable, software-wise, inside the driver. These drivers weren't optimized for gaming performance into oblivion, like Radeon drivers are. That is probably the reason for the score, and probably part of the reason why AMD didn't send review samples. These aren't meant to be representative of Vega's gaming performance, but are interesting to look at.


----------



## Nokiron (Jun 28, 2017)

robert3892 said:


> If I understood him correctly he's using a 550 watt PSU which isn't enough to drive a VEGA card with that TDP. That might be why his scores aren't so good.





fullinfusion said:


> Thanks, well looking at that system I think that's to much of a card for what's he's got..


Yeah, sure. A 550W Corsair PSU can't handle a stock 4790K with a Vega FE. Please.


----------



## Nokiron (Jun 28, 2017)

Raevenlord said:


> Remember that this graphics card isn't geared for gaming. It's geared for professional workloads, while allowing developers to use a special code path in the drivers to run in gaming environments with a performance acceptable enough for them to test the result of their development.
> 
> Since the change is made by a driver toggle, I think it's safe to assume there is a limit to the number of "gaming" flags you can enable, software-wise, inside the driver. These drivers weren't optimized for gaming performance into oblivion, like Radeon drivers are. That is probably the reason for the score, and probably part of the reason why AMD didn't send review samples. These aren't meant to be representative of Vega's gaming performance, but are interesting to look at.


It's still initially disappointing in comparison to the competition. Even in comparsion to Quadros.


----------



## P4-630 (Jun 28, 2017)

robert3892 said:


> If I understood him correctly he's using a 550 watt PSU which isn't enough to drive a VEGA card with that TDP. That might be why his scores aren't so good.



If the PSU wouldn't be adequate he wouldn't be able to finish a benchmark....


----------



## bug (Jun 28, 2017)

Two eight pin connectors. More power hungry than Titan Xp...


----------



## fullinfusion (Jun 28, 2017)

Nokiron said:


> Yeah, sure. A 550W Corsair PSU can't handle a stock 4790K with a Vega FE. Please.


Lol let me guess, ah nevermind...


----------



## KainXS (Jun 28, 2017)

well he only has a 550W PSU when AMD recommends 850W for this card.(bet 750 would be fine though)


----------



## Raevenlord (Jun 28, 2017)

Nokiron said:


> It's still initially disappointing in comparison to the competition. Even in comparsion to Quadros.



Won't argue with that. Just wanted to make it clearer that these probably aren't representative of RX Vega performance.  No need to have broken dreams yet =)


----------



## fullinfusion (Jun 28, 2017)

P4-630 said:


> If the PSU wouldn't be adequate he wouldn't be able to finish a benchmark....


Don't kid yourself, I had a Corsair 850 psu and a pair of 7970's and it ran the benches.. well it completed them but...The psu was old


----------



## hojnikb (Jun 28, 2017)

LOL, x2 8 pin and a single blower fan. I'm sure this isn't going to be loud at all


----------



## Nokiron (Jun 28, 2017)

KainXS said:


> well he only has a 550W PSU when AMD recommends 850W for this card.(bet 750 would be fine though)


Are you really serious? If that card pulls the maximum supposed "300W", he still has 250(!!)W for the CPU and rest of the components.

A system with a 4790K would pull 100W at most.


----------



## P4-630 (Jun 28, 2017)

fullinfusion said:


> Don't kid yourself, I had a Corsair 850 psu and a pair of 7970's and it ran the benches.. well it completed them but the numbers were low. The psu was old and after installing a new psu the numbers went up



Hmm... I thought you would see a black screen while benchmarking.


----------



## RejZoR (Jun 28, 2017)

hojnikb said:


> LOL, x2 8 pin and a single blower fan. I'm sure this isn't going to be loud at all



Well, it depends. AMD is known to overengineer power delivery. Has been like this and still is. The card may not actually need so much power, but they've given it anyway. I mean, maybe they found out that it's cheaper to use same board layout for both, water cooled and air cooled model and just limit it with BIOS than design 2 separate boards. Just an idea. You can than stick water cooling block on Vega FE and have a beefy power delivery to really push it.

Btw, anyone realized what they did with the name? Radeon FE (Frontier Edition), GeForce FE (Founders Edition)...


----------



## bug (Jun 28, 2017)

RejZoR said:


> Well, it depends. AMD is known to overengineer power delivery. Has been like this and still is. The card may not actually need so much power, but they've given it anyway. I mean, maybe they found out that it's cheaper to use same board layout for both, water cooled and air cooled model and just limit it with BIOS than design 2 separate boards. Just an idea. You can than stick water cooling block on Vega FE and have a beefy power delivery to really push it.
> 
> Btw, anyone realized what they did with the name? Radeon FE (Frontier Edition), GeForce FE (Founders Edition)...


I'm afraid that's not the care here. You can't get a PCI-SIG certification with two eight pin connectors, so if anyone uses them, they're needed.
Professional cards don't need overclocking headroom. They're built for stability above all else. And no, you don't achieve stability through more amps.


----------



## KainXS (Jun 28, 2017)

Nokiron said:


> Are you really serious? If that card pulls the maximum supposed "300W", he still has 250(!!)W for the CPU and rest of the components.
> 
> A system with a 4790K would pull 100W at most.



I would pick up an 750W PSU If I were getting Frontier Edition card, but if I had a quality 650W PSU I would probably keep it, 550W is not enough though to me.


----------



## Frick (Jun 28, 2017)

I still think AMD should give that design to their reference gaming cards.


----------



## Solid State Brain (Jun 28, 2017)

KainXS said:


> I would pick up an 750W PSU If I were getting Frontier Edition card, but if I had a quality 650W PSU I would probably keep it, 550W is not enough though to me.



If people actually measured their system's power consumption during stressing loads and paid attention to the specifications of the 12v rails they would quickly realize that very powerful PSUs aren't really a strict requirement.

GPU manufacturers have to account for cheap PSUs that aren't capable of delivering what they promise.


----------



## P4-630 (Jun 28, 2017)

Solid State Brain said:


> GPU manufacturers have to account for cheap PSUs that aren't capable of delivering what they promise.



I don't think that's how it works....


----------



## OSdevr (Jun 28, 2017)

Solid State Brain said:


> If people actually measured their system's power consumption during stressing loads and paid attention to the specifications of the 12v rails they would quickly realize that very powerful PSUs aren't really a strict requirement.



If people actually put an oscilloscope probe to their GPUs power supply they would quickly realize that they can pull far more than their TDP for brief periods.


----------



## XiGMAKiD (Jun 28, 2017)

RejZoR said:


> Btw, anyone realized what they did with the name? Radeon FE (Frontier Edition), GeForce FE (Founders Edition)...



Just realize that now


----------



## bug (Jun 28, 2017)

I think this an interesting review idea: take a high-end video card and test it with increasingly smaller PSUs. Do it with quality and no name units.


----------



## RejZoR (Jun 28, 2017)

I don't get it how can PSU power be an issue when you can get freaking 850W LC Power Arkangel unit which is pretty darn decent for 100 freaking €. Had one as test unit for when my old mobo died and it's pretty nice looking unit, single rail, modular, really quiet and I heard CWT makes its guts. Making excuses about being "unable" to buy better PSU is just lazy. Besides, if you really care about you rpower delivery, you buy Corsair, Seasonic, BeQuiet or the likes...


----------



## DarkHill (Jun 28, 2017)

P4-630 said:


> I don't think that's how it works....



that is EXACTLY how it works.

For many years the PSU watt-athon was in full force with even the crappiest producers "making" 1000watt psus it was howver only peak power while sustained wattage might be as low as half.

in the west this is fortunately amost gone now, but in east? probably still in full force.

Stick a watt-meter on your PC and see how the powerdraw actually is, then go check the TDP of your current card and look again at the watt-meter. Ill await your surprised comment.


----------



## Nokiron (Jun 28, 2017)

OSdevr said:


> If people actually put an oscilloscope probe to their GPUs power supply they would quickly realize that they can pull far more than their TDP for brief periods.


And that's is absolutely no issue whatsoever for a decent power supply. The power supply in question is more than capable to deal with a Vega FE.


----------



## EarthDog (Jun 28, 2017)

P4-630 said:


> Graphics score you should look at, it's 22.916 of this vega card, my 1070 OC'd scores 21.169 so yeah....


But wait!!!!!!!!! Drivers aren't mature and this dude didn't potentially didn't set up configurations for it......... Ugh, the speculation to play both sides to the middle........... bleh.



robert3892 said:


> Looks like his PSU is too low in wattage for a card like this





robert3892 said:


> If I understood him correctly he's using a 550 watt PSU which isn't enough to drive a VEGA card with that TDP. That might be why his scores aren't so good.


Guys, GPUs don't 'brown out' and get "slower" because of an inadequate PSU...


----------



## Prima.Vera (Jun 28, 2017)

Frontier Edition....Founders Edition.... WTH is with this BS from both manufacturers??


----------



## OSdevr (Jun 28, 2017)

Nokiron said:


> And that's is absolutely no issue whatsoever for a decent power supply. The power supply in question is more than capable to deal with a Vega FE.


We don't know if the power supply in question IS a decent power supply! It is only described as a "550W PSU".


----------



## ZoneDymo (Jun 28, 2017)

Prima.Vera said:


> Frontier Edition....Founders Edition.... WTH is with this BS from both manufacturers??



Well with Nvidia its a bullshit premium
With this card its just... well its just the card,, just a name, there is no non frontier edition, pretty sure its only used as marketing, claiming this reaches a new frontier or something.


----------



## EarthDog (Jun 28, 2017)

OSdevr said:


> We don't know if the power supply in question IS a decent power supply! It is only described as a "550W PSU".


I dont care if it is filled with cement... GPUs do NOT brown out and slow down if it isn't being fed enough power...


----------



## Kissamies (Jun 28, 2017)

Supercrit said:


> That acrylic "R" made me want to get it...


IMO that looks damn stupid! 



hojnikb said:


> LOL, x2 8 pin and a single blower fan. I'm sure this isn't going to be loud at all


Aaaaand..? Even if it would have 10 plugs, it doesn't mean that they're neccessary. Reminds me of my friend about ~10 years ago, "I don't want a high wattage PSU since it takes more power from outlet".


----------



## Nokiron (Jun 28, 2017)

OSdevr said:


> We don't know if the power supply in question IS a decent power supply! It is only described as a "550W PSU".


It's Corsair. It's visible in the images.

https://www.techpowerup.com/img/WEgEUWXLX6RCvEPU.jpg


----------



## Aenra (Jun 28, 2017)

EarthDog said:


> GPUs do NOT brown out and slow down if it isn't being fed enough power...



He's right guys 

If the PSU wasn't enough, he/we'd know because the PC would have shut down/potentially damaged itself. It didn't shut down, ergo it sufficed.

(this isn't to say i agree with everything anyone's ever said about today's market and having a 500W PSU [don't], am just commenting on this specific scenario)


----------



## Kissamies (Jun 28, 2017)

Nokiron said:


> It's Corsair. It's visible in the images.
> 
> https://www.techpowerup.com/img/WEgEUWXLX6RCvEPU.jpg


But what model exactly, that's what is the most important thing. There's more than a little difference in quality between older CX's and Vengeances for example.


----------



## Nokiron (Jun 28, 2017)

9700 Pro said:


> But what model exactly, that's what is the most important thing. There's more than a little difference in quality between older CX's and Vengeances for example.


That's grasping for straws that does not matter.


----------



## TheinsanegamerN (Jun 28, 2017)

EarthDog said:


> I dont care if it is filled with cement... GPUs do NOT brown out and slow down if it isn't being fed enough power...


Pascall GPUs will power throttle (IE run lower/disabled boost) if not fed enough juice. It's entirely possible that vega has the same ability. 

That being said, a decent 550 watt PSU should be able to drive a 300 watt card. I used to run dual 150 watt 550tis on a 550 watt no problem.


----------



## bug (Jun 28, 2017)

Nokiron said:


> And that's is absolutely no issue whatsoever for a decent power supply. The power supply in question is more than capable to deal with a Vega FE.


No, no, no, you got it all wrong.
Based on the evidence at hand we must conclude the consumer Vega will easily beat a 1080Ti while using 150W or less. This card right here was obviously hamstrung. /sarcasm


----------



## P4-630 (Jun 28, 2017)

fullinfusion said:


> Don't kid yourself, I had a Corsair 850 psu and a pair of 7970's and it ran the benches.. well it completed them but the numbers were low. The psu was old and after installing a new psu the numbers went up





EarthDog said:


> Guys, GPUs don't 'brown out' and get "slower" because of an inadequate PSU...



So what is it now!??
LOL!!

Should I buy a PSU with moar watts to get a better 3D Mark score?? 
Or no....


----------



## PerfectWave (Jun 28, 2017)

At least we know that the air gpu is clocked at 1600 mhz. maybe it is the cpu that bottleneck the gpu.


----------



## Nokiron (Jun 28, 2017)

PerfectWave said:


> At least we know that the gpu is clocked at 1600 mhz. maybe it is the cpu that bottleneck the gpu.


Not in 3Dmark, and not a 4790K.

What we do know is that the card probably throttles immensly because of thermals (Powerlimit as well?).


----------



## NGreediaOrAMSlow (Jun 28, 2017)

Nokiron said:


> Are you really serious? If that card pulls the maximum supposed "300W", he still has 250(!!)W for the CPU and rest of the components.
> 
> A system with a 4790K would pull 100W at most.



And I guess you forgot the power supply rating.  Which is a two digit number representing the guarantee load.  If branded PSU, then should be at least 80.  Which means guarantee up to 80% of it's full capacity.

Non branded (generic) PSU are lower than that.

550*.80=440

He has a guarantee load up to 440 Watts.  While the power may reach 550,  pushing it may have side effects.

Also a K processor.  Do you think overclocking it will still consume the same?


----------



## OSdevr (Jun 28, 2017)

Aenra said:


> He's right guys
> 
> If the PSU wasn't enough, he/we'd know because the PC would have shut down/potentially damaged itself. It didn't shut down, ergo it sufficed.
> 
> (this isn't to say i agree with everything anyone's ever said about today's market and having a 500W PSU [don't], am just commenting on this specific scenario)



I've had personal experience tell me otherwise:

I had a i5-3570K (77w TDP, not overclocked) machine with a 600w Thermaltake PSU that worked well until I added a GTX 660 TI (150w). Everything was fine if a game wasn't graphically demanding but as soon as I played one that was the video would look really... odd and my sound would become scratchy and quickly cut out. I tried reinstalling drivers and such but no luck until I tried a larger power supply. To my great surprise that fixed everything.

EDIT: My system froze shortly after the 'symptoms' started and needed to be rebooted.


----------



## Nokiron (Jun 28, 2017)

NGreediaOrAMSlow said:


> And I guess you forgot the power supply rating.  Which is a two digit number representing the guarantee load.  If branded PSU, then should be at least 80.  Which means guarantee up to 80% of it's full capacity (550W).
> 
> Non branded (generic) PSU are lower than that.


No I didnt. Because you are wrong.

The efficiency is calculated from the wall, not internally. A 550W-rated powersupply with 80% efficiency pulls 660W from the wall but still delivers 550W to the components.



OSdevr said:


> I've had personal experience tell me otherwise:
> 
> I had a i5-3570K (77w TDP, not overclocked) machine with a 600w Thermaltake PSU that worked well until I added a GTX 660 TI (150w). Everything was fine if a game wasn't graphically demanding but as soon as I played one that was the video would look really... odd and my sound would become scratchy and quickly cut out. I tried reinstalling drivers and such but no luck until I tried a larger power supply. To my great surprise that fixed everything.


That's a problem with the individual unit. Not the model.

Stop with the non-issue, seriously. It's not the power supply.


----------



## Captain_Tom (Jun 28, 2017)

renz496 said:


> 17k on fire strike? that is 1070 territory?



They showed it off gaming next to a Titan XP, and PCgamer said they were performing almost identically.

That would put it at least 60% stronger than the 1070 even just to be near the 1080 Ti.


Something fishy is going on here...


----------



## fullinfusion (Jun 28, 2017)

P4-630 said:


> So what is it now!??
> LOL!!
> 
> Should I buy a PSU with moar watts to get a better 3D Mark score??
> Or no....


No your missing the point.. think of your mod, how well does it perform at 4.2v vs 3.4v? Understand 

My 850 I had was hitting over 1100w at the wall.. pictures are here on the site somewhere.. it didn't enjoy that pull and after awhile it just ran like it wasn't sure what to do..


----------



## pat-roner (Jun 28, 2017)

KainXS said:


> well he only has a 550W PSU when AMD recommends 850W for this card.(bet 750 would be fine though)



750w are you bonkers?


----------



## KainXS (Jun 28, 2017)

PerfectWave said:


> At least we know that the air gpu is clocked at 1600 mhz. maybe it is the cpu that bottleneck the gpu.



AMD flat out says the peak clock is 1600MHz and the typical is 1382MHz but they never state the base clock. The reference RX 480 was clocked at 1266MHz but you had had to change the power limit so it would not power throttle and change the temp so it would not thermal throttle. So we don't know what clock its at really.



OSdevr said:


> I've had personal experience tell me otherwise:
> 
> I had a i5-3570K (77w TDP, not overclocked) machine with a 600w Thermaltake PSU that worked well until I added a GTX 660 TI (150w). Everything was fine if a game wasn't graphically demanding but as soon as I played one that was the video would look really... odd and my sound would become scratchy and quickly cut out. I tried reinstalling drivers and such but no luck until I tried a larger power supply. To my great surprise that fixed everything.
> 
> EDIT: My system froze shortly after the 'symptoms' started and needed to be rebooted.



That does not sound right, I used my 3770k overclocked with an RX 480@ 1.3V and an 750Ti. Maybe that PSU you had was bad or a cheap PSU.


----------



## Deleted member 172152 (Jun 28, 2017)

Wasn't even running 1600mhz like it should. If it was running at something like 1400mhz that would mean a 1600mhz score of about 19.5k at least. Besides, we don't know what support's like, we don't know how well rx vega will do and most importantly, firestrike isn't an actual game. Lots of if's, dunno's and a guy doing the benchmarks that doesn't understand wattman. Great, now we still don't know how good rx vega is.


----------



## Captain_Tom (Jun 28, 2017)

Hugh Mungus said:


> Wasn't even running 1600mhz like it should. If it was running at something like 1400mhz that would mean a 1600mhz score of about 19.5k at least. Besides, we don't know what support's like, we don't know how well rx vega will do and most importantly, firestrike isn't an actual game. Lots of if's, dunno's and a guy doing the benchmarks that doesn't understand wattman. Great, now we still don't know how good rx vega is.



Why aren't they using the Ultra 4K Firestrike bench?   1080p is almost completely irrelevant in GPU's this expensive.


----------



## NGreediaOrAMSlow (Jun 28, 2017)

Nokiron said:


> Are you really serious? If that card pulls the maximum supposed "300W", he still has 250(!!)W for the CPU and rest of the components.
> 
> A system with a 4790K would pull 100W at most.



100 Watts? More in the 120, 140oc according to this.
http://www.tomshardware.com/reviews/core-i7-4790k-devils-canyon-overclock-performance,3845-9.html


----------



## Nokiron (Jun 28, 2017)

NGreediaOrAMSlow said:


> 100 Watts? More in the 120, 140oc according to this.
> http://www.tomshardware.com/reviews/core-i7-4790k-devils-canyon-overclock-performance,3845-9.html


Does not matter, even with that rating.

You are completely wrong with the power-supply rating by the way. You have misunderstood it completely.


----------



## okidna (Jun 28, 2017)

Looking good for AMD If the promises that "*RX Vega will actually be faster than Frontier version*" is turned out to be true.



NGreediaOrAMSlow said:


> And I guess you forgot the power supply rating.  *Which is a two digit number representing the guarantee load.  If branded PSU, then should be at least 80.  Which means guarantee up to 80% of it's full capacity.*
> 
> Non branded (generic) PSU are lower than that.
> 
> 550*.80=440



Oh wow....

That's NOT what efficiency rating mean. To put it simple : Efficiency = (output power / input power)

More explanation : efficiency is a ratio between the output power (DC power generated by PSU) divided by input power (AC power needed by the PSU from electricity socket). The number will never be 1 (or 100%) because there's always certain amount of power lost during the AC to DC conversion. The higher the efficiency rating = less AC power consumed from the socket, and vice versa.

Simple example : you have a 550W, 85% rated efficiency (all load level for easy example) PSU, and your PC uses 350W of DC power. So, you actually use = (350W / 85%) = 411.76W of AC power from the socket. And when your PC actually uses 550W of DC power, you will pull 647W of AC power from the socket.


----------



## Supercrit (Jun 28, 2017)

9700 Pro said:


> IMO that looks damn stupid!


I think I'm like a bird or something, I love shiny transparent things like gems etc.



NGreediaOrAMSlow said:


> And I guess you forgot the power supply rating.  Which is a two digit number representing the guarantee load.  If branded PSU, then should be at least 80.  Which means guarantee up to 80% of it's full capacity.
> 
> Non branded (generic) PSU are lower than that.
> 
> ...




It's the other way around, if the PSU manufacturer is not a dirty liar. 550w PSU should be able to output this amount to the components while pulling 550/0.8=687.5 from the wall.


----------



## uuuaaaaaa (Jun 28, 2017)

MilesMetal said:


> Do you have a link to the Disqus article?
> 
> Thanks,
> Miles.



It's the wccftech comment section on an article.


----------



## Kissamies (Jun 28, 2017)

Well, a new product and drivers are probably far from "ready" yet..


----------



## xkm1948 (Jun 28, 2017)

In case people forgot, this is the leaked version of VEGA running at 1200MHz core








Pay attention to the graphic score, 17801. I plot a little chart using all the data points available. It looks like a pretty good linear regression line to me. 


When I have time tonight I will do a series of frequency vs GPU score plot for my FuryX.


----------



## EarthDog (Jun 28, 2017)

If one extrapolates data from extrapolated data to form another extrapolated data point, how can we go wrong????????????????


----------



## OSdevr (Jun 28, 2017)

KainXS said:


> That does not sound right, I used my 3770k overclocked with an RX 480@ 1.3V and an 750Ti. Maybe that PSU you had was bad or a cheap PSU.


I just cracked it open to see if anything was amiss. Hardly a thorough examination, but no signs of damage. I suppose if there was it wouldn't be working at all though.

Either way, it looks like a GPU can act strangely without sufficient power and not immediately crash the system.


----------



## EarthDog (Jun 28, 2017)

NGreediaOrAMSlow said:


> And I guess you forgot the power supply rating.  Which is a two digit number representing the guarantee load.  If branded PSU, then should be at least 80.  Which means guarantee up to 80% of it's full capacity.
> 
> Non branded (generic) PSU are lower than that.
> 
> ...


Sir... that is NOT how it works... the efficiency rating is how much power the PSU is pulling FROM THE WALL. It DOES NOT take away from the label rating!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

So, if I was pulling an actual 500W load on a 90% efficient PSU, I would be pulling 550W FROM THE WALL.

Does that help clear things up????



TheinsanegamerN said:


> Pascall GPUs will power throttle (IE run lower/disabled boost) if not fed enough juice. It's entirely possible that vega has the same ability.


They do (NVIDA GPUs....or Vega...........or  are we hoping something sticks??) Have any links to this? It is something I never heard of before... 



OSdevr said:


> Either way, it looks like a GPU can act strangely without sufficient power and not immediately crash the system.


Or, just keep on thinking that is going to happen... 


Let's get our heads out of our...err, the sand, can we?


----------



## KainXS (Jun 28, 2017)

OSdevr said:


> I just cracked it open to see if anything was amiss. Hardly a thorough examination, but no signs of damage. I suppose if there was it wouldn't be working at all though.
> 
> Either way, it looks like a GPU can act strangely without sufficient power and not immediately crash the system.



Normally when a GPU does not get enough power from the PSU it crashes the system or cuts off from its protection but with that system and 600W PSU, the PSU must have been bad or cheaply made. Even with this Vega card, a 550W PSU is not what I would recommend but if his PSU is a quality one(RM550 or something) it should run and the PSU should not be the limiting factor here. Thermaltake years ago made some really shifty PSU's on their low end, maybe that was your problem.


----------



## R-T-B (Jun 28, 2017)

fullinfusion said:


> Don't kid yourself, I had a Corsair 850 psu and a pair of 7970's and it ran the benches.. well it completed them but the numbers were low. The psu was old and after installing a new psu the numbers went up



I'm sorry, but I'm pretty skeptical that that's what's going on here, or even was in your case.  Low voltages / under delivered power don't cause low benchmarks, they cause instabilities, crashes, or even hardware damage.  Most likely outcome of overloading a PSU is actually a hard shutdown, as it won't "not get enough juice."  PSU's are a pull technology, they will attempt to deliver the requested wattage, or if built well, shutdown when unable.  Whether they do it with acceptable ripple or without an electrical fire is another matter.


----------



## Deleted member 172152 (Jun 28, 2017)

xkm1948 said:


> In case people forgot, this is the leaked version of VEGA running at 1200MHz core
> 
> 
> 
> ...


Still a 1080 score (just look at the graphics) in a dx11 synthetic bench with a dx12 optimized card with workstation optimizations as well! Not exactly representative of rx vega and the drivers will get some improvements, but it's still a decent score and a pretty good card for a gaming prosumer. New benches soon apparently and we should get a few more professional reviews once reviewers get their cards, which they had to buy with their own money.


----------



## dwade (Jun 28, 2017)

So Vega is looking to compete against the upcoming GTX 2060... maybe even GTX 2050 if Nvidia is generous.


----------



## EarthDog (Jun 28, 2017)

Hugh Mungus said:


> Still a 1080 score (just look at the graphics) in a dx11 synthetic bench with a dx12 optimized card with workstation optimizations as well! Not exactly representative of rx vega and the drivers will get some improvements, but it's still a decent score and a pretty good card for a gaming prosumer. New benches soon apparently and we should get a few more professional reviews once reviewers get their cards, which they had to buy with their own money.


I just keep wondering why so many people care about a prosumer card...


----------



## R-T-B (Jun 28, 2017)

EarthDog said:


> I just keep wondering why so many people care about a prosumer card...



Because it's VEGAGAGAGAGAGAGA!

Sorry, I had to.


----------



## rtwjunkie (Jun 28, 2017)

EarthDog said:


> I just keep wondering why so many people care about a prosumer card...


Look at the obsession with UFO, bigfoot, Champ, Loch Ness Monster, Abominable Snowman, etc.  People grab at anything that is in the neighborhood of related.


----------



## TheGuruStud (Jun 28, 2017)

Man, why didn't AMD come up with this brilliant plan faster? Release a new GPU with 40% less IPC, woohoo.

Gimme a break. The retardation of commenters (have you seen vidcardz and wcctardtech) is only rivaled by the people posting this shit show for clicks.


----------



## fullinfusion (Jun 28, 2017)

R-T-B said:


> I'm sorry, but I'm pretty skeptical that that's what's going on here, or even was in your case.  Low voltages / under delivered power don't cause low benchmarks, they cause instabilities, crashes, or even hardware damage.  Most likely outcome of overloading a PSU is actually a hard shutdown, as it won't "not get enough juice."  PSU's are a pull technology, they will attempt to deliver the requested wattage, or if built well, shutdown when unable.  Whether they do it with acceptable ripple or without an electrical fire is another matter.


Dont be sorry, I don't think so neither in this case. But my case was an oddity to say the least.. was able to replicate the issue in two different rigs and when I said caused lower scores I didn't say how much it lowered them. Dirty and erratic power can cause a bunch of different issues and in my instance it was just a psu that was at the end of its life.. mind you this is going back a number of years too and tech has changed alot too. I'm not going to say a better psu will raise numbers but a clean line of power will definitely hold the numbers.


----------



## EarthDog (Jun 28, 2017)

R-T-B said:


> Because it's VEGAGAGAGAGAGAGA!
> 
> Sorry, I had to.





rtwjunkie said:


> Look at the obsession with UFO, bigfoot, Champ, Loch Ness Monster, Abominable Snowman, etc.  People grab at anything that is in the neighborhood of related.


I get that, you can, sort of, maybe, assume that performance out of a pure gaming card should be faster. If this is 1070 speeds (not saying it is) and at 300W, that isn't great. But again, its a flagship part, who really cares... just manage the heat in the case and noise, right? I still have to image Vega ( or VEGAVEGAVEGAVEGAVEGAVEGAVEGA!! as some call it ), the actual gaming card, to be around 1080Ti performance or slightly better. But if all of that is true, it has a lot of ground to make up.


----------



## HD64G (Jun 28, 2017)

dwade said:


> So Vega is looking to compete against the upcoming GTX 2060... maybe even GTX 2050 if Nvidia is generous.


Not any chance that we are going to see 50% more performance from Pascal to next gen, even if it's Volta based and not a refresh from 10X0 series as some sources suggest. So, keep trolling...


----------



## xkm1948 (Jun 28, 2017)

People try to bench VEGA FE against other gaming cards:

Fanboys: No no no, this is a PROSUMER card. It is NOT for gaming.

People try to bench VEGA FE against other workstation cards:

Fanboys: No no no, this is a PROSUMER card you cannot compare it to Quandro, UNFAIR blah blah blah.


People: So can we bench this AGAINST ANYTHING?

Fanboys: I guess NO?

Raja@RTG: YES, this is EXACTLY why we push VEGA as a "prosumer" card. The fanboys will defend us like mad men.


In reality, Vega FE is an abomination. It is neither for gaming nor for content creation. Or it is both for gaming and content creation.

This is some excellent(shitty) marketing right there. Well played Raja, well played RTG.




EarthDog said:


> I get that, you can, sort of, maybe, assume that performance out of a pure gaming card should be faster. If this is 1070 speeds (not saying it is) and at 300W, that isn't great. But again, its a flagship part, who really cares... just manage the heat in the case and noise, right? I still have to image Vega ( or VEGAVEGAVEGAVEGAVEGAVEGAVEGA!! as some call it ),* the actual gaming card, to be around 1080Ti performance or slightly better.* But if all of that is true, it has a lot of ground to make up.



In Vulkan and well optimized DX12 applications I imagine that to become reality.



Also new sauce:


----------



## Steevo (Jun 28, 2017)

So a card sold not as a gaming card, performs poorly in gaming, likely due to.

1) Forced application paths, you know, forcing true or HDR color on everything, forcing redundancy checks on everything to make sure rendered calculations are correct.
2) Immature drivers, and drivers specifically not for performance gaming that probably have hooks for other applications to read data and or dump data about render stages.
3) Looking at the card they have probably tuned the profile to keep temps low and fan speed low as it is essentially a workstation card.
4) Not being a gaming card.

I wonder if people who buy Prius cars are surprised they don't go as well as a Tesla, cause, they are both electric, have four wheels and a steering wheel, have windows that roll up and down, seat belts, and rear seats....




xkm1948 said:


> People try to bench VEGA FE against other gaming cards:
> 
> Fanboys: No no no, this is a PROSUMER card. It is NOT for gaming.
> 
> ...



What a great and helpful post. Did you write it all by yourself? Do you understand what the card is for? Do you understand Vega FE is meant for midsized business to do things like oil exploration and other modeling? Game companies to use when rendering scenes and dumping data to see where and if any bottlenecks occur? 

probably not, now run along while the adults talk.


----------



## Deleted member 172152 (Jun 28, 2017)

xkm1948 said:


> People try to bench VEGA FE against other gaming cards:
> 
> Fanboys: No no no, this is a PROSUMER card. It is NOT for gaming.
> 
> ...


Gaming cards can be great for prosumers (basic stuff) but could use a few optimizations. Vega FE has those optimizations to make it a great basic high-end workstation card. Sounds odd, but some programs just don't need all the quadro/wx drivers or optimizations. Vega FE isn't trying to compete with quadros or gaming cards. It's meant for people who can use something like a titan Xp for professional use, only vega FE is more geared towards the "pro" in prosumer and less towards "rich idiots" who apparently are in the same category as normal prosumers.


----------



## xkm1948 (Jun 28, 2017)

Hugh Mungus said:


> Gaming cards can be great for prosumers (basic stuff) but could use a few optimizations. Vega FE has those optimizations to make it a great basic high-end workstation card. Sounds odd, but some programs just don't need all the quadro/wx drivers or optimizations.



Actual work station users WILL go for Quandro as nVidia driver and tech support is way better than AMD's.  Also this price is not anywhere cheap.



Steevo said:


> So a card sold not as a gaming card, performs poorly in gaming, likely due to.
> 
> 1) Forced application paths, you know, forcing true or HDR color on everything, forcing redundancy checks on everything to make sure rendered calculations are correct.
> 2) Immature drivers, and drivers specifically not for performance gaming that probably have hooks for other applications to read data and or dump data about render stages.
> ...



Ah, I was waiting for some good old car analogy here. Kinda impossible for a thread to go on without some people using this.

VEGA FE and RX VEGA use the exact the same GPU. Does your lovely Prius and Tesla use the exact same engine? I think not.

And yeah if reasoning fails you resort to another good old tactic---personal attack. How mature of you.


----------



## Deleted member 172152 (Jun 28, 2017)

xkm1948 said:


> Actual work station users WILL go for Quandro as nVidia driver and tech support is way better than AMD's.  Also this price is not anywhere cheap.


Surely there is a market for Titans and vega FE? Haven't seen a single vega FE in stock yet.


----------



## jabbadap (Jun 28, 2017)

Well let see Fury X is by fp32 flops as 2*64*64*1.05=8601.6GFlops. Vega FE is by amds mentioned typical core clock 2*64*64*1.382=11321.344GFlops. Fury X gets by guru3ds firestrike test as 16 081 GPU score, from that VEGA FE should get at least 16 081*11 321.344/8601.6=21 166 as gpu score. From peak core clock of 1.6GHz same estimation would be gpu score of 24 504. If we but things on perspective by looking nvidia's figures from guru3D vanilla gtx 1080 FE gets gpu score 21 905 and vanilla gtx 1080 ti FE gets gpu score 28 340.


----------



## Captain_Tom (Jun 28, 2017)

Before we really place judgement I just want to see a FULL review of this thing.


Things like 16GB of HBM2, HBC (which allows for a 512 TB frame-buffer!), rapid packed math, and the rest of the bells and whistles could make this incredible at certain work-loads.

I wouldn't be surprised if there were some professional apps that got massive boosts from Vega, and like everyone says we need real game benchmarks (and a stable boost clock) to see how this does in gaming.


----------



## Steevo (Jun 28, 2017)

xkm1948 said:


> Actual work station users WILL go for Quandro as nVidia driver and tech support is way better than AMD's.  Also this price is not anywhere cheap.
> 
> 
> 
> ...



Funny enough, the Prius has and engine, the Tesla has electric motors. 

Where and how do you know the Vega FE and RX have the same GPU? 

The only test to come out of this is from someone who said "i don't know WattMan or these settings, I'm sorry" and the last test was at 1080 levels, on a non-gaming card. 

The only personal part was a response to calling anyone who says to wait, and that this is a prosumer card a fanboi, so pot and kettle mate. Based solely on one person, with no knowledge of the product, in a system that is unknown, and highly unlikely to be able to be reproduced.....  its a bit unfair to call the game at this point.


----------



## Kissamies (Jun 28, 2017)

I don't believe that "PSU is too underrated so it the card speed throttles" theory, when I had Pentium D 935 @ 4GHz with Radeon HD4850 on 300W PSU, the system just shut down if the power draw was too high. I had to undervolt the card to keep the system from shutting down.


----------



## Captain_Tom (Jun 28, 2017)

9700 Pro said:


> I don't believe that "PSU is too underrated so it the card speed throttles" theory, when I had Pentium D 935 @ 4GHz with Radeon HD4850 on 300W PSU, the system just shut down if the power draw was too high. I had to undervolt the card to keep the system from shutting down.



It can happen buddy, so don't act like this isn't a possible thing (I have seen it in person like many other people here).   Nonetheless I also somewhat doubt that's what it is.   

Doesn't matter though:  This guy clearly is a noob, and 3Dmark is a wast of energy.   Real games and professional apps are what matters.


----------



## Kyuuba (Jun 28, 2017)

It makes my eyes hurt when i read PSU wattage affects score. Please.


----------



## Gasaraki (Jun 28, 2017)

LOL, I laugh when people say the PSU is the "bottleneck". That's not how it works...

If the PSU is not supplying enough power, the computer will crash, not get low scores in firestrike. 

OMG.


----------



## Kissamies (Jun 28, 2017)

Captain_Tom said:


> It can happen buddy, so don't act like this isn't a possible thing (I have seen it in person like many other people here).   Nonetheless I also somewhat doubt that's what it is.
> 
> Doesn't matter though:  This guy clearly is a noob, and 3Dmark is a wast of energy.   Real games and professional apps are what matters.


I don't doubt it, but never had that problem myself. Several years ago I built a Sims-PC for my ex-girlfriend with i3-530 & GTX470 and ran without problems with Delta 350W PSU. Just saying that I haven't ever experienced anything like that.


----------



## xkm1948 (Jun 28, 2017)

Also, Can you game on Nvidia Quadro / AMD Pro cards?

yes, yes you can. And performance is about the same as their gaming counterpart.

So I see no reason not to use VEGA FE to indicate RX VEGA gaming performance.






















Kyuuba said:


> It makes my eyes hurt when i read PSU wattage affects score. Please.



The denial is strong and the end is nigh


----------



## Italia1 (Jun 28, 2017)

Remember Amd driver team ? Was working for create ready drivers at Vega launch.... Something gone wrong ?


----------



## rtwjunkie (Jun 28, 2017)

Italia1 said:


> Remember Amd driver team ? Was working for create ready drivers at Vega launch.... Something gone wrong ?


Which Vega launch is that? This isn't the gaming card release, therefore no drivers yet.


----------



## Deleted member 172152 (Jun 28, 2017)

xkm1948 said:


> Also, Can you game on Nvidia Quadro / AMD Pro cards?
> 
> yes, yes you can. And performance is about the same as their gaming counterpart.
> 
> ...


Rx vega will a) have higher clockspeeds and b) be less like a wx card and c) have better gaming drivers at launch since AMD likely left those for last. Dunno why, but vega FE doesn't seem to know what the hell it's supposed to be other than a generic prosumer card.

Firestrike is always going to favour pascal, simply because it's dx11. Vega is designed for dx12 and mainly vulkan. Than again, DiRT 4, what I believe is a dx11 game, performs really well with an rx 580, so with the right optimizations vega should run well in any game. Just one problem: not many game devs can be bothered to optimize their games, especially for AMD! At least it won't be as bad as ryzen 7 at launch. Some games have nearly doubled 1080p framerates!

Can we at least wait untill we get proper reviews and at least one gaming driver update? That way we know AMD has spent time on the gaming drivers and the reviewer actually has a clue what on earth he's doing. 1080 performance in firestrike is great, but I expect at least 10% better from vega FE and much closer to 1080 ti with rx vega with driver optimizations, increased clockspeeds, etc.


----------



## EarthDog (Jun 28, 2017)

Hugh Mungus said:


> Rx vega will a) have higher clockspeeds and b) be less like a wx card and c) have better gaming drivers at launch since AMD likely left those for last. Dunno why, but vega FE doesn't seem to know what the hell it's supposed to be other than a generic prosumer card.
> 
> Firestrike is always going to favour pascal, simply because it's dx11.


so... higher clockspeeds means more power use..we are already at 300 and 375...getting to be a lot for a single gpu for sure.

Yes, this is not a pure gaming card. 

Most games are dx11, really. Dx12 is making headway, but i wouldnt call it saturated with 20 games...
https://en.m.wikipedia.org/wiki/List_of_games_with_DirectX_12_support


----------



## Deleted member 172152 (Jun 28, 2017)

EarthDog said:


> so... higher clockspeeds means more power use..we are already at 300 and 375...getting to be a lot for a single gpu for sure.
> 
> Yes, this is not a pure gaming card.
> 
> ...


Wayyy more games have dx12 in one form or another.


----------



## P4-630 (Jun 28, 2017)

If you play mainly Vulkan based games, you're golden!!! 
(not that there are many though.....)


----------



## Ferrum Master (Jun 28, 2017)

So this is the Sweden edition... just a as expensive as usually


----------



## GC_PaNzerFIN (Jun 28, 2017)

Why do I have the feeling this is not only outsourced but also fully paid possibility to do beta testing of their drivers. Brilliant business move, AMD.


----------



## Tomgang (Jun 28, 2017)

If i am correct Vega is not ment for gaming, but if i compare numbers for my two GTX 970 SLI and the coming GTX 1080 TI card i am planning to upgrade to. Those vega 3dmark scores does not impress me.
Vega score between 21000 and 23000 in GPU score. My two GTX 970 in sli score stock 21539 and then maximum overclock 25274 in GPU score and then GTX 1080 TI score between 26000 and 29000. I am so far not impress by vega. It might be missing driver optimisasion.

And if we go to numbers that ain gonna help. TPD at 300 or 375 watts and price at 999 US or 1500 US compared to GTX 1080 TI TDP 250 watt and from 700 US. The only advantage i se so far for vega is that it has more vram. There may be other features i am not aware of, if yes please tell me.

Well with other words. vega is not gonna change my mind about GPU upgrade. i will still choose GTX 1080 TI.

By the way here is my system fire strike scores to compare.

every thing runs stock here.

http://www.3dmark.com/3dm/17979523?

This is the oc i run CPU and GPU at for every day use.

http://www.3dmark.com/3dm/17980205?

And then i torture my system to its knees.

http://www.3dmark.com/3dm/18112484?

Lets just say i expected more from vega than this so far. When thinking on price and the rated tdp.


----------



## rtwjunkie (Jun 28, 2017)

Hugh Mungus said:


> Wayyy more games have dx12 in one form or another.


Since you are "in the know" provide what is not on that list (currently, not planned or future promises).


----------



## EarthDog (Jun 28, 2017)

Hugh Mungus said:


> Wayyy more games have dx12 in one form or another.


its wiki..it can be off.

Links would be awesome...I'm not going through another thread of 'take hugh at his word when others posted supporting links' again...

...lets see it bud.


----------



## Aenra (Jun 28, 2017)

Personally, i don't care if it's gonna be 5% behind 1080Ti, or a mere 5% ahead of 1080. I know the pricing will be extremely competitive and i know i feel bad for having yet to support them again; as in tangibly. 
Will buy the uber-super-GTi-turbo-injection-PowPow edition, whenever it's out.

(and i'll still feel bad, because Ryzen deserved my money too. Show it with your wallet, or soon enough there won't be anything for you to show anymore and we'll all be buying 4core Intels for $500 a pop. That's my very simple philosophy)


----------



## I No (Jun 28, 2017)

My point of view :

1. No driver update in this world would boost it by 30-35% (where the hype-train showed it would land). 
2. Prosumer card or not it's marketed with a "GAMING MODE". Last time I checked the Titan didn't perform worse than the "gaming" version although the Titan is clearly aimed at gaming (no pro driver support) but it also has some deep-learning usage where it is 20-25% faster than a 1080 Ti.
3. The price is outrageous for what it can do (don't get me wrong the Titan is a rip-off as well) in both gaming and/or pro use
4. *AMD did state they won't be supplying the card for reviewers. *Yet they were the ones that made a comparison vs a Titan in the first place (mind you in pro usage with the Titan on regular drivers - Thanks nVidia for that).
5. No sane person would pick this vs a Quardo or hell even a WX variant for pro usage.
6. AMD dropped the ball when they showed a benchmark vs a Titan (doesn't even matter what was the object of the bench itself). nVidia being them *never *showed a slide where the competition is mentioned they only compare the new gen vs the current gen (tones down the hype).
7. Apart from the 700 series every Titan delivered what was promised(and that's cuz they didn't use the full chip in the Titan).

Long story short rushed, unpolished release. Jack all support. Denying the reviewers a sample. AMD is trying to do what nVidia does with the Titan but it's clearly not in the position to do so. In other words AMD just pulled another classic "AMD cock-up". Again this is my POV, had high hopes for this ... but then again I should've known better...


----------



## mrthanhnguyen (Jun 28, 2017)

is this another hype train as last year with the rx480 ? $200 rx480 will be as fast as gtx980ti/1070 and ALL FUTURE GAMES will be on DX12 or Vulkan.


----------



## Aenra (Jun 28, 2017)

@I No So basically this is "only your POV" and you don't really care and anyway Titans/prosumer cards are rip offs.. but simultaneously, you're disappointed, lol?

+1

@mrthanhnguyen I was never aboard any hype train, that's for children and dysfunctionals. I value performance in one kind of rig, price/performance in another, usage depending. I spend my money accordingly and feel good while at it (supporting the "better".. other.. smaller team). The rest is you people exaggerating and/or putting way too much weight in a simple product that end of the day?
Won't change your life either way, sorry


----------



## GC_PaNzerFIN (Jun 28, 2017)

Latest update shows *4375*rpm fan speed and *81*c temps in Witcher 3 @1600MHz gpu.


----------



## Deleted member 172152 (Jun 28, 2017)

EarthDog said:


> its wiki..it can be off.
> 
> Links would be awesome...I'm not going through another thread of 'take hugh at his word when others posted supporting links' again...
> 
> ...lets see it bud.


Quite a few more large games have dx12 options. There are thousands of games released every year, so good luck getting through all of them, but there definitely are more games with dx12 (options) than bf1 amd forza. DX12 is mainstream, vulkan is niche and yet there are roughly as many games of each according to wikipedia. The vulkan list may be right, but the dx12 list sure as hell isn't. Just about every new game tested seems to have optional dx12 nowadays and barely any of them made it the list.

Many of the games without API stated can run in dx12 mode it seems.


----------



## Steevo (Jun 28, 2017)

xkm1948 said:


> Also, Can you game on Nvidia Quadro / AMD Pro cards?
> 
> yes, yes you can. And performance is about the same as their gaming counterpart.
> 
> ...




Did he turn on ECC? 

http://www.tomsitpro.com/articles/nvidia-quadro-m6000,2-898-2.html

There are a LOT of hardware features that AMD may be giving away now to take the compute market back from Nvidia, so again, gaming means shit on a Pro card. Giving medium sized businesses better performance for their money is where AMD can make up ground they lost to compute. 

Also, what end is nigh?


----------



## Agony (Jun 28, 2017)

I was waiting something like 30.000 graphics score like 1080ti aftermarket . 22.000 is way to low


----------



## xkm1948 (Jun 28, 2017)

GC_PaNzerFIN said:


> Latest update shows *4375*rpm fan speed and *81*c temps in Witcher 3 @1600MHz gpu.




Geez, that is hot and probably loud.


----------



## I No (Jun 28, 2017)

Aenra said:


> So basically this is "only your POV" and you don't really care and anyway Titans/prosumer cards are rip offs.. but simultaneously, you're disappointed, lol?
> 
> +1
> 
> I was never aboard any hype train, that's for children and dysfunctionals. I value performance in one kind of rig, price/performance in another. I spend my money accordingly and feel good while at it, supporting the "better" guys. The rest is you people exaggerating and/or putting way too much weight in a simple product that end of the day? Won't change your life either way, sorry




Thing is i'm kind of tired to pay "extra" because AMD kept fueling the hype and under-deliver. This is bad for business hell, I, like everyone else around these parts would like not to add a kidney to the price. Everyone wants the best performance/price ratio. But look at it this way if this is AMD's "top dog" would you be shocked if the next gen nVidia puts out will be even more expensive? Are you ok with paying through your teeth because *there is no alternative*? Dunno how you got the idea that i don't care about the pricing of the Titan. My POV still stands regarding this "launch" or whatever the heck this is. nVidia gets their fair share of bashing on this as well (no pro drivers for the Titan - when they denied reviewers a sample stating that "It's not for gaming" which is BS). To support the "better guys" you would actually need at least *2 *sides and at this point AMD is offering last year's performance ($500) for $1000 how's this not shooting yourself in the foot? Did not expect to beat nVidia to the punch but at least put up a decent fight ....


----------



## Deleted member 172152 (Jun 28, 2017)

Agony said:


> I was waiting something like 30.000 graphics score like 1080ti aftermarket . 22.000 is way to low


Optimized dx11 though. Only useful for prey. Every dx11 nvidia-biased game should perform worse, and newer games dx11/dx12 should be better than 1080 in vega FE's current state. Still not rx vega and there will always be 5-10% at least from driver updates with any gpu from either nvidia or AMD.

Expect rx vega to be half vega FE's price or so for both air and water. All the improvements over the next month or so will make it a 1080 priced just about 1080 ti competitor with a lot of potential for future otimized dx11, dx12 and vulkan games. DiRT 4 will love vega hopefully!


----------



## Steevo (Jun 28, 2017)

xkm1948 said:


> Geez, that is hot and probably loud.




"
You were asking for pro tests results as follows ((SPECPerfview12_1_1):

Now proper shock. Finished SPECPerfview12_1_1 test on VEGA FE and is almost on par with Quadro P6000 and P5000 with the same settings. Biggest difference is that VEGA FE price was £977 and is running in my PC and both Quadro benchmark results are taken from spec.org as follows:

P6000 
https://www.spec.org/gwpg/g...

P5000
https://www.spec.org/gwpg/g...

...and of course both are in workstations with Xeon CPU's, then no compare to my PC.

VEGA is perfectly prepared for PRO, it behaves completely different no issues at all, it is quiet.. and you get so MUCH performance per dollar is just amazing (compare to Quadro cards - P5000 -£2000, P6000 - £4000 to £5000).

Please look at the screen shots as follows:

1. Vega results - all benchmark frame rates and my platform
2. Vega results - rest of the settings from my platform
3. SPECPerfView12_1_1 - initial settings, as per spec.org I've run it with defaults to make sure that it will match uploaded results on their websites
4. All the numbers on the same image and as you can see most of the time VEGA is on par with those two (again not in workstation but standard PC)."


$1000 card performing as well as $5000 green card, and its quiet........ not throttling.... hmmm. Anything else you wanna bitch about?


----------



## GC_PaNzerFIN (Jun 28, 2017)

Steevo said:


> "
> 
> 
> $1000 card performing as well as $5000 green card, and its quiet........ not throttling.... hmmm. Anything else you wanna bitch about?



In all fairness, I can pretty much guarantee over 4000rpm radial fan pushing air through fin array is anything but quiet. Laws of physics still apply, even to AMD. No throttling though, and I wouldn't really expect at that fan speed.


----------



## Ferrum Master (Jun 28, 2017)

xkm1948 said:


> Geez, that is hot and probably loud.



Still better than Hawaii... 14C man.


----------



## Aenra (Jun 28, 2017)

I No said:


> i'm kind of tired to pay "extra" because AMD kept fueling the hype and under-deliver  ...  Everyone wants the best performance/price ratio



You're not forced to do anything, ever; that's one 
On top of that, you have zero, absolute no reason to even think that the gaming-oriented version will cost more than Nvidia's; don't know where you saw that 'extra', but it goes with what i was saying before. We need to approach this more.. reasonably. Feels like you've added some hyperbole to the equation, then used it to conclude you're right to be disappointed.
Now in terms of the hype.. why did you allow any hype to influence you in the first place? Are we not in the 21st century? Who is it that has yet to learn how marketing is done or, even better, how better it is to let the customers do the marketing for the company? You've only yourself to blame if you fell for that, and that's the truth.

As to the latter part i quote above; NO. That's you thinking like that; skewed perceptions, a market gone haywire because that's how they prefer you to think. Do all of you here drive Porsches? Live in $1000000 mansions? No. If you get a second toddler and are in need of a new house, what you gonna do? Buy the mansion or go under the river?
Whom is it that said we must want the 'bestest'? 
And by the way, because 99% i know the reply to this last bit.. if let's say it's 5% slower than the 1080Ti.. is that 'worstest'? 'Fail'? How exactly could you measure anything when all you go by is 'bestest'? Again, rather.. skewed criteria there.

*same goes for price-performance ratio.. who told you that's what everyone wants? Some have money to burn and do so just because, some purchase due to an emotional drive, others due to past impressions/familiarity, etc etc.. this "everyone" is just dead wrong. Keep it limited to what -you- want. And why it is you allowed yourself to form an opinion _before_ it was even out (hype).

Take it or leave it, am not defending them. Am only showing you how this is very, very problematic; and most folks don't even see it no more.


----------



## Steevo (Jun 28, 2017)

GC_PaNzerFIN said:


> In all fairness, I can pretty much guarantee over 4000rpm radial fan pushing air through fin array is anything but quiet. Laws of physics still apply, even to AMD. No throttling though, and I wouldn't really expect at that fan speed.




not to insult you, but I will take the word of someone with an actual card over your ideas.


----------



## GC_PaNzerFIN (Jun 28, 2017)

Steevo said:


> not to insult you, but I will take the word of someone with an actual card over your ideas.


After 10 years of similar fan and cooling designs you expect THIS time it changes everything how radial coolers have worked? Having reviewed and/or owned probably in excess of 100 graphics card, very many with radial fan designs, I do find your ignorance to reality a bit insulting.


----------



## rtwjunkie (Jun 28, 2017)

Hugh Mungus said:


> Quite a few more large games have dx12 options. There are thousands of games released every year, so good luck getting through all of them, but there definitely are more games with dx12 (options) than bf1 amd forza. DX12 is mainstream, vulkan is niche and yet there are roughly as many games of each according to wikipedia. The vulkan list may be right, but the dx12 list sure as hell isn't. Just about every new game tested seems to have optional dx12 nowadays and barely any of them made it the list.
> 
> Many of the games without API stated can run in dx12 mode it seems.


LOL, where to start. DX12 is certainly not mainstream yet, and every new game does NOT seemingly have a DX12 mode.

In any case I haven't yet seen a need for DX12 in any game I have that has it optional. DX11 is working just fine for me...nothing has taxed my system unrealistically yet where DX12 might even be necessary to aid performance.

@I No AMD hasn't hyped this. The legion of rabid AMD fanatic fanboys has.  I don't know why you sound so disappointed. Wait til RX Vega releases before being happy or disappointed.


----------



## Steevo (Jun 28, 2017)

GC_PaNzerFIN said:


> After 10 years of similar fan and cooling designs you expect THIS time it changes everything how radial coolers have worked? Having reviewed and/or owned probably in excess of 100 graphics card, very many with radial fan designs, I do find your ignorance to reality a bit insulting.




I too have owned, worked on, installed and heard many radial blower design cards, Nvidia has managed to keep many quiet and between the fan blade design, shroud, fins, tolerance and much else two seemingly identical coolers can be significantly different in noise, and what pitch or timbre of noise is produced. Thanks for calling me ignorant.


----------



## GC_PaNzerFIN (Jun 28, 2017)

Steevo said:


> I too have owned, worked on, installed and heard many radial blower design cards, Nvidia has managed to keep many quiet and between the fan blade design, shroud, fins, tolerance and much else two seemingly identical coolers can be significantly different in noise, and what pitch or timbre of noise is produced. Thanks for calling me ignorant.


How many of your NVIDIA cards have had fan at over 4000rpm?

Want to check latest similar fan design from AMD, here is TPU RX 480 review.
Now add 100W more heat to be transferred off even beefier fin array.


----------



## EarthDog (Jun 28, 2017)

Hugh Mungus said:


> Quite a few more large games have dx12 options. There are thousands of games released every year, so good luck getting through all of them, but there definitely are more games with dx12 (options) than bf1 amd forza. DX12 is mainstream, vulkan is niche and yet there are roughly as many games of each according to wikipedia. The vulkan list may be right, but the dx12 list sure as hell isn't. Just about every new game tested seems to have optional dx12 nowadays and barely any of them made it the list.
> 
> Many of the games without API stated can run in dx12 mode it seems.


yawn.. links...


----------



## rtwjunkie (Jun 28, 2017)

GC_PaNzerFIN said:


> How many of your NVIDIA cards have had fan at over 4000rpm?
> 
> Want to check latest similar fan design from AMD, here is TPU RX 480 review.
> Now add 100W more heat to be transferred off even beefier fin array.


I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.


----------



## GC_PaNzerFIN (Jun 28, 2017)

rtwjunkie said:


> I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.


No need to trust wizzard, youtube has other RX 480 fan speed noise tests.










5000 rpm, 2600 rpm etc.

RX480 doesn't need to push fanspeed so far, but vega fe seems to do it.


----------



## Steevo (Jun 28, 2017)

GC_PaNzerFIN said:


> How many of your NVIDIA cards have had fan at over 4000rpm?
> 
> Want to check latest similar fan design from AMD, here is TPU RX 480 review.
> Now add 100W more heat to be transferred off even beefier fin array.














Longer fin array from pictures, but the fan itself is what is creating the noise for the most part. The fan itself looks different than the RX480 stock fan in small ways. I'm by no means saying it is silent, but the guy with one said clearly that its not noisy, and he has a watercooled CPU so I am guessing if it were noisy he would know the difference.



rtwjunkie said:


> I've always taken W1zzard's noise chart with a grain of salt, and you should too. According to that chart my 780 I used to have should have been very audible, but I never even heard it, even at full load.



5870's were quiet for the day, but I hated the way mine sounded before the water block went on.


----------



## I No (Jun 28, 2017)

Aenra said:


> You're not forced to do anything, ever; that's one
> On top of that, you have zero, absolute no reason to even think that the gaming-oriented version will cost more than Nvidia's; don't know where you saw that 'extra', but it goes with what i was saying before. We need to approach this more.. reasonably. Feels like you've added some hyperbole to the equation, then used it to conclude you're right to be disappointed.
> Now in terms of the hype.. why did you allow any hype to influence you in the first place? Are we not in the 21st century? Who is it that has yet to learn how marketing is done or, even better, how better it is to let the customers do the marketing for the company? You've only yourself to blame if you fell for that, and that's the truth.
> 
> ...



Ok so basically what you're saying is that i'm not forced to get a product that's best suited for my needs because i'm not forced to do so? Wait what? What does this have to do with a segment in the market that's running completely unopposed at the time of writing? (This would be where those "high hopes" I mentioned would fit in)
Hype is "this will blow X product out of the water" Realism is "They had a whole year to get this at least on OK levels"- And this isn't OK from where i'm sitting. Most of the users/people/whatever do actually care about price/performance more so when we're talking about PC parts. But please permit me to doubt that the same chip in a "gaming" version will do a far better job.
Theory crafting, if it turns up to be 5% bellow or above the Ti then good that means the market will have competition and it will force a reaction thus pushing the industry further (note Intel's reaction to Ryzen).
People buy whatever is suited for their needs. If they feel the need for a $2000 GPU/CPU/whatever they will throw money at it.
I don't really care if it's red blue green purple or has polka-dots on it as long as it is incentive for innovation.
My bad on the usage of "Everyone" should've sticked with "the majority".


----------



## DeathtoGnomes (Jun 28, 2017)

**put the waders on**

The amount of ...less than spectacular displays of knowledge... posts show how many are in need of doing research before guessingposting. 

In the meantime, please dont forget this card is not a gaming card and the drivers could be considered as beta level.


----------



## efikkan (Jun 28, 2017)

Hugh Mungus said:


> Wasn't even running 1600mhz like it should. If it was running at something like 1400mhz that would mean a 1600mhz score of about 19.5k at least. Besides, we don't know what support's like, we don't know how well rx vega will do and most importantly, firestrike isn't an actual game. Lots of if's, dunno's and a guy doing the benchmarks that doesn't understand wattman. Great, now we still don't know how good rx vega is.


You don't know how boost works.
Makers usually specify the base clock and typical/average boost clock. But this time AMD have chosen to specify typical clock and peak boost clock, probably to get a higher peak throughput number to "beat" the competition. However, if we employ the same method to Nvidia, the rated "performance" of Titan Xp would become 14.2 Tflop/s (assuming 1850 MHz peak clock).


----------



## Xzibit (Jun 28, 2017)




----------



## Steevo (Jun 28, 2017)

efikkan said:


> You don't know how boost works.
> Makers usually specify the base clock and typical/average boost clock. But this time AMD have chosen to specify typical clock and peak boost clock, probably to get a higher peak throughput number to "beat" the competition. However, if we employ the same method to Nvidia, the rated "performance" of Titan Xp would become 14.2 Tflop/s (assuming 1850 MHz peak clock).




The guy with the card posted that he had to change the WattMan settings to get it to stay at 1600Mhz and only found that today when playing The Witcher


----------



## Fluffmeister (Jun 29, 2017)

When are the consumer gaming orientated versions due?


----------



## Basard (Jun 29, 2017)

So.... It's a FuryX with higher clocks and "better" memory....


Edit:


I No said:


> My point of view :
> 
> 1. No driver update in this world would boost it by 30-35% (where the hype-train showed it would land).
> 2. Prosumer card or not it's marketed with a "GAMING MODE". Last time I checked the Titan didn't perform worse than the "gaming" version although the Titan is clearly aimed at gaming (no pro driver support) but it also has some deep-learning usage where it is 20-25% faster than a 1080 Ti.
> ...



Seems pretty much spot on.  The Vega will also deliver what what was promised.  The first "prosumer" card, lol.  Ever since ATI became AMD that's all they've seemed to be able to sell--prosumer cards.  The architecture always seems so generic and too future-proof.  Too open and reliant on people devoting their free time to help "the cause"...


----------



## ShurikN (Jun 29, 2017)

I'm actually wondering why this card exists.
It's not a gaming card, and at $1000 it can't compete with similar offerings.
It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...

The only reason I can think of as to why is AMD wasting Vega chips on this abomination (and not on RX) is probably due to the early chips being utter crap. Hence the delay between FE and RX Vega.

A jack of all trades, but pretty mediocre at all of them.


----------



## Basard (Jun 29, 2017)

ShurikN said:


> I'm actually wondering why this card exists.
> It's not a gaming card, and at $1000 it can't compete with similar offerings.
> It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...
> 
> ...



It's the best they could do with a shrunken Fiji.... 
Now, if they could hit 2Ghz.... Nvidia seems to have whooped AMD's ass on the way to 2Ghz.  Oh well, you can't win ALL of the Ghz battles, AMD.  
The slogan should have been "Vega--The Bulldozer of GPUs," the RX Vega's will be "RX vega--the Piledriver of GPU's," and hopefully by the time Navi 2.0 comes out we will have something better than Ryzen to compare it too. 
I hate to hate AMD, but I hope the next revision of Ryzen will offer something other than more cores and a mediocre IPC.  
Done ranting for the night.  Later guys!


----------



## Th3pwn3r (Jun 29, 2017)

Fluffmeister said:


> When are the consumer gaming orientated versions due?



Are you new here or just to the Vega 'situation'?

And my lord, there are so many terrible, stupid, comments filled with BS that I wish I had more faces and palms.


----------



## toilet pepper (Jun 29, 2017)

EarthDog said:


> But wait!!!!!!!!! Drivers aren't mature and this dude didn't potentially didn't set up configurations for it......... Ugh, the speculation to play both sides to the middle........... bleh.
> 
> 
> Guys, GPUs don't 'brown out' and get "slower" because of an inadequate PSU...



Uhmmm. Jayztwocents made a video about that today. Even the cables you use from the PSU mattered in benchmarking. Insignificant but there was a change.


----------



## Fluffmeister (Jun 29, 2017)

Th3pwn3r said:


> Are you new here or just to the Vega 'situation'?
> 
> And my lord, there are so many terrible, stupid, comments filled with BS that I wish I had more faces and palms.



I'm new to the V3ga 'situation'!

But i hear ya, it's facepalm galore.


----------



## Th3pwn3r (Jun 29, 2017)

Fluffmeister said:


> I'm new to the V3ga 'situation'!
> 
> But i hear ya, it's facepalm galore.



Vega has basically been AMD teasing everyone with the cards for what seems like an eternity. A lot of people don't even think they'll come out with the consumer versions at this point, others said F it and bought 1080s/1080tis. I bought a 1080 but am still somewhat thinking about buying Vega but it's getting to the point where Nvidia will have something that stomps all over it if AMD ever decides to release it.


----------



## xkm1948 (Jun 29, 2017)

High Band Width Cache may be amazing on paper. The truth is Vega is 2048bit versus FuryX 4096bit. And Vega has less total Vram bandwidth as well. I wonder whether this has anything to do with the not so promising performance of Vega. After all, no applications or games are optomized for HBC yet.


----------



## qubit (Jun 29, 2017)

So AMD's super duper overpriced flagship card is only as fast as NVIDIA's second tier card and consumes lots of power. Sounds about right.  Kinda the same thing as Ryzen and Intel, but there's not always quite so much difference there at least.

Maybe in another couple of generations AMD will catch up, but I'm not holding my breath.

Perhaps the upcoming gaming oriented RX Vega will be better, but I doubt it.


----------



## Divide Overflow (Jun 29, 2017)

AMD releases a $1000 workstation card that's comparable to a $5000 card from the competition.  Nice work, AMD!
Looking forward to RX Vega in a little over a month and some real reviews.  Reference AMD cards tend to be hot and loud, so tack on another month for custom cooling solutions to appear.


----------



## TheGuruStud (Jun 29, 2017)

Basard said:


> So.... It's a FuryX with higher clocks and "better" memory....



No. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).

I guess you gotta rile up fanboys for attention, though.

Might as well ignore all posts for a couple months.


----------



## xkm1948 (Jun 29, 2017)

TheGuruStud said:


> No. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).
> 
> I guess you gotta rile up fanboys for attention, though.
> 
> Might as well ignore all posts for a couple months.




I agree. If the keep 4096bit width just switch to HBM2, use 14nm to boost up clock speed then it might be better.

And yeah I know I know next to squat about GPU electronic engineering yadayada.


----------



## eidairaman1 (Jun 29, 2017)

Sorry, 300 is my cap on any GPU.


----------



## sweet (Jun 29, 2017)

xkm1948 said:


> I agree. If the keep 4096bit width just switch to HBM2, use 14nm to boost up clock speed then it might be better.
> 
> And yeah I know I know next to squat about GPU electronic engineering yadayada.



HBM2 is not cheap, and 4096 bit means 4 stack and doubling the cost for VRAM unfortunately.


----------



## xkm1948 (Jun 29, 2017)

sweet said:


> HBM2 is not cheap, and 4096 bit means 4 stack and doubling the cost for VRAM unfortunately.



Instead of 2 stacks of 8GB, how about 4 stacks of 4GB? Or better, 4 stacks of 2GB? More bandwidth is always good.


----------



## laszlo (Jun 29, 2017)

too expensive for what can do...

another review:  








quite a fiasco for amd ...


----------



## sweet (Jun 29, 2017)

laszlo said:


> too expensive for what can do...
> 
> another review:
> 
> ...



For what it can do, it is quite a bargain. We are talking about a $1000 card that is equal to a $5000 one in pro tasks.

It's just wrong if this card is judged solely on its gaming performance.


----------



## Xzibit (Jun 29, 2017)

laszlo said:


> too expensive for what can do...
> 
> another review:
> 
> ...



Unfortunately i saw most of the video.  At 1hr 20min he realizes hes not in gaming mode (I think he gets told by the viewers).  Then he has issues with Afterburner and doesn't want to use WattMan cause he doesn't really know how.


----------



## TheGuruStud (Jun 29, 2017)

Xzibit said:


> Unfortunately i saw most of the video.  At 1hr 20min he realizes hes not in gaming mode (I think he gets told by the viewers).  Then he has issues with Afterburner and doesn't want to use WattMan cause he doesn't really know how.



And gaming mode doesn't do anything for games lol. It's just chill, wattman and w/e the record shit is called...?


----------



## laszlo (Jun 29, 2017)

sweet said:


> For what it can do, it is quite a bargain. We are talking about a $1000 card that is equal to a $5000 one in pro tasks.
> 
> It's just wrong if this card is judged solely on its gaming performance.



at some pro task it may deliver; consumer versions will not be better in terms of gaming and they can't ask more than 500 $ for top product... wonder if they manage to cover costs...


----------



## sweet (Jun 29, 2017)

laszlo said:


> at some pro task it may deliver; *consumer versions will not be better in terms of gaming* and they can't ask more than 500 $ for top product... wonder if they manage to cover costs...


Why are you so sure about that bold statement? Note that AMD themselves said that RX Vega will be *much* faster than Vega FE in games. They also said that who wants to game should wait for RX Vega as well.

Also, we still don't know if RX Vega comes with HBM or not. If not, the production cost would be significantly reduced and a much cheaper price is quite feasible while maintaining some margin.


----------



## Xzibit (Jun 29, 2017)

TheGuruStud said:


> And gaming mode doesn't do anything for games lol. It's just chill, wattman and w/e the record shit is called...?



He didnt run test after he installed the updated drivers.  *Once he was told he installed them and just loaded W3 and ran around for 10seconds in the grass and thus made his comparison* from that, he loads and runs around.  Thats a summary of the last 30mins of the video.  1hr 35+ he says he was getting 32-35 on previous drivers then with the new ones he says he gets from 36-43 game mode & pro mode.

The video should be called I don't know how to work my machine because he keeps saying "It's gonna crash, i think its gonna crash!!!" 

This was a pro benchmarkers move. When hes using the drivers that came with the card there is a part where he is running test, Hes looks at the info display and says "I might have something running in the background? Yeah, I have something running in the background"


----------



## Deleted member 172152 (Jun 29, 2017)

GC_PaNzerFIN said:


> After 10 years of similar fan and cooling designs you expect THIS time it changes everything how radial coolers have worked? Having reviewed and/or owned probably in excess of 100 graphics card, very many with radial fan designs, I do find your ignorance to reality a bit insulting.





EarthDog said:


> yawn.. links...


You go look through thousands of games. I have a life.


ShurikN said:


> I'm actually wondering why this card exists.
> It's not a gaming card, and at $1000 it can't compete with similar offerings.
> It's not a workstation card, as it gets beaten by much smaller (die) ws gpus...
> Disproven.
> ...


Again, disproven.


----------



## Boosnie (Jun 29, 2017)

laszlo said:


> too expensive for what can do...
> 
> another review:
> 
> ...



My eyes and eras are bleeding.
My brain is bleeding too for the utter lack of competence of this guy.


----------



## Recus (Jun 29, 2017)

Steevo said:


> not throttling


----------



## rtwjunkie (Jun 29, 2017)

Hugh Mungus said:


> You go look through thousands of games. I have a life.


YOU made the claim.  Those of us who apparently, by your insinuation, don't have a life  already know the number of DX12 games is minuscule.  DX12 has so far turned out to be the most irrelevant DX yet.

When you make a claim like that in a forum, you need to be able to back it up, which you seem to be unable/unwilling to do.


----------



## Xzibit (Jun 29, 2017)

Recus said:


> Throttling



That another funny part.  Hes not using 2 fans to cool it. He has a mickey mouse setup Nactua (92mm I think) fan wrapped in Amazon Prime cardboard pushing air into, when he has a Corsair 120mm fan right next to it. A similar Nactua w/ Amazon Prime sucking air out of the exhaust.

@1:42:00+









Its funny how he says its Thermal throttling 3seconds after loading W3 and the screen with Afterburner the temp says 51c.


----------



## qubit (Jun 29, 2017)

I No said:


> My point of view :
> 
> 1. No driver update in this world would boost it by 30-35% (where the hype-train showed it would land).
> 2. Prosumer card or not it's marketed with a "GAMING MODE". Last time I checked the Titan didn't perform worse than the "gaming" version although the Titan is clearly aimed at gaming (no pro driver support) but it also has some deep-learning usage where it is 20-25% faster than a 1080 Ti.
> ...


Don't worry, none of this will stop the AMD fanboy apologists from defending AMD to the hilt, attacking the likes of you and me.  Weird, as we're not the enemy here, just pointing out the shortcomings of some for-profit company selling substandard stuff, so they should be thanking us instead. Oh and the fanboys don't even get paid for it.


----------



## Liviu Cojocaru (Jun 29, 2017)

I am not sure why people are so bothered about the gaming side of this card and compare it with the Titan Xp in gaming, it's not made for that is like you would compare a Ferrari with a comfort mode available on it with an S class for the comfort on the normal roads...


----------



## Imsochobo (Jun 29, 2017)

KainXS said:


> well he only has a 550W PSU when AMD recommends 850W for this card.(bet 750 would be fine though)



tis no issue on a singlerail psu 550W without OC.
end of discussion.


----------



## KainXS (Jun 29, 2017)

Imsochobo said:


> tis no issue on a singlerail psu 550W without OC.
> end of discussion.



The Power Discussion is over yes I know.



KainXS said:


> Normally when a GPU does not get enough power from the PSU it crashes the system or cuts off from its protection but with that system and 600W PSU, the PSU must have been bad or cheaply made. Even with this Vega card, a 550W PSU is not what I would recommend but if his PSU is a quality one(RM550 or something) it should run and the PSU should not be the limiting factor here. Thermaltake years ago made some really shifty PSU's on their low end, maybe that was your problem.


----------



## Basard (Jun 29, 2017)

TheGuruStud said:


> No. If you believe this ridiculousness, then Fiji is much faster (about same scoring as Fury X at 1.25 GHz).
> 
> I guess you gotta rile up fanboys for attention, though.
> 
> Might as well ignore all posts for a couple months.



Yeah... that's why I come here, to rile up fanboys and to get attention...

So it's a Fury X at 1600Mhz with shittier memory then?

I just call em like I see em.  Based on all of the posts I've seen in the last couple months, I see this as a 1600Mhz Fury X, which isn't bad... just not great.


----------



## TheGuruStud (Jun 29, 2017)

Basard said:


> Yeah... that's why I come here, to rile up fanboys and to get attention...
> 
> So it's a Fury X at 1600Mhz with shittier memory then?
> 
> I just call em like I see em.  Based on all of the posts I've seen in the last couple months, I see this as a 1600Mhz Fury X, which isn't bad... just not great.



I meant you as a vaguery to everyone reposting this idiot from discus.

A fury x with polaris uarch would be a lot better than this...that's why this is just a joke.


----------



## Th3pwn3r (Jun 29, 2017)

qubit said:


> Don't worry, none of this will stop the AMD fanboy apologists from defending AMD to the hilt, attacking the likes of you and me.  Weird, as we're not the enemy here, just pointing out the shortcomings of some for-profit company selling substandard stuff, so they should be thanking us instead. Oh and the fanboys don't even get paid for it.



Do Nvidia fanboys get paid for their smear campaign? This card is actually really good for what its intended use is.

Before you try to say I'm a fanboy, my best card is a 1080. However, I have two Amd cards and two Nvidia in my four running machines. When the consumer/gaming version of Vega drops I'm either getting one or another 1080ti. I'm not going to sit here bashing without knowing what I'm talking about though like 50% of people here.


----------



## KainXS (Jun 29, 2017)

The Frontier Edition bios looks like it hit TPU's bios database

https://www.techpowerup.com/vgabios/193092/193092

Edit:
58 KB is pretty small for an AMD bios though, it must be incomplete.


----------



## qubit (Jun 29, 2017)

Th3pwn3r said:


> Do Nvidia fanboys get paid for their smear campaign?


I've no idea, you'll have to ask one.



Th3pwn3r said:


> This card is actually really good for what its intended use is.


I think not and I've explained why, so no point repeating myself. If you don't agree, that's fine by me. You certainly do sound like and AMD apologist/fanboy and owning an NVIDIA card doesn't change that.


----------



## EarthDog (Jun 29, 2017)

Hugh Mungus said:


> You go look through thousands of games. I have a life.


like he said, you made the claim. I went out and provided a link (since i have no life, therefore the time) and here we are again with your lips around our collective rims blowing smoke up it. I dont mind the lips, but your words hold ZERO merit without support. 

Support your claims or continue to be grouped with the other clueless muppets blowing smoke.


----------



## PerfectWave (Jun 29, 2017)

why some ppl test this card at 1080p? Also for non gaming app? Really really...


----------



## EarthDog (Jun 29, 2017)

The longer the thread, the less clue seems to populate it...


----------



## xkm1948 (Jun 29, 2017)

EarthDog said:


> The longer the thread, the less clue seems to populate it...



I have given up discussion in this thread already. Mosylt people already have their own version of Vega image established in their head. We are not here looking for different arguments for the sake of facts. Most of us are here looking for echo chambers to justify our own beliefs.


----------



## Prima.Vera (Jun 29, 2017)

This starts to look more and more like the 3DFx Voodoo5 5500 story. Now AMD on the same position.
Is funny that people are still not learning from past mistakes...


----------



## phanbuey (Jun 29, 2017)

looks close to stock 1080 performance in dx12 - that timespy score is good.


----------



## iO (Jun 29, 2017)

The driver might run in some kind of fallback mode as it doesnt look like any of the architectual advancements work at all and it performs 30-40% faster than a Fury which conspicuously lines up with the higher clocks...


----------



## ironcerealbox (Jun 29, 2017)

Between this article and the one from late summer/early autumn of 2015 (https://www.techpowerup.com/215776/amd-radeon-r9-nano-review-by-tpu-not):


----------



## rtwjunkie (Jun 29, 2017)

Unsubbed.  I'll wait for RX release. Until then it's just more arguments based on speculation.


----------



## mat9v (Jun 29, 2017)

How is 7120p in the ballpark of stock 1080.
Look for 4790k (his CPU) and 1080 in TimeSpy results. His score corresponds most closely to 4790K@4Ghz and 1080@1850-2000Mhz


----------



## efikkan (Jun 29, 2017)

Aaah, the AMD product cycle strikes again:
_"It's going to be the best thing ever"_ -> denial -> _"it's actually better, you just don't see it because of missing optimizations."_
So we are moving into phase two…



qubit said:


> Maybe in another couple of generations AMD will catch up, but I'm not holding my breath.


They are not catching up, they are falling behind, so until Nvidia slows down they wouldn't.
Vega was supposed to compete with Pascal, but Pascal is nearing the end of it's life cycle before it even arrives (so AMD is >~0.6 cycles behind). Next year Nvidia will launch consumer Volta, while AMD's counterpart Navi is slipping into 2019…


----------



## qubit (Jun 29, 2017)

efikkan said:


> Aaah, the AMD product cycle strikes again:
> _"It's going to be the best thing ever"_ -> denial -> _"it's actually better, you just don't see it because of missing optimizations."_
> So we are moving into phase two…
> 
> ...


Yeah that sounds about right, I was just trying to be optimistic. 

I don't get why some people end up trying to defend a big, for-profit company that's letting everyone down with substandard products and dodgy hype (eg Vega FE v Titan Xp) by flaming us with incoherent rants instead of flaming AMD. It's just weird. 

Some seem to think it's up to us to buy their products regardless to help fund their R&D for better products. That's pretty flawed thinking, because if AMD made lots of money by selling worse products, then they don't have any motivation to spend their profits on R&D for making better products. Worse, the competition will see this and might wonder why they're spending so much money, time and effort into better products when they don't need to in order to make a decent profit. This could potentially start a trend for race to the bottom. So no, don't buy AMD's products to "help fund R&D", because it won't work and you'll have wasted your money on inferior products.


----------



## xkm1948 (Jun 29, 2017)

qubit said:


> Yeah that sounds about right, I was just trying to be optimistic.
> 
> I don't get why some people end up trying to defend a big, for-profit company that's letting everyone down with substandard products and dodgy hype (eg Vega FE v Titan Xp) by flaming us with incoherent rants instead of flaming AMD. It's just weird.
> 
> Some seem to think it's up to us to buy their products regardless to help fund their R&D for better products. That's pretty flawed thinking, because if AMD made lots of money by selling worse products, then they don't have any motivation to spend their profits on R&D for making better products. Worse, the competition will see this and might wonder why they're spending so much money, time and effort into better products when they don't need to in order to make a decent profit. This could potentially start a trend for race to the bottom. So no, don't buy AMD's products to "help fund R&D", because it won't work and you'll have wasted your money on inferior products.




The denial from emotionally invested people is strong. I fully understand. Once someone is emotionally invested in something they feel the obligation to defend it no matter what that thing actually is. The discussion is already useless here. People have resorted to personal attacks, which is one of the lowest form of arguing, starting from the first several pages. 

We are all fully grown ass adults. So basically the chance of one persuading another to abandon his/her stance on the matter is quite slim. 

AMD is charging $999 for their card, someone bought it and tested it, he posted the results. End of story.  Annnd unsubscribed as well. Nothing useful is coming out of this thread now(for me).


----------



## Th3pwn3r (Jun 29, 2017)

Good


xkm1948 said:


> The denial from emotionally invested people is strong. I fully understand. Once someone is emotionally invested in something they feel the obligation to defend it no matter what that thing actually is. The discussion is already useless here. People have resorted to personal attacks, which is one of the lowest form of arguing, starting from the first several pages.
> 
> We are all fully grown ass adults. So basically the chance of one persuading another to abandon his/her stance on the matter is quite slim.
> 
> AMD is charging $999 for their card, someone bought it and tested it, he posted the results. End of story.  Annnd unsubscribed as well. Nothing useful is coming out of this thread now(for me).


 
Good riddance, you are biased like most others anyways. No need for your input anyhow.


----------



## erocker (Jun 29, 2017)

Prima.Vera said:


> This starts to look more and more like the 3DFx Voodoo5 5500 story. Now AMD on the same position.
> Is funny that people are still not learning from past mistakes...


I don't see Nvidia buying AMD and making them disappear like what happened  after that card came out.


----------



## S@LEM! (Jun 29, 2017)

you had only one job..

**** you Raja Koduri!


----------



## KainXS (Jun 29, 2017)

Thread went down the toilet


----------



## eidairaman1 (Jun 29, 2017)

, Yup all the hatred and Bias





KainXS said:


> Thread went down the toilet


----------



## DeathtoGnomes (Jun 30, 2017)

EarthDog said:


> The longer the thread, the less clue seems to populate it...


yep called it...


----------



## fullinfusion (Jun 30, 2017)

R-T-B said:


> I'm sorry, but I'm pretty skeptical that that's what's going on here, or even was in your case.  Low voltages / under delivered power don't cause low benchmarks, they cause instabilities, crashes, or even hardware damage.  Most likely outcome of overloading a PSU is actually a hard shutdown, as it won't "not get enough juice."  PSU's are a pull technology, they will attempt to deliver the requested wattage, or if built well, shutdown when unable.  Whether they do it with acceptable ripple or without an electrical fire is another matter.


Here check this out , it kinda validates my problem to a point..


----------



## eidairaman1 (Jun 30, 2017)

fullinfusion said:


> Here check this out , it kinda validates my problem to a point..



Seen others swap them to resolve issues, especially when defective from brand new


----------



## Arjai (Jun 30, 2017)

I don't have clue about any of this. AND, I just read thru 8 pages of this thread.

Somehow, I learned nothing about these cards. But, I did learn some things about the cards in this thread.

Not real happy about either.


----------



## Th3pwn3r (Jun 30, 2017)

eidairaman1 said:


> Seen others swap them to resolve issues, especially when defective from brand new



I really don't think that's the 'problem' here.

Expectations are the problem. Some just don't understand what this card is supposed to do for one...


----------



## TheGuruStud (Jun 30, 2017)

It is using barely modified Fiji drivers from January lol (package is new but same 'ol drivers). The novideo fans are squirming.


----------



## uuuaaaaaa (Jul 26, 2017)

TheGuruStud said:


> It is using barely modified Fiji drivers from January lol (package is new but same 'ol drivers). The novideo fans are squirming.



_*Terry Makedon*‏ @CatalystMaker_
_I sense a BIG day for Radeon Software tomorrow. Cant wait to let the cat out of the bag......_

https://twitter.com/CatalystMaker/status/889946633575780352


----------



## EarthDog (Jul 26, 2017)

fullinfusion said:


> Here check this out , it kinda validates my problem to a point..


So, a difference of 1%(by score - less than 1% in FPS) in one test? That could have been just from rebooting. I'd like to see a lot more testing, but, from what i know, its simply off or on. it gets enough voltage and it works or doesn't and craps out.


----------



## OSdevr (Jul 26, 2017)

Why did this thread get resurrected?


----------

