# AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080



## Raevenlord (Jul 19, 2017)

On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.

The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)



 

 

 

 



All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.



 

 

 

AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.



 

 



*View at TechPowerUp Main Site*


----------



## Chaitanya (Jul 19, 2017)

Interesting.


----------



## The Quim Reaper (Jul 19, 2017)

If Vega costs more than $399, they may as well not bother.

Performance isn't what will sell the card, price is.

Offering similar levels of performance for Nvidia levels of money will accomplish nothing and will only sell to the AMD die-hards, which is a tiny percentage of the GPU market.

Price is the only trick card they can play, being so late to market, and if they don't play it, well....go home AMD, you're done.


----------



## csatahajos (Jul 19, 2017)

Well according to friends going there the best thing were the two hostess ladies, so little was shown of the STUFF. As the articla states this was mostly a fake event, by the secrecy there it could have been RX 580 inside the AMD machine, nobody could tell as most games run were easy on the graphics card I was told.

A major dissapointment, also why wait with the release so much if this card is really only 1080 material only for 100 USD less. It is nice but not the kind of game changer like Ryzen or TR. I'm really hoping this was the XT version only and the XTX with better drivers will be 1080Ti level still for a 600 USD pricetag. 

AFAIK the HBM2 is fairly expensive, like 150USD-ish so if the MSRP is really 450 on the high end card AMD won't make any money on these....


----------



## the54thvoid (Jul 19, 2017)

BOM and R&D costs would be interesting to see. If it has to price compete with a year old card, no longer Nvidia's best, AMD might be selling at very low, investor depressing margins.  

Still, announcement end of July, reviews after that?


----------



## kunyicajsz (Jul 19, 2017)

I was there for about an hour and closely checked both rig. BF1 was running and I considered both config running the game really smooth. The speaker guy told me which one is Vega (rig on the left) and I had to watch for like a minute to see any difference (which we all know that can be easily the trick of my mind).
I did win an asus rx 570 though


----------



## bistrocrat (Jul 19, 2017)

it is just sad... sad that we must wait for 1.5 years (since gtx 10XX) to get this  -100$(maybe) cheaper product from competitor, so nvidia will level down those -99$ for their 1.5 year old GPU's and wont haste to release next gen - because there is nothing in market that challanges nvidias market share. and the fact that these 2 year cycles with baerly no price cuts in betwean is in time frame when 4K or 120fps+ gaming monitor prices are really low and affordable is really sad


----------



## Prima.Vera (Jul 19, 2017)

Naturally this has to be cheaper and perform slightly better than the GTX 1080 to be even considered by buyers...


----------



## okidna (Jul 19, 2017)

kunyicajsz said:


> I was there for about an hour and closely checked both rig. BF1 was running and I considered both config running the game really smooth. The speaker guy told me which one is Vega (rig on the left) and I had to watch for like a minute to see any difference (which we all know that can be easily the trick of my mind).
> *I did win an asus rx 570 though*



Congrats!!


----------



## buggalugs (Jul 19, 2017)

This is the new normal  It seems most companies are releasing products ahead of reviews these days, and they're trying hard to control the message.  Not just AMD but Intel is doing it too and Nvidia did it.

 The old days of companies giving review sites products to review before launch is over. 

Its going to be hard for AMD and Vega, Im sure Nvidia is sitting on new cards, ready for an answer to vega launch. I hope AMD planned ahead.


----------



## Deleted member 172152 (Jul 19, 2017)

Suppose AMD doesn't want the actual performance to leak. Guess I'm going back into hibernation.


----------



## csatahajos (Jul 19, 2017)

Hugh Mungus said:


> Suppose AMD doesn't want the actual performance to leak. Guess I'm going back into hibernation.



It would be fine but why do a demo like this then, which has no meaningful value (well other than a chance to win some hardware )


----------



## dicktracy (Jul 19, 2017)

Picture looks like a pornography set


----------



## bug (Jul 19, 2017)

bistrocrat said:


> it is just sad... sad that we must wait for 1.5 years (since gtx 10XX) to get this  -100$(maybe) cheaper product from competitor, so nvidia will level down those -99$ for their 1.5 year old GPU's and wont haste to release next gen - because there is nothing in market that challanges nvidias market share. and the fact that these 2 year cycles with baerly no price cuts in betwean is in time frame when 4K or 120fps+ gaming monitor prices are really low and affordable is really sad


Harsh words.
Remember, AMD doesn't have infinite resource (neither does anyone else) and they covered a lot of ground with their CPUs in the past year. GPU division not keeping up isn't totally unexpected.


----------



## chaosmassive (Jul 19, 2017)

AMD shouldn't never use HBM on their Vega (or consumer) card, they dont learn from Fury X
be it HBM or GDDR5(X) as long as GPU fed with sufficient bandwidth, graphic card performance ceiling always on GPU core capability
with GDDR5X's bandwidth rivaling or even exceeding HBM2, its not wise option to put HBM on consumer card at all
furthermore HBM2 is recently developed (with questionable yield) and costly to make

now, if only AMD use GDDR5 (GDDR5X at least) AMD might be able to save its BOM which translate to lower cost per card
AMD trying to force them self into luxury space, note that even Nvidia dont release HBM-based graphic card at all.

card equiped with HBM2 memory and still on par with GTX 1080 its not funny at all


----------



## Vayra86 (Jul 19, 2017)

My god these AMD events always look like some weird scene out of a cyberpunk B-movie or Total Recall or something.


----------



## Crap Daddy (Jul 19, 2017)

To showcase a high-end product which took years into making and tons of hype in such a low-key event with only one Vega machine running one game with the emphasis on Freesync and cheaper alternative to technology which is available since more than a year is depressing. RTG and AMD are in deep ****.


----------



## chaosmassive (Jul 19, 2017)

Crap Daddy said:


> To showcase a high-end product which took years into making and tons of hype in such a low-key event with only one Vega machine running one game with the emphasis on Freesync and cheaper alternative to technology which is available since more than a year is depressing. RTG and AMD are in deep ****.


their CPU department  start clawing back its market pie
its GPU part is very concerning,,,,


----------



## Deleted member 172152 (Jul 19, 2017)

chaosmassive said:


> their CPU department  start clawing back its market pie
> its GPU part is very concerning,,,,


RTG is doing okay, not great. Just about competing is still more than most expected, so that's something. With AMD's cpu money RTG can hopefully increase its R&D budget and if rx vega is about as good as a 1080 now, it should at least outperform it in the long run when more optimized games are released and it gets better drivers, which still would make it a good long-term option. Maybe I'll get vega 2.0 at a later date, but I only play at most with high settings anyway (except in small and/or old games of course), so the framerate should be fine no matter what for the next few years. If rx vega doesn't dissapoint, I'm buying it. If it's complete rubbish, I'm not. Simple. Buying in september, so that leaves some time for driver optimizations that will give a more complete picture, so I can at least make an informed decision without rushing to conclusions (like nvidia fanboys  ). That also means prices will have come down a bit from launch, especially since vega is supposed to be a rubbish mining gpu.


----------



## londiste (Jul 19, 2017)

system price difference $300 less for amd one?
the monitors are apparently mx34vq (msrp $799) and pc348q (msrp $1299).
wait...


----------



## Dimi (Jul 19, 2017)

200$ difference between G-sync & Freesync? Yeah if you buy shitty monitors from Asus & Benq maybe.


----------



## Deleted member 172152 (Jul 19, 2017)

Dimi said:


> 200$ difference between G-sync & Freesync? Yeah if you buy shitty monitors from Asus & Benq maybe.


Depends where you live mostly. In the Netherlands 200 euros difference is fairly common and it can go up wayyy more with only a few cheap g-sync monitors around. Only advantage of g-sync is that if you pay even more you can get higher refresh rates, but anything above 100hz on a UWQHD panel is a bit pointless, so you would be better off with a 2560x1080 monitor anyway and then you probably wouldn't use any sync to get the best response times possible, making a cheaper freesync monitor a better option.


----------



## Tomgang (Jul 19, 2017)

If rx vega is only gtx 1080 compitable with, im glad i dit not wait for vega and got my self a gtx 1080 ti for some future 4k fun.

Any news about what tdp rx vega comes with?

I mean if tdp is like the fe vega or water cooled version with tdp of 300 and 375 watts and still only has performence levels of gtx 1080 i am not impressed at all. Gtx 1080 tdp is 180 watt for the reference card by the way.


----------



## Bytales (Jul 19, 2017)

csatahajos said:


> Well according to friends going there the best thing were the two hostess ladies, so little was shown of the STUFF. As the articla states this was mostly a fake event, by the secrecy there it could have been RX 580 inside the AMD machine, nobody could tell as most games run were easy on the graphics card I was told.
> 
> A major dissapointment, also why wait with the release so much if this card is really only 1080 material only for 100 USD less. It is nice but not the kind of game changer like Ryzen or TR. I'm really hoping this was the XT version only and the XTX with better drivers will be 1080Ti level still for a 600 USD pricetag.
> 
> AFAIK the HBM2 is fairly expensive, like 150USD-ish so if the MSRP is really 450 on the high end card AMD won't make any money on these....



Yah, like they had to bring some b.es, otherwise it would have been for nothing.


----------



## JB_Gamer (Jul 19, 2017)

The Quim Reaper said:


> If Vega costs more than $399, they may as well not bother. Performance isn't what will sell the card, price is. Offering similar levels of performance for Nvidia levels of money will accomplish nothing and will only sell to the AMD die-hards, which is a tiny percentage of the GPU market. Price is the only trick card they can play, being so late to market, and if they don't play it, well....go home AMD, you're done.



Well I truly believe that no-one wants that scenario - "go home AMD, you're done", not even the most green-faced Nvidia fanboys/die-hards, what then, leaving Nvidia without competition?!


----------



## EarthDog (Jul 19, 2017)

440w and 1080 performance...


----------



## bug (Jul 19, 2017)

EarthDog said:


> 440w and 1080 performance...


Not each and every design is a winner. But in this instance it looks like we'd get better performance if we allowed Polaris to draw that much power. Oh well, the wait is almost over, we should have our answers soon.


----------



## Fluffmeister (Jul 19, 2017)

It does increasing look like Vega is going to be a turkey.

Look forward to the reviews, when are they due?


----------



## Crap Daddy (Jul 19, 2017)

JB_Gamer said:


> Well I truly believe that no-one wants that scenario - "go home AMD, you're done", not even the most green-faced Nvidia fanboys/die-hards, what then, leaving Nvidia without competition?!



Of course not. The correct phrase would be "Go home AMD, you're drunk" Sober up and come back with something good. Like Ryzen. As for competition, there's none at the high-end level since a few good years now, it's just getting worse.


----------



## rtwjunkie (Jul 19, 2017)

buggalugs said:


> This is the new normal  It seems most companies are releasing products ahead of reviews these days, and they're trying hard to control the message.  Not just AMD but Intel is doing it too and Nvidia did it.
> 
> The old days of companies giving review sites products to review before launch is over.
> 
> Its going to be hard for AMD and Vega, Im sure Nvidia is sitting on new cards, ready for an answer to vega launch. I hope AMD planned ahead.


It's not just the hardware manufacturers either.  Many games are going the same route: release, then review.


----------



## RejZoR (Jul 19, 2017)

EarthDog said:


> 440w and 1080 performance...



GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?


----------



## EarthDog (Jul 19, 2017)

RejZoR said:


> GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?


Lol, people complained back then too about it..wth are you talking about?? To that end, it actually beat out whatever it was competing against from ati at the time, didnt it? It also wasnt 440W worth of single gpu either. 250w tdp vs 375W. This is a record afaik for a single gpu tdp a full 50% more power for less performance (1080ti and rx vega xtx). Put it up against the 1080 which it appears to compete with, and that becomes 180w vs 375w. 

Nobody is saying its garbage, its going to have to compete on price again since performance right now has it on par with a 1080 and power to performamce is abhorrent.


----------



## TheLostSwede (Jul 19, 2017)

bug said:


> Harsh words.
> Remember, AMD doesn't have infinite resource (neither does anyone else) and they covered a lot of ground with their CPUs in the past year. GPU division not keeping up isn't totally unexpected.



You're aware that the CPU's and GPU's are developed by two entirely different teams, spread between different locations, right? And if as you say, AMD has such limited resources, then maybe they should've spent them more wisely and maybe not made a bunch of marketing noise about a product that looks like it'll be a dud at best. The worst thing you can do in the tech industry is over promise and under deliver.


----------



## Alejandrodg82 (Jul 19, 2017)

The Quim Reaper said:


> If Vega costs more than $399, they may as well not bother.
> 
> Performance isn't what will sell the card, price is.
> 
> ...




Couldn´t agree more. Who is gonna buy this allegedly "power hungry" card, even when Volta is just arround the corner. If the price isn´t right, this is gonna be a mayor flop for them. It´s like they always have some caveat that pushes me away from their products.

I was about to buy a freeesync ultrawide (CF791) and hoping to put a Vega GPU. Between the flickering issues that have surfaced with freesync and this... It just makes me want to stay with nvidia and their damn expensive ecosystem, just to play it safe.


----------



## bug (Jul 19, 2017)

TheLostSwede said:


> You're aware that the CPU's and GPU's are developed by two entirely different teams, spread between different locations, right? And if as you say, AMD has such limited resources, then maybe they should've spent them more wisely and maybe not made a bunch of marketing noise about a product that looks like it'll be a dud at best. The worst thing you can do in the tech industry is over promise and under deliver.


I don't think they _planned_ to underdeliver. But sometimes crap happens and without extra resources, you can't turn things around (i.e. run another silicon revision).


----------



## the54thvoid (Jul 19, 2017)

RejZoR said:


> GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?



Like @EarthDog has said, people did mention it back then. AMD even created PR about how hot Fermi was. Everyone mocked the GTX480 for being hot and hungry. AMD people lapped it up. Hell, that's why I went with HD5850’s.

http://www.guru3d.com/news-story/ati-ad-that-mocks-nvidia-fermi-spotted-on-youtube.html


----------



## Anymal (Jul 19, 2017)

RejZoR said:


> GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?


Dang, you are the AMD Fanboy.


----------



## Dimi (Jul 19, 2017)

Alejandrodg82 said:


> Couldn´t agree more. Who is gonna buy this allegedly "power hungry" card, even when Volta is just arround the corner. If the price isn´t right, this is gonna be a mayor flop for them. It´s like they always have some caveat that pushes me away from their products.
> 
> I was about to buy a freeesync ultrawide (CF791) and hoping to put a Vega GPU. Between the flickering issues that have surfaced with freesync and this... It just makes me want to stay with nvidia and their damn expensive ecosystem, just to play it safe.



Buy the Dell S2417DG you can get it on amazon for 399 atm, i've seen it as low as 350. *165hz 1440p G-Sync* monitor.

Its incredible and worth every penny. Coming from an IPS panel, i can't even tell this is a TN panel.


----------



## okidna (Jul 19, 2017)

the54thvoid said:


> Like @EarthDog has said, people did mention it back then. AMD even created PR about how hot Fermi was. Everyone mocked the GTX480 for being hot and hungry. AMD people lapped it up. Hell, that's why I went with HD5850’s.
> 
> http://www.guru3d.com/news-story/ati-ad-that-mocks-nvidia-fermi-spotted-on-youtube.html



And don't forget about dozen of videos cooking egg with GTX 480


----------



## bug (Jul 19, 2017)

the54thvoid said:


> Like @EarthDog has said, people did mention it back then. AMD even created PR about how hot Fermi was. Everyone mocked the GTX480 for being hot and hungry. AMD people lapped it up. Hell, that's why I went with HD5850’s.
> 
> http://www.guru3d.com/news-story/ati-ad-that-mocks-nvidia-fermi-spotted-on-youtube.html


And even so, those part weren't offering the performance of previous generation's ATI parts. Though the FX series tended to suck across the board.


Dimi said:


> Buy the Dell S2417DG you can get it on amazon for 399 atm, i've seen it as low as 350. *165hz 1440p G-Sync* monitor.
> 
> Its incredible and worth every penny. Coming from an IPS panel, i can't even tell this is a TN panel.



You have to be blind to not be able to tell TN Film from IPS. TN Film changes contrast as soon as you move up or down in your chair. Also, IPS glows.


----------



## EarthDog (Jul 19, 2017)

MEH, he won't respond. Too busy trying to get that foot out of his mouth he swallowed whole.


----------



## Dimi (Jul 19, 2017)

bug said:


> You have to be blind to not be able to tell TN Film from IPS. TN Film changes contrast as soon as you move up or down in your chair. Also, IPS glows.



Look up some reviews, you'll find that more people share my opinion. I have another TN panel next to the S2417DG and it looks miles worse than my new one.

But sure, if you wanna be an ips elitist, you pay the price plus all the other major issues it brings with it. No thanks.


----------



## Gasaraki (Jul 19, 2017)

bistrocrat said:


> it is just sad... sad that we must wait for 1.5 years (since gtx 10XX) to get this  -100$(maybe) cheaper product from competitor, so nvidia will level down those -99$ for their 1.5 year old GPU's and wont haste to release next gen - because there is nothing in market that challanges nvidias market share. and the fact that these 2 year cycles with baerly no price cuts in betwean is in time frame when 4K or 120fps+ gaming monitor prices are really low and affordable is really sad



Not really 1.5 years yet but I get your point. This is why nVidia stocks are through the roof and AMD's are in the poopers.


----------



## r9 (Jul 19, 2017)

I really really wanted amd to do well, but rx vega is the worst thing they or every one else ever released period. Huge chip expensive as f#&$ to produce, not to mention expensive as f$#_ memory and the cherry on top that ridiculous power draw. If I was Amd I would save my self a lot of embarrassment and never release Vega.


----------



## Gasaraki (Jul 19, 2017)

bug said:


> Harsh words.
> Remember, AMD doesn't have infinite resource (neither does anyone else) and they covered a lot of ground with their CPUs in the past year. GPU division not keeping up isn't totally unexpected.



That is just an excuse people make every time. Remember before AMD, ATi was their own company. The AMD graphics department (ATi) is separate from the cpu department. They are not sharing resources or fab or anything.


----------



## bug (Jul 19, 2017)

Dimi said:


> Look up some reviews, you'll find that more people share my opinion. I have another TN panel next to the S2417DG and it looks miles worse than my new one.
> 
> But sure, if you wanna be an ips elitist, you pay the price plus all the other major issues it brings with it. No thanks.


Ok, made me look.

Tom's Hardware: 


> A 24-inch TN panel of this quality can almost fool you into thinking you're looking at an IPS panel, until you move past 45° off-axis. There you’ll see the expected green color shift and 50-60% light falloff. From the top, detail is reduced significantly as well. However, at normal viewing distances and angles, the S2417DG is one of the better TN monitors we’ve seen. When gaming, we didn’t notice a problem, and we didn’t pine for an IPS screen.



PC Monitors:


> Due to the viewing angle limitations, we will not be providing analysis of colour temperature variation using the colorimeter. The perceived variations here due to these viewing-angle related shifts can largely counteract measured deviations, even when you’re simply observing different sections of the screen from a normal viewing angle.



Neither seem to be thinking they were looking at an IPS and I couldn't find other reviews yet.


----------



## noname00 (Jul 19, 2017)

Maybe AMD hoped the HBM2 price would be lower by now, and the high price of HBM2 is one of the reasons the card was delayed so much.

Between the high price of HBM2 (actually high price of ram memory in general) and high power consumption, I don't see how AMD will make a decent profit from Vega if it won't be at least as fast as a 1080ti.


----------



## dozenfury (Jul 19, 2017)

Based on AMD's recent history and what has dribbled out, I'm assuming the Vega ends up slightly faster than the 1070 but a bit slower than the 1080.  That would be fair price/performance for $399 especially if there is a little headroom for improvement with driver optimizations.  That price is also assuming the mining price gouging starts to wind down.


----------



## TheLostSwede (Jul 19, 2017)

bug said:


> You have to be blind to not be able to tell TN Film from IPS. TN Film changes contrast as soon as you move up or down in your chair. Also, IPS glows.



The latest generation of TN panels are actually not as bad as they used to be. See this review for example http://www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html 
Also, not all IPS panels suffer badly from the glow, but some have a weird coating that makes it worse.


----------



## Deleted member 172152 (Jul 19, 2017)

dozenfury said:


> Based on AMD's recent history and what has dribbled out, I'm assuming the Vega ends up slightly faster than the 1070 but a bit slower than the 1080.  That would be fair price/performance for $399 especially if there is a little headroom for improvement with driver optimizations.  That price is also assuming the mining price gouging starts to wind down.


"1075 ti" performance with driver updates, or probably even a "gtx 990" with its high power draw and heat outpuy. Better performance than a 1080, but also higher power draw and probably about the same fps/watt than/as a 980 ti. If games are well-optimized for vega and the drivers work properly, vega may end up being a bit of a dark horse in the lomg run, but that remains to be seen. 499 dollars would be a reasonable price tag IMO, especially if you factor in the money saved from a g-sync monitor. 300 dollars/euros less (depending on region) for the same performance seems like a good deal to me and even with a normal monitor, that does make 1080 performance a bit more accesible because of a 100 dollar saving. More than 500 dollars could be a tough sell.


----------



## Prince Valiant (Jul 19, 2017)

Dimi said:


> 200$ difference between G-sync & Freesync? Yeah if you buy shitty monitors from Asus & Benq maybe.


It's not like there are tons of panel options for game oriented monitors. Any IPS at 144/165hz is still that accursed AUO panel as far as I know.

Edit: Waaait, nevermind.


----------



## Footman (Jul 19, 2017)

If all the rumors are actually true and the Vega performs at or below 1080 speeds then I will admit to being a little disappointed. Especially when you factor in the amount of power these cards need to run at 1600mhz core speed reliably, according to tests performed on the already released Vega FE.

I will likely buy one though as I am still looking for a faster video card to replace my RX 580 for Freesync gaming on my 2560x1440 144hz monitor....


----------



## bug (Jul 19, 2017)

TheLostSwede said:


> The latest generation of TN panels are actually not as bad as they used to be. See this review for example http://www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
> Also, not all IPS panels suffer badly from the glow, but some have a weird coating that makes it worse.


Oh, I did not mean they're as bad as they used to be (fwiw I used them for gaming for years and they did just fine). But there's no way to not see any difference from IPS.

That review is the one I've quoted from above


----------



## TheGuruStud (Jul 19, 2017)

No one knows system specs. One turd burglar on reddit does not a story make.

Cmon, don't wcctardtech this crap.


----------



## Darmok N Jalad (Jul 19, 2017)

I'm more interested to see what sort of APU solution they can come up with using Ryzen and Vega. While AMD hasn't had much desktop success, they have managed to secure several console wins, and I suspect that won't change since they hold the keys to the strongest APU combination. I would think that a Ryzen-based APU will wind up in PS5 and whatever comes out after X1X. I am curious if they will use HBM by that time.


----------



## Footman (Jul 19, 2017)

bug said:


> Oh, I did not mean they're as bad as they used to be (fwiw I used them for gaming for years and they did just fine). But there's no way to not see any difference from IPS.
> 
> That review is the one I've quoted from above



If you are looking for a 2560x1440 144mhz IPS Freesync monitor, then I just purchased the new Nixeus 27 EDG and I'm very impressed. Amazon have the basic version at $449 USD, but there is a more expensive version with a better stand for $499. Newegg also has these listed, although I believe that Amazon is the only reseller with the basic stand version so far.

Just saying.


----------



## HD64G (Jul 19, 2017)

Darmok N Jalad said:


> I'm more interested to see what sort of APU solution they can come up with using Ryzen and Vega. While AMD hasn't had much desktop success, they have managed to secure several console wins, and I suspect that won't change since they hold the keys to the strongest APU combination. I would think that a Ryzen-based APU will wind up in PS5 and whatever comes out after X1X. I am curious if they will use HBM by that time.


Vega and Ryzen will be combined through IF (as they are both compatible) to create the perfect APU (CPU & GPU on same die maye?). Especially if they manage to get 2 X 1GB HBM into the package, we will see tremendous performance for an APU imho. For notebooks and next gen comsoles it will become a must to have an AMD APU if executed well.


----------



## Capitan Harlock (Jul 19, 2017)

My question is , what card is the same used in the Public presentation with Doom in 4k?
This RX Vega or the FT edition?


----------



## efikkan (Jul 19, 2017)

chaosmassive said:


> their CPU department  start clawing back its market pie
> its GPU part is very concerning,,,,


The big problem for AMD is their competitors are developing tirelessly.
In the CPU department, Ryzen is a huge step up from Bulldozer, but still >30% behind Skylake IPC. AMD need to _keep_ developing, and I'm not talking about small tweaks. Intel will release its next architecture next year, so AMD needs to keep investing to keep up, relying on Zen for five years is not good news.
Regarding GPUs, AMD have been pretty stagnant since the launch of GCN, just minor tweaks while Nvidia keep innovating.
Meanwhile AMD have spent billions on projects like "skybridge" and K12, and APUs which are not profitable. AMD's research budget might be tight, but if they focused on two instead of five things, they could at least make a profit.



Hugh Mungus said:


> With AMD's cpu money RTG can hopefully increase its R&D budget and if rx vega is about as good as a 1080 now, it should at least outperform it in the long run when more optimized games are released and it gets better drivers, which still would make it a good long-term option.


The same story as always; default to waiting for "optimized" software. The final phase of the AMD product cycle.



bug said:


> I don't think they planned to underdeliver. But sometimes crap happens and without extra resources, you can't turn things around (i.e. run another silicon revision).


Well, they might not estimate the performance exactly, but they did know the consumption for a given clock frequency. Nvidia manages to get 200-300 MHz more, while consuming less energy. This is due to the chip design, which is no accident.


----------



## bug (Jul 19, 2017)

efikkan said:


> Well, they might not estimate the performance exactly, but they did know the consumption for a given clock frequency. Nvidia manages to get 200-300 MHz more, while consuming less energy. This is due to the chip design, which is no accident.



Design is one factor, but fab process and maturity is just as important.
I'm just saying, when designing a GPU it's not like you can set in stone TDP and processing power and then go on the achieve both. There are margins and sometimes it happens that you end up overstepping them. Because of various factors.


----------



## efikkan (Jul 19, 2017)

bug said:


> Design is one factor, but fab process and maturity is just as important.


Sure, but the process is just fine. Polaris was taped out on both TSMC and Samsung, and AMD chose Samsung. The process is stable and mature by now. The process is not responsible for Vega consuming ~300W to compete with GTX 1080 on 180W, that's due to the chip design. A less efficient design(longer critical path) would require higher voltage to sustain a specific clock, which results in higher energy consumption. With the attributes of the process known, the consumption is very predictable before tapeout.


----------



## Alduin (Jul 19, 2017)

I hope vega top model is faster than
GTX 1080ti if not
We will have to see what volta brings


----------



## cdawall (Jul 19, 2017)

RejZoR said:


> GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?



Um the geforce fx was openly relegated at the time as a complete failure.

The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.

Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.


----------



## S@LEM! (Jul 19, 2017)

they could have safe themselves, money and time. by releasing Fury X 2 without all hype b$hit poor Volta thing
This kind of improvements over 50%+ power increase from competition is nothing but an overclocked Fury X with better yield manufacturing and a fancy HBM2.0 not for the sake of advanced technology but to cover up the monster power hunger of an aging architecture to let the clocks stable with that many shading units for 375W 

Poor Vega, you got an idiotic hype of PR
Poor Raji, you had only one job


----------



## GhostRyder (Jul 19, 2017)

efikkan said:


> The big problem for AMD is their competitors are developing tirelessly.
> In the CPU department, Ryzen is a huge step up from Bulldozer, but still >30% behind Skylake IPC. AMD need to _keep_ developing,


I don't know why you keep saying that, they are not 30% behind.  But whatever...



cdawall said:


> The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.



Don't know if I would count that as a selling point, its kinda tardy to the party after promising it and then not delivering for years.  If anything I think the only reason they did that is because mobile fermi chips were still around more than for the desktop counterparts (I can't really find people who use the desktop parts but some still have laptops that have a Fermi based chip inside).



cdawall said:


> Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.


Yea I had a feeling...When things stay quiet you can guess whats going to happen.  The only saving grace will be price unless this card is literally in the middle between a GTX 1080 and 1080ti in which it can at least be argued (Though price still has to be good).


----------



## ZoneDymo (Jul 19, 2017)

cdawall said:


> Um the geforce fx was openly relegated at the time as a complete failure.
> 
> The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.
> 
> Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.



ermmm I think you might have to look up some benchmarks man, seems your memory is tainted by rose coloured glasses.
http://media.bestofmicro.com/1/2/242390/original/Crysis-1920.png
http://1.bp.blogspot.com/_9vgJ1nwu_...xcd0E/s1600/GTX480+benchmark+tests+5cn7g6.png


----------



## Vayra86 (Jul 19, 2017)

Dimi said:


> Buy the Dell S2417DG you can get it on amazon for 399 atm, i've seen it as low as 350. *165hz 1440p G-Sync* monitor.
> 
> Its incredible and worth every penny. Coming from an IPS panel, i can't even tell this is a TN panel.



While totally offtopic, I gotta set this straight here.

399 for a TN is not great value, its selling an overpriced el-cheapo TN.
http://www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
Look at the first picture there with the gradient bars. Top middle of screen is blue-ish, the rest of the bars are not. Backlight bleed and bad color uniformity + contrast shift of TN.
Color and black uniformity charts for that panel are among the worst of all panels in the comparison. White uniformity is good, but how often do you look at 100% white canvas? That's right, never, and if you do, its extremely unpleasant to the eye. The gamma of this monitor also does not stick to 2.2 which means that whatever you do, you'll crush blacks or lose bright tones.

So, may look good to Toms' (which I find odd, the review reads like an advertorial) but in reality its crap. That line about unable to distinguish from IPS is straight from the review as well. Hell you can't even calibrate this panel to not show visible DeltaE errors.

If you then look at the comments to owners of this monitor you can also see that it suffers from everything that belongs to budget-segment TN: bad QC, sharp edges on the plastic bezels, bad OSD buttons, etc etc etc.

Credibility - 1


----------



## cdawall (Jul 19, 2017)

ZoneDymo said:


> ermmm I think you might have to look up some benchmarks man, seems your memory is tainted by rose coloured glasses.
> http://media.bestofmicro.com/1/2/242390/original/Crysis-1920.png
> http://1.bp.blogspot.com/_9vgJ1nwu_...xcd0E/s1600/GTX480+benchmark+tests+5cn7g6.png



That supports what I said. At least the second one and really a single game argument...

https://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_480_SOC/10.html

That same 5870 doesn't look at hot once the 480 gets a factory oc


----------



## RejZoR (Jul 19, 2017)

cdawall said:


> Um the geforce fx was openly relegated at the time as a complete failure.
> 
> The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.
> 
> Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.



There was such disparity with GeForce FX in terms of heat and consumption compared to Radeon 9000 series and yet they were selling despite everyone saying its crap. You'll NEVER EVER see that with Radeon cards. As presented with endless exhibits.


----------



## Vayra86 (Jul 19, 2017)

RejZoR said:


> There was such disparity with GeForce FX in terms of heat and consumption compared to Radeon 9000 series and yet they were selling despite everyone saying its crap. You'll NEVER EVER see that with Radeon cards. As presented with endless exhibits.



Let's face it then, that is also entirely up to AMD's marketing compared to Nvidia's. And to this day, that difference is still visible. You'd think that after a couple decades of experience they would wisen up abit, no?

Instead, even with Vega just now we get 'leaks' and 'events' that just look silly, are outright misleading, or don't give any new information, further fueling either a hype train so the product underdelivers (Fury X), or like Vega right now, massive disappointment before the release is even there.

And don't even get me started on the advertorials, even here on TPU, they're quite possibly even as low as you can go. Again, like I've said before, its completely bizarre.


----------



## Kyuuba (Jul 19, 2017)

Looks like a home party.


----------



## Eric3988 (Jul 19, 2017)

I don't like the secrecy here, doesn't exactly inspire confidence. Say the card will compete with Nvidia's best and show us actual benchmarks or just undercut them like they've been doing all these years recently. Doesn't matter to me, just keep Nvidia honest at this point because they have been on top for too long and can get away with charging whatever they want for their higher tier cards.


----------



## Hood (Jul 19, 2017)

When a mouse challenges the cat to a fight, he must play a game of hide and seek, hoping to wear kitty out.  It's a long shot; in the end kitty almost always wins.  The mouse has huge balls, though, and that's why half the spectators are rooting for him.  Too bad he has to die, the little guy has a good heart, just not enough resources.


----------



## Basard (Jul 19, 2017)

S@LEM! said:


> they could have safe themselves, money and time. by releasing Fury X 2 without all hype b$hit poor Volta thing
> This kind of improvements over 50%+ power increase from competition is nothing but an overclocked Fury X with better yield manufacturing and a fancy HBM2.0 not for the sake of advanced technology but to cover up the monster power hunger of an aging architecture to let the clocks stable with that many shading units for 375W
> 
> Poor Vega, you got an idiotic hype of PR
> Poor Raji, you had only one job



They did release an x2 Fury... named Pro Duo. It was a big fail.


----------



## ZoneDymo (Jul 19, 2017)

cdawall said:


> That supports what I said. At least the second one and really a single game argument...
> 
> https://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_480_SOC/10.html
> 
> That same 5870 doesn't look at hot once the 480 gets a factory oc



no it does not.... a stock HD5870 does only 4 fps less then a stock GTX480...
and you claimed you needed a HD5970 to get the same performance... in fact from there you see a HD5970 does a lot better.


and yeah nothing looks as hot as the GTX480 as the GTX480 was extremely hot


----------



## EarthDog (Jul 19, 2017)

Ill bet they will both run hot, as do a lighter and bonfire with a yellow flame...but which has more energy behind it?


----------



## efikkan (Jul 19, 2017)

Did AMD actually showcase anything new here? (besides the hired help?) It would be interesting with some performance figures.


----------



## warup89 (Jul 19, 2017)

Has there been as much hype like this for an AMD GPU before? I've been following AMD ever since the ATI 9800pro days because I personally like them but dang everyone seems to be blowing this coming GPU out of the water, the 295x2 debut was not even like this even though that was an milestone for them in terms of raw performance.  

I find it funny how "naked eye FPS comparisons" are even making tech news, like cmon hahah, whats next? footage of a pc running YouTube while using an RX ?


----------



## silentbogo (Jul 19, 2017)

warup89 said:


> Has there been as much hype like this for an AMD GPU before? I've been following AMD ever since the ATI 9800pro days because I personally like them but dang everyone seems to be blowing this coming GPU out of the water, the 295x2 debut was not even like this even though that was an milestone for them in terms of raw performance.
> 
> I find it funny how "naked eye FPS comparisons" are even making tech news, like cmon hahah, whats next? footage of a pc running YouTube while using an RX ?


Well, RX480 was kind of the same way, except eventually we got the actual numbers. Same with Pascal. 
The only difference is that Vega takes so long from an initial announcement to the actual release, that even pseudo-tech-enthusiast sites completely lost interest in making bogus benchmark screenshots and fake photos for the sake of ad revenue 
Everyone is tired... Radeon Group, OEMs, fans, potential buyers... even boobs don't help in this depressing situation.


----------



## cdawall (Jul 19, 2017)

ZoneDymo said:


> no it does not.... a stock HD5870 does only 4 fps less then a stock GTX480...
> and you claimed you needed a HD5970 to get the same performance... in fact from there you see a HD5970 does a lot better.
> 
> 
> and yeah nothing looks as hot as the GTX480 as the GTX480 was extremely hot

















ONE GAME DOES NOT DESCRIBE A VIDEO CARD. I don't know what way that needs to be phrased for people, but it is a thing. Overall the only same generation AMD card that is competitive is the 5970 and yes captain obvious it beats it. It also consumes 30 watts less power, which I *already* said.

The reference 480 was hot correct, it also consumed quite a bit of power (quite a bit less than vega), but that design was not bad, the fermi 2.0 cards were absolutely beautiful.



RejZoR said:


> There was such disparity with GeForce FX in terms of heat and consumption compared to Radeon 9000 series and yet they were selling despite everyone saying its crap. You'll NEVER EVER see that with Radeon cards. As presented with endless exhibits.








2003 was the release period for the 9000 series AMD saw a 13% market share increase. They saw a 21% market share increase with the X800 series. The market follows when someone releases a good product just fine. They saw a slight market share bump with the Radeon 4000/5000 series as well. After that the next "good" AMD product was the tahiti based 79x0 cards which they yet again saw a bump from. Looks like the market works fine.


----------



## warup89 (Jul 19, 2017)

silentbogo said:


> Well, RX480 was kind of the same way, except eventually we got the actual numbers. Same with Pascal.
> The only difference is that Vega takes so long from an initial announcement to the actual release, that even pseudo-tech-enthusiast sites completely lost interest in making bogus benchmark screenshots and fake photos for the sake of ad revenue
> Everyone is tired... Radeon Group, OEMs, fans, potential buyers... even boobs don't help in this depressing situation.



Right, usually reviews should be out by now in order to entice the customers to buy their cards as soon as they are out. I hope this market strategy stops in the future because I dont see it doing any good for them, unless they are trying to hide something, time will tell.


----------



## Footman (Jul 19, 2017)

cdawall said:


> Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.



That's kind of how I feel to be honest. Shame really as I am still waiting for a decent AMD VGA card to run my Freesync monitor at 144hz... Gamer Nexus were able to successfully undervolt and run their Vega FE at 1600mhz, so it gives me hope that we will see 1080 performance in the gaming range of cards at less than 440W.


----------



## FordGT90Concept (Jul 19, 2017)

Look on the bright side: it's going to be faster than RX 580.


----------



## notb (Jul 20, 2017)

TheLostSwede said:


> The latest generation of TN panels are actually not as bad as they used to be. See this review for example http://www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html


Yes, they are.
Poor viewing angles are a built-in property of TN. This really IS an issue.
This is a 24" LCD, which helps a lot. If you go for a larger TN LCD (30"+) and use it at a typical desktop viewing distance (50-70cm), you'll already notice the difference near the edges.

From a different point of view: imagine a situation when 2 people at the office are looking at the same TN LCD while discussing e.g. a PowerPoint presentation. They will see different colours on the screen. Good luck.


----------



## Anymal (Jul 20, 2017)

Well, for gaming a modern TN is the best.
Any price speculations for big and aib Vega? Hbm2 is pricey and Vega chip is bigger than gp102, what can AMD do at all?


----------



## NGreediaOrAMSlow (Jul 20, 2017)

Hugh Mungus said:


> Suppose AMD doesn't want the actual performance to leak. Guess I'm going back into hibernation.


That will make sense with product available on launch day.  But a month or two later.... Nah!

nVidia could announce a product the day after, and have it in a market in a month.


----------



## Bytales (Jul 20, 2017)

I got the new Samsung 32 Inch 2560x1440 Freesync 2 Monitor, currently using a watercooled r9 nano.
Waiting for Vega. It'll definetly be an improvement over the Nano, thats for sure, and it will help with the 144hz freq of the Monitor.
By the way, this Monitor, while beeing VA, is a bit of HDR as well (600 nits) and it is as fast as a TN supposedly with a 1ms Response Time.
I believe THIS would be the best Monitor for Gaming.
If Vega has Performance at or above 1080 while costing less, ist fine by me. I naturally wished it was more, but hey, you cant have it all. Beside, probably with the maturing of Drivers more Performance will be squezed out of it.

Im allready gaming decently with the R9 Nano, which is basically a stand by Card (Had 2 1080 before, and sold them), so it cant be that bad with VEGA.
Besides, what i'm really hoping for is a DUAL GPU Card. Something with like 4 8 PIN PCI power. I would get me two of those just to Benchmark.


----------



## ZoneDymo (Jul 20, 2017)

FordGT90Concept said:


> Look on the bright side: it's going to be faster than RX 580.



Well that is kinda where I am surprised by AMD, in a negative sense.
Their old R9 390 is pretty much the same in performance to the RX 580...

Like is VEGA just a terrible architecture or what? cant they like just fit an RX 580 with a lot more of.. well everything and have that be the more high end card?
I mean unless you tell me it would not offer any more performance, 
why not get an RX580 and put on that card 16gb of ram, 64 ROP's and 4608 shaders or something and sell it as an RX590?


----------



## r.h.p (Jul 20, 2017)

Raevenlord said:


> On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.
> 
> The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)
> 
> ...



Geezus I'm gonna kill myself before Vega makes it out ...... Hurry Up


----------



## Pap1er (Jul 20, 2017)

I'm just wondering why ppl rumble about 440 watts of power draw of Vega FE @ liquid @ overclocked...
It does seem a little stupid, doesn't it?
Air cooled card wouldn't draw so much power anyway, air cooler won't allow that...


----------



## the54thvoid (Jul 20, 2017)

Pap1er said:


> I'm just wondering why ppl rumble about 440 watts of power draw of Vega FE @ liquid @ overclocked...
> It does seem a little stupid, doesn't it?
> Air cooled card wouldn't draw so much power anyway, air cooler won't allow that...



Problem is that the higher draw is higher clocks is higher performance. Lower clocks and temp/power limiting means lower performance. Water cooled FE is about 10% faster than air cooled. Assuming RX performs better on gaming drivers by 10-20% (figures plucked out of air) that air cooling will limit performance. 
If you want max 24/7 performance, you want water cooling on any card (or a very good custom fan design).


----------



## Gasaraki (Jul 20, 2017)

RejZoR said:


> GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?



It doesn't matter if it's hot and power hungry AS LONG AS IT PERFORMS. The GTX280, GTX285, GTX295 were all furnaces but at least they performed at the top of their class. We already have a mid range card from AMD, the 580. We don't need another card that sucks power and produces tons of heat yet only performs on the level of the 1080. The 1080Ti is already 30% faster than the regular 1080.


----------



## RejZoR (Jul 20, 2017)

You're going against the "we don't care if it performs". Performs like what? If they pit it against GTX 1080 and price it around there and performs, how is that not "it performs"? RX 480 wasn't top of the line product, but they positioned it against GTX 1060 offering. And it performed.


----------



## Gasaraki (Jul 20, 2017)

RejZoR said:


> You're going against the "we don't care if it performs". Performs like what? If they pit it against GTX 1080 and price it around there and performs, how is that not "it performs"? RX 480 wasn't top of the line product, but they positioned it against GTX 1060 offering. And it performed.



If you have 2 cards and they cost around the same, performance is around the same but one sucks 100W more power and can heat up a small room, which one do you buy?


----------



## Anymal (Jul 20, 2017)

100w more and 1080 performance, 1 year later. Yes, it performs.  Big chip, hbm2,... price is also a big question. I think they need better than 1080ti gp102 performance, otherwise we will party like its 2016.


----------



## cdawall (Jul 20, 2017)

Pap1er said:


> I'm just wondering why ppl rumble about 440 watts of power draw of Vega FE @ liquid @ overclocked...
> It does seem a little stupid, doesn't it?
> Air cooled card wouldn't draw so much power anyway, air cooler won't allow that...



Well issue one would be it vastly overdraws atx spec.


----------



## RejZoR (Jul 20, 2017)

Gasaraki said:


> If you have 2 cards and they cost around the same, performance is around the same but one sucks 100W more power and can heat up a small room, which one do you buy?



Well, that didn't stop people opting for GeForce FX and Fermi. Which is contradicting your own words. But now that AMD is mentioned, no no no, you can't have higher power draw, even if you deliver performance. Why?


----------



## Anymal (Jul 20, 2017)

Gasaraki said:


> If you have 2 cards and they cost around the same, performance is around the same but one sucks 100W more power and can heat up a small room, which one do you buy?





RejZoR said:


> Well, that didn't stop people opting for GeForce FX and Fermi. Which is contradicting your own words. But now that AMD is mentioned, no no no, you can't have higher power draw, even if you deliver performance. Why?


How much Fermi draw vs. VEGA?


----------



## cdawall (Jul 20, 2017)

RejZoR said:


> Well, that didn't stop people opting for GeForce FX and Fermi. Which is contradicting your own words. But now that AMD is mentioned, no no no, you can't have higher power draw, even if you deliver performance. Why?



Fermi actually performed well? Power draw also went down with most aib cards. Temps really hurt power consumption on those.

I also posted that people did buy and those times with a ncie neat graph that had a breakdown showing market share increase for amd during those times. You have chosen to ignore it however because it doesnt fit your plight against the green.


----------



## RejZoR (Jul 20, 2017)

"Plight against the green." Are you seriously still grasping at those straws?


----------



## Anymal (Jul 20, 2017)

RejZoR said:


> "Plight against the green." Are you seriously still grasping at those straws?


All readers see your bullshit and you go on and on and on...


----------



## cdawall (Jul 20, 2017)

RejZoR said:


> "Plight against the green." Are you seriously still grasping at those straws?



Are you seriously still accusing everyone who purchased Fermi or fx cards as the reason why amd can't release a gpu?

Amd released a 1080 competitor two years ago. It was called dual rx480's. You know what it did? Consumed roughly half the power of a Vega fe, cost similar to a 1070 and if the game supported xfire it worked just like a 1080. 2 years later where are we? Oh we are trying to push the fx8150 of graphics cards.


----------



## RejZoR (Jul 20, 2017)

Anymal said:


> All readers see your bullshit and you go on and on and on...



Bullshit? What bullshit? That I'm somehow a massive AMD fanboy while not having a single AMD component in my system? Yeah, that argument's gonna fly...



cdawall said:


> Are you seriously still accusing everyone who purchased Fermi or fx cards as the reason why amd can't release a gpu?
> 
> Amd released a 1080 competitor two years ago. It was called dual rx480's. You know what it did? Consumed roughly half the power of a Vega fe, cost similar to a 1070 and if the game supported xfire it worked just like a 1080. 2 years later where are we? Oh we are trying to push the fx8150 of graphics cards.



For the first line, that's what you are implying. I actually haven't said anything other than stating the fact that people still bought it because it had performance. Fast forward to now and everyone is fucking freaking out when something (not something, AMD) has higher consumption even if it delivers performance. Hence, comparison with GeForce FX and Fermi. But ya all are too busy calling me an AMD fanboy to realize that.

As for the second part, if AMD already released dual RX480 (which it didn't on single PCB) to counter GTX 1080, then why all this whining ever since R9 Fury X how AMD doesn't have anything to go against GTX 1080? Make up your god damn minds, will ya? AMD fanboy... F**k me.


----------



## S@LEM! (Jul 20, 2017)

cdawall said:


> Are you seriously still accusing everyone who purchased Fermi or fx cards as the reason why amd can't release a gpu?
> 
> Amd released a 1080 competitor two years ago. It was called dual rx480's. You know what it did? Consumed roughly half the power of a Vega fe, cost similar to a 1070 and if the game supported xfire it worked just like a 1080. 2 years later where are we? Oh we are trying to push the fx8150 of graphics cards.



"the fx8150 of graphics cards" .. sum up
well let's hope that's the bottom that they can get, wake them up and come up with something truly innovative like Rayzen


----------



## cdawall (Jul 20, 2017)

RejZoR said:


> Bullshit? What bullshit? That I'm somehow a massive AMD fanboy while not having a single AMD component in my system? Yeah, that argument's gonna fly...
> 
> 
> 
> ...



So you know raja announced dual 480's as a competitor to the 1080 right? That also doesn't change that they don't have a single card that competes. Even now, 2 years after the 1080 released. Hell they dont even have a 980ti competitor.

You have been swinging off of the vaporware that is vega for 6-8 months now. You have zero facts to back up any argument repeat the same line of b.s. and every time someone throws sand up to show how wrong that line is you stone wall and ignore it. AGAIN NVIDIA LOST MARKET SHARE AND ATI GAINED IT IN ALL OF THE TIMES YPU HAVE MENTIONED.


----------



## FordGT90Concept (Jul 20, 2017)

RejZoR said:


> As for the second part, if AMD already released dual RX480 (which it didn't on single PCB) to counter GTX 1080, then why all this whining ever since R9 Fury X how AMD doesn't have anything to go against GTX 1080? Make up your god damn minds, will ya? AMD fanboy... F**k me.


That's pretty much an R9 295 X2, just lower power consumption.


----------



## Anymal (Jul 20, 2017)

RejZoR said:


> Bullshit? What bullshit? That I'm somehow a massive AMD fanboy while not having a single AMD component in my system? Yeah, that argument's gonna fly...
> 
> 
> 
> ...


How you punish yourself is not the matter.  Why negative against nvidia when they delivered so good product 1 year ago. Why even defend AMD when first 300+w tdp card is announced and maybe only 1080 performance. Shit, you are fanboy at its finest, buying 1050ti when nothing better from AMD is in stock. Gtfo.


----------



## RejZoR (Jul 20, 2017)

cdawall said:


> So you know raja announced dual 480's as a competitor to the 1080 right? That also doesn't change that they don't have a single card that competes. Even now, 2 years after the 1080 released. Hell they dont even have a 980ti competitor.
> 
> You have been swinging off of the vaporware that is vega for 6-8 months now. You have zero facts to back up any argument repeat the same line of b.s. and every time someone throws sand up to show how wrong that line is you stone wall and ignore it. AGAIN NVIDIA LOST MARKET SHARE AND ATI GAINED IT IN ALL OF THE TIMES YPU HAVE MENTIONED.



Soooo, where is this dual RX480 then? All I'm LITERALLY saying the entire time is all the data on Vega is just off as far as gaming goes. I'm not defending it, I'm stating that it doesn't make any sense. It's a big fucking difference. And all YOU people are saying is VEGA IS GARBAGE OMG OMG FANBOY AHAHAHAHA, STOP DEFENDING IT (now, that is objectively bashing of a product you can't even try because Vega FE ain't a gaming card no matter what you say). That's literally what's happening. But sure, keep on convincing yourself otherwise.



Anymal said:


> How you punish yourself is not the matter.  Why negative against nvidia when they delivered so good product 1 year ago. Why even defend AMD when first 300+w tdp card is announced and maybe only 1080 performance. Shit, you are fanboy at its finest, buying 1050ti when nothing better from AMD is in stock. Gtfo.



Negativity? Where? Please, do show me that and with full context, so you won't pull shit out of your rear by cherry picking individual words and not whole paragraphs. The only thing you'll find is that I literally nowhere say that the GTX 980 that I do own (no, I don't have a Radeon if you haven't freaking noticed yet) is crap. Also, good luck finding ANY post ANYWHERE where I state that NVIDIA currently doesn't hold a performance supremacy. Wait, I'll do it for you:

https://www.techpowerup.com/forums/...rovement-per-month.234924/page-4#post-3689493



> GTX 1080Ti drivers are as good as they can get imo. Besides, why would NVIDIA hold it back with subpar drivers if they can reign absolute supremacy with excellent drivers? It's not around 40% faster than anything else by pure chance...



Look at this bashing of NVIDIA! God I'm so nasty by stating that NVIDIA has excellent drivers and holds supreme performance crown in a SINGLE SENTENCE. OH MAH GOD.


----------



## EarthDog (Jul 20, 2017)

Its like our very own court jester...


----------



## cdawall (Jul 20, 2017)

RejZoR said:


> Soooo, where is this dual RX480 then? All I'm LITERALLY saying the entire time is all the data on Vega is just off as far as gaming goes. I'm not defending it, I'm stating that it doesn't make any sense. It's a big fucking difference. And all YOU people are saying is VEGA IS GARBAGE OMG OMG FANBOY AHAHAHAHA, STOP DEFENDING IT (now, that is objectively bashing of a product you can't even try because Vega FE ain't a gaming card no matter what you say). That's literally what's happening. But sure, keep on convincing yourself otherwise.



Have you ever been shopping and selected two things instead of one? That would be dual rx480's. Tons of things with AMD don't make sense, one thing that does is terrible drivers. The card also implements a multitude of things that have to be enabled at both a hardware and software level that AMD lacks the R&D to actually implement. I mean for gods sake they released a card that they bragged up and down about TBR and then the driver doesn't even implement it. Broken half finished garbage, I was actually excited for this card 8 months ago.


----------



## RejZoR (Jul 20, 2017)

Hold your horses. If we can just pair shit up, why RX480 and not Fury X in CrossfireX then? Because I do know for a fact that dual Fury X pisses even on GTX 1080Ti. And now is the moment you'll say "BUT"...

As for drivers and "terrible". I've had generations of Radeon cards and never had issues. I've also had several generations of GeForce cards. Also never had any serious issues. There were issues with both, mostly minor glitches that got fixed in driver or two.


----------



## the54thvoid (Jul 20, 2017)

RejZoR said:


> GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?



I'm quoting this again because it's pertinent to the current ping pong discourse going back and forth.
People are reacting to your invalid point, quoted above.  People DID call Nvidia out for the hot chip. People actually did switch to Radeon as Cdawall's sales graphs show.
If you simply say you understand this, it's all good. You're currently being gang banged because you've not accepted what you wrote (in quotes) is wrong.  The history doesn't support you, nor does the sales figures.
Also, of real merit, after Fermi, Nvidia made efforts to streamline and make the design more efficient. We've seen it as they've reduced the compute component over time. This flipped the tables on former ATI who had made some nice efficient designs, especially by offering the 5870/5850 cards at the time.
Nvidia were laughed at and ATI got better sales. Nvidia managed though to get a full core 580 chip to succeed the 480 and this caught folk off guard (notably Charlie at Semi Accurate who said 'trust me, Nvidia cannot make a full core Fermi chip, it's not possible).
Since then, dropping compute has worked for gaming in the most part.  Except now, GP100 and GV100 are not the same chip for gaming cards. Nvidia, unlike AMD is bifurcating their stack for HPC and general consumer.

Anyway. Most folk know your not a fan boy but failing to address your own erroneous statements doesnt help.


----------



## cdawall (Jul 20, 2017)

RejZoR said:


> Hold your horses. If we can just pair shit up, why RX480 and not Fury X in CrossfireX then? Because I do know for a fact that dual Fury X pisses even on GTX 1080Ti. And now is the moment you'll say "BUT"...
> 
> As for drivers and "terrible". I've had generations of Radeon cards and never had issues. I've also had several generations of GeForce cards. Also never had any serious issues. There were issues with both, mostly minor glitches that got fixed in driver or two.



As someone with two fury's they do not "piss" on a 1080Ti in anything, but producing heat.

And if you mean minor glitches like, the overclock section shitting itself every single time you restart the computer normally then sure "Minor" or black screen crashes in multiple games or xfire just not working in games. The list goes on.


----------



## RejZoR (Jul 20, 2017)

Benchmarks say otherwise. But since you own them, I guess everyone else is wrong, right?


----------



## cdawall (Jul 20, 2017)

RejZoR said:


> Benchmarks say otherwise. But since you own them, I guess everyone else is wrong, right?



Care to post benchmarks overclocked vs overclocked. My 1080Ti in my HTPC rig is at 2063/1600 the fury's are at 1075/500.


----------



## Prince Valiant (Jul 20, 2017)

cdawall said:


> As someone with two fury's they do not "piss" on a 1080Ti in anything, but producing heat.
> 
> And if you mean minor glitches like, the overclock section shitting itself every single time you restart the computer normally then sure "Minor" or black screen crashes in multiple games or xfire just not working in games. The list goes on.


I have random driver crashes, Gsync has to be toggled off and back on anytime I set fixed refresh for a program in the control panel, various performance drops after driver updates. The list goes on just the same for Nvidia when it comes to drivers.


----------



## Fluffmeister (Jul 21, 2017)

I wonder if an Ashes of the Singularity Vega patch will drop soon?


----------



## notb (Jul 21, 2017)

RejZoR said:


> Bullshit? What bullshit? That I'm somehow a massive AMD fanboy while not having a single AMD component in my system? Yeah, that argument's gonna fly...


I think this argument is getting pretty old and over-exploited by now.
Everyone can see you're very supportive for AMD - yet, whenever someone points this out, you use the same story about not having AMD gear. It's been going on for months!

So here's the question: if you value recent AMD products so much, why don't you buy anything?


----------



## cdawall (Jul 21, 2017)

notb said:


> I think this argument is getting pretty old and over-exploited by now.
> Everyone can see you're very supportive for AMD - yet, whenever someone points this out, you use the same story about not having AMD gear. It's been going on for months!
> 
> So here's the question: if you value recent AMD products so much, why don't you buy anything?



Because everything they actually release is just a box of hype



Prince Valiant said:


> I have random driver crashes, Gsync has to be toggled off and back on anytime I set fixed refresh for a program in the control panel, various performance drops after driver updates. The list goes on just the same for Nvidia when it comes to drivers.



My mining rigs 24/7/365 the nvidia ones need a restart weekly the AMD ones every other day can be pushing it.


----------



## purecain (Jul 21, 2017)

I heard that the rx version wont be much different to the frontier edition.


----------



## Capitan Harlock (Jul 21, 2017)

Before judging i wanna see real performance with new drivers too.
About the argument of dual R9 Fuxy x , not a lot of time ago i was wondering if would be good to add another Fury x or buy a 1080ti and the thing is that if we have to talk about performance we have to use the latest drivers too and not refer to old tests with old drivers or all this don't make any sense.
I wanna see how it goes with RX Vega but i was wandering what graphic card was pushing the first time was announced Doom at 4 k at 60 fps .
Was the FT or this RX Vega in ealry stages?
We don't know and i hope that was not only a dev concept that didn't work out .
AT 4k 2 How is the difference with 2 Fury x vs 1 1080ti with the latest drivers?
Because we have to push it to the limit to testing it out not using 1080p that don't push anything high end .


----------



## cdawall (Jul 21, 2017)

Capitan Harlock said:


> I wanna see how it goes with RX Vega but i was wandering what graphic card was pushing the first time was announced Doom at 4 k at 60 fps .



A single 1080ti gets well over that in doom


----------



## Capitan Harlock (Jul 21, 2017)

cdawall said:


> A single 1080ti gets well over that in doom


If i remember was like 60 to 65 and with not optimized drivers because is not a full released card.
We have to wait and see how it goes .
4K 60 is good so i don't see the point in going over the refresh rate .


----------



## RejZoR (Jul 21, 2017)

notb said:


> I think this argument is getting pretty old and over-exploited by now.
> Everyone can see you're very supportive for AMD - yet, whenever someone points this out, you use the same story about not having AMD gear. It's been going on for months!
> 
> So here's the question: if you value recent AMD products so much, why don't you buy anything?



Yeah, well, it's kinda very hard being a f**king AMD fanboy if you don't own a single fucking thing from them, don't you think? Jesus... use some common sense. You can't call an argument you people created a tiring one and throw it at me. That's like beating a strawman into a flat thing. And setting it on fire. But if not rushing to conclusions without actually having an RX Vega at hands makes me a fanboy, then so be it. Idiots making conclusions based on Vega Frontier Edition are just that, a bunch of clueless idiots. And we're talking about major reputable publications doing that. That's just beyond baffling. It's like they don't get it that while core is the same, Vega FE was released with different tasks in mind, meaning drivers could in fact be far more primitive and "half baked" and still work for what it was meant. As it's evident from tests where they actually tested shit that aren't games. For all the features that matter for games to function, they required extra time which is why there is no RX Vega out yet. Is that really so hard t grasp? AMD also did release VEGA within scheduled date which was 1H 2017. Granted, they didn't say it'll be a card for developers first, but they released it. Again, it was again your expectations (and mine) that make us believe that they'll release their entire lineup. Which sucks, but it's not like it's end of the world because of that.



> So here's the question: if you value recent AMD products so much, why don't you buy anything?



Good f**king question, don't you think? It's you people who accuse me of being a massive AMD fanboy. Hard to hold that narrative against someone with Intel CPU and NVIDIA GPU, isn't it?


----------



## FordGT90Concept (Jul 21, 2017)

I'm still giving AMD the benefit of the doubt on Vega.  I'm not expecting a Titan Xp slayer but, like I said, it will be faster than the RX 480 and priced to sell for the performance.  What's not to love?  Sure, sure, it could always be faster but, reality check.


----------



## cdawall (Jul 21, 2017)

Capitan Harlock said:


> If i remember was like 60 to 65 and with not optimized drivers because is not a full released card.
> We have to wait and see how it goes .
> 4K 60 is good so i don't see the point in going over the refresh rate .



Name the last time the "not optimized" driver argument was made? They are never optimized they are never this never that etc. Stop making excuses for bad PR hyping up half finished products.



FordGT90Concept said:


> I'm still giving AMD the benefit of the doubt on Vega.  I'm not expecting a Titan Xp slayer but, like I said, it will be faster than the RX 480 and priced to sell for the performance.  What's not to love?  Sure, sure, it could always be faster but, reality check.



440w. I'm sorry, but as someone who deals with that kind of heat in a case from graphics cards people don't realize the other issues it causes.


----------



## justimber (Jul 21, 2017)

yep. same here. will reserve judgement after the whole product is released.


----------



## FordGT90Concept (Jul 21, 2017)

cdawall said:


> 440w. I'm sorry, but as someone who deals with that kind of heat in a case from graphics cards people don't realize the other issues it causes.


Overclocked on what is effectively pre-production silicon.


----------



## RejZoR (Jul 21, 2017)

cdawall said:


> Name the last time the "not optimized" driver argument was made? They are never optimized they are never this never that etc. Stop making excuses for bad PR hyping up half finished products.
> 
> 
> 
> 440w. I'm sorry, but as someone who deals with that kind of heat in a case from graphics cards people don't realize the other issues it causes.



Oh god, you're still not getting it what "optimizations" we're talking about. These aren't the kind of "5% here and 3% there" optimizations that entirely fall down to individual games. We were talking about optimizations in terms of giving a GAMING RX Vega a FULLY WORKING driver. There was no need to give Vega FE such features as it performs ok as it is even without any of the fancy features. Where RX Vega explicitly depends on them. Releasing it when not ready would be just straight foolish as everyone would piss on it like they have on Vega FE. That's what we're saying the entire bloody time. If AMD knew they can't get anything out of it, what would be the point in postponing it for a whole month if the end result will be exactly the same?

You're bitching about bad PR and then in the same breath, you expect AMD to say outloud they don't have the drivers finished yet. That wouldn't be a bad press then, somehow... It's no secret that AMD doesn't have the resources to pull "magic" shit NVIDIA does, so give them some bloody slack, geez. Everyone pissing on AMD like everyone's life depends on it. Tired of waiting? Buy god damn GTX 1080. Many have and many will. I've decided to wait even though I was this <> close to just hitting a BUY button for AORUS GTX 1080Ti. Didn't wait last time with Fury X, but will now. RX Vega may turn out to be a good card despite power consumption, but if it doesn't, it'll still make some sort of competition on the market, making NVIDIA cards potentially cheaper. If that makes just 50€ less, then so be it. Right now is actually the worst time to be jumping to anything with RX Vega being literally around the corner. But whatever man, gotta go on beer with the fanboys now...

Also, those 440W. Yeah, it's a lot. But that's water cooled AND overclocked version where you can place the radiator on the case exhaust. Meaning it won't really affect case internal temperature. For normal stock state it's 350W and for air cooled, 300W. Still not ideal, but for the right price and maybe special features, who cares? Sure, it'll heat up the room, but so does my GTX 980. I have to run AC anyway. So, what difference does it make? For winter, you'll save up on heating. This is no joke, I've had just PC heating up my place for 2 winters now. The central heating radiator was closed except for the coldest days.

People make way too much drama about power consumption. If it's great, excellent. If it's not, then you check other benefits  or tradeoffs and decide. But just universally taking a piss at products that have a certain power draw or thermals is becoming a really annoying habit of the actual fanboys. Particularly from the green camp. My HD7950 at 1.2 GHz was also a freaking furnace. But it was stupid fast. I didn't care. That was my decision. And I have the same with GTX 980. I could run it ultra cool at stock. But I've decided to max it all out. It's also a furnace, I could fry eggs on the backplate, it's that hot. But that's what I wanted and willingly decided for it. Who are you to say what I want or don't want? And same applies to all potential buyers of RX Vega. They are adults for the most part, we don't need your parroting how Vega's power draw is shit and horrible. We'll decide about that when it's actually released.


----------



## EarthDog (Jul 21, 2017)

Court. Jester.


----------



## cdawall (Jul 21, 2017)

RejZoR said:


> Oh god, you're still not getting it what "optimizations" we're talking about. These aren't the kind of "5% here and 3% there" optimizations that entirely fall down to individual games. We were talking about optimizations in terms of giving a GAMING RX Vega a FULLY WORKING driver. There was no need to give Vega FE such features as it performs ok as it is even without any of the fancy features. Where RX Vega explicitly depends on them. Releasing it when not ready would be just straight foolish as everyone would piss on it like they have on Vega FE. That's what we're saying the entire bloody time. If AMD knew they can't get anything out of it, what would be the point in postponing it for a whole month if the end result will be exactly the same?



It took amd over two years to get the r9 fury competitive with its 980ti competitor. In that time nvidia releases an entire series of cards and amd released half of one. So I guess if your plan is to be a card for 2 years from now that has a complete driver buy vega on release. We also don't know what percentage it could be improved. No one still knows if the hbcc will actually help in games like AMD has hyped. No one knows if AMD can actually get the TBR working like they hyped.



RejZoR said:


> You're bitching about bad PR and then in the same breath, you expect AMD to say outloud they don't have the drivers finished yet. That wouldn't be a bad press then, somehow... It's no secret that AMD doesn't have the resources to pull "magic" shit NVIDIA does, so give them some bloody slack, geez. Everyone pissing on AMD like everyone's life depends on it. Tired of waiting? Buy god damn GTX 1080. Many have and many will. I've decided to wait even though I was this <> close to just hitting a BUY button for AORUS GTX 1080Ti. Didn't wait last time with Fury X, but will now. RX Vega may turn out to be a good card despite power consumption, but if it doesn't, it'll still make some sort of competition on the market, making NVIDIA cards potentially cheaper. If that makes just 50€ less, then so be it. Right now is actually the worst time to be jumping to anything with RX Vega being literally around the corner. But whatever man, gotta go on beer with the fanboys now...



Oh so you mean you will just buy nvidia because they have the better product? Is this before or after you complain how their market share is unfair?



RejZoR said:


> Also, those 440W. Yeah, it's a lot. But that's water cooled AND overclocked version where you can place the radiator on the case exhaust. Meaning it won't really affect case internal temperature. For normal stock state it's 350W and for air cooled, 300W. Still not ideal, but for the right price and maybe special features, who cares? Sure, it'll heat up the room, but so does my GTX 980. I have to run AC anyway. So, what difference does it make? For winter, you'll save up on heating. This is no joke, I've had just PC heating up my place for 2 winters now. The central heating radiator was closed except for the coldest days.



Couple of things water cooling a gpu decreases power consumption at the same clocks/volts. More heat turns into more leakage which turns into more power consumption. We have a card sucking down 440w not even close to all of that is being exhausted out of the case from that tiny 120mm radiator. Which means its building up in thw case. Rule of thumb is 150w for every 120mm of radiator for a custom loop, we gave already seen garbage aio's can't touch that. Even stock that leaves 200w somewhere.



RejZoR said:


> People make way too much drama about power consumption. If it's great, excellent. If it's not, then you check other benefits  or tradeoffs and decide. But just universally taking a piss at products that have a certain power draw or thermals is becoming a really annoying habit of the actual fanboys. Particularly from the green camp. My HD7950 at 1.2 GHz was also a freaking furnace. But it was stupid fast. I didn't care. That was my decision. And I have the same with GTX 980. I could run it ultra cool at stock. But I've decided to max it all out. It's also a furnace, I could fry eggs on the backplate, it's that hot. But that's what I wanted and willingly decided for it. Who are you to say what I want or don't want? And same applies to all potential buyers of RX Vega. They are adults for the most part



7950's with a heavy overclock pull half the wattage we are talking about. I dont think you realise how much heat 440w is. The good rx 480's are drawing 95w under load. So 4 rx 480's worth of heat, 5 or 6 gtx 1060's, nearly 3 gtx 1070's, 2 gtx 1080's, nearly 2 gtx 1080ti's. Hell that's two fx9590's worth of power 

You could have an entire custom loop cooled 1080ti/7700k rig for less wattage than the Vega card shown.



RejZoR said:


> we don't need your parroting how Vega's power draw is shit and horrible. We'll decide about that when it's actually released.



Pot meet kettle. You have been doing the exact same thing as a pro for Vega for months. VRM design should have been the dead giveaway that this card would be eating. AMD didn't invest money into the beat reference VRM section for fun. These cards suck power like it is going out of style.



FordGT90Concept said:


> Overclocked on what is effectively pre-production silicon.



Can you name a time when any manufacturer put the worst silicon on the most expensive cards it sells? Want to know something scary? There is a chance that is the best silicon AMD has right now.


----------



## Prince Valiant (Jul 21, 2017)

EarthDog said:


> Court. Jester.


And the people that keep aggressively replying ?


----------



## efikkan (Jul 21, 2017)

RejZoR said:


> Oh god, you're still not getting it what "optimizations" we're talking about. These aren't the kind of "5% here and 3% there" optimizations that entirely fall down to individual games. We were talking about optimizations in terms of giving a GAMING RX Vega a FULLY WORKING driver. There was no need to give Vega FE such features as it performs ok as it is even without any of the fancy features. Where RX Vega explicitly depends on them.


I'm just going to remind you what AMD says themselves:
_"The Radeon™ Vega Frontier Edition graphics card is designed to simplify and accelerate game creation by providing a single GPU that is optimized for every stage of this workflow, from asset production, to playtesting, to performance optimization."_

So which game features have AMD intensionally disabled on the card they call optimized for gaming?



RejZoR said:


> you expect AMD to say outloud they don't have the drivers finished yet. That wouldn't be a bad press then, somehow... It's no secret that AMD doesn't have the resources to pull "magic" shit NVIDIA does…


So, how much time are they going to need? Another year? They've had more time than usual to polish the driver.



cdawall said:


> Can you name a time when any manufacturer put the worst silicon on the most expensive cards it sells? Want to know something scary? There is a chance that is the best silicon AMD has right now.


Exactly. It would be unusual if they picked the lower binning and sold it for more…


----------



## RejZoR (Jul 21, 2017)

@Effting 
It's not what AMD has intentionally disabled, it's what AMD hasn't implemented yet... Vega FE was released without that because it was assumed it's not a game changer if it lacks that stuff for now. A pure gaming card like RX Vega however, entirely depends on those capabilities. If you don't have them yet, releasing it is somehow pointless. It's why I keep on bitching over idiots testing Vega FE as if it was a pure gaming card.

How much they are gonna need? RX Vega is getting released for real at the end of this month. That much.


----------



## cdawall (Jul 21, 2017)

Bullshit


----------



## FordGT90Concept (Jul 21, 2017)

cdawall said:


> Can you name a time when any manufacturer put the worst silicon on the most expensive cards it sells? Want to know something scary? There is a chance that is the best silicon AMD has right now.


Nope, but I also can't name a time where a company released their top of the line chip twice, several months apart, announcing that the chip is still coming.  And yes, that's a distinct possibility too.

AMD might be starting a new trend of giving developers access to new silicon before anyone else so they have an opportunity to optimize for it before the main product launches.


----------



## cdawall (Jul 21, 2017)

FordGT90Concept said:


> Nope, but I also can't name a time where a company released their top of the line chip twice, several months apart, announcing that the chip is still coming.  And yes, that's a distinct possibility too.
> 
> AMD might be starting a new trend of giving developers access to new silicon before anyone else so they have an opportunity to optimize for it before the main product launches.



Or they could have literally zero stock of hbm, or this giant expensive hard to manufacture gpu could have garbage yields.


----------



## FordGT90Concept (Jul 21, 2017)

A likely contributing factor which hopefully they're rectifying by taping out a GDDR5X version of the chip for late 2017/early 2018.


----------



## cdawall (Jul 21, 2017)

FordGT90Concept said:


> A likely contributing factor which hopefully they're rectifying by taping out a GDDR5X version of the chip for late 2017/early 2018.



That isn't going to change how huge this gpu is.


----------



## FordGT90Concept (Jul 21, 2017)

The GPU is not huge compared to, say, Fiji.  Fiji was not only a larger lithograph but it also had 1024-bit wide bus instead of 512.  Fiji was literally at the limit of what the interposer tech could handle, Vega is not.


----------



## cdawall (Jul 21, 2017)

FordGT90Concept said:


> The GPU is not huge compared to, say, Fiji.  Fiji was not only a larger lithograph but it also had 1024-bit wide bus instead of 512.  Fiji was literally at the limit of what the interposer tech could handle, Vega is not.



Still has an interposer, 4096 cu's a 512bit bus, the HBCC cheap etc.


----------



## Gasaraki (Jul 21, 2017)

RejZoR said:


> Benchmarks say otherwise. But since you own them, I guess everyone else is wrong, right?



LOL, first time hearing that Fury X beats 1080Ti in games.


----------



## FordGT90Concept (Jul 21, 2017)

cdawall said:


> Still has an interposer, 4096 cu's a 512bit bus, the HBCC cheap etc.


Vega is only slightly larger (484 mm²) than GP102 (471 mm²).  By comparison, Fiji is 596 mm².


----------



## cdawall (Jul 21, 2017)

FordGT90Concept said:


> Vega is only slightly larger (484 mm²) than GP102 (471 mm²).  By comparison, Fiji is 596 mm².



GP102 also at maximum has a 384 bit bus width, no interposer and is sold to most consumers neutered/re-purposed broken dies


----------



## FordGT90Concept (Jul 21, 2017)

Does that imply their GP102 yields are pretty crappy?  Vega has more compute cores than GP102 but not by much.  On paper, Vega is the faster GPU of the two.  Vega can reasonably expected to have fairly poor yields too.  Do they really have enough inventory of binned chips for the Frontier Edition or is it an older silicon revision (water cooled got binned chips while air cooled got the rest)?

We don't really know until the consumer RX Vega card launches.  Frontier Edition was just weird.


----------



## efikkan (Jul 21, 2017)

RejZoR said:


> It's not what AMD has intentionally disabled, it's what AMD hasn't implemented yet... Vega FE was released without that because it was assumed it's not a game changer if it lacks that stuff for now. A pure gaming card like RX Vega however, entirely depends on those capabilities. If you don't have them yet, releasing it is somehow pointless. It's why I keep on bitching over idiots testing Vega FE as if it was a pure gaming card.


What precisely are you talking about here? Is it the tiled rasterization again? Do you even know what it is? No such feature is implemented in a driver; it's a hardware scheduling feature.



RejZoR said:


> How much they are gonna need? RX Vega is getting released for real at the end of this month. That much.


They've had working hardware to test since last November, even demonstrated it working in December. That's about nine months of polishing the driver, which is more than they usually need. And it's not even a new architecture.



FordGT90Concept said:


> A likely contributing factor which hopefully they're rectifying by taping out a GDDR5X version of the chip for late 2017/early 2018.


Hopefully they will, because going with HBM has been their greatest mistake with Vega. It will help with supply and cost, but it wouldn't help with consumption, performance, etc.


----------



## FordGT90Concept (Jul 21, 2017)

HBM2 should be lower power and higher performance than GDDR5X.

Thing is, we're just speculating.  AMD has never said why they released a Frontier Edition ahead of the main product.  They also never said why Vega keeps getting kicked down the road.  I would love to be a fly on the wall in RTG meetings on Vega to hear their reasoning for these things.


----------



## cdawall (Jul 21, 2017)

FordGT90Concept said:


> Does that imply their GP102 yields are pretty crappy?  Vega has more compute cores than GP102 but not by much.  On paper, Vega is the faster GPU of the two.  Vega can reasonably expected to have fairly poor yields too.  Do they really have enough inventory of binned chips for the Frontier Edition or is it an older silicon revision (water cooled got binned chips while air cooled got the rest)?
> 
> We don't really know until the consumer RX Vega card launches.  Frontier Edition was just weird.



GP102 released may of 2016. I would imagine in the more than year it has been publicly available more than enough dies have existed that did not meet qc. The fact that they havw gp104 dies going onto gp106 cards hints that yields might be less than perfect (or that gp106 cannot meet demand)


----------



## Kenneth Waycaster (Jul 21, 2017)

Looks like a lame event.


----------



## efikkan (Jul 21, 2017)

FordGT90Concept said:


> HBM2 should be lower power and higher performance than GDDR5X.


HBM2 is a little more energy efficient than GDDR, but compared to the hot GPU it wouldn't matter much.
384-bit GDDR5X is faster than 2048-bit HBM2 BTW…



FordGT90Concept said:


> Thing is, we're just speculating.  AMD has never said why they released a Frontier Edition ahead of the main product.  They also never said why Vega keeps getting kicked down the road.  I would love to be a fly on the wall in RTG meetings on Vega to hear their reasoning for these things.


We do know HBM(2) supplies are a problem, even for Nvidia.


----------



## S@LEM! (Jul 21, 2017)

FordGT90Concept said:


> HBM2 should be lower power and higher performance than GDDR5X.
> 
> Thing is, we're just speculating.  AMD has never said why they released a Frontier Edition ahead of the main product.  They also never said why Vega keeps getting kicked down the road.  I would love to be a fly on the wall in RTG meetings on Vega to hear their reasoning for these things.



1080ti messed them up


----------



## FordGT90Concept (Jul 21, 2017)

efikkan said:


> HBM2 is a little more energy efficient than GDDR, but compared to the hot GPU it wouldn't matter much.
> 384-bit GDDR5X is faster than 2048-bit HBM2 BTW…


http://www.hotchips.org/wp-content/uploads/hc_archives/hc26/HC26-11-day1-epub/HC26.11-3-Technology-epub/HC26.11.310-HBM-Bandwidth-Kim-Hynix-Hot Chips HBM 2014 v7.pdf
http://www.anandtech.com/show/9883/gddr5x-standard-jedec-new-gpu-memory-14-gbps
Vega has 2 stacks of HBM2 at 512 GB/s (~7.3w).  GTX 1080 Ti has 11 chips of GDDR5X at 484 GB/s (~27.5w).  Titan Xp has 12 chips of GDDR5X at 547.7 GB/s (~30w).

Even thought Titan Xp has a slight edge over Vega in bandwidth, Vega has significantly lower latency and more flexibility in making memory requests.  HBCC spawned from lessons learned in using HBM for Fiji.



S@LEM! said:


> 1080ti messed them up


I wouldn't bet on that.


----------



## cdawall (Jul 21, 2017)

FordGT90Concept said:


> http://www.hotchips.org/wp-content/uploads/hc_archives/hc26/HC26-11-day1-epub/HC26.11-3-Technology-epub/HC26.11.310-HBM-Bandwidth-Kim-Hynix-Hot Chips HBM 2014 v7.pdf
> http://www.anandtech.com/show/9883/gddr5x-standard-jedec-new-gpu-memory-14-gbps
> Vega has 2 stacks of HBM2 at 512 GB/s (~7.3w).  GTX 1080 Ti has 11 chips of GDDR5X at 484 GB/s (~27.5w).  Titan Xp has 12 chips of GDDR5X at 547.7 GB/s (~30w).
> 
> ...



HBCC has shown no improvements with Vega as it sits.


----------



## efikkan (Jul 22, 2017)

cdawall said:


> HBCC has shown no improvements with Vega as it sits.


HBC works kind of similar to the prefetcher in a CPU, it's only able to detect linear access patterns. It will probably work flawlessly for some compute workloads, but it wouldn't do anything for random access patterns, which are typical for games.

And BTW, caching is about hiding latency, not increasing performance.


----------



## cdawall (Jul 22, 2017)

efikkan said:


> HBC works kind of similar to the prefetcher in a CPU, it's only able to detect linear access patterns. It will probably work flawlessly for some compute workloads, but it wouldn't do anything for random access patterns, which are typical for games.
> 
> And BTW, caching is about hiding latency, not increasing performance.



By decreasing latency it should increase performance. I mean that's how every single thing works. Maybe AMD just has that cool magic they keep hyping us about.

The latency in compute performance is also proving to be inconsistent as hell.


----------



## FordGT90Concept (Jul 22, 2017)

And that could be the root of it.  Perhaps there was a major hardware flaw in the HBCC that triggered erroneous cache misses.  They got a pile of cards ready to ship so they sell them under a limited edition moniker.  Fix the bug in another silicon revision and performance could improve immensely.  We just don't know until the final product is available.

Decreasing latency mitigates how long it takes for the GPU to move forward when a cache miss occurs.  This doesn't increase framerate (unless it is happening regularly) but it increases minimum FPS.

My point is that 13 TFLOPS of compute power shouldn't result in as low of a framerate as it gets.  There's clearly something wrong in the Frontier Edition and AMD knows it.


----------



## efikkan (Jul 22, 2017)

cdawall said:


> By decreasing latency it should increase performance.


You didn't get it. The point of caching is to utilize a small fast storage pool as if it was a larger fast storage pool, by using a small fast pool as a buffer for the large one.

So by Vega having 8 GB HBM with caching, it will only act as if the memory pool was larger, caching will never give you more performance than having the larger pool.



FordGT90Concept said:


> And that could be the root of it.  Perhaps there was a major hardware flaw in the HBCC that triggered erroneous cache misses.  They got a pile of cards ready to ship so they sell them under a limited edition moniker.  Fix the bug in another silicon revision and performance could improve immensely.  We just don't know until the final product is available.
> 
> Decreasing latency mitigates how long it takes for the GPU to move forward when a cache miss occurs.  This doesn't increase framerate (unless it is happening regularly) but it increases minimum FPS.


I seriously doubt it. A cache miss from GPU memory over the PCIe bus to system memory will get close to a millisecond, while a cache miss for CPU to its memory is a little over 50 ns (~200-250 clocks wasted for Kaby Lake). Such cache misses for HBC would not only result in stutter, but a completely unplayable game.



FordGT90Concept said:


> My point is that 13 TFLOPS of compute power shouldn't result in as low of a framerate as it gets.  There's clearly something wrong in the Frontier Edition and AMD knows it.


That's nothing new. Fury X had like ~53% more Flop/s than GTX 980 Ti, so things have been "wrong" for a while.

And BTW, back in the days we measured Flop/s for base clock, then for typical boost, and now AMD operates with max boost. So you wouldn't even hit 13.1 TFlop/s unless you increase the power limit to make it stay at 1600 MHz. We should really call it a 11.3 TFlop/s card, based on rated typical boost clock.


----------



## FordGT90Concept (Jul 22, 2017)

efikkan said:


> That's nothing new. Fury X had like ~53% more Flop/s than GTX 980 Ti, so things have been "wrong" for a while.


Sad but true.  Fiji had seriously underutilization problems, even at 4K.


----------



## cdawall (Jul 22, 2017)

efikkan said:


> You didn't get it. The point of caching is to utilize a small fast storage pool as if it was a larger fast storage pool, by using a small fast pool as a buffer for the large one.
> 
> So by Vega having 8 GB HBM with caching, it will only act as if the memory pool was larger, caching will never give you more performance than having the larger pool



Bullshit lower latency memory access will always give better performance that is the entire point


----------



## RejZoR (Jul 22, 2017)

efikkan said:


> What precisely are you talking about here? Is it the tiled rasterization again? Do you even know what it is? No such feature is implemented in a driver; it's a hardware scheduling feature.
> 
> 
> They've had working hardware to test since last November, even demonstrated it working in December. That's about nine months of polishing the driver, which is more than they usually need. And it's not even a new architecture.
> ...



Just because something is done in hardware, it doesn't mean it just magically works. The driver needs to be aware of the thing. And the "it's not even a new architecture" is getting very old. Everyone parroting how it's not a new architecture and yet they changed nearly everything in the core. Just because the fundamentals are not changed much, that doesn't mean you can just slam a Fury X driver on it and it'll just magically work. AMD didn't use tiled rasterizing ever before. And like it was already said, the feature, while hardware implemented needs driver awareness. Pixel shaders have also always been hardware feature, but they wouldn't work if they weren't exposed in drivers correctly. And there are probably many things AMD hasn't even mentioned that works underneath. With R9 200 and R9 300 series they could literally just slam a renamed driver on top because it's what they already developed drivers for in the past. RX Vega with all the changes doesn't have that luxury. And 9 months is really not a lot of time. It may sound a lot, but it's not when RTG is most likely understaffed/underfunded. That's another AMD problem, but that's the reality we are aware of and is no secret. That's something neither me or you can change, that's entirely up to AMD.


----------



## notb (Jul 22, 2017)

RejZoR said:


> Yeah, well, it's kinda very hard being a f**king AMD fanboy if you don't own a single fucking thing from them, don't you think?


There's no contradiction.

It seems a lot of your brain is in "AMD rulez" mode.
But then you go to a shop, you reach for your wallet... and you buy NVIDIA/Intel. That's where the (usually shouted down) sensible part of your brain decides on financial decisions. And this is a good sign. 

I'm a Mazda MX-5 fanboy and I don't own one. I might never do.
I bought a new Toyota in January and for more or less the same money I could have bought a used MX-5 from few years back.
And I have to admit: every time I see an MX-5 I wish I had one. But I made a sensible choice. And I'm really glad I did.


RejZoR said:


> It's like they don't get it that while core is the same, Vega FE was released with different tasks in mind, meaning drivers could in fact be far more primitive and "half baked" and still work for what it was meant. As it's evident from tests where they actually tested shit that aren't games.


Problem is: the "different tasks in mind" is a theory you are popularizing. AMD said this card is - among other things - designed for creating, testing and optimizing games.
If this card is aimed at game developers, shouldn't it be the fastest Vega available? How will a Vega FE user be able to test a game that maxes RX Vega out, if he can't run it?


RejZoR said:


> Good f**king question, don't you think? It's you people who accuse me of being a massive AMD fanboy. Hard to hold that narrative against someone with Intel CPU and NVIDIA GPU, isn't it?


Not at all. Still, this isn't an answer to my question.
Why don't you buy some AMD gear if you like it so much?


----------



## efikkan (Jul 22, 2017)

cdawall said:


> Bullshit lower latency memory access will always give better performance that is the entire point


No, you still don't get what caching does.
If you have two comparable GPUs, GPU A have 12 GB, and GPU B have 8 GB + caching, the caching will try to weigh up for the missing memory in GPU B. Whenever you need less than 8 GB, there will be no difference, and when you need more GPU B will perform up to the level of GPU A, never above it. Your confusion is what to compare it to. HBC will not have lower latency than other GPU memory, only lower latency than falling back to system memory.



RejZoR said:


> Just because something is done in hardware, it doesn't mean it just magically works. The driver needs to be aware of the thing.


The driver is aware of the hardware capabilities, but it does not micro-manage low-level scheduling inside the GPU, that is controlled on the GPU side. Tiled rasterization is not a new unit with a new feature set to expose through an API, it's a reordering of operations inside the GPU.



RejZoR said:


> And the "it's not even a new architecture" is getting very old. Everyone parroting how it's not a new architecture and yet they changed nearly everything in the core. … And 9 months is really not a lot of time.


It has been enough in the past, and it's not like they start from scratch when the working chips arrive. Remember, they did demo it working in late December. Well at least this time with all the delays, the driver should be ~2.5 months more mature than the drivers of Polaris and Fiji at their respective releases.

But this boils down to what we've heard for every single generation from AMD the last five years; at release AMD fans say we can't judge it, because the driver are immature. Yet, they somehow "know" it will improve, we only need to give it more time, but no substantial improvement ever materializes.


----------



## rtwjunkie (Jul 22, 2017)

efikkan said:


> So which game features have AMD intensionally disabled on the card they call optimized for gaming?


Ok, now you are becoming as much Court Jester as he is. They never said Frontier Edition is optimized for gaming. You are reading into it. They say it is optimized for every stage of the game production process. 

As I've pointed out before, and as you have ignored, game designers don't have to be able to play a game at top-level quality to give you that to play.


----------



## RejZoR (Jul 22, 2017)

@effikan
Sure, then explain to me what they are doing this whole extra month? Drinking booze and laughing? If they knew this is what they have, they'd just release it. It would be of less of an embarrassment than delaying it for whole month and then releasing the exact same thing as we've already seen with Vega FE, just with half the memory. AMD made some questionable decisions in the past, but they aren't that dumb, you can be assured of that. If I'm aware of those things, someone paid several times as much as I am sure as hell knows that. But entire computer world seems to be entirely oblivious to those tiny facts. If everything was where it should have been, they'd release entire Vega range back then and call it a day. Even if availability would actually come later if HBM2 production is the real issue. But whatever, apparently thinking logical isn't what people are expected to do over here anymore... You can do all the math and power draw and whatever, but tell me, this aspect doesn't strike you as very odd?

People were wondering what's up when RX480 was the fastest thing they offered and was really just a mid range. It didn't really bother people that there isn't any top end. The user base simply adapted to the offerings. If AMD pulled the same thing with RX Vega, release it as GTX 1080 competitor with engaging pricing scheme, people would be all over it even if it wasn't king of the hill. And yet they aren't doing that either. So, clearly something is going on. because otherwise, they could've done all of it long ago.


----------



## efikkan (Jul 22, 2017)

rtwjunkie said:


> Ok, now you are becoming as much Court Jester as he is. They never said Frontier Edition is optimized for gaming. You are reading into it. They say it is optimized for every stage of the game production process.


I'm just going to refer to AMD once again, it's even designed for "playtesting" and "performance optimization".



 
It can't get any clearer than that. Anyone failing to understand that are having trouble with fundamental logic.



RejZoR said:


> Sure, then explain to me what they are doing this whole extra month? Drinking booze and laughing?


Please try to stay serious.
They are stockpiling cards for the launch.


----------



## RejZoR (Jul 22, 2017)

Because paper launches haven't been invented yet apparently...


----------



## rtwjunkie (Jul 22, 2017)

efikkan said:


> I'm just going to refer to AMD once again, it's even designed for "playtesting" and "performance optimization".


As I said, logical thought is something you need to strive for.  Design team members play testing is NOT gaming.  Playtesting is checking bugs and implementation of new things into the program as they go along in production. 

It is not actual gaming as either us as consumers will do, or quality control testers will do when production is near finished.


----------



## cdawall (Jul 22, 2017)

efikkan said:


> No, you still don't get what caching does.
> If you have two comparable GPUs, GPU A have 12 GB, and GPU B have 8 GB + caching, the caching will try to weigh up for the missing memory in GPU B. Whenever you need less than 8 GB, there will be no difference, and when you need more GPU B will perform up to the level of GPU A, never above it. Your confusion is what to compare it to. HBC will not have lower latency than other GPU memory, only lower latency than falling back to system memory.



So it performs up to the same as a 12gb gpu while only having 8gb that would be a performance increase over a standard 8gb card no? Hence the whole reason amd has done it.


----------



## efikkan (Jul 22, 2017)

rtwjunkie said:


> As I said, logical thought is something you need to strive for.  Design team members play testing is NOT gaming.  Playtesting is checking bugs and implementation of new things into the program as they go along in production.
> 
> It is not actual gaming as either us as consumers will do, or quality control testers will do when production is near finished.


It's amazing when people fail to understand plain text. This has got to be the most ridiculous things I've heard this year.

Anyone familiar to development knows you do performance optimization on hardware representative of what the end user will run. If Vega FE lacks gaming features of RX Vega, then it's completely useless for performance optimizations, which AMD _claims_ is it's intended use.


----------



## londiste (Jul 22, 2017)

hbcc seems utterly useless in gaming use cases. caching is all nice and dandy but in case of vega, that cache is vram. it will not help performance if there is enough and there will still be latency issues if there is not enough. optimizing for cleaning up allocated but not used swaths of memory will undoubtedly cause its own fair share of issues in addition to this needing to be written for on software side (not very likely with this being a feature only in high end even on amd side of things).

however, there are excellent use cases for hbcc when it comes to compute, especially with the memory access/addressing improvements of last few generations that will be able to make use of system (and other bits of) memory in a single pool.


----------



## R0H1T (Jul 22, 2017)

londiste said:


> hbcc seems utterly useless in gaming use cases. caching is all nice and dandy but in case of vega, that cache is vram. it will not help performance if there is enough and there will still be latency issues if there is not enough. optimizing for cleaning up allocated but not used swaths of memory will undoubtedly cause its own fair share of issues in addition to this needing to be written for on software side (not very likely with this being a feature only in high end even on amd side of things).
> 
> however, there are excellent use cases for hbcc when it comes to compute, especially with the memory access/addressing improvements of last few generations that will be able to make use of system (and other bits of) memory in a single pool.


HBCC is mostly an enterprise feature, I would've thought that AMD might've included SLC cache or something for Vega to make use of HBCC.
It's that or I'm reading HBCC wrong /:


----------



## notb (Jul 22, 2017)

RejZoR said:


> @effikan
> Sure, then explain to me what they are doing this whole extra month? Drinking booze and laughing?


Let's hope they are designing something good. 
As for the launch... in the optimistic variant: building inventory, shipping, preparing benchmarks for the launch even. In pessimistic one: waiting for HBM2 supply...


> If they knew this is what they have, they'd just release it.


Maybe they're hoping for a miracle?
They must have already had, since they've continued developing this card. They must have noticed months ago how will the power draw look for the performance they aimed at.
As it's been told already: a dual RX480 could be better. AMD's board or shareholders wanted a Vega release in time (to show this architecture actually works) and a launch of gaming model with solid inventory for preorders (like they did with Ryzen).


> If everything was where it should have been, they'd release entire Vega range back then and call it a day. Even if availability would actually come later if HBM2 production is the real issue. But whatever, apparently thinking logical isn't what people are expected to do over here anymore... You can do all the math and power draw and whatever, but tell me, this aspect doesn't strike you as very odd?


That could be just about accomplishing targets.

People were wondering what's up when RX480 was the fastest thing they offered and was really just a mid range. It didn't really bother people that there isn't any top end. The user base simply adapted to the offerings. If AMD pulled the same thing with RX Vega, release it as GTX 1080 competitor with engaging pricing scheme, people would be all over it even if it wasn't king of the hill. And yet they aren't doing that either. So, clearly something is going on. because otherwise, they could've done all of it long ago.[/QUOTE]



R0H1T said:


> HBCC is mostly an enterprise feature, I would've thought that AMD might've included SLC cache or something for Vega to make use of HBCC.
> It's that or I'm reading HBCC wrong /:


Vega FE is not exactly and enterprise-grade product. Sure, it'll be used in some workstations, but the general corporate audience would prefer something more FirePro-ish.
This would mean that Vega FE is in fact not a product. It's just a showcase of technologies that AMD has and can use in future products. I wouldn't be shocked - this kind of launches happen quite often.


----------



## londiste (Jul 23, 2017)

R0H1T said:


> HBCC is mostly an enterprise feature, I would've thought that AMD might've included SLC cache or something for Vega to make use of HBCC.
> It's that or I'm reading HBCC wrong /:


slc cache would not be helpful. in this case where vram itself acts as cache, next level of memory is system ram over pci-e. this is much faster than slc cache.


----------



## R0H1T (Jul 23, 2017)

londiste said:


> slc cache would not be helpful. in this case where vram itself acts as cache, *next level of memory is system ram* over pci-e. this is much faster than slc cache.


Yes it is but unless Vega reserves a part of System RAM for cache, like primocache, I don;t see how HBCC would work more efficiently with it than say a dedicated pool of memory or storage for caching the program (or games) & the various chunks of data it'll work on, like content creation or editing. Also I haven't seen HBCC in action, or a thorough review, so I;m waiting to see how it actually works.


----------



## londiste (Jul 24, 2017)

R0H1T said:


> Yes it is but unless Vega reserves a part of System RAM for cache, like primocache, I don;t see how HBCC would work more efficiently with it than say a dedicated pool of memory or storage for caching the program (or games) & the various chunks of data it'll work on, like content creation or editing. Also I haven't seen HBCC in action, or a thorough review, so I;m waiting to see how it actually works.


gpus have been able to access system ram for a while now. as far as i understand the idea behind hbcc is not that system ram is used as a cache but vram is used as cache for everything further away - system ram, vram on other vega cards (either of them could be over pci-e or infinity fabric), storage of any kind etc. that has definite benefits for compute purposes - for example storage arrays being used for data and only necessary bits taken into vram at a time. data streaming has been doable for a while but this should make it much more seamless.


----------



## gamerman (Aug 23, 2017)

hurraayyy AMD support team...

how about that 126w


----------

