# AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer



## Raevenlord (Aug 3, 2017)

TweakTown has put forth an article wherein they claim to have received info from industry insiders regarding the upcoming Vega 56's performance. Remember that Vega 56 is the slightly cut-down version of the flagship Vega 64, counting with 56 next-generation compute units (NGCUs) instead of Vega 64's, well, 64. This means that while the Vega 64 has the full complement of 4,096 Stream processors, 256 TMUs, 64 ROPs, and a 2048-bit wide 8 GB HBM2 memory pool offering 484 GB/s of bandwidth, Vega 56 makes do with 3,548 Stream processors,192 TMUs, 64 ROPs, the same 8 GB of HBM2 memory and a slightly lower memory bandwidth at 410 GB/s.

The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.



 

 



The results in a number of popular games were as follows:

Battlefield 1 (Ultra settings): 95.4 FPS (GTX 1070: 72.2 FPS; 32% in favor of Vega 56)
Civilization 6 (Ultra settings, 4x MSAA): 85.1 FPS (GTX 1070: 72.2 FPS; 17% in favor of Vega 56)
DOOM (Ultra settings, 8x TSAA): 101.2 FPS (GTX 1070: 84.6 FPS; 20% in favor of Vega 56)
Call of Duty: Infinite Warfare (High preset): 99.9 FPS (GTX 1070: 92.1 FPS; 8% in favor of Vega 56)

If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes.

*View at TechPowerUp Main Site*


----------



## RejZoR (Aug 3, 2017)

Just posted this in VEGA discussion thread few seconds before this one 

Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...

Battlefield and Doom numbers are pretty significant.


----------



## Raevenlord (Aug 3, 2017)

RejZoR said:


> Just posted this in VEGA discussion thread few seconds before this one
> 
> Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...
> 
> Battlefield and Doom numbers are pretty significant.



The community never sleeps, but us editors need to


----------



## londiste (Aug 3, 2017)

RejZoR said:


> Just posted this in VEGA discussion thread few seconds before this one
> 
> Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...
> 
> Battlefield and Doom numbers are pretty significant.


enhanced sync will use as much tdp as it can, always.
chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.

the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.


----------



## ViperXTR (Aug 3, 2017)

those are some pretty nice numbers


----------



## HD64G (Aug 3, 2017)

When the NDA ends on Vegas?


----------



## uuuaaaaaa (Aug 3, 2017)

Could this mean that DBSR is working on the review driver?


----------



## Assimilator (Aug 3, 2017)

Raevenlord said:


> If these numbers ring true, this means NVIDIA's GTX 1070, whose *medium* pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56.



Do you even proofread? The word you're looking for is "median".

And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.

The price of GTX 1070 cards has only been pushed up because of the cryptomining BS. Vega 56 is unlikely to offer a better hashrate-per-watt than GTX 1070, which means GTX 1070 prices will stay high and they will continue to be bought in volume by miners, whereas Vega 56 will be bought in much smaller quantities by gamers. So NVIDIA still wins in terms of the pure numbers game, and therefore in revenue.

You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.


----------



## Liviu Cojocaru (Aug 3, 2017)

HD64G said:


> When the NDA ends on Vegas?


I think is 14th of August

If this information is true then I like the Vega 56, we'll see


----------



## Mr.Newss (Aug 3, 2017)

londiste said:


> all the listed games have a tendency of running noticeably better on amd hardware.



how about CoD IW? I'm pretty sure this game doesn't have any tendency towards both Green and Red teams.

since right now I have a sexy Sapphire HD 7950 and my system became fit with that, I really don't care about the Vega 56's consumption.


----------



## RejZoR (Aug 3, 2017)

londiste said:


> enhanced sync will use as much tdp as it can, always.
> chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.
> 
> the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.



Actually, that's not true. If Enhanced sync works anything like Fast V-Sync. I thought it's unlimited framerate, but it's not. At 144Hz, it stops at 144fps. It's just dropping frames that are out of sync and beyond refresh rate.


----------



## RejZoR (Aug 3, 2017)

Assimilator said:


> Do you even proofread? The word you're looking for is "median".
> 
> And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.
> 
> ...



GTX 1070 for 350 ahaha. Good luck finding one. Especially in Europe. Cheapest I could find was 413€. Zotac Mini. So, no, it's not cheaper...


----------



## londiste (Aug 3, 2017)

RejZoR said:


> Actually, that's not true. If Enhanced sync works anything like Fast V-Sync. I thought it's unlimited framerate, but it's not. At 144Hz, it stops at 144fps. It's just dropping frames that are out of sync and beyond refresh rate.


the premise should be exactly same as fastsync. it shows frames at screen's refresh rate but keeps rendering frames as fast as the card possibly can.


----------



## GreiverBlade (Aug 3, 2017)

oh that's great, and if true, just as i expected ... 

as i have a GTX 1070 .... (more 526$ for me than 460 .... when i bought it ) i suspected that a Vega 56 would be a slight upgrade tho, as i can sell my 1070 close to the initial price .... thanks Nvidia, i can probably plan on a Vega 64 ... 

ok it eats some more but ... well not that much important to me 


too bad Vega pack will probably not be available where i live


----------



## renz496 (Aug 3, 2017)

Mr.Newss said:


> how about CoD IW? I'm pretty sure this game doesn't have any tendency towards both Green and Red teams.
> 
> since right now I have a sexy Sapphire HD 7950 and my system became fit with that, I really don't care about the Vega 56's consumption.



that game is clearly favoring AMD hardware more. by a large margin too compared to respective nvidia card. just look how RX580 is competitive to nvidia reference GTX980ti in all resolution:

https://www.techpowerup.com/reviews/Sapphire/RX_580_Nitro_Plus/10.html


----------



## Crap Daddy (Aug 3, 2017)

One just has to check with TPUs reviews and other trusted reviews to see that these numbers for the 1070 are all over the place. Until a proper review of Vega 56 is out these are meaningless. What we do know is that this Vega is competing with the 1070 for a MSRP of $400 so it better be better.


----------



## renz496 (Aug 3, 2017)

RejZoR said:


> Actually, that's not true. If Enhanced sync works anything like Fast V-Sync. I thought it's unlimited framerate, but it's not. At 144Hz, it stops at 144fps. It's just dropping frames that are out of sync and beyond refresh rate.



the frame rate will be capped to maximum monitor refresh rates. but the GPU will still work as hard as it could similar to v-sync being turn off. at least that's how i remember it.


----------



## uuuaaaaaa (Aug 3, 2017)

renz496 said:


> that game is clearly favoring AMD hardware more. by a large margin too compared to respective nvidia card. just look how RX580 is competitive to nvidia reference GTX980ti in all resolution:
> 
> https://www.techpowerup.com/reviews/Sapphire/RX_580_Nitro_Plus/10.html



The RX 580 is competitive with the Fury X too, heck it beats it at 1080p.


----------



## Liviu Cojocaru (Aug 3, 2017)

GreiverBlade said:


> oh that's great, and if true, just as i expected ...
> 
> as i have a GTX 1070 .... (more 526$ for me than 460 .... when i bought it ) i suspected that a Vega 56 would be a slight upgrade tho, as i can sell my 1070 close to the initial price .... thanks Nvidia, i can probably plan on a Vega 64 ...
> 
> ...


I think that when this will be released the price on the 1070 will decrease as well


----------



## renz496 (Aug 3, 2017)

RejZoR said:


> GTX 1070 for 350 ahaha. Good luck finding one. Especially in Europe. Cheapest I could find was 413€. Zotac Mini. So, no, it's not cheaper...



the 1070 price has been inflated due to miner flocking to 1060 and 1070 when there is no longer AMD polaris GPU on the market they can buy. but at one point (i think it was a week or two before the price start going up) 1070 can be had as low as $330 at newegg. i was very surprise to see that back then. if the mining craze did not happen maybe we can see a lot of deal on 1070 below $400 mark right now.


----------



## renz496 (Aug 3, 2017)

uuuaaaaaa said:


> The RX 580 is competitive with the Fury X too, heck it beats it at 1080p.


 nah. that's more because of Fury X design flaw i think. just look how fast it was at 4k. even ahead of 1070. to be honest it is because stuff like this i have difficulty to recommend people buying Fury X over RX480 if they have no intention to play above 1080p. it is no joke when you see 1060 beating Fury X at 1080p res:


----------



## bug (Aug 3, 2017)

Those are good numbers, but Civ VI and Doom already benchmark favourably even on Polaris. SO those were expected wins anyway.
But yes, lower prices for a bit more performance is rarely a bad deal


----------



## the54thvoid (Aug 3, 2017)

Wait for a respectable review site to do the formal reviews. I can't even use Tweaktown on mobile, the add spamming makes me nauseous.


----------



## uuuaaaaaa (Aug 3, 2017)

renz496 said:


> nah. that's more because of Fury X design flaw i think. just look how fast it was at 4k. even ahead of 1070. to be honest it is because stuff like this i have difficulty to recommend people buying Fury X over RX480 if they have no intention to play above 1080p. it is no joke when you see 1060 beating Fury X at 1080p res:


We were discussing COD: IW not Anno 2205. Since the RX580 beats the both the 980ti and the Fury X at 1080p, wouldn't that make the 980ti flawed too? At 4k the Fury X easily beats the 980ti and the 1070 in COD IW tho and the RX580 is still ahead of the 980ti. These kind of discussions are almost pointless, in the end it comes to software and driver optimization, look at DOOM vulkan on AMD cards, specially the Fury X and how close it is to the GTX 1080.


----------



## renz496 (Aug 3, 2017)

uuuaaaaaa said:


> We were discussing COD: IW not Anno 2205. Since the RX580 beats the both the 980ti and the Fury X at 1080p, wouldn't that make the 980ti flawed too? At 4k the Fury X easily beats the 980ti and the 1070 in COD IW tho and the RX580 is still ahead of the 980ti. These kind of discussions are almost pointless, in the end it comes to software and driver optimization, look at DOOM vulkan on AMD cards, specially the Fury X and how close it is to the GTX 1080.



i was pointing it out that such behavior also exist on other game for Fury X since you said RX580 beat Fury X in COD: IW at *1080p*. the RX580 beat 980ti in all resolution is because of the said game favoring AMD architecture more. but realistically if you look at raw performance alone RX580 should not beat Fury X even at 1080p. if you look at anno bench at 4k the Fury X performance is closer with 980ti but at 1080p 980ti is significantly ahead to the point even 1060 capable of beating fury x at that res. that result is from RX580 review and Fury X has been on the market since 2015. anno 2205 was released to the market by the end of 2015 so i doubt AMD still have no proper optimization for the game in 2017. if anything that should point to something is not right with Fiji that can hold back it's performance on lower resolution.


----------



## RejZoR (Aug 3, 2017)

> AMD has tiny market share...
> 
> ...a lot of games favor Radeon architecture.



Ok, how in bloody hell that even works? If AMD has small market share, why would anyone bother specializing their engines favoriting AMD? Just pointing out the obvious. You know, maybe AMD is just good at it? Why can't that be a possibility? Why that only applies when NVIDIA is good at it?


----------



## AndreiD (Aug 3, 2017)

Isn't this pretty bad? Considering all of those games tend to favor AMD?       
It's probably going to end up slightly slower or on par with a 1070 in TPU's summary, while using more power and seemingly costing more based on MSRP.         
Custom 1070s will probably eat Vega 56 alive, this just doesn't look good at all.


----------



## RejZoR (Aug 3, 2017)

Again, how do games favor graphic cards from a vendor with hardly any market share. Or more importantly, why? Which makes me think AMD cards are simply... I don't know... better balanced for workloads that matter?

When NVIDIA is better at something, everyone is raving about NVIDIA's "performance supremacy", but when AMD does it, it's because games favor them. C'mon, and I'm being called an AMD fanboy for pointing out shit like this...


----------



## Vya Domus (Aug 3, 2017)

Just as expected the cut down Vega looks to be the best value.


----------



## cowie (Aug 3, 2017)

xbox PlayStation?

20%more then a 1070 is a 1080 beater too


----------



## Fouquin (Aug 3, 2017)

Assimilator said:


> You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.



25% that goes directly to the retailer, not nVidia. I'm sure they're ecstatic that their excellent design is lining the pockets of the companies stocking their shelves with it.


----------



## B-Real (Aug 3, 2017)

AndreiD said:


> Isn't this pretty bad? Considering all of those games tend to favor AMD?
> It's probably going to end up slightly slower or on par with a 1070 in TPU's summary, while using more power and seemingly costing more based on MSRP.
> Custom 1070s will probably eat Vega 56 alive, this just doesn't look good at all.


Actually you are the first one complaining about these results. Yes, AMD is faster on DX12 BF1 but isn't 30%+ faster... It's faster in CoD IW, but AMD had nearly 20% advantage there, and here it has less than 10%. It's faster in Doom Vulkan. However, Civilization DX12 is head-to-head (1060 6GB vs RX580 8GB shows 10% difference for AMD, but compared to the 480 there is 2 fps difference in 1440P).

So overall, IF these results are true, there won't be a 20% overall difference between the two, more like 10 or maximum 15%. But, given the HBM2, the promising features (that will be used by NV supported titles like Far Cry 5), the Vega56 looks charming.

I was wondering whether AMD was  trolling us and using Vega56 at the comparison events with the 1080, emphasizing Sync... We will see.


----------



## bug (Aug 3, 2017)

RejZoR said:


> Again, how do games favor graphic cards from a vendor with hardly any market share. Or more importantly, why? Which makes me think AMD cards are simply... I don't know... better balanced for workloads that matter?
> 
> When NVIDIA is better at something, everyone is raving about NVIDIA's "performance supremacy", but when AMD does it, it's because games favor them. C'mon, and I'm being called an AMD fanboy for pointing out shit like this...


Are we pretending sponsored titles do not exist now?


----------



## oxidized (Aug 3, 2017)

londiste said:


> enhanced sync will use as much tdp as it can, always.
> chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.
> 
> the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.



Don't even bother trying to discredit something coming from AMD with this guy


----------



## Vya Domus (Aug 3, 2017)

Fast Sync and Enhanced Sync work in pretty much the same ways. All frames are rendered but only a portion of them are shown , which means TPD isn't reduced. The only feature that can reduce TDP is Radeon Chill.


----------



## Recus (Aug 3, 2017)

But it's catching GTX 1080.

Plot twist: it's Vega 64 but TweakTown can't stop hype.


----------



## Vya Domus (Aug 3, 2017)

Fury was literally a stone's throw away from Fury X , it wouldn't surprise me at all if the story is the same with Vega.


----------



## Sempron Guy (Aug 3, 2017)

where in the world can I get a $350 GTX 1070 these days?



> Fast Sync and Enhanced Sync work in pretty much the same ways. All frames are rendered but only a portion of them are shown , which means TPD isn't reduced. The only feature that can reduce TDP is Radeon Chill.



frame limiter works for me


----------



## londiste (Aug 3, 2017)

Sempron Guy said:


> frame limiter works for me


with frame limiter you are negating the entire point on enhanced sync.


----------



## Sempron Guy (Aug 3, 2017)

londiste said:


> with frame limiter you are negating the entire point on enhanced sync.



exactly, been using frame limiter + freesync ever since, no reason to use enhanced sync. As I said that combo works for me and I had no complaints.


----------



## Vya Domus (Aug 3, 2017)

Sempron Guy said:


> exactly, been using frame limiter + freesync ever since, no reason to use enhanced sync. As I said that combo works for me and I had no complaints.



Isn't it unnecessary to use a frame limiter with freesync , I mean framerates wont go over the maximum refresh rate anyway. Unless you want a lower framerate for some reason.


----------



## Darmok N Jalad (Aug 3, 2017)

HD64G said:


> When the NDA ends on Vegas?



I thought what happens in Vegas is always under NDA. 

Sorry, could help it.


----------



## Liviu Cojocaru (Aug 3, 2017)

Darmok N Jalad said:


> I thought what happens in Vegas is always under NDA.
> 
> Sorry, could help it.


*couldn't*


----------



## RejZoR (Aug 3, 2017)

oxidized said:


> Don't even bother trying to discredit something coming from AMD with this guy



Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.

Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.

People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.


----------



## Vya Domus (Aug 3, 2017)

Everyone is biased to a degree , if you can't acknowledged that and think everyone beside you is a fanboy you are either delusional or simply a troll.

I also find it hilarious when one gets called a fanboy yet they own products from the opposing camp.

People need to make the difference between being a *fan *and a *fan*boy. Because believe it or not you can like one company and not be a mindless moron.


----------



## bug (Aug 3, 2017)

RejZoR said:


> Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.
> 
> Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.
> 
> People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. *In fact it performs better now than does NVIDIA's stuff for the most part*, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.



Not really, no: https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/3
It gained significantly in a few titles, but that's not the norm.


----------



## Raevenlord (Aug 3, 2017)

Assimilator said:


> Do you even proofread? The word you're looking for is "median".
> 
> And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.
> 
> ...



I do. Thank you for your correction. And the word I'm looking for isn't median either. It's average. You'll pardon me for not being 100% correct at all times in a non-native language. I'm sure you aren't either - even in your native language.

I don't blame NVIDIA. Where did I blame NVIDIA? What I did was mention current GTX 1070 pricing, which is what users look at. No one will approach this from the point of view of "Man, I'll be buying a GTX 1070 at $460 and it will be offering me much better bang for buck than the $399 Vega 56. I mean, its MSRP is much lower, and that's what matters, right?"

So yeah, I completely ignored that fact. On purpose. Because it doesn't make sense to me.


----------



## the54thvoid (Aug 3, 2017)

Vya Domus said:


> Fury was* literally *a stone's throw away from Fury X , it wouldn't surprise me at all if the story is the same with Vega.



On the shop shelf perhaps.


----------



## Sempron Guy (Aug 3, 2017)

Vya Domus said:


> Isn't it unnecessary to use a frame limiter with freesync , I mean framerates wont go over the maximum refresh rate anyway. Unless you want a lower framerate for some reason.



I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.


----------



## Vya Domus (Aug 3, 2017)

the54thvoid said:


> On the shop shelf perhaps.


Care to elaborate ? Don't know what you mean.



Sempron Guy said:


> I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.



That's odd.


----------



## oxidized (Aug 3, 2017)

RejZoR said:


> Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.
> 
> Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.
> 
> People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.



Wow you sound pretty mad, but whatever, every and i say EVERY post i read coming from you it's always pro AMD, i swear, i'm not even joking, that's the difference between someone unbiased and someone neutral, i don't care whose hardware i buy, i just buy whatever is best for my money



Vya Domus said:


> Everyone is biased to a degree , if you can't acknowledged that and think everyone beside you is a fanboy you are either delusional or simply a troll.
> 
> I also find it hilarious when one gets called a fanboy yet they own products from the opposing camp.
> 
> People need to make the difference between being a *fan *and a *fan*boy. Because believe it or not you can like one company and not be a mindless moron.



Wrong, i'm as neutral as one can be, i don't care about brands and stuff, i only buy what i think and what is said to be the best thing, and AMD, aside from ryzen, isn't best, in any kind of way atm, not even on polaris


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> Wrong, i'm as neutral as one can be, i don't care about brands and stuff, i only buy what i think and what is said to be the best thing, and AMD, aside from ryzen, isn't best, in any kind of way atm, not even on polaris



Wrong in what way ? That comment wasn't aimed at you but it was more of a general thing. If you buy whatever is said to be the best thing then you let other's bias influence you. That's still bias at work there you know , just not directly.


----------



## birdie (Aug 3, 2017)

> Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a *killer deal*.



So much fanboyism, it's staggering.

Meanwhile GTX 1070 has 150W TDP, while RX 56 won't differ much from it's older brother and will consume at least 300W. Some people here don't care about the cost of electricity, true, but I don't know the people who don't care about raw power dissipation which needs to be removed from your system.

Also, > 95% of people around me have neither FreeSync, nor GSync monitors, so you guys need to slow down a bit. Very few people actually care about tear free monitors. Instead of a tear free free display, I'd rather buy something based on OLED, which supports true 10/12bit colors and HDR.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> Wrong in what way ? That comment wasn't aimed at you but it was more of a general thing. If you buy whatever is said to be the best thing then you let other's bias influence you. That's still bias at work there you know , just not directly.



I said i buy what i think it's best, and what is said to be the best, ofc not said to be the best from a person, i read multiple sites and stuff, and make my conclusion.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> I said i buy what i think it's best, and what is said to be the best, ofc not said to be the best from a person, i read multiple sites and stuff, and make my conclusion.



Still , one that is 100% neutral , wouldn't really get involved in these discussions , just to say they are neutral would they ? 

I mean everyone can just go on review sites right ?


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> Still , one that is 100% neutral , wouldn't really get involved in these discussions , just to say they are neutral would they ?
> 
> I mean everyone can just go on review sites right ?



Nothing is 100%, that's why i used "as neutral as one can be"

Everyone can just go on review sites, but do they all really?


----------



## Gasaraki (Aug 3, 2017)

Sempron Guy said:


> I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.




That is the correct way to fix your "issue". This problem happens with G-Sync also because once Freesync and G-Sync reach the optimal frames of your monitor (in your case, 144fps) it turns off Freesync/G-Sync. But your video card is not able to maintain 144 CONSTANTLY, so when frames drop below 144, Freesync/G-Sync turns back on. This turning off and on causes the tearing you see. Setting the max frame below the max for you monitor will keep Freesync/G-Sync active at all times so tearing won't occur.


----------



## B-Real (Aug 3, 2017)

birdie said:


> So much fanboyism, it's staggering.
> 
> Meanwhile GTX 1070 has 150W TDP, while RX 56 won't differ much from it's older brother and will consume at least 300W. Some of people here don't care about the cost of electricity, true, but I don't know the people who don't care about raw power dissipation which needs to be removed.
> 
> Also, > 95% of people around me have neither FreeSync, nor GSync monitors, so you guys needs to slow down a bit. Very few people actually care about tear free monitors. Instead of a tear free free display, I'd rather buy something based on OLED, which supports true 10/12bit colors and HDR.


Vega56 has 210W TDP.
You cannot generalize only from your environment's habits. There may be at least that much gamers who would like a Sync monitor compared to the amount of gamers who would like an OLED HDR display.... Not to mention that Freesync doesn't really make that actual monitor expensive, compared to G-Sync.


----------



## RejZoR (Aug 3, 2017)

oxidized said:


> Wow you sound pretty mad, but whatever, every and i say EVERY post i read coming from you it's always pro AMD, i swear, i'm not even joking, that's the difference between someone unbiased and someone neutral, i don't care whose hardware i buy, i just buy whatever is best for my money
> 
> 
> 
> Wrong, i'm as neutral as one can be, i don't care about brands and stuff, i only buy what i think and what is said to be the best thing, and AMD, aside from ryzen, isn't best, in any kind of way atm, not even on polaris



*NOT *PISSING ON AMD DAY AFTER DAY DOESN'T FREAKING MAKE YOU *PRO* AMD.

It just means I don't see point in pissing all over it day after day because even RX Vega no matter how crappy you people think or say it is, it has its benefits and things that most likely will push the whole graphics industry forward. AMD has rarely been the absolute king of the hill and yet if you look through history, they are the driving force for many technologies used by everyone. Like for example Tessellation (ATi TruForm) and normal maps compression (ATi 3Dc), Vulkan/DX12 low level API (Mantle) and you can be assured that HBC will be used by everyone in the future. Maybe it won't prove itself super useful now, but it certainly will lead to yet another innovation made by AMD.

People just love to accuse me of being an AMD fanboy, but they conveniently leave out all the times I say good things about NVIDIA, where I confirm their superiority and when I correct things that are BS on AMD end (like the BS scaling of graphs for RX560 "review" here on TPU). Go on, search it a bit and you'll see. All these whiners calling me an AMD fanboy will never do that because it's inconvenient for their BS narrative that I'm an AMD fanboy. I can remember from top of my head that I've said several times that I'd have hard time considering AMD ever again if it didn't include Fast V-Sync like feature. To my luck, they did (Enhanced Sync). I also said several times that NVIDIA currently holds undisputed superiority in terms of performance. But whatever. Look it up and you'll see who's full of manure and who isn't.


----------



## Raevenlord (Aug 3, 2017)

birdie said:


> So much fanboyism, it's staggering.
> 
> Meanwhile GTX 1070 has 150W TDP, while RX 56 won't differ much from it's older brother and will consume at least 300W. Some of people here don't care about the cost of electricity, true, but I don't know the people who don't care about raw power dissipation which needs to be removed.
> 
> Also, > 95% of people around me have neither FreeSync, nor GSync monitors, so you guys needs to slow down a bit. Very few people actually care about tear free monitors. Instead of a tear free free display, I'd rather buy something based on OLED, which supports true 10/12bit colors and HDR.



Yeah, I don't think that word means what you think it means.

A card with 20% better performance in the numbers covered on the article (not representative) and 15% lower cost than the current GTX 1070 pricing is *objectively*, *financially* more attractive than the alternative. Your mileage may vary with power consumption costs, yes, but this is something I know most users look way lower in the shopping list than performance and retail pricing.

A Radeon Pack for $499 with two games, $100 discount on Ryzen/mobo combo, and a $200 discount on a FreeSync monitor is *objectively better* than $460 pricing for a GTX 1070 for people who are interested in the extra parts. That's why I said "has the potential to be a killer deal." For those who only want the card, there's a chance it won't be. For those who want more, it *will *be a killer deal.

Sentences like "Instead of a tear free free display, I'd rather buy something based on OLED," fully enters the subjective realm which wasn't even approached on the article. *Objectivelly, $-wise, *at the quoted prices, the RX Vega 56 is a better deal.


----------



## RejZoR (Aug 3, 2017)

Gasaraki said:


> That is the correct way to fix your "issue". This problem happens with G-Sync also because once Freesync and G-Sync reach the optimal frames of your monitor (in your case, 144fps) it turns off Freesync/G-Sync. But your video card is not able to maintain 144 CONSTANTLY, so when frames drop below 144, Freesync/G-Sync turns back on. This turning off and on causes the tearing you see. Setting the max frame below the max for you monitor will keep Freesync/G-Sync active at all times so tearing won't occur.



That's not true. Just because graphic card outputs same or less frames as refresh, that doesn't mean it won't tear. You can have 60Hz screen and you have 60fps and it can tear like crazy if the frame is 1/3 of a frame late which means all 60 frames per second are 1/3 of a second late and it'll tear at around 1/3rd of the screen because they aren't aligned with the refresh. It just gets particularly bad at very high frames because it'll miss the refresh cycles even more often and in different places across the screen, making it way more noticeable and annoying. I think NVIDIA had actual presentations for Fast V-Sync and you'll see what I mean. For example, FreeSync and G-Sync both make sure that frames are always in sync, even when they are lower than refresh, because it adapts the refresh to the framerate, meaning it can't tear since both are identical.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> Nothing is 100%, that's why i used "as neutral as one can be"



So basically you agree with what I said , that everyone is biased to a degree. 



oxidized said:


> Everyone can just go on review sites, but do they all really?




So they go on places like this right ? But if everyone here is "neutral" with advice only taken from review sites , what is the point ?

One needs to figure out stuff on their own to be truly neutral. If all you do is stare at charts , you're not really doing a good job at remaining neutral.


----------



## birdie (Aug 3, 2017)

Raevenlord said:


> A card with 20% better performance in the numbers covered on the article (not representative) and 15% lower cost than the current GTX 1070 pricing is *objectively*, *financially* more attractive than the alternative. Your mileage may vary with power consumption costs, yes, but this is something I know most users look way lower in the shopping list than performance and retail pricing.



You're comparing the current pricing for the GTX 1070, however you instantly forget that RX4XX and RX5XX are nowhere to be found at their MSRP, and if Vega RX proves to be a good mining card, it will be priced way above its MSRP which makes you price comparison totally invalid. 20% faster in cherry picked titles, favoring AMD.



Raevenlord said:


> A Radeon Pack for $499 with two games, $100 discount on Ryzen/mobo combo, and a $200 discount on a FreeSync monitor is *objectively better* than $460 pricing for a GTX 1070 for people who are interested in the extra parts. That's why I said "has the potential to be a killer deal." For those who only want the card, there's a chance it won't be. For those who want more, it *will *be a killer deal.



This could have been true 6 months ago when Ryzen got released. Again most people have either already upgraded to Ryzen or will not upgrade to it at all (as an Intel Core i5 2500 owner I don't care a bit about Ryzen - it has a similar IPC to my 6 years old CPU). Most people with actual money, fanboys notwithstanding, have already upgraded to 1080/1080 Ti.



Raevenlord said:


> Sentences like "Instead of a tear free free display, I'd rather buy something based on OLED," fully enters the subjective realm which wasn't even approached on the article. *Objectivelly, $-wise, *at the quoted prices, the RX Vega 56 is a better deal.



There's *no* RX Vega 56 to speak of. It will be released on August 14 in unknown quantities with unknown prices. Two weeks prior to its hard launch you make absolutely ridiculous unearthly claims.


----------



## basco (Aug 3, 2017)

i wonder how much peeps in these typical discussions have aircondition and like it cold or a car and like to go faster then you are allowed and dont think about what the car or aircon consumes but uuh ahh ohh the amd gpu uses so much power bäääh.

yes i know i love ya all


----------



## oxidized (Aug 3, 2017)

RejZoR said:


> *NOT *PISSING ON AMD DAY AFTER DAY DOESN'T FREAKING MAKE YOU *PRO* AMD.
> 
> It just means I don't see point in pissing all over it day after day because even RX Vega no matter how crappy you people think or say it is, it has its benefits and things that most likely will push the whole graphics industry forward. AMD has rarely been the absolute king of the hill and yet if you look through history, they are the driving force for many technologies used by everyone. Like for example Tessellation (ATi TruForm) and normal maps compression (ATi 3Dc), Vulkan/DX12 low level API (Mantle) and you can be assured that HBC will be used by everyone in the future. Maybe it won't prove itself super useful now, but it certainly will lead to yet another innovation made by AMD.
> 
> People just love to accuse me of being an AMD fanboy, but they conveniently leave out all the times I say good things about NVIDIA, where I confirm their superiority and when I correct things that are BS on AMD end (like the BS scaling of graphs for RX560 "review" here on TPU). Go on, search it a bit and you'll see. All these whiners calling me an AMD fanboy will never do that because it's inconvenient for their BS narrative that I'm an AMD fanboy. I can remember from top of my head that I've said several times that I'd have hard time considering AMD ever again if it didn't include Fast V-Sync like feature. To my luck, they did (Enhanced Sync). I also said several times that NVIDIA currently holds undisputed superiority in terms of performance. But whatever. Look it up and you'll see who's full of manure and who isn't.



The fact that you NEVER talk bad about amd makes me say you're a fan, not the fact you're not pissing on amd, you're not never talking bad about amd, never, let alone pissing on it.



Vya Domus said:


> So basically you agree with what I said , that everyone is biased to a degree.
> 
> 
> 
> ...



I agree to a certain point, if 8/10 reviews talk about the same stuff, and agree on 90+% of the stuff they talk about, it's pretty hard to be biased, unless they're all settled to say all the same things, which is again pretty hard since some of those websites hate each other. So even if there's a 1% or even 5% of bias it's as good as unbiased


----------



## B-Real (Aug 3, 2017)

birdie said:


> You're comparing the current pricing for the GTX 1070, however you instantly forget that RX4XX and RX5XX are nowhere to be found at their MSRP, and if Vega RX proves to be a good mining card, it will be priced way above its MSRP which makes you price comparison totally invalid. 20% faster in cherry picked titles, favoring AMD.



The card will start at its MSRP ($400). If the miners want to get their hands on those, in some weeks, it will get more expensive. But at the start, it will be $400. So if someone wants it, he can get it.


----------



## vega22 (Aug 3, 2017)

RejZoR said:


> Ok, how in bloody hell that even works? If AMD has small market share, why would anyone bother specializing their engines favoriting AMD? Just pointing out the obvious. You know, maybe AMD is just good at it? Why can't that be a possibility? Why that only applies when NVIDIA is good at it?



amd owns the console market, which is the main target for like 75/80% of all games (makers) so it makes sense to squeeze as much out of that as they can.

imo anyway.

these resluts i take with a pinch of salt till i see them from respected reviewers. that being said if they have the tiled rasterization working now the jump makes sense.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> I agree to a certain point, if 8/10 reviews talk about the same stuff, and agree on 90+% of the stuff they talk about, it's pretty hard to be biased, unless they're all settled to say all the same things, which is again pretty hard since some of those websites hate each other. So even if there's a 1% or even 5% of bias it's as good as unbiased



I'll just give an example , when I bought my 1060 I initially wanted to buy an RX 480 and I would have still bought one today , there was one issue , Nvidia's drivers favor more cores as opposed to AMD's which hammer down on just one core/thread.

My CPU would have been more a bottleneck if I went with an RX 480 , even though I would have preferred it above a 1060. However none of the review sites talked about this aspect , because 95% of them are very shallow with their reviews. Just some charts put together in one day and sent out as fast as possible to gain as much traffic as possible. I do not blame them , they are in the business of making money , but I cannot really much on their relevance and you shouldn't either.

I understand not everyone has the time to do research and just end up going on to popular review sites , but one should acknowledge how inaccurate they can be and how little of the whole story do they convey most of the time. Not to mention that some of the practices that are happening with regards to review samples make me question their relevance and bias even more.

So again , I am not saying you can't use them , but please don't infer they are anywhere near being 100% neutral or accurate.


----------



## Raevenlord (Aug 3, 2017)

birdie,



birdie said:


> You're comparing the current pricing for the GTX 1070, however you instantly forget that RX4XX and RX5XX are nowhere to be found at their MSRP, and if Vega RX proves to be a good mining card, it will be priced way above its MSRP which makes you price comparison totally invalid. 20% faster in cherry picked titles, favoring AMD.



I'm comparing the values we know *now*. With the information we have, the current state of affairs, the GTX 1070 costs on average $460, and the Vega 56 costs $399. I'm not comparing RX 400 or 500 series. I don't even mention them in the piece. I don't care about their pricing - they're not relevant for the article. *When, if, Vega is at a higher price than MSRP, I'll revise my position accordingly.* *If these performance numbers are a dud, I'll revise my position accordingly.* Why don't you revise your current one?



birdie said:


> This could have been true 6 months ago when Ryzen got released. Again most people have either already upgraded to Ryzen or will not upgrade to it at all (as an Intel Core i5 2500 owner I don't care a bit about Ryzen - it has a similar IPC to my 6 years old CPU). Most people with actual money, fanboys notwithstanding, have already upgraded to 1080/1080 Ti.



For the people who care about it, it's a killer deal. I don't understand your insistence on this topic. Not everyone will think so. True. And? For those who care, it is. Those're the people I'm referring to.



birdie said:


> There's *no* RX Vega 56 to speak of. It will be released on August 14 in unknown quantities with unknown prices. Two weeks prior to its hard launch you make absolutely ridiculous unearthly claims.



It's just a matter of reviewing our current knowledge on the situation, I really don't understand how you don't see it. What will happen doesn't matter. What matters, for the scope of the article, is what we know, what *is.
*
When the facts change, I change my opinion. What do you do?


----------



## RejZoR (Aug 3, 2017)

oxidized said:


> The fact that you NEVER talk bad about amd makes me say you're a fan, not the fact you're not pissing on amd, you're not never talking bad about amd, never, let alone pissing on it.
> 
> 
> 
> I agree to a certain point, if 8/10 reviews talk about the same stuff, and agree on 90+% of the stuff they talk about, it's pretty hard to be biased, unless they're all settled to say all the same things, which is again pretty hard since some of those websites hate each other. So even if there's a 1% or even 5% of bias it's as good as unbiased



Ok, if being quiet and not bitching over AMD all the time automatically means I'm favoring AMD, so one would assume I'm constantly saying how garbage NVIDIA is, right? Well, good luck finding that, coz you won't find any (and don't pull stuff out of context). Only time I ever mentioned downsides about NVIDIA was about particular features or FX and Kepler series. And how Pascal is ridiculously fast, but doesn't really bring any tech that would excite me. It's just a very fast card. And that's about it. Wouldn't that, I don't know, kinda make me you know, neutral? Just because I can find positives in otherwise underwhelming launch of RX Vega, that doesn't mean I'm a fanboy.

It almost makes me want to buy RX Vega just to piss people off here at TPU. Seeing all of you implode here would be the best s***t ever.


----------



## bug (Aug 3, 2017)

RejZoR said:


> Ok, if being quiet and not bitching over AMD all the time automatically means I'm favoring AMD, so one would assume I'm constantly saying how garbage NVIDIA is, right? Well, good luck finding that, coz you won't find any (and don't pull stuff out of context). Only time I ever mentioned downsides about NVIDIA was about particular features or FX and Kepler series. And how Pascal is ridiculously fast, but doesn't really bring any tech that would excite me. It's just a very fast card. And that's about it. Wouldn't that, I don't know, kinda make me you know, neutral? Just because I can find positives in otherwise underwhelming launch of RX Vega, that doesn't mean I'm a fanboy.
> 
> It almost makes me want to buy RX Vega just to piss people off here at TPU. Seeing all of you implode here would be the best s***t ever.


Do this exercise: go back and see how many positive posts you had about AMD and how many about Nvidia in the past couple of months. Then do the same for negative posts.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> I'll just give an example , when I bought my 1060 I initially wanted to buy an RX 480 and I would have still bought one today , there was one issue , Nvidia's drivers favor more cores as opposed to AMD's which hammer down on just one core/thread.
> 
> My CPU would have been more a bottleneck if I went with an RX 480 , even though I would have preferred it above a 1060. However none of the review sites talked about this aspect , because 95% of them are very shallow with their reviews. Just some charts put together in one day and sent out as fast as possible to gain as much traffic as possible. I do not blame them , they are in the business of making money , but I cannot really much on their relevance and you shouldn't either.
> 
> ...



Until like 3 months ago i had an old GTX580, i decided to buy something new, i found a 480 GTR black edition from xfx at something like 240€ on amazon france, received it, so happy, but afterwards i found out it had consistent coil whine, it had few stupid issues, which put together made me send it back, and get a 1060 gaming x which is been doing pretty good at least until now.

Now i'd been reading all kind of reviews in the months before buying the new card, and in pretty much ALL of them the 1060 was faster on most of the games tried in the benchmark it was something like 60/40 in favour of nvidia, but the price of that gtr black edition was just too good, and you know what happened next. So what?

You're all obsessed with this thing where nvidia pays everyone to make it look good, and make amd look bad, it's not like that, it could be like that in some case, but if you examine 20 reviews or something around it, and at the end of the story 90% of them agree over most of the points, there's no way they could all be biased, just this.

Also i'm pretty sure someone (because i already read it somewhere) would start to think someone isn't biased if they start talking good about amd in any case.



RejZoR said:


> Ok, if being quiet and not bitching over AMD all the time automatically means I'm favoring AMD, so one would assume I'm constantly saying how garbage NVIDIA is, right? Well, good luck finding that, coz you won't find any (and don't pull stuff out of context). Only time I ever mentioned downsides about NVIDIA was about particular features or FX and Kepler series. And how Pascal is ridiculously fast, but doesn't really bring any tech that would excite me. It's just a very fast card. And that's about it. Wouldn't that, I don't know, kinda make me you know, neutral? Just because I can find positives in otherwise underwhelming launch of RX Vega, that doesn't mean I'm a fanboy.
> 
> It almost makes me want to buy RX Vega just to piss people off here at TPU. Seeing all of you implode here would be the best s***t ever.



We're talking about AMD here not nvidia, i don't care what you think about nvidia.



> It almost makes me want to buy RX Vega just to piss people off here at TPU. Seeing all of you implode here would be the best s***t ever.



And you don't even realize that amd has most of the mind share atm especially of forums, everyone just loves AMD because they're the underdogs, so they must be good and right, and nvidia is the villain which is wrong and only want to steal our money. That's a fairy tale. 
If you're looking for a less harsh on AMD forum or anyway, somewhere AMD fun are automatically right just go on OCN, TPU is much more neutral and there's all kind of people, from nvidia fanboys to amd fanboys pretty much in the same quantity.


----------



## the54thvoid (Aug 3, 2017)

RejZoR said:


> Ok, if being quiet and not bitching over AMD all the time automatically means I'm favoring AMD, so one would assume I'm constantly saying how garbage NVIDIA is, right? Well, good luck finding that, coz you won't find any (and don't pull stuff out of context). Only time I ever mentioned downsides about NVIDIA was about particular features or FX and Kepler series. And how Pascal is ridiculously fast, but doesn't really bring any tech that would excite me. It's just a very fast card. And that's about it. Wouldn't that, I don't know, kinda make me you know, neutral? Just because I can find positives in otherwise underwhelming launch of RX Vega, that doesn't mean I'm a fanboy.
> 
> *It almost makes me want to buy RX Vega just to piss people off here at TPU*. Seeing all of you implode here would be the best s***t ever.



I kinda wanted to do that with Ryzen.  Now I have Ryzen I'm happy i gave AMD money but I did not foresee Intel cutting it's costs.  Dont buy something to spite others, it makes you a fool.  Buy it to experience something new.  At least with my CPU, gaming is not entirely affected though it does hold back a 1080ti at 2Ghz, even at 1440p.  At least the fps is high enough tha it doesn't matter on a 60Hz monitor.

The Vega 56 does sound totally like Fury to Fury X - the better value proposition but wait for reviews to see how it manages.  Also, dont forget, a custom 980ti tends to beat a 1070 (or level with it) so you're looking at a 2 year old card matching a Vega 56 potentially (20% OC results on stock 980ti is quite common).  If you frame it that way, it's not quite as good looking.

If I had the budget for a card like a 1070 or a Vega 56, I'd wait for reviews for sure.  The few titles used to bench so far are not AMD biased but they absolutely paint AMD's better side.  Just watch Nvidia discretely lower 1070/1080 prices if Vega is a threat to them.  Really dude - wait, read, consider the options and buy your card and enjoy it, whatever you choose.

EDIT: I dont want Vega to be too good because I cant afford to buy a new gfx card yet  (upgrade itch is like herpes - it never truly goes away).  Hoping I'm safe with my 1080ti on steroids.


----------



## RejZoR (Aug 3, 2017)

bug said:


> Do this exercise: go back and see how many positive posts you had about AMD and how many about Nvidia in the past couple of months. Then do the same for negative posts.



Of course there's more discussion about Vega in recent months. Why would I talk about a year old product (Pascal) that I don't care since it has all been discussed like trillion times already? VEGA is the shit everyone talks about now. Including me, you and everyone else.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> You're all obsessed with this thing where nvidia pays everyone to make it look good, and make amd look bad, it's not like that, it.



I have no idea how they payed anyone , but one thing is clear :  they did their best to shove things like GameWorks and other things ( which are in my opinion shady tactics ) such as making developers put ludicrous amounts of tessellation because they knew AMD wasn't as efficient at it.

You may think these are perfectly legit methods but that doesn't change the fact that yes , Nvidia did do their best to make AMD look bad.


----------



## springs113 (Aug 3, 2017)

B-Real said:


> Actually you are the first one complaining about these results. Yes, AMD is faster on DX12 BF1 but isn't 30%+ faster... It's faster in CoD IW, but AMD had nearly 20% advantage there, and here it has less than 10%. It's faster in Doom Vulkan. However, Civilization DX12 is head-to-head (1060 6GB vs RX580 8GB shows 10% difference for AMD, but compared to the 480 there is 2 fps difference in 1440P).
> 
> So overall, IF these results are true, there won't be a 20% overall difference between the two, more like 10 or maximum 15%. But, given the HBM2, the promising features (that will be used by NV supported titles like Far Cry 5), the Vega56 looks charming.
> 
> I was wondering whether AMD was  trolling us and using Vega56 at the comparison events with the 1080, emphasizing Sync... We will see.


In some ways I believe they were, because it(vega 56) was touted as the ultimate freesync gaming card by AMD so I wouldn't be surprised if they were somewhat sandbagging the numbers on their slides.  After all remember it was the Ryzen 7 1700 that they pitted against Intel throughout the Ryzen pre release events.  Nonetheless I will be getting my hands on 1.


----------



## bug (Aug 3, 2017)

RejZoR said:


> Of course there's more discussion about Vega in recent months. Why would I talk about a year old product (Pascal) that I don't care since it has all been discussed like trillion times already? VEGA is the shit everyone talks about now. Including me, you and everyone else.


Ok, try this then: in how many posts were you critical of Vega?


----------



## the54thvoid (Aug 3, 2017)

bug said:


> Ok, try this then: in how many posts were you critical of Vega?



He can't be critical of Vega - it's not been released yet...


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> I have no idea how they payed anyone , but one thing is clear :  they did their best to shove things like GameWorks and other things ( which are in my opinion shady tactics ) such as making developers put ludicrous amounts of tessellation because they knew AMD wasn't as efficient at it.
> 
> You may think these are perfectly legit methods but that doesn't change the fact that yes , Nvidia did do their best to make AMD look bad.



But we're talking about reviews here not nvidia, it's pretty reasonable that a company like nvidia or even amd, will try to make the other look bad, it's how it works, and it's a completely different thing from "nvidia is bribing reviewers, and review websites", completely different, so in the end you'll have a game that runs better on nvidia rather than amd, and the reason can be whatever you want it to be, but the fact remains, that game ACTUALLY runs better on nvidia rather than amd, same story if AMD did that, for example doom, only has vulkan and opengl, but why exactly? I'm pretty sure that if it ran on dx11 nvidia would have the upper hand there too. Needless to remind that vulkan is pretty much an iteration of mantle, which was used on bf4? or 3 i can't remember. and boosted amd performance, so what are we talking about here?


You people should really stop evaluating companies based on their behaviour, they're lucrative companies, they'll always do shady things no matter nvidia amd intel or whatever you want, there's nobody better than the other in doing such, there's only who does a better job in terms of product, and since one can't rely on their tests, because ofc they'll be biased, one has to rely on 3d party tests or even personal tests.


----------



## bug (Aug 3, 2017)

the54thvoid said:


> He can't be critical of Vega - it's not been released yet...


It's been announced it will draw a lot of power, it's been announced that far from offering breakthrough performance it will offer something Nvidia gave us last year and it's been announced it will not be particularly cheap either. I think there were enough reasons to at least express concern for someone not explicitly rooting for AMD.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> But we're talking about reviews here not nvidia, it's pretty reasonable that a company like nvidia or even amd, will try to make the other look bad, it's how it works, and it's a completely different thing from "nvidia is bribing reviewers



But if say even as little as 1/10 games featured in a review have gone through "Nvidia's handy optimizations" , doesn't that mean this bias gets carried through ?



oxidized said:


> so in the end you'll have a game that runs better on nvidia rather than amd, and the reason can be whatever you want it to be, but the fact remains, that game ACTUALLY runs better on nvidia rather than amd



OK so you're fine with software and hardware that is gimped on purpose as long as it gives you the illusion of better ? Whatever floats your boat I guess.



oxidized said:


> same story if AMD did that, for example doom, only has vulkan and opengl, but why exactly? I'm pretty sure that if it ran on dx11 nvidia would have the upper hand there too? Needless to remind that vulkan is pretty much an iteration on mantle, which was used on bf4? or 3 i can't remember. and boosted amd performance, so what are we talking about here?



Every single time someone goes out to mention how Vulkan/Doom is biased towards AMD tells me how much do they really understand from all of this.

I have already gone through this in a recent discussion about how Vulkan DOSE NOT favor anyone. Nvidia is at the top of charts aren't they ? Did they need DX 11 ? No , because Vulkan is great at taking advantage of modern hardware. DX11 doesn't do that anymore.


----------



## Sasqui (Aug 3, 2017)

Ok, I'm off to sell my brand new GTX 1070.

Seriously...


----------



## B-Real (Aug 3, 2017)

oxidized said:


> Now i'd been reading all kind of reviews in the months before buying the new card, and in pretty much ALL of them the 1060 was faster on most of the games tried in the benchmark it was something like 60/40 in favour of nvidia, but the price of that gtr black edition was just too good, and you know what happened next. So what?



Well, it's very easy to answer: You looked at the initial reviews.
Actually since the december re-tests of RX480 vs 1060, they are equal in DX11. Check the reviews, usually on YT but there are updated reviews on sites like Hardware Canuks, for example. The tipical ~10% advantage of the 1060 in DX11 titles has evaporated to 0-2%, not to mention the DX12 advantage of RX480. So yeah, you were unlucky with the coil whine RX480, but overall, you got an overall bit worse card (for more money).


----------



## neatfeatguy (Aug 3, 2017)

Just like all pre-released info on a new GPU lineup - I'll just bypass the info here and wait for actual benchmarks from reliable sites once NDA is done.

I do, however, greatly enjoy reading the bias response people have on both sides of the fence when going off of "leaked" or "rumored" data. You get those that will defend the data to the end and those that will dispute it with all their might. Most of the comments are laughable and make it for good reading material.


----------



## bug (Aug 3, 2017)

Sasqui said:


> Ok, I'm off to sell my brand new GTX 1070.
> 
> Seriously...


Because of 4 leaked benchmarks? You know what they say about a fool and their money...


----------



## Fouquin (Aug 3, 2017)

bug said:


> Because of 4 leaked benchmarks? You know what they say about a fool and their money...



I'm fairly certain the "Seriously..." was in an exasperated tone in response to the cherry picked benchmarks, and that they were being sarcastic about selling their brand new GTX 1070.

I could be wrong, but it fits the tone of the rest of the comments on this article.


----------



## Th3pwn3r (Aug 3, 2017)

RejZoR said:


> Ok, if being quiet and not bitching over AMD all the time automatically means I'm favoring AMD, so one would assume I'm constantly saying how garbage NVIDIA is, right? Well, good luck finding that, coz you won't find any (and don't pull stuff out of context). Only time I ever mentioned downsides about NVIDIA was about particular features or FX and Kepler series. And how Pascal is ridiculously fast, but doesn't really bring any tech that would excite me. It's just a very fast card. And that's about it. Wouldn't that, I don't know, kinda make me you know, neutral? Just because I can find positives in otherwise underwhelming launch of RX Vega, that doesn't mean I'm a fanboy.
> 
> It almost makes me want to buy RX Vega just to piss people off here at TPU. Seeing all of you implode here would be the best s***t ever.



Lol, you have changed your mind at least five times about getting Vega. You even made an upset post that you were getting a 1080ti and that was it. Then you said you were getting the liquid cooled Vega...blah,blah.

I don't think you're too pro AMD but you sure are extremely indecisive.

I on the other hand will buy Vega if it can beat the 1080ti. If not I'll buy another Nividia card(1080ti).


----------



## Dimi (Aug 3, 2017)

There is no Vega 56 at 399$ available. There probably NEVER will be due to mining. 

Hell average price for 580's is 500$!

Besides, who wants to buy crap cooled blower style standard editions anyway.

Half a dozen *aftermarket* 1070's were available for 349.99 or lower at one point. Don't forget this.


----------



## ppn (Aug 3, 2017)

Maybe if it can get closer to 1080 with memory OC to 512GB/s compared to default of 410 that is 10% performance for free, the 399$ mark is perfect for a new product.  GTX 1070 will be the new 250$ card soon enough so no point comparing it to that.


----------



## Vya Domus (Aug 3, 2017)

Dimi said:


> Half a dozen *aftermarket* 1070's were available for 349.99 or lower at one point. Don't forget this.



Yeah , *were*. If Vega 56 pricing will get screwed  up , it'll be just as screwed up as 1070 pricing.


----------



## cdawall (Aug 3, 2017)

bug said:


> Because of 4 leaked benchmarks? You know what they say about a fool and their money...



To be fair I am selling off my 980Ti collection on ebay just in case the price tanks


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> But if say even as little as 1/10 games featured in a review have gone through "Nvidia's handy optimizations" , doesn't that mean this bias gets carried through ?
> 
> 
> 
> ...



It's not an illusion of better, IT IS better, the way they achieved this isn't relevant. as long as they didn't to anything illegal, and they didn't last i checked.
Vulkan, and Doom specifically run much better on AMD hardware, i understood everything i assure you.
Vulkan favors AMD whether they like it or not, so much they sold doom with amd cards for a while, what are we talking about cmon, are you only sharp when it suits you? Vulkan favors AMD even because (and i'll say it again) it's mantle's offspring.
I'd say pascal is more modern than polaris seen consuption and everything else related.

This last thing actually reminds me of those people that when it's pro AMD it's sure and can't be wrong, otherwise it's wrong, so if a game runs better on an AMD cards then it means the game is good, it's well optimized and well developed to gain from newer hardware, if the game runs better on nvidia but be badly optimized and doesn't scale well enough with new hardware.

So that's practically AMD fanboy, what else would you need?




B-Real said:


> Well, it's very easy to answer: You looked at the initial reviews.
> Actually since the december re-tests of RX480 vs 1060, they are equal in DX11. Check the reviews, usually on YT but there are updated reviews on sites like Hardware Canuks, for example. The tipical ~10% advantage of the 1060 in DX11 titles has evaporated to 0-2%, not to mention the DX12 advantage of RX480. So yeah, you were unlucky with the coil whine RX480, but overall, you got an overall bit worse card (for more money).



No i looked even at the most recent reviews i could find, and the 1060 was still above, and on the same percentage, the only thing going really head to head with the 1060 is the 580, i looked at hardware canuck's re-review, and it's probably the only one that showed those shift, which i could find anywhere else besides, dx12 worked better on 480, that is true, and also you're probably true about saying i got the worst bang for the buck between those 2, but 1060 had more valid AIB choices and the overall consumption is lower on 1060, so in the end i'd say it's on par.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> It's not an illusion of better, IT IS better, the way they achieved this isn't relevant. as long as they didn't to anything illegal, and they didn't last i checked.



It's not illegal but it hampers progress and it annoys me because it is not benefiting the consumer in way.



oxidized said:


> Vulkan, and Doom specifically run much better on AMD hardware, i understood everything i assure you.
> Vulkan favors AMD whether they like it or not, so much they sold doom with amd cards for a while, what are we talking about cmon, are you only sharp when it suits you? Vulkan favors AMD even because (and i'll say it again) it's mantle's offspring, i'd say pascal is more modern than polaris seen consuption and everything else related.



You are simply wrong , like I said I already had a pretty lengthy discussion on this matter , not gonna have it again.

Vulkan does not favor AMD , hell think about this : The president of The Khronos Group who developed Vulkan works at Nvidia. Whoop de doo , you still think it favors AMD ? Nvidia along side other members and partners from all around the computer graphics technology world would let such a ridiculously biased thing happen ? Just because a piece of hardware performs better in a certain application doesn't mean that application favors it. Come on man. And do I have to remind you again that Titan Xp is at the top of the charts in Doom and Vega with the same raw power doesn't. Is that bias ?



oxidized said:


> This last thing actually reminds me of those people that when it's pro AMD it's sure and can't be wrong, otherwise it's wrong, so if a game runs better on an AMD cards then it means the game is good, it's well optimized and well developed to gain from newer hardware, if the game runs better on nvidia but be badly optimized and doesn't scale well enough with new hardware.
> 
> So that's practically AMD fanboy, what else would you need?



Doom is well optimized to run on any newer hardware , Nvidia and AMD.


----------



## Nkd (Aug 3, 2017)

Assimilator said:


> Do you even proofread? The word you're looking for is "median".
> 
> And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.
> 
> ...



1070 MSRP was never reduced it was gtx 1080 MSRP. for 1070 It was always 399.99, they did start going on sale before mining craze hit.


----------



## B-Real (Aug 3, 2017)

Dimi said:


> There is no Vega 56 at 399$ available. There probably NEVER will be due to mining.
> 
> Hell average price for 580's is 500$!
> 
> ...


So one on side, you are talking about Vega will not be available for 399$, because 580 *IS* 500$ (which is simply a lie, because you can get RX580s for 310-320$, but yeah, its way too expensive for the initial price) and on the other side you say that 1070s *WERE *available for $350 at one point.  OMG, help me please.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> It's not illegal but it hampers progress and it annoys me because it is not benefiting the consumer in way.



I'd say it hampers the competitor, not really the consumer directly, but i can give you that.



Vya Domus said:


> You are simply wrong , like I said I already had a pretty lengthy discussion on this matter , not gonna have it again.
> 
> Vulkan does not favor AMD , hell think about this : The president of The Khronos Group who developed Vulkan works at Nvidia. Whoop de doo , you still think it favors AMD ? Nvidia along side other members and partners from all around the computer graphics technology world would let such a ridiculously biased thing happen ? Just because a piece of hardware perform better in a certain application doesn't mean it favors it. Come on man.
> And do I have to remind you again that Titan Xp is at the top of the charts in Doom and Vega with the same raw power doesn't. Is that bias ?



Means nothing, i'm just saying Vulkan favors AMD, not that it's meant to favor AMD, whether or not it's meant, it still favor greatly AMD cards performance, and not for the things you say, because there's really nothing superior on polaris compared to pascal, hell even Vega isn't superior to pascal if these latest results become true. TitanXp is a much much more powerful chip than any Vega, i take it you're not serious.



Vya Domus said:


> Doom is well optimized to run on any newer hardware , Nvidia and AMD.


Doom is excellently optimized, but still on par GPUs does much much better on AMD, it's AMD friendly environment if you don't want to call it bias.


----------



## Prince Valiant (Aug 3, 2017)

If the games AMD performs well on are AMD friendly that must mean that the games it performs bad on are Nvidia friendly.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> TitanXp is a much much more powerful chip than any Vega, i take it you're not serious.





oxidized said:


> but still on par GPUs does much much better on AMD, it's AMD friendly environment if you don't want to call it bias.



Oh really , is that so ? Vega 64 has about 14 TFLOPS , right about where Titan Xp sits as well, die sizes are similar too. So yeah , they are on par in terms of power. Yet , if Vulkan favors "greatly" AMD how come it doesn't beat it ? You are on witch hunt mate , Doom/Vulkan is impartial to hardware.


----------



## B-Real (Aug 3, 2017)

oxidized said:


> No i looked even at the most recent reviews i could find, and the 1060 was still above, and on the same percentage, the only thing going really head to head with the 1060 is the 580, i looked at hardware canuck's re-review, and it's probably the only one that showed those shift, which i could find anywhere else besides, dx12 worked better on 480, that is true, and also you're probably true about saying i got the worst bang for the buck between those 2, but 1060 had more valid AIB choices and the overall consumption is lower on 1060, so in the end i'd say it's on par.



Then I have no idea what you checked. First is Hardware Canuks as a written review, YT videos like HW Unboxed (







), getting 1% in favor of the 1060 and first it was a 12% difference. Yeah, it consumes 30-40W more at default (which is said by HW Unboxed that shouldn't be a deal breaker), but actually, JayzTwoCents' XFX 480 video shows that under stress test, that RX480 eats about 95 to 120W. And with Crimson, AB etc. you can control the more hungry RX480s like the Sapphire or MSI. Also, ones like MSI got later BIOS updates that got their power consumption lower.

So yeah, overall the RX480 (including the DX12 and Vulkan games) is a somewhat faster (or equal) card than the GTX 1060 since about 8 months now, and before that mining fever, it was about 30-40 bucks cheaper. At start, I would say they were egal (2016 summer), considering cost, performance, power consumption, but since decembers Crimson drivers, RX480 is simply the better choice. And for the RX580... yeah, its actually a faster card than the 1060. And it was sold for around 480 prices till the fever hit... Yeah, it consumed more than the RX480, it's for sure.


----------



## Gasaraki (Aug 3, 2017)

Dimi said:


> There is no Vega 56 at 399$ available. There probably NEVER will be due to mining.
> 
> Hell average price for 580's is 500$!
> 
> ...



No one knows if Vega will be good for mining so you can't say that. Vega is a totally different architecture.


----------



## oxidized (Aug 3, 2017)

Prince Valiant said:


> If the games AMD performs well on are AMD friendly that must mean that the games it performs bad on are Nvidia friendly.



No? Not necessarily, since often 1060 performed better than the 480 by 4/5%, so no, that doesn't show any friendly environment for nvidia.



Vya Domus said:


> Oh really , is that so ? Vega 64 has about 14 TFLOPS , right about where Titan Xp sits as well, die sizes are similar too. So yeah , they are on par in terms of power. Yet , if Vulkan favors "greatly" AMD how come it doesn't beat it ? You are on witch hunt mate , Doom/Vulkan is impartial to hardware.



Oh yeah? TFLOPS mean nothing, 1060 had less than 480, still, a faster card, in the beginning, and now even if the gap reduced a bit. TitanXP is a much faster card than any Vega, at least on games that is, rest i don't care, since we're talking about that. So again, Vega doesn't beat TitanXp under Vulkan because TitanXP is much faster that that on games. And NO, Doom isn't impartial at all, neither is Vulkan for the moment, in future we'll see.



B-Real said:


> Then I have no idea what you checked. First is Hardware Canuks as a written review, YT videos like HW Unboxed
> 
> 
> 
> ...



No sir, the 1060 was and is a faster even if sligthly card compared to 480, JayzTwoCent's 480 was a pretty lucky one since he could overclock it much while leaving untouched the voltage, i couldn't find anyone else that reached those frequencies, also there's nothing to control, Polaris is just designed with a higher TDP (232mm^2 vs 200mm^2, which is 16% bigger die) than Pascal, there's nothing to adjust. So again, overall the 480 isn't a faster card than 1060, it's the opposite, it's true the 480 increased performance overtime but couldn't still catch the 1060, put it as you want, and again what kept AMD floating was price/performance, nothing else really.
The 580 was maybe a little faster, costing more, and not at 480 price as you say, because i checked for that (i actually was planning to buy it) and it cost like a good AIB 1060, while having additional power draw, and it needed a bigger heatsink due the the higher TDP, which not everyone did, like sapphire on the pulse version.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> Doom isn't impartial at all, neither is Vulkan



And let me guess all the other games and APIs aren't ?impartial. OK dude , you can keep your incorrect belief to yourself.


----------



## Prince Valiant (Aug 3, 2017)

I didn't think my joke would get taken seriously .

I can scarcely wait for proper reviews to come out. The speculation with Vega is as bad as Ryzen if not worse.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> And let me guess all the other games and APIs aren't ? OK dude , you can keep your incorrect belief to yourself.




Other games and APIs don't show such great amount of shift in performance going from nvidia to amd or the opposite, Doom and hitman are 2 titles that favour AMD much more than the rest is favouring nvidia, again no idea if meant or not, but it's there.


----------



## T4C Fantasy (Aug 3, 2017)

Assimilator said:


> Do you even proofread? The word you're looking for is "median".
> 
> And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.
> 
> ...



MSRP doesnt exist for 1070 anymore its $450+
and $500 for founders

editor is not wrong there


----------



## DeathtoGnomes (Aug 3, 2017)

I'm convinced. There are too many posters in this thread being paid specifically to attack other people and hijack the discussion. 

As for me the unverified results are nice considering AMD hasnt been in such a position in a long time. I think Nvidia will up its game with Volta, which might mean a delayed release until they prove they retake top daug slot without much fanfare.


----------



## efikkan (Aug 3, 2017)

RejZoR said:


> GTX 1070 for 350 ahaha. Good luck finding one. Especially in Europe. Cheapest I could find was 413€. Zotac Mini. So, no, it's not cheaper...


Keep in mind most western European countries have local VAT included in their MSRP, but the US MSRP does not.



RejZoR said:


> Ok, how in bloody hell that even works? If AMD has small market share, why would anyone bother specializing their engines favoriting AMD? Just pointing out the obvious. You know, maybe AMD is just good at it? Why can't that be a possibility? Why that only applies when NVIDIA is good at it?


Nvidia is dominant in desktop gaming, but AMD is dominant in console gaming. Many games start out as console exclusives or console focused titles.


----------



## Captain_Tom (Aug 3, 2017)

No surprise here.

$400, and it slightly loses to the 1080 while having FAR better long-term technology.  This will sell well, and Vega64 should be at least 15% stronger than this!


----------



## efikkan (Aug 3, 2017)

Captain_Tom said:


> No surprise here.
> 
> $400, and it slightly loses to the 1080 while having FAR better long-term technology.  This will sell well, and Vega64 should be at least 15% stronger than this!


Precisely, which "better" long-term technology are you talking about?


----------



## Captain_Tom (Aug 3, 2017)

efikkan said:


> Precisely, which "better" long-term technology are you talking about?



FP16 (It will be a 26+ TFLOP card in many upcoming games), HBC, and asynchronous compute.   There are more, but those are the big ones.


Although you are very "vocal" in these forums, so I already know you are aware of them, and I can''t wait to see what your fanboy response will be.


----------



## B-Real (Aug 3, 2017)

oxidized said:


> No sir, the 1060 was and is a faster even if sligthly card compared to 480, JayzTwoCent's 480 was a pretty lucky one since he could overclock it much while leaving untouched the voltage, i couldn't find anyone else that reached those frequencies, also there's nothing to control, Polaris is just designed with a higher TDP (232mm^2 vs 200mm^2, which is 16% bigger die) than Pascal, there's nothing to adjust. So again, overall the 480 isn't a faster card than 1060, it's the opposite, it's true the 480 increased performance overtime but couldn't still catch the 1060, put it as you want, and again what kept AMD floating was price/performance, nothing else really.
> The 580 was maybe a little faster, costing more, and not at 480 price as you say, because i checked for that (i actually was planning to buy it) and it cost like a good AIB 1060, while having additional power draw, and it needed a bigger heatsink due the the higher TDP, which not everyone did, like sapphire on the pulse version.


As i wrote, the 480 is equal (and better in DX12-Vulkan) compared to the 1060. Even if you believe it or not.

And who the heck cares what kept AMD floating? :O It was ~10% slower for about $40 cheaper and a bit more power consumption. And 5-6 months later, the performance gap was reduced to 0-2% and still costing $30-40 less.


----------



## efikkan (Aug 3, 2017)

Captain_Tom said:


> FP16 (It will be a 26+ TFLOP card in many upcoming games), HBC, and asynchronous compute.   There are more, but those are the big ones.


- FP16 is surely going to be interesting in some years, but still doesn't make Vega a better buy than Pascal. And remember Volta will be here in a few months.
- HBC is relevant for pure compute workloads, but not for gaming.
- Asynchronous compute is supported by Nvidia as well.


----------



## DeathtoGnomes (Aug 3, 2017)

efikkan said:


> - FP16 is surely going to be interesting in some years, but still doesn't make Vega a better buy than Pascal. And remember Volta will be here in a few months.
> - HBC is relevant for pure compute workloads, but not for gaming.
> - Asynchronous compute is supported by Nvidia as well.


Volta might be delayed, they will try to slam the door closed on AMD with it, which takes extra time.


----------



## the54thvoid (Aug 3, 2017)

Want Vega?

You're all screwed. It's allegedly the God of mining.

https://videocardz.com/71591/rumor-amd-radeon-rx-vega-64-to-be-great-for-mining


----------



## oxidized (Aug 3, 2017)

B-Real said:


> As i wrote, the 480 is equal (and better in DX12-Vulkan) compared to the 1060. Even if you believe it or not.
> 
> And who the heck cares what kept AMD floating? :O It was ~10% slower for about $40 cheaper and a bit more power consumption. And 5-6 months later, the performance gap was reduced to 0-2% and still costing $30-40 less.



Nothing you say happened plain simple



Captain_Tom said:


> FP16 (It will be a 26+ TFLOP card in many upcoming games), HBC, and asynchronous compute.   There are more, but those are the big ones.
> 
> 
> Although you are very "vocal" in these forums, so I already know you are aware of them, and I can''t wait to see what your fanboy response will be.



I just hope you're right my friend...


----------



## arbiter (Aug 3, 2017)

B-Real said:


> As i wrote, the 480 is equal (and better in DX12-Vulkan) compared to the 1060. Even if you believe it or not.
> 
> And who the heck cares what kept AMD floating? :O It was ~10% slower for about $40 cheaper and a bit more power consumption. And 5-6 months later, the performance gap was reduced to 0-2% and still costing $30-40 less.


If i remember a story of 480 vs 1060 in doom. As cpu became slower the 480 suffered a lot in fps where as the 1060 barely lost any. Same thing happened in both gl and vulkan.


----------



## Vya Domus (Aug 3, 2017)

efikkan said:


> - HBC is relevant for pure compute workloads, but not for gaming.



You are so wrong , but just as always you're going to ignore facts and carry on.

https://m.youtube.com/watch?feature=share&v=85ProuqAof0


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> You are so wrong , but just as always you're going to ignore facts and carry on.



Yeah yeah keep it up, AMD is deus ex machina of hardware, while both nvidia and intel are evil companies that support stagnating technologies. Could actually make a nice fairy tale.


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> Yeah yeah keep it up, AMD is deus ex machina of hardware, while both nvidia and intel are evil companies that support stagnating technologies. Could actually make a nice fairy tale.



You can both hold hands , the all mighty anti-AMD brigade.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> You can both hold hands , the all mighty anti-AMD brigade.


Not anti-AMD at all, i'll most likely be buying ryzen if coffelake lacks performance, hell i might be buying it anyway, that's the difference between me and you, i don't care at all if not for performance and overall quality of something.


----------



## Sasqui (Aug 3, 2017)

bug said:


> Because of 4 leaked benchmarks? You know what they say about a fool and their money...



Meh. 

The 1070 will be sold at a profit, and just wait to see how prices sit in the Fall after mining peters out.  Back in with either red or green then depending on price points


----------



## Captain_Tom (Aug 3, 2017)

efikkan said:


> - FP16 is surely going to be interesting in some years, but still doesn't make Vega a better buy than Pascal. And remember Volta will be here in a few months.
> - HBC is relevant for pure compute workloads, but not for gaming.
> - Asynchronous compute is supported by Nvidia as well.



Volta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018.  In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh (But likely with more GDDR5X/6.   It will be stronger, but lack all of these features.

The only exception I can think of is possibly a cut-down Volta sold as a Titan card for $1500 - $2000.


----------



## Captain_Tom (Aug 3, 2017)

Sasqui said:


> Meh.
> 
> The 1070 will be sold at a profit, and just wait to see how prices sit in the Fall after mining peters out.  Back in with either red or green then depending on price points



Hate to say it, but I don't think mining will ever "peter out".  For sure there will be periods of boom and bust, but mining is here to stay buddy.

And if you think about it, it was only a matter of time before some program found a way to make money off of the massive computational power modern GPU's have.


----------



## the54thvoid (Aug 3, 2017)

Captain_Tom said:


> Volta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018.  In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh



Is that not worrying?  All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost.  All AMD are doing with these compute heavy consumer cards is fueling the mining craze.  There is no disputing AMD's consumer compute ability but:

1) It's still not enough to be the fastest gaming GPU and ;
2) It keeps the profit margins very low

Anyway, if you see my post further up, Vega 64 is a mining monster. So it's going to disappear fast on one of those jumbo jets.  Not good news at all for gamers.


----------



## FrustratedGarrett (Aug 3, 2017)

RejZoR said:


> Just posted this in VEGA discussion thread few seconds before this one
> 
> Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...
> 
> Battlefield and Doom numbers are pretty significant.



Too little too late! The GTX1070 uses cheap GDDR5 memory and is based on a much smaller chip that is a whole year older. It doesn't matter at this point. Volta should be around the corner, with GDDR6 and definite performance upgrade over Pascal, Vega is DOA.


----------



## RejZoR (Aug 3, 2017)

Dream on about both, Volta and GDDR6.


----------



## Sasqui (Aug 3, 2017)

Captain_Tom said:


> Hate to say it, but I don't think mining will ever "peter out".  For sure there will be periods of boom and bust, but mining is here to stay buddy.
> 
> And if you think about it, it was only a matter of time before some program found a way to make money off of the massive computational power modern GPU's have.



We shall see, and no... crypto currency is not going away, it's simply a highly speculative commodity.

And I'm not your buddy, that's just plain gay.


----------



## Fluffmeister (Aug 3, 2017)

the54thvoid said:


> Want Vega?
> 
> You're all screwed. It's allegedly the God of mining.
> 
> https://videocardz.com/71591/rumor-amd-radeon-rx-vega-64-to-be-great-for-mining



Looks like RX Vega is about to get mining then!

Not sure the Radeon Packs are gonna stop the onslaught, but I guess it was worth a go.


----------



## birdie (Aug 3, 2017)

DeathtoGnomes said:


> I'm convinced. There are too many posters in this thread *being paid* specifically to attack other people and hijack the discussion.
> 
> As for me the unverified results are nice considering AMD hasnt been in such a position in a long time. I think Nvidia will up its game with Volta, which might mean a delayed release until they prove they retake top daug slot without much fanfare.



I guess you need to go see a shrink ASAP. Next you're gonna say that Jensen Huang frequents these forums and slanders AMD in his spare time. 

I still hope you're were joking.


----------



## efikkan (Aug 3, 2017)

Vya Domus said:


> efikkan said:
> 
> 
> > - HBC is relevant for pure compute workloads, but not for gaming.
> ...


Then please focus on the facts instead of personal attacks.

A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc.



Captain_Tom said:


> Volta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018.  In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh (But likely with more GDDR5X/6.   It will be stronger, but lack all of these features.
> 
> The only exception I can think of is possibly a cut-down Volta sold as a Titan card for $1500 - $2000.


GV102 and GV104 are already taped out, and the first test batch will arrive soon. So unless Nvidia run into problems like on Fermi, they can be released anywhere from 5-10 months from today.


----------



## Assimilator (Aug 3, 2017)

Captain_Tom said:


> No surprise here.
> 
> $400, and it slightly loses to the 1080 while having FAR better long-term technology.  This will sell well, and Vega64 should be at least 15% stronger than this!



"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.


----------



## Vya Domus (Aug 3, 2017)

efikkan said:


> Then please focus on the facts instead of personal attacks.
> 
> A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc.



I literally posted a presentation from AMD themselves where they showcased HBC in action in a game. Yet you insist it isn't possible. Right , AMD knows nothing about it and you do. You know , just because you bring up all that technical stuff that hasn't got much to do with the subject it doesn't really help you prove that you know what you are talking about.

Facts ? All you do is ignore them mate.  Just like you did in out previous discussion.

We got you bro , AMD hasn't got a clue about what they're doing and you do.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> I literally posted a presentation from AMD themselves where they showcased HBC in action in a game. Yet you insist it isn't possible. Right , AMD knows nothing about it and you do. You know , just because you bring up all that technical stuff that hasn't got much to do with the subject it doesn't really help you prove that you know what you are talking about.
> 
> Facts ? All you do is ignore them mate.  Just like you did in out previous discussion.
> 
> We got you bro , AMD hasn't got a clue about what they're doing and you do.



So basically we can't trust reviews and reviewers but we SHOULD trust the company itself? I mean how can you not be trolling?



Assimilator said:


> "Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.



Hey you're wrong! AMD has the alien technology nobody has, but everyone wants, that's why everyone's constantly copying them, and try to stop them by paying to make them look bad!


----------



## Vya Domus (Aug 3, 2017)

oxidized said:


> So basically we can't trust reviews and reviewers but we SHOULD trust the company itself? I mean how can you not be trolling?



Of course it looks like trolling when you ain't got a clue about how things work and are under the impression you know better that a team of engineers.

Right, wasted enough time with you both , you are both on ignore.


----------



## oxidized (Aug 3, 2017)

Vya Domus said:


> Of course it looks like trolling when you ain't got a clue about how things work and are under the impression you know better that a team of engineers.
> 
> Right, wasted enough time with you both , you are both on ignore.



The level of your stupidity is astonishing, really, i must compliment you.


----------



## the54thvoid (Aug 3, 2017)

I thought HBC stood for "Holy Batman, Catwoman!"


----------



## Fluffmeister (Aug 3, 2017)

the54thvoid said:


> I thought HBC stood for "Holy Batman, Catwoman!"



But seriously, buy cheap (if even possible), and then flog to miners... hmm I'm tempted.


----------



## vega22 (Aug 3, 2017)

the54thvoid said:


> Is that not worrying?  All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost.  All AMD are doing with these compute heavy consumer cards is fueling the mining craze.  There is no disputing AMD's consumer compute ability but:
> 
> 1) It's still not enough to be the fastest gaming GPU and ;
> 2) It keeps the profit margins very low
> ...



nah that bubble is bursting as we speak. by the time they are on the shelves most miners will be dumping their older, low power, cards and using those funds to buy vega and 4k screens for gaming


----------



## deu (Aug 3, 2017)

Sempron Guy said:


> I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.



Same with gsync as far as I understand.


----------



## Dimi (Aug 3, 2017)

deu said:


> Same with gsync as far as I understand.



I never see any tearing on my gsync monitor, goes to 165hz though.


----------



## DeathtoGnomes (Aug 3, 2017)

birdie said:


> I guess you need to go see a shrink ASAP. Next you're gonna say that Jensen Huang frequents these forums and slanders AMD in his spare time.
> 
> I still hope you're were joking.


Thanks for making my point.


----------



## Captain_Tom (Aug 3, 2017)

the54thvoid said:


> Is that not worrying?  All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost.  All AMD are doing with these compute heavy consumer cards is fueling the mining craze.  There is no disputing AMD's consumer compute ability but:
> 
> 1) It's still not enough to be the fastest gaming GPU and ;
> 2) It keeps the profit margins very low
> ...




I never said it was good, but I wouldn't say this is "bad".

Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol).  Nvidia can afford to have 2 architectures at the same time, and AMD cannot.    However AMD is starting to finally make money again, and so this will change by 2019.


Why are you so sure Vega won't pan out?  It will be mediocre now, but it has FP16/HBC/RPM/etc.   Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (*That makes Vega64 a 27 TFLOP card in that gamer lol*).   Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department.   Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.


----------



## efikkan (Aug 3, 2017)

Captain_Tom said:


> Why are you so sure Vega won't pan out?  It will be mediocre now, but it has FP16/HBC/RPM/etc.   Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (*That makes Vega64 a 27 TFLOP card in that gamer lol*).   Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department.   Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.


Well, for starters we hear this every single time. Buy a underperforming AMD card now, and it will be better in the future, but it never pans out. The primary reason is that your expectations are too inflated. Secondly, AMD will shift their focus to Navi in a few months.

Fp16 is certainly interesting, and will be gradually more used in the future. But for the next 2-3 years there will be a limited amount of games giving a little boost there, and even with this boost it still wouldn't beat a 1080 Ti. Also, keep in mind that AMD already needs to improve their scheduling, so usage of fp16 will result in even more idle resources. HBC wouldn't give it an advantage over Nvidia unless a game needs more memory than the competition can provide and the game uses intrinsics. So simply stated, even in your best case scenario, Vega doesn't look good in comparison to Pascal.


----------



## bug (Aug 3, 2017)

Captain_Tom said:


> I never said it was good, but I wouldn't say this is "bad".
> 
> Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol).  Nvidia can afford to have 2 architectures at the same time, and AMD cannot.    However AMD is starting to finally make money again, and so this will change by 2019.
> 
> ...


Out of curiosity, what would qualify as bad in your opinion?


----------



## Fluffmeister (Aug 4, 2017)

I'm just impressed to hear Maxwell is worthless beyond gaming and mining (lol).


----------



## Th3pwn3r (Aug 4, 2017)

Assimilator said:


> "Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.



By far the worst post in this thread. If you don't realize that AMD has gone against the grain then you need to read more.


----------



## S@LEM! (Aug 4, 2017)

all in all, for such a small Group like RTG with R&D crippled by the former management, i'm just glad for what they accomplished. they always push for new stuff no matter what, opted for open standards friendly ecosystem long term functionally and future proof, they superior the market of consoles, the god of APU I remember when AMD's slogan was "The Future is Fusion" when we are fighting over quad cores CPUs. Sure they may deliver half baked tech sometime, but they strike hard along the way and i'm sure the next Vega will be even more appealing for hard core enthusiast not just for pro and developers. 

can't wait for the official reviews and god save us from the miners!


----------



## xenocide (Aug 4, 2017)

Whether or not the Vega 56 is more cost effective than the 1070 is irrelevant when you consider that the GTX 1070 has been out for about a year and the Vega 56 still isn't.  Most people already gave Nvidia their money, and the prospect of a lateral move isn't exactly appealing.


----------



## Frick (Aug 4, 2017)

@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.



vega22 said:


> nah that bubble is bursting as we speak. by the time they are on the shelves most miners will be dumping their older, low power, cards and using those funds to buy vega and 4k screens for gaming



Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.


----------



## renz496 (Aug 4, 2017)

B-Real said:


> Then I have no idea what you checked. First is Hardware Canuks as a written review, YT videos like HW Unboxed (
> 
> 
> 
> ...



some people were mislead by that video by Jay thinking that *the whole card only use that much power* when running 3D application like Fire Strike. but in reality those were power used on GPU core section only not the entire GPU because there is no sensor on the card that can measure the entire power being used by the card. that's why reviewer like techpowerup and toms hardware did not look at power consumption reported by MSI Afterburner (like Jay did at the time) and have specific equipment to measure GPU power consumption only in a system. this is PCB analysis of the exact card use by Jay (XFX RX480 GTR):


----------



## renz496 (Aug 4, 2017)

Captain_Tom said:


> No surprise here.
> 
> $400, and it slightly loses to the 1080 while having FAR better long-term technology.  This will sell well, and Vega64 should be at least 15% stronger than this!



that might be wonderful if Vega is launching at the same time as GTX1080/1070 back in may 2016. but right now we already nearing at the end of pascal life cycle. if the said feature is really important to have nvidia will have them as well. volta will be here in less than a year.


----------



## renz496 (Aug 4, 2017)

Captain_Tom said:


> FP16 *(It will be a 26+ TFLOP card in many upcoming games)*, HBC, and asynchronous compute.   There are more, but those are the big ones.
> 
> 
> Although you are very "vocal" in these forums, so I already know you are aware of them, and I can''t wait to see what your fanboy response will be.



but it will not going to double your frame rate. and the entire scene will not being computed in FP16 because certain effect specifically need FP32 computation. only certain effect can use them. that 26Tflops is massive but the use case is limited.


----------



## Assimilator (Aug 4, 2017)

Th3pwn3r said:


> By far the worst post in this thread. If you don't realize that AMD has gone against the grain then you need to read more.



What does that even mean?


----------



## renz496 (Aug 4, 2017)

Captain_Tom said:


> Volta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018.  *In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh (But likely with more GDDR5X/6.   It will be stronger, but lack all of these features.*
> 
> The only exception I can think of is possibly a cut-down Volta sold as a Titan card for $1500 - $2000.



another case people underestimating what nvidia can do. it seems you were hoping for nvidia not to integrate this more advance feature into their GPU so AMD can get on top. then let me tell you this: 
1) nvidia launch 1080ti in march and drop 1080 price by $100 even without AMD product competing in the segment. 
2) AMD supply pro drivers with Vega FE as an added feature to their GPU that nvidia titan does not have. in about a month later nvidia responding by releasing pro drivers for the titan

so you still think nvidia will let AMD to easily one up them? consumer volta by the end of 2018? that is your wishful thinking right?


----------



## HisDivineOrder (Aug 4, 2017)

I hope it's true.  That would be nice for AMD, which is nice for competition.

That being said, I have to think that anyone who wanted that level of performance for that price bought in already.  If they waited this long, wouldn't you wait a few months to see what nVidia does next?  I mean, I know you can always argue "Wait a while, get more for less" but in this case... AMD is so late with their product that they've already crossed into nVidia's next generation and they're only barely keeping up with what came out way back when.

If AMD had leapfrogged nVidia, it'd be a different story.  Mostly matching nVidia at similar pricing isn't really going to cut it imo.  Perhaps AMD believes that tying Ryzen success to Vega will boost Vega, too?  I just don't know.  Are there really that many AMD fans out there that'll ditch their 1070's to get similar performance for similar pricing?


----------



## laszlo (Aug 4, 2017)

wow ! so much bs in every vega thread/news comments ... green/red war nonstop ...fudzpowerup?


----------



## Manu_PT (Aug 4, 2017)

Captain_Tom said:


> No surprise here.
> 
> $400, and it slightly loses to the 1080 while having FAR better long-term technology.  This will sell well, and Vega64 should be at least 15% stronger than this!



"future proof" on a GPU? Cool story bro.


----------



## Vya Domus (Aug 4, 2017)

Frick said:


> @Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.



Every new feature AMD does either doesn't really exist , or it sucks , even before you can see the damn things in action. If it's AMD it's all fake news , everyone knows that right ?


----------



## vega22 (Aug 4, 2017)

Frick said:


> @Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.
> 
> 
> 
> Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.



it will for sure go down to how deep they're invested already. but i have seen plenty who have already bailed, cashing out to those late to the game. some from panic at the exchanges and others who preempting the jump in difficulty. 

burst is probably the wrong word, but i think the tide has changed and the swell now recedes.


----------



## bug (Aug 4, 2017)

Fluffmeister said:


> I'm just impressed to hear Maxwell is worthless beyond gaming and mining (lol).


I can confirm. Just the other day I told Maxwell I had laundry to do. It was totally useless.


----------



## oxidized (Aug 4, 2017)

Frick said:


> @Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.
> 
> 
> 
> Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.




HBM on Fury was supposed to be a new era, sure... a new era...

I can only imagine what will vega be with its HBM2 and HBC...


Anyway i'll just wait, i have no idea whether Vega will be good or bad yet, i only have a bad feeling, TDPs are way too high and the fact they reached nvidia's performance a year later makes me think i'm right, right on the fact that AMD probably put a huge amount of money on ryzen and left RTG with basically nothing, so i'd say they're actually doing even too good in my opinion - polaris was good enough, for a company with not that good of a situation atm. On only one thing i'm pretty sure, HBM2 won't affect almost at all performance on most of the use of those cards, especially videogames, Fury set an example.


----------



## jabbadap (Aug 4, 2017)

Frick said:


> @Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.
> 
> Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.



Well i think you mean HBCC, not high performance computing aka HPC. HBCC has it use if game demands more VRAM than you have in your card. When amd demoed it, they vere using VEGA with "2GB" of available VRAM and running HBCC off and on.











Frick said:


> @Vya Domus
> Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.


And then there will be next coin to mine and all crap begins from the start again.


----------



## Frick (Aug 4, 2017)

Vya Domus said:


> And then there will be next coin to mine and all crap begins from the start again.



There were years between the Bitcoin craze and this latest craze though, so we'll likely to have some quiet time. I hope.


----------



## xenocide (Aug 4, 2017)

Manu_PT said:


> "future proof" on a GPU? Cool story bro.



The more I read that post of his the more hilariously stupid it sounds.


----------



## Frick (Aug 5, 2017)

xenocide said:


> The more I read that post of his the more hilariously stupid it sounds.



Depends on how you look at it I guess, and whether you consider turning down settings to be "proper" gaming. The 7970 lasted for a really long while.


----------



## xenocide (Aug 9, 2017)

Frick said:


> Depends on how you look at it I guess, and whether you consider turning down settings to be "proper" gaming. The 7970 lasted for a really long while.



While that's true, to imply--or outright say--that AMD GPU's have some magical ability to be future proof is kind of absurd.  GCN was an anomaly that benefited from advancements in API's such as DirectX 12 and Vulkan/OGL more than its competitors.  A lot of the cards that came out around that time are still viable with some adjustments and depending on the games you play.  No way in hell are you maxing out BF1 with something like a 7870 or 670, but you can easily drop some settings and get it to work well enough.  Yea, in modern games the 7xxx series may have "aged" better than the 6xx series, but those cards are what 5 years old now?  None of them run modern games _well_.  It's just that while they both run them okayish, one does them slightly more okayish.


----------



## Vya Domus (Aug 9, 2017)

xenocide said:


> While that's true, to imply--or outright say--that AMD GPU's have some magical ability to be future proof is kind of absurd.  GCN was an anomaly that benefited from advancements in API's such as DirectX 12 and Vulkan/OGL more than its competitors.  A lot of the cards that came out around that time are still viable with some adjustments and depending on the games you play.  No way in hell are you maxing out BF1 with something like a 7870 or 670, but you can easily drop some settings and get it to work well enough.  Yea, in modern games the 7xxx series may have "aged" better than the 6xx series, but those cards are what 5 years old now?  None of them run modern games _well_.  It's just that while they both run them okayish, one does them slightly more okayish.



I think you have the wrong idea of what "future proofing" is. Future proofing is not the ability to run games maxed out for ages , that is obviously impossible , no. What it means is to able to play games without any major handicap and while still supporting new software and not be left out. And in this regard GCN is in fact more future proof even though you aren't maxing out games anymore. Not every one is busting 300$+ each 2 years , so slightly more okaysih is actually preferable for something that's 4 years old.


----------



## Liviu Cojocaru (Aug 9, 2017)

5 days left and hopefully all of this nonsense disputes will go away


----------



## Th3pwn3r (Aug 9, 2017)

Liviu Cojocaru said:


> 5 days left and hopefully all of this nonsense disputes will go away


It'll just create new ones and more idiotic posts from biased people .


----------



## bug (Aug 9, 2017)

Liviu Cojocaru said:


> 5 days left and hopefully all of this nonsense disputes will go away


Only if somehow Vega beats expectations. Otherwise we'll be back into "futureproofing" and "drivers need to mature" discussion territory.


----------

