# AMD RX480 Confirmed at $199



## xkm1948 (Jun 1, 2016)

Not bad at all, not bad!

Now I am excited again for Vega this year.


----------



## xkm1948 (Jun 1, 2016)

And AMD's answer to 1080:

If one doesn't work, build in moar cards!!


----------



## chinmi (Jun 1, 2016)

will it be faster then nvidia's $200 card or not ? that's the big question.
and cross fire have lots of bugs. and AFAIK it doesn't work with borderless windowed mode (at least my last amd in cf card doesn't) and not as efficient as 1 faster card.
i rather spend extra $100 to $200 for 1 card then has to do cross fire.


----------



## Xzibit (Jun 1, 2016)

This is the interesting picture. Its small


----------



## Fluffmeister (Jun 1, 2016)

xkm1948 said:


> And AMD's answer to 1080:
> 
> If one doesn't work, build in moar cards!!
> 
> View attachment 75098



The GTX 1070 isn't looking so bad after all! Solid performance every time without worrying about Crossfire support and it's inherent stuttering.

And that's Ashes too, the ultimately AMD game.


----------



## natr0n (Jun 1, 2016)

cheaper priced cards are always nice.


----------



## Fluffmeister (Jun 1, 2016)

natr0n said:


> cheaper priced cards are always nice.



Definitely, lets hope they get some joy against the 1060. As it stands a nice custom 1070 for around $400-450 is looking like a killer deal.


----------



## Naito (Jun 1, 2016)

xkm1948 said:


> And AMD's answer to 1080:



You're being misleading. A lead in one benchmark/game is not what makes a good card.

Looks great for the price. We'll see some reviews soon enough. Hopefully I'm not speaking too soon by saying well done AMD. Things may become very interesting in this price range.


----------



## ViperXTR (Jun 1, 2016)

Fluffmeister said:


> The GTX 1070 isn't looking so bad after all! Solid performance every time without worrying about Crossfire support and it's inherent stuttering.
> 
> And that's Ashes too, the ultimately AMD game.


Ashes produces frame pacing issues even on single AMD cards (Fury series as per Guru3D FCAT tests vs 1070/1080)

Also 199 is for 4GB yeah? probably 229 or 249 for RX 480 8GB variant

GTX 1060 better be gud to match it price performance


----------



## Naito (Jun 1, 2016)

ViperXTR said:


> GTX 1060



There's been rumours of a GP104-150-A1 chip around the web. Though, this is probably reserved for the 1060 Ti. Who knows how far off a GP106 is.


----------



## Fluffmeister (Jun 1, 2016)

ViperXTR said:


> Ashes produces frame pacing issues even on single AMD cards (Fury series as per Guru3D FCAT tests vs 1070/1080)
> 
> Also 199 is for 4GB yeah? probably 229 or 249 for RX 480 8GB variant
> 
> GTX 1060 better be gud to match it price performance



I hope your right, but to use Kanan, if you have similar performance already... why bother?

I suspect the 1060 will be a suitable headache.


----------



## Caring1 (Jun 1, 2016)

This just comfirms what I said about wanting a 150W card that doesn't perform worse in game play, compared to older gen 
And to boot, it is fury size once the cooler is removed, and water added.
Bonus is, another wish of mine, no DVI, so it can be single slot.


----------



## newtekie1 (Jun 1, 2016)

xkm1948 said:


> Not bad at all, not bad!
> 
> Now I am excited again for Vega this year.



I'll wait for real performance numbers before making that judgement.  We know performance won't be close to the 1070, or they would have priced it as such.



xkm1948 said:


> And AMD's answer to 1080:
> 
> If one doesn't work, build in moar cards!!



Not exactly a good answer.  It isn't hard to find a benchmark where two cheaper cards perform way better than a single far more expensive one.  I mean, two $200 GTX960s outperformed a GTX980, that didn't make the GTX960 a good card, and people buying two for SLI right off the bat were stupid.


Xzibit said:


> This is the interesting picture. Its small




Very GTX670 ish.


----------



## ViperXTR (Jun 1, 2016)

wait, just realized TDP is same as 1070 '__'


----------



## Naito (Jun 1, 2016)

ViperXTR said:


> wait, just realized TDP is same as 1070 '__'



Could it be they're just quoting 75W from PCIe lane + 75W from 6 pin PCIe power? Might not be actual power usage.


----------



## Caring1 (Jun 1, 2016)

Naito said:


> Could it be they're just quoting 75W from PCIe lane + 75W from 6 pin PCIe power? Might not be actual power usage.


Partner boards may come out with an 8 pin to increase that and raise overclocking limits.


----------



## wiak (Jun 1, 2016)

Caring1 said:


> This just comfirms what I said about wanting a 150W card that doesn't perform worse in game play, compared to older gen
> And to boot, it is fury size once the cooler is removed, and water added.
> Bonus is, another wish of mine, no DVI, so it can be single slot.


amd has had no dvi for a few of years now, thank god, no need to live in the dark ages anymore, DP 1.4+HDR, HDMI 2.0, and HEVC Encode/Decode 4K 10-bit nice


----------



## darkangel0504 (Jun 1, 2016)

Loading at 51%


----------



## RejZoR (Jun 1, 2016)

Ok, Ashes of Singularity is a VERY specific example. Is there a direct link to the release video? I wanna watch it soon...


----------



## medi01 (Jun 1, 2016)

As single card: nice. (although a tad disappointed about 4Gb, if 8Gb is 229$ we are still fine)
But in cross-fire it is below 1080, above 1070.
But then, it is more expensive than 1070 too, so hardly any point buying if 1070 is available at around 450$.


----------



## Sempron Guy (Jun 1, 2016)

ViperXTR said:


> wait, just realized TDP is same as 1070 '__'



well you know TDP. Actual consumption could be less or more. But I doubt with only a single 6pin, it will saturate the whole 150w.


----------



## ViperXTR (Jun 1, 2016)

Naito said:


> Could it be they're just quoting 75W from PCIe lane + 75W from 6 pin PCIe power? Might not be actual power usage.





Sempron Guy said:


> well you know TDP. Actual consumption could be less or more. But I doubt with only a single 6pin, it will saturate the whole 150w.


Guess well just have to wait for benchmark tests, and looking at how small the card is, its probably not gonna saturate that



darkangel0504 said:


> Loading at 51%



50%, so that's like just 1 card running?  '__'


----------



## Xzibit (Jun 1, 2016)

RejZoR said:


> Ok, Ashes of Singularity is a VERY specific example. Is there a direct link to the release video? I wanna watch it soon...



Polaris RX 480 @ 29:00+









I don't know if its intentional or not but when DOOM devs say "True asynchronous compute".   @ 34:29


----------



## medi01 (Jun 1, 2016)

Doesn't 1070 use 8 pin connector? (and even two on AIBs)

PS
RX 480 8GB & gtx1070
*Price difference - 65%*
DX11 performance difference - 8%
DX12 performance difference - <23%

RX 480(x) 8GB & gtx1070
*Price difference - 40%*
DX11 performance difference - <27%
DX12 performance difference - <14%

480 8GB reference - $229
gtx 1070 reference - $449

DX11 performance derived from 3DMark 11
DX12 performance derived from AotS


----------



## sabre23 (Jun 1, 2016)

I pray for 8GB RX480 to come out top of GTX 970 at every game. 
A 230$ Card will be available in India for 310$ which is almost without breaking budget of most Indian gamers.
Average pricing of GTX 1070 here will be 575$.


----------



## Caring1 (Jun 1, 2016)

sabre23 said:


> I pray for 8GB RX480 to come out top of GTX 970 at every game.
> A 230$ Card will be available in India for 310$ which is almost without breaking budget of most Indian gamers.
> Average pricing of GTX 1070 here will be 575$.


Even if they are equal, one may perform better in some games than the other.The RX480 will be good value and worth buying.


----------



## dj-electric (Jun 1, 2016)

medi01 said:


> gtx 1070 reference - $449



379$. Nobody cares about NVIDIA's stupid FE.


----------



## Valdas (Jun 1, 2016)

Is it just me or does that AotS comparison was not entirely apples to apples? It looks like at a very least the shadow quality was different.


----------



## medi01 (Jun 1, 2016)

Dj-ElectriC said:


> 379$. Nobody cares about NVIDIA's stupid FE.


Check AIB pricing for custom cards then.
Or street f*cking prices for that matter, with 789 Euro MSRP for FE in Germany.
(Top Asus MSRP is 799 and that card sucks at OCing)



Valdas said:


> Is it just me or does that AotS comparison was not entirely apples to apples? It looks like at a very least the shadow quality was different.


Saw that too.
Was told AoT renders smaller amount of snow on nVidia cards, as can be seen here (red arrow):







PS
Frankly, we've seen 3dmark leak, with CF 480 being next to 1080.
They shouldn't need tricks to beat 1080 in AMD favoring game.
(assuming 480 is C7 and not C4)


----------



## Caring1 (Jun 1, 2016)

medi01 said:


> PS
> Frankly, we've seen 3dmark leak, with CF 480 being next to 1080.
> They shouldn't need tricks to beat 1080 in AMD favoring game.
> (assuming 480 is C7 and not C4)


RX480 has to be the C4 as it is around GTX970 performance.
So the C7 should be interesting when that is announced.


----------



## ViperXTR (Jun 1, 2016)

> Ashes of the singularity uses some form of procedual generation for its texture generation ( aswell as unit composition/behavior to prevent driver cheats) which means that every game session and bench run will have various differences in some details.
> 
> You can see this quite well in the second image. Looking at the chasm like drop off in front of the mountain (top portion) you can see that on the 480 side it's actually half filled with snow, while the 1080 run is pretty much... "dry" down there. Same can be observed with various mountain ledges where any remotely flat surface is covered in thick white snow on the 480 and hardly any on the 1080. Lastly the plateau on top of the same mountain is basically all snow on the 480 with almost no rock texture retained while on the 1080 the would-be snow layer is thin enough to show some of the rock's detail beneath.
> 
> ...


----------



## Delta6326 (Jun 1, 2016)

Well AMD said they want to get VR to 99%of people. This should help, but doesn't look like many care.
They hit the #1 request and the last.


----------



## Valdas (Jun 1, 2016)

medi01 said:


> Check AIB pricing for custom cards then.
> Or street f*cking prices for that matter, with 789 Euro MSRP for FE in Germany.
> (Top Asus MSRP is 799 and that card sucks at OCing)
> 
> ...


789 Euro is actually a decent price. $699 * 1.19 (19% VAT in germany) * 0.8990 (current USD/EUR exchange rate) puts you at 748 Euro.
EVGA basic custom card comes at $609, that's 652 Euro. Don't expect to pay anything less than that for 1080 any time soon.

As for the image comparisons can clearly see now that the texture quality is different, on the left side it looks like more washed out.
As for 3D mark chart it was said by videocardz admin that these numbers cannot be verified and therefore should be taken with a grain of salt.

Edit: if it's procedural textures than it makes sense.


----------



## mroofie (Jun 1, 2016)

darkangel0504 said:


> Loading at 51%


doesn't make any sense 

51% utilization


----------



## medi01 (Jun 1, 2016)

Talking about power consumption.
According to AMD slides:
* 14nm transition gives 1.7 times power savings
* they claim they've achieved 2.8

So, if 390 TDP was 275w, for 2.8 promise to stand, 480 should have been 100w, should it not?





Caring1 said:


> RX480 has to be the C4 as it is around GTX970 performance.


They compared it to 970/980... But yeah, makes sense.
Oh, C4 is exactly between 970 and 980...
Makes me a bit less excited though, as even 480x in CF doesn't reach 1080 in 3dmark, C7 should consume more power, yet we have 1070 with 150w TDP. (but yeah, even 1070 FE is using 8 pin power connector vs 6 on RX 480, so theoretical max is 225w vs 150w)




mroofie said:


> doesn't make any sense
> 
> 51% utilization



It's a game that favors AMD GPUs, so why not.


----------



## Valdas (Jun 1, 2016)

medi01 said:


> They compared it to 970/980... But yeah, makes sense.
> Oh, C4 is exactly between 970 and 980...
> Makes me a bit less excited though, as even 480x in CF doesn't reach 1080 in 3dmark, C7 should consume more power, yet we have 1070 with 150w TDP. (but yeah, even 1070 FE is using 8 pin power connector vs 6 on RX 480, so theoretical max is 225w vs 150w)


It's confirmed that C7 is RX 480. So the question now is whether 480 was overclocked in 3D Mark or whether it used old drivers:
*AMD Polaris 67DF:C7 — Radeon R9 480X?*
http://www.3dmark.com/3dm11/11257751 8GB 1266 MHz 2000 MHz 15524
http://www.3dmark.com/3dm11/11263084 8GB 1266 MHz 2000 MHz 18060


----------



## medi01 (Jun 1, 2016)

mroofie said:


> The 8GB option will cost 250$


But why? I don't recall AMD charging that much premium. 229$ is more likely.
250$ was "max" estimate from the "crossfire 480 faster than 1080 in AOT" slide.



Valdas said:


> It's confirmed that C7 is RX 480.


OMG.
Wow.

Seriously:








PS


Valdas said:


> So the question now is whether 480 was overclocked in 3D Mark or whether it used old drivers:




P13 141:  Driver version 16.200.1014.0
P14 461:  Driver version 16.200.0.0

So older drivers gave better results.
Weird, to say the least.


----------



## mroofie (Jun 1, 2016)

xkm1948 said:


> And AMD's answer to 1080:
> 
> If one doesn't work, build in moar cards!!
> 
> View attachment 75098





medi01 said:


> But why? I don't recall AMD charging that much premium. 229$ is more likely.
> 250$ was "max" estimate from the "crossfire 480 faster than 1080 in AOT" slide.
> 
> 
> ...



Look at the the slide 250 + 250 = 500


----------



## Dethroy (Jun 1, 2016)

CANNOT WAIT! Let the price war begin!


----------



## Tsukiyomi91 (Jun 1, 2016)

wouldn't put all my eggs into AMD's basket yet since it's still unproven that this $199 card is really a GTX1070 fighter with only 4GB on a 256-bit memory interface.


----------



## NdMk2o1o (Jun 1, 2016)

Tsukiyomi91 said:


> wouldn't put all my eggs into AMD's basket yet since it's still unproven that this $199 card is really a GTX1070 fighter with only 4GB on a 256-bit memory interface.



This isn't to compete with 1070 it's the 380 successor.....


----------



## Caring1 (Jun 1, 2016)

Valdas said:


> It's confirmed that C7 is RX 480.


No it's not.
That picture does not show what card it is, only some numbers for the chip.


----------



## medi01 (Jun 1, 2016)

Tsukiyomi91 said:


> unproven that this $199 card is really a GTX1070 fighter


Is it trolling?
I hope it's trolling.



mroofie said:


> Look at the the slide 250 + 250 = 500


Care to elaborate?



Caring1 said:


> That picture does not show what card it is, only some numbers for the chip.


But it says chipset is "... :C7":


----------



## Caring1 (Jun 1, 2016)

medi01 said:


> But it says chipset is "... :C7":


My point exactly, nowhere is there any proof that it is the 480.
C7 could be the 480X.


----------



## medi01 (Jun 1, 2016)

Caring1 said:


> C7 could be the 480X.


Agreed.
But then it's 300$ chip. (100-300$ was mentioned as price range of the incoming cards)

Not very competitive.

PS
I wonder if AMD plans to talk about stuff a bit more on E3 (2 weeks from now)


----------



## eidairaman1 (Jun 1, 2016)

You can say the same about SLi having tons of bugs too.


chinmi said:


> will it be faster then nvidia's $200 card or not ? that's the big question.
> and cross fire have lots of bugs. and AFAIK it doesn't work with borderless windowed mode (at least my last amd in cf card doesn't) and not as efficient as 1 faster card.
> i rather spend extra $100 to $200 for 1 card then has to do cross fire.


----------



## ZoneDymo (Jun 1, 2016)

xkm1948 said:


> And AMD's answer to 1080:
> 
> If one doesn't work, build in moar cards!!
> 
> View attachment 75098



On that demonstration...did you also notice the Nvidia rendering clearly had more detail in the game?


----------



## medi01 (Jun 1, 2016)

ZoneDymo said:


> Nvidia rendering clearly had more detail in the game?


That's game, not settings. More snow (random thing) vs less snow.



chinmi said:


> will it be faster then nvidia's $200 card or not ? that's the big question.


Some say 1060 is coming this fall the earliest.
And 960 was worse than 380 too (perf wise) the only advantage was lower power consumption.
It isn't easy to surprise anyone in given price range, without cannibalizing 1070.


----------



## eidairaman1 (Jun 1, 2016)

If I was in market id prob grab a 480X or 490, however Im not.


----------



## ZoneDymo (Jun 1, 2016)

medi01 said:


> That's game, not settings. More snow (random thing) vs less snow.
> 
> 
> Some say 1060 is coming this fall the earliest.
> ...



On the game:
idk about that, it had more detail on the Nvidia side, just check it again, its not just the snow, its also the buildings and units.


----------



## medi01 (Jun 1, 2016)

ZoneDymo said:


> idk about that, it had more detail on the Nvidia side, just check it again, its not just the snow, its also the buildings and units.


I don't see "more units".
Check snow at the point marked with red arrow.

Think about it from another perspective, CF 480 scored in 3DMark slightly below 1080.
They shouldn't need stinky tricks to beat it in AMD favoring game.


----------



## Fluffmeister (Jun 1, 2016)

There is definitely more detail on the right, around that blue building on the left you can seeing it's missing a number of details and the ground is lacking some definition and shadowing too, in fact right across the board.


----------



## ZoneDymo (Jun 1, 2016)

medi01 said:


> I don't see "more units".
> Check snow at the point marked with red arrow.
> 
> Think about it from another perspective, CF 480 scored in 3DMark slightly below 1080.
> They shouldn't need stinky tricks to beat it in AMD favoring game.



ermm I never spoke of "more units" so why do put that in quotations?
I spoke of DETAIL, more DETAIL.

more detail in the units, more detail in the buildings and yeah more detail in landscape as well
I am 100% sure the Nvidia card is running at higher settings here


----------



## medi01 (Jun 1, 2016)

ZoneDymo said:


> ermm I never spoke of "more units"


Mia culpa, number of them was the only thing I could think of (in the beginning of the video there were less).

PS
Buildings:


----------



## Fluffmeister (Jun 1, 2016)

AMD using lower settings on a game that favours them already, oh dear.


----------



## ZoneDymo (Jun 1, 2016)

medi01 said:


> Mia culpa, number of them was the only thing I could think of (in the beginning of the video there were less).
> 
> PS
> Buildings:
> ...



Went back and watched the vid again, and yes the biggest difference comes form shadowing and terain which I still really doubt would not be a difference in settings.
There is just much more depth on the Nvidia side:

Oh and another, notice how in that canyon on the right it looks much more detailed as well as the shadowing on the big red unit on the left-ish, the other looks much flatter


----------



## medi01 (Jun 1, 2016)

ZoneDymo said:


> Oh and another, notice how in that canyon on the right it looks much more detailed as well as the shadowing on the big red unit on the left-ish, the other looks much flatter



Added white semi transparent layer on the right ("fog", "snow" whatever), do you still see "more details" on the right?


----------



## ZoneDymo (Jun 1, 2016)

medi01 said:


> Added white semi transparent layer on the right ("fog", "snow" whatever), do you still see "more details" on the right?
> 
> View attachment 75110



ermm yes, with ease, I mean you dont?
I know certain settings for games add little for the performance they ask, hell look at The Division, there is virtually no difference I can distinct between many settings on or off but here I can clearly see more detail on the right side.
The left has in many places just a mass of brown or black while the right has little nuances to them, little bumbs and differences in the terrain.


----------



## medi01 (Jun 1, 2016)

ZoneDymo said:


> ermm yes, with ease, I mean you dont?


Nope.
Not saying you are lying or something, just don't see it myself. (as with zoomed building in my earlier post)


----------



## ZoneDymo (Jun 1, 2016)

medi01 said:


> Nope.
> Not saying you are lying or something, just don't see it myself. (as with zoomed building in my earlier post)



Well like I said with the division, some people see differences, others just want to know they are running something at the highest of settings.
But to me here, just stepping back and looking at both pictures the right just looks much better


----------



## TRWOV (Jun 1, 2016)

wow... I was hoping to grab an used Nano to replace my 7970 (not that I need the upgrade THAT much playing at 1920x1200) but this migh actually be a better buy.

Although it has worse performance per watt compared to the 1070 the price difference makes up for that.


----------



## KainXS (Jun 1, 2016)

If the 480 is closer to the 980 than it is to the 970 then I'm a buyer for that price thats a good buy, It seems like an even better deal than the 78XX cards were.


----------



## Dethroy (Jun 1, 2016)

Don't think the performance/watt difference will be as big as some people make it out to be. The card will probably draw somewhere around 120W under heavy load.


----------



## thesmokingman (Jun 1, 2016)

This is priced to move units.


----------



## F-Zero (Jun 1, 2016)

Great card to replace my old R9 270 ! I'm waiting for the reviews to see what it can do.


----------



## ZoneDymo (Jun 1, 2016)

TRWOV said:


> wow... I was hoping to grab an used Nano to replace my 7970 (not that I need the upgrade THAT much playing at 1920x1200) but this migh actually be a better buy.
> 
> Although it has worse performance per watt compared to the 1070 the price difference makes up for that.



Im still running an HD6950.....and yes...im feeling it....
But every gen since i bought my card has just not been where I wanted it to be, but this might be an upgrade for me purely of how affordable it seems.
Either that or ill splash more on the GTX1070...


----------



## mroofie (Jun 1, 2016)

Fluffmeister said:


> AMD using lower settings on a game that favours them already, oh dear.


This is not gonna go well
Rip inter webs =P


----------



## AsRock (Jun 1, 2016)

mroofie said:


> Look at the the slide 250 + 250 = 500



I know were you got the x2 from but $250 a pop who said that ?, last video i had seen  was the RX480 being $200 a pop i guess it will depend on were you live.


----------



## Frag_Maniac (Jun 1, 2016)

xkm1948 said:


> And AMD's answer to 1080:
> 
> If one doesn't work, build in moar cards!!



LOL, Crossfire?

I'd much rather pay $400 for a more powerful single GPU card, because quite frankly, Crossfire can be a pain. Plus they happened to pick the one game Dx12 works well on. Most games would likely run better on one 1070, and for $20 less.

AMD's marketing is still as dysfunctional as ever.


----------



## mroofie (Jun 1, 2016)

AsRock said:


> I know were you got the x2 from but $250 a pop who said that ?, last video i had seen  was the RX480 being $200 a pop i guess it will depend on were you live.


forget about the comment.
I was searching for a price of the 8GB version turns out its 230 =D


----------



## newconroer (Jun 1, 2016)

Caring1 said:


> RX480 has to be the C4 as it is around GTX970 performance.
> So the C7 should be interesting when that is announced.



Makes me realize that $200 isn't so great overseas, when the conversion isn't arranged fairly. I wouldn't be surprised if 480s launch at £180 or more. GTX 970 will come down in price (or just go used) and possibly still be the better buy.
Then again, 8gb on the 480 makes it far superior to the 970.


----------



## D007 (Jun 1, 2016)

This  is why I don't buy AMD.. Yea they always try to compete FPS/consumption wise but they just never look the same. Nvidia looks better.
AMD tends to use tricks, so it doesn't have to render the things nvidia chooses to render, which degrades quality on AMD cards.. I went red once.. Once.. Never again..


----------



## uuuaaaaaa (Jun 1, 2016)

D007 said:


> This  is why I don't buy AMD.. Yea they always try to compete FPS/consumption wise but they just never look the same. Nvidia looks better.
> AMD tends to use tricks, so it doesn't have to render the things nvidia chooses to render, which degrades quality on AMD cards.. I went red once.. Once.. Never again..



Just search for amd (or ati) vs nvidia image quality, before posting things like this.

Best regards


----------



## TRWOV (Jun 1, 2016)

D007 said:


> This  is why I don't buy AMD.. Yea they always try to compete FPS/consumption wise but they just never look the same. Nvidia looks better.
> AMD tends to use tricks, so it doesn't have to render the things nvidia chooses to render, which degrades quality on AMD cards.. I went red once.. Once.. Never again..



My experience is exactly the opposite. I suppose it depends on the setup and games.


----------



## RejZoR (Jun 1, 2016)

Why don't AMD just slam 2x RX 480 on a single card, give it dual 6pin and call it a day? I know it's a dual GPU which I don't like, but if they push enough of these on the market, they will force the change with developers. And at least in DX12, that means same performance as GTX 1080. Or around there. It's a lot easier and cheaper for AMD and end customers. I guess they just aren't ready yet, that's why they are banking on multi GPU with Navi, 2 years down the road...


----------



## ZoneDymo (Jun 1, 2016)

RejZoR said:


> Why don't AMD just slam 2x RX 480 on a single card, give it dual 6pin and call it a day? I know it's a dual GPU which I don't like, but if they push enough of these on the market, they will force the change with developers. And at least in DX12, that means same performance as GTX 1080. Or around there. It's a lot easier and cheaper for AMD and end customers. I guess they just aren't ready yet, that's why they are banking on multi GPU with Navi, 2 years down the road...



Well there might be one coming, has it not always been like this with dual cards? that it took a while for those to come out?


----------



## D007 (Jun 1, 2016)

uuuaaaaaa said:


> Just search for amd (or ati) vs nvidia image quality, before posting things like this.
> 
> Best regards


There are examples "In this very topic" Showing AMD using lower grade quality...
Not going to argue about what I know is a fact and what I've seen with my own two eyes.. Nuff said, moving on..


----------



## RejZoR (Jun 1, 2016)

ZoneDymo said:


> Well there might be one coming, has it not always been like this with dual cards? that it took a while for those to come out?



I'm not talking about super expensive single GPU solution that comes at the end of life of one product where they extract last breath out of it. I'm talking dual GPU solution being released on day 1. They could revive the "Maxx" brand for it even.

The main issue with highest end single GPU cards is that GPU is massive and as such, waffers of it are super expensive compared to cheap small GPU's that you can stack metric tons of them on a single waffer. That's why lower end cards are so cheap and higher end are so expensive. Doing multi GPU and doing it right would end this. It would make production of GPU's significantly cheaper and since multi GPU would be such mainstream, everyone from developers to graphic vendors would have to work a lot harder on compatibility and support compared to current situationw here they need to support those 15 people owning multi-GPU setup of top of the line cards. I know AMD is aiming for that with Navi, but they could already start this revolution with Polaris.


----------



## ZoneDymo (Jun 2, 2016)

D007 said:


> There are examples "In this very topic" Showing AMD using lower grade quality...
> Not going to argue about what I know is a fact and what I've seen with my own two eyes.. Nuff said, moving on..



aka *puts fingers in ears" and goes "lalalala I cant hear you lalalalala"



RejZoR said:


> I'm not talking about super expensive single GPU solution that comes at the end of life of one product where they extract last breath out of it. I'm talking dual GPU solution being released on day 1. They could revive the "Maxx" brand for it even.
> 
> The main issue with highest end single GPU cards is that GPU is massive and as such, waffers of it are super expensive compared to cheap small GPU's that you can stack metric tons of them on a single waffer. That's why lower end cards are so cheap and higher end are so expensive. Doing multi GPU and doing it right would end this. It would make production of GPU's significantly cheaper and since multi GPU would be such mainstream, everyone from developers to graphic vendors would have to work a lot harder on compatibility and support compared to current situationw here they need to support those 15 people owning multi-GPU setup of top of the line cards. I know AMD is aiming for that with Navi, but they could already start this revolution with Polaris.



I was talking about a dual gpu solution...not sure why you thought otherwise.
The 7950 GX2, 3870X2, etc etc etc and recently the Pro Duo, did they not all come out a little while after their single gpu counterparts?
So who knows, while not announced right now maybe we will see a 480(x)X2 in the near future.


----------



## AsRock (Jun 2, 2016)

Too me AMD are trying to say you have 2xRX480's and perform near as good as the 1080 but if there RX480 could really perform so well they would not of used 2 but actually 1.   They say 51% utilization again implying 1 480 can perform as well as the 1080 which i call bullshit. 

Then there is the quality difference and to me as others have said the details are lower on the 480.

Question is that if 2 RX480's could run 80-90+ utilization and give the 1080 a fight with being cheaper.

As i don't tend to buy games when they first come out crossfire is a option as most issue's are when games are new.

How ever if i was going go CF i think the best time to do that will when they are like multi core CPU's.


----------



## OneMoar (Jun 2, 2016)

they can say 50% because we all know crossfire scaling is shit
so practice more like 70%


----------



## AsRock (Jun 2, 2016)

OneMoar said:


> they can say 50% because we all know crossfire scaling is shit
> so practice more like 70%



But my point is is 2 cards get 50% due to terrible scaling surly one card would get near 100% as CF be taken out the picture, but they did not do that which makes me think only bad things  and this is some game engine they have put a lot of time in to.

AMD crying wolf is only going make no one believe them when they do actually have some thing good.

Just wish they would stop the BS as it makes me not want to buy there stuff, but i guess we will find out how much BS is about sooner than enough.


----------



## rvalencia (Jun 2, 2016)

AsRock said:


> Too me AMD are trying to say you have 2xRX480's and perform near as good as the 1080 but if there RX480 could really perform so well they would not of used 2 but actually 1.   They say 51% utilization again implying 1 480 can perform as well as the 1080 which i call bullshit.
> 
> Then there is the quality difference and to me as others have said the details are lower on the 480.
> 
> ...


R9-290/290X was battling 780/780 Ti and then 970/980. Over time, 780/780 Ti wasn't able to keep up with 970/980/R9-290/290X.


----------



## OneMoar (Jun 2, 2016)

AsRock said:


> But my point is is 2 cards get 50% due to terrible scaling surly one card would get near 100% as CF be taken out the picture, but they did not do that which makes me think only bad things  and this is some game engine they have put a lot of time in to.
> 
> AMD crying wolf is only going make no one believe them when they do actually have some thing good.
> 
> Just wish they would stop the BS as it makes me not want to buy there stuff, but i guess we will find out how much BS is about sooner than enough.


I expect real world performance to be around the 390 with the 480 being 5 - 8 fps or so faster and losing in some scenarios


----------



## ViperXTR (Jun 2, 2016)

ZoneDymo said:


> On that demonstration...did you also notice the Nvidia rendering clearly had more detail in the game?





> Ashes of the singularity uses some form of procedual generation for its texture generation ( aswell as unit composition/behavior to prevent driver cheats) which means that every game session and bench run will have various differences in some details.
> 
> You can see this quite well in the second image. Looking at the chasm like drop off in front of the mountain (top portion) you can see that on the 480 side it's actually half filled with snow, while the 1080 run is pretty much... "dry" down there. Same can be observed with various mountain ledges where any remotely flat surface is covered in thick white snow on the 480 and hardly any on the 1080. Lastly the plateau on top of the same mountain is basically all snow on the 480 with almost no rock texture retained while on the 1080 the would-be snow layer is thin enough to show some of the rock's detail beneath.
> 
> ...


----------



## Fluffmeister (Jun 2, 2016)

That's gold, too much snow:







At the end of the day, it's a $200 card, you can't expect it to compete with the big boys (mid range Pascal).


----------



## rvalencia (Jun 2, 2016)

OneMoar said:


> I expect real world performance to be around the 390 with the 480 being 5 - 8 fps or so faster and losing in some scenarios


I expect 480(5.84 TFLOPS)'s real world performance to be around R9-390X(5.9 TFLOPS) to Fury Pro.  

From 3DMarks 11 scores, AMD GPUs follows their SKU levels e.g. Fury Pro > R9-390X > R9-390 > R9-380X > R9-380.


----------



## OneMoar (Jun 2, 2016)

rvalencia said:


> I expect 480(5.84 TFLOPS)'s real world performance to be around R9-390X(5.9 TFLOPS) to Fury Pro.
> 
> From 3DMarks 11 scores, AMD GPUs follows their SKU levels e.g. Fury Pro > R9-390X > R9-390 > R9-380X > R9-380.


LOLNO


----------



## rvalencia (Jun 2, 2016)

OneMoar said:


> LOLNO


Your post is LOL.


----------



## uuuaaaaaa (Jun 2, 2016)

D007 said:


> There are examples "In this very topic" Showing AMD using lower grade quality...
> Not going to argue about what I know is a fact and what I've seen with my own two eyes.. Nuff said, moving on..



Like the recent debacle on Titan X vs Fury X on Battlefield 4, where Titan X produced blurry washed out frames compared to Fury X? Or back in the days of the nvidia FX series when they purposely did not render part of the frames in 3Dmark to get higher scores?Even recently I have seen some guys in the World of Tanks forums complaining about the image quality produced by nvidia (after "upgrading" from amd) cards at the same settings, blatantly being worse on his shinny new nvidia GTX 970.

There is always that guy that listens to flac's compressed to 128kB/s mp3's and claims that the later sound better... 

Best regards


----------



## medi01 (Jun 2, 2016)

Cool would it be if Vega 10 (400mm chip,  1080 is 300-ish, 480 200-ish) arrives this fall.



ZoneDymo said:


> But to me here, just stepping back and looking at both pictures the right just looks much better


Anyhow, from AotS site itself, C7 in CF:
http://www.ashesofthesingularity.co...-details/ac88258f-4541-408e-8234-f9e96febe303

1080:
http://www.ashesofthesingularity.co...-details/a957db0f-59b3-4394-84cc-2ba0170ab699

Game versions are different though, but both run 1440p "crazy".




D007 said:


> Yea they always try to compete FPS/consumption wise but they just never look the same.


Dude.
I'll assume you are neither trolling nor on a NV payrol ("chizow" and company).

If you'd check sites going deeper, such as anandtech, you'd realize AMD is the best of 3 major GPU manufacturer in that regard, with nVidia being somewhat worse and Intel simply terrible in that regard. (things such as anisotropic filtering). And for quite a while.




newconroer said:


> Makes me realize that $200 isn't so great overseas, when the conversion isn't arranged fairly.


That's not how AMD handled things, at least in the past.
Heck, even nGreedia didn't do 699$ => 789€ last gen.




AsRock said:


> But my point is is 2 cards get 50% due to terrible scaling surly one card would get near 100% as CF be taken out the picture, but they did not do that which makes me think only bad things and this is some game engine they have put a lot of time in to.


Come on.
There could be other bottlnecks.
Like CPU, for instance.

CF vs 1080 comparison is just PR for lols anyhow.
480 being between 970 and 980 or even 980 and Fury at 199$/229$ is real news.




RejZoR said:


> Why don't AMD just slam 2x RX 480 on a single card, give it dual 6pin and call it a day?


I think they could use single 8pin. (225w)
And yeah, why not, provided there are enough chips to satisfy demand.

Timing is crucial, they must pump out as much as they can before 1060 (which, given nVidia's arrogance, won't be priced too competitively) arrives, so if they simply can't produce



PS
A case for multi-GPU (for dudes who are into VR):

Moving on, we have AMD’s compelling content goal, which is backed by their Affinity Multi-GPU technology. Short and to the point, Affinity Multi-GPU allows for each eye in a VR headset to be rendered in parallel by a GPU, as opposed to taking the traditional PC route of alternate frame rendering (AFR), which has the GPUs alternate on frames and in the process can introduce quite a bit of lag. Though multi-GPU setups are not absolutely necessary for VR, the performance requirements for high quality VR combined with the simplicity of this solution make it a easy way to improve performance (reduce latency) just by adding in a second GPU






At a lower level, Affinity Multi-GPU also implements some rendering pipeline optimizations to get rid of some of the CPU overhead that would come from dispatching two jobs to render two frames. *With each eye being nearly identical, it’s possible to cut down on some of this work by dispatching a single job and then using masking to hide from each eye what it can’t actually see.*






Marked bold is what nVidia's "simultaneous multi-projection" is likely about.​


----------



## Mr McC (Jun 2, 2016)

A lot of posters drinking Jen-jizz trying to pass it off as milk... Benchmarks will be here soon enough.


----------



## F-Zero (Jun 2, 2016)

Concerning the AOTS image quality controversy


----------



## the54thvoid (Jun 2, 2016)

F-Zero said:


> Concerning the AOTS image quality controversy


I'm sorry but a Reddit post from the guy who is AMD Technical Marketing isn't proof. 

Let's wait on benchmarks for neutral interpretation.


----------



## RejZoR (Jun 2, 2016)

F-Zero said:


> Concerning the AOTS image quality controversy



The rocks/rocky ridges on GTX 1080 still looked a bit more detailed to me, despite having "less" snow.


----------



## Frick (Jun 2, 2016)

Didn't the IQ debate die half a decade ago?


----------



## medi01 (Jun 2, 2016)

F-Zero said:


> Concerning the AOTS image quality controversy


Cheers.



RejZoR said:


> a bit more detailed to me


I tried hard to see that, zoomed in buildings. No difference in them. The rest looks like "more fog vs less fog" to me.

http://imgur.com/a/pyC3r

PS
Choosing specs that favor your card is one thing, pretty much standard for marketing people.
But outright lies with something *THAT CAN BE EASILY VERIFIED 4 WEEKS FROM NOW*, would not only be the newest low, but would be crystallized idiotism... Why, on planet Earth, would they do it???


----------



## uuuaaaaaa (Jun 2, 2016)

About the image quality:
_
"At present the GTX 1080 is incorrectly executing the terrain shaders responsible for populating the environment with the appropriate amount of snow. The GTX 1080 is doing less work to render AOTS than it otherwise would if the shader were being run properly."_

http://videocardz.com/60860/amd-exp...ts-radeon-rx-480-cf-vs-gtx-1080-demonstration


----------



## AsRock (Jun 2, 2016)

OneMoar said:


> I expect real world performance to be around the 390 with the 480 being 5 - 8 fps or so faster and losing in some scenarios



Which would be nice for a $200-$230 card.



Fluffmeister said:


> That's gold, too much snow:
> 
> 
> 
> ...



Bad pic in the case except for the lack of tree's the AMD side looks better, unless if you were playing and all that other shit started to pop up haha.

But in the end it is a $200 card, i just don't like the deceitfulness.


----------



## vega22 (Jun 2, 2016)

medi01 said:


> PS
> A case for multi-GPU (for dudes who are into VR):
> 
> Moving on, we have AMD’s compelling content goal, which is backed by their Affinity Multi-GPU technology. Short and to the point, Affinity Multi-GPU allows for each eye in a VR headset to be rendered in parallel by a GPU, as opposed to taking the traditional PC route of alternate frame rendering (AFR), which has the GPUs alternate on frames and in the process can introduce quite a bit of lag. Though multi-GPU setups are not absolutely necessary for VR, the performance requirements for high quality VR combined with the simplicity of this solution make it a easy way to improve performance (reduce latency) just by adding in a second GPU
> ...



i think dx12 and vulcan also build on this allowing for the 1 "screen" to be rendered by upto 4 gpu with each taking a quarter.


----------



## Valdas (Jun 2, 2016)

F-Zero said:


> Concerning the AOTS image quality controversy


Makes more sense than the initial explanation of procedural textures, considering that there numerous benchmarks on youtube showing multiple gpus side by side without any obviously noticeable difference regarding terrain.
None the less, if 480 performance is between 970 and 980 it makes sense for dual 480 to beat 1080 in AotS, since dual 980 outperform 1080 in most cases and this is a game that runs better on AMD cards.


----------



## zargana (Jun 2, 2016)

I would say something about the price of the card. Looks like the survey from techpowerup didnt pass unoticed from AMD eyes. 
If want to check the link.


----------



## Fluffmeister (Jun 2, 2016)

uuuaaaaaa said:


> About the image quality:
> _
> "At present the GTX 1080 is incorrectly executing the terrain shaders responsible for populating the environment with the appropriate amount of snow. The GTX 1080 is doing less work to render AOTS than it otherwise would if the shader were being run properly."_
> 
> http://videocardz.com/60860/amd-exp...ts-radeon-rx-480-cf-vs-gtx-1080-demonstration



Yet it looks better? The damn game is a joke. 

Also, I assume AMD knew this already yet they insist on using the game that isn't a true apples to apples comparison? ok.


----------



## Kanan (Jun 3, 2016)

rvalencia said:


> I expect 480(5.84 TFLOPS)'s real world performance to be around R9-390X(5.9 TFLOPS) to Fury Pro.
> 
> From 3DMarks 11 scores, AMD GPUs follows their SKU levels e.g. Fury Pro > R9-390X > R9-390 > R9-380X > R9-380.


Also don't forget that GCN 4 has a better utilization of it's shaders because of the new frontend of the GPU compared to GCN 2 and 3, so "TFLOPs" is pure theory, in fact, every shader of Polaris is not only better, but better utilized too. That + the higher clocks is why RX 480 is most likely faster than 390X even on high util games like Tomb Raider (that is in favor of 390X's 2816 shaders). 



> Also, I assume AMD knew this already yet they insist on using the game that isn't a true apples to apples comparison? ok.


It's normal that a company uses for marketing what's good for them. Nvidia and every other company does the same, they would be crazy if not.


----------



## NdMk2o1o (Jun 4, 2016)

Boohoo AMD's $200 card doesn't match up with the $650 card from Nvidia OMG stop the press....... if this card is somewhere at 390x/Fury level and for $200 as all the speculation seems to point towards, I couldn't give a fiddlers elbow if the AOTS was rigged or not (however as many have pointed out no run of AOTS is ever the same and you will see more/different textures with every run... but hey, why listen to reason when you can spout a load of nonsense) but if this is going to be THE go-to mid-range GPU like it is possibly shaping up to be then NV need to come back with something damn special being as they are selling the 980 and 970 successors at at 980 and 980ti prices, let's not even contemplate the next Titan card. AMD have the right idea, flood the mid range market with fast affordable cards, I for one will be buying a 480 and they haven't even been released or showcased yet, that right there is marketing


----------



## R-T-B (Jun 4, 2016)

Frick said:


> Didn't the IQ debate die half a decade ago?



Not with procedural generation exposing supposed minute differences...

Never mind the fact that procedural generation will differ every fucking run...  guys, trust me.  I'm a developer who does a lot of procedural generation.  It's a fully legit point.  There is no IQ difference.  There has not been for several generations of cards.  Mark my words.



> Boohoo AMD's $200 card doesn't match up with the $650 card from Nvidia OMG stop the press.......



True, but IQ should never differ.  It should just run slower.  That said, I really doubt it does.  Could you pick a worse possible title to compare on than one with frickin' procedural textures?


----------



## Fluffmeister (Jun 4, 2016)

Kanan said:


> It's normal that a company uses for marketing what's good for them. Nvidia and every other company does the same, they would be crazy if not.



I don't disagree at all, it makes perfect sense they focus on the one game that shows them in the best light.

But as R-T-B mentioned above, a game that randomly generates sounds like a pretty poor baseline to start with, but hey AMD choose it.

The fact that many think the game looks better when it's apparently being rendered wrong is just plain funny to me.

I guess the 267 people that currently play the game (or is that just benchmark?) care so maybe AMD are on to something:

http://steamcharts.com/app/228880


----------



## Kanan (Jun 4, 2016)

Fluffmeister said:


> I don't disagree at all, it makes perfect sense they focus on the one game that shows them in the best light.
> 
> But as R-T-B mentioned above, a game that randomly generates sounds like a pretty poor baseline to start with, but hey AMD choose it.
> 
> ...


It's not really important either, just nerds are raging about it - I think AMD wanted a game to show off their Asynch Compute, something Nvidia isn't good at. We still have to wait for proper reviews and benchmarks, then we know. That said the leaked 3Dmark bench was better, but with that high utilization on GPU it's only comparable to games like Tomb Raider. Worst case, if that bench is not a fake, real world gaming is -20% perf on that, but could very well also be the real performance of that GPU, depends if they closed the bottlenecks on the frontend of GCN (shader utilization).


----------



## Fluffmeister (Jun 4, 2016)

Kanan said:


> It's not really important either, just nerds are raging about it - I think AMD wanted a game to show off their Asynch Compute, something Nvidia isn't good at. We still have to wait for proper reviews and benchmarks, then we know. That said the leaked 3Dmark bench was better, but with that high utilization on GPU it's only comparable to games like Tomb Raider. Worst case, if that bench is not a fake, real world gaming is -20% perf on that, but could very well also be the real performance of that GPU, depends if they closed the bottlenecks on the frontend of GCN (shader utilization).



That's also the problem, AotS is a known quantity, we all know already AMD do well in it, they didn't show us anything we haven't seen already.

I'm not knocking the card at all, the price is clearly great, but numbers from one game directly from horses mouth leaves me wanting more. Roll on the end of the month.


----------



## Mussels (Jun 4, 2016)

not sure if this quote was shown in the previous pages:


----------



## TRWOV (Jun 4, 2016)

Two things I'm expecting:

- RX 480X... let's say, $250-$300?
- ITX version of RX 480


----------



## RejZoR (Jun 4, 2016)

I want to see what RX 490X can offer. Or even RX Fury. But We'll have to wait till autumn for that afaik.


----------



## medi01 (Jun 4, 2016)

RejZoR said:


> I want to see what RX 490X can offer. Or even RX Fury. But We'll have to wait till autumn for that afaik.


Fury in autumn is rather unlikely. (400mm2 Vega10 chip will likely be marketed as 490/490x) although with those chips, AMD should easily take on 1080.

Fury should be Vega11, 2017.

PS
If you wonder, why 11 is sometimes smaller chip and sometimes the bigger: number is not related to the size of the chip at all, it only reflects start date of the project.


----------



## Pehla (Jun 4, 2016)

Frag Maniac said:


> LOL, Crossfire?
> 
> I'd much rather pay $400 for a more powerful single GPU card, because quite frankly, Crossfire can be a pain. Plus they happened to pick the one game Dx12 works well on. Most games would likely run better on one 1070, and for $20 less.
> 
> AMD's marketing is still as dysfunctional as ever.



i gues you didnt watch that video a guy made about "amd master plan" 
as wierd as it sound it all developing exactly as the guy described...there is a reason why they make smaller gpu´s
the crosfire that everybody fears amd is going to fix that also..,just give them some more time ,they aint going to bother with it ...,they will handle it to game developers to do it for them!!acording to the "AMD MASTER PLAN"


----------



## HD64G (Jun 4, 2016)

Guys, let's make it simple.

RX480 is a GPU of great value targeting to flood the market ok?

Now, about its performance. If it lands just in 390 lvl it will be great. If it surpasses it, it will be the bargain of the century in GPU history imho. $200 is almost half than the previous gen GPUs in FPS/$. And please do not compare to any used GPU prices... 

In conclusion, as nVidia chose to release performance GPUs and AMD bargain GPUs first, we have no direct competition for the next few months. We will have that until next spring. So, anyone ready and willing to buy a GPU now, pick up the best for your wallet's depth  but don't start flame wars. We will have time for that in 2017.


----------



## Frag_Maniac (Jun 4, 2016)

Pehla said:


> i gues you didnt watch that video a guy made about "amd master plan"
> as wierd as it sound it all developing exactly as the guy described...there is a reason why they make smaller gpu´s
> the crosfire that everybody fears amd is going to fix that also..,just give them some more time ,they aint going to bother with it ...,they will handle it to game developers to do it for them!!acording to the "AMD MASTER PLAN"



No, actually I watched that vid, and I also noted at the end of the 2nd one, the vid author correctly states, "...but if anyone can screw it up, it's AMD."

THAT was the most telling part of the videos. Look at how customer satisfaction of SLI compares to that of Crossfire. Look at customer reviews on AMD GPUs, vs that of Nvidia ones. Also, note that the entire premise of his assuming AMD will eventually dominate gaming, lies on his assumption Nvidia will back out of gaming and go entirely industrial.

It's all predilected on what Nvidia decides, because that is the only way AMD will get a decent share of game endorsements back. It would literally take Nvidia giving up on gaming for that to happen, and honestly, I don't see that happening, especially with the huge success they've had at it lately.

Currently, AMD has a very small percentage of game endorsements, and that also has a lot to do with why they're so far behind on that Crossfire support you admit will take some time. In fact, if Nvidia stays strong in gaming, they're pretty much in the driver's seat, because AMD can't afford to spend the money it would take to catch up, and they're simply too mistake prone even if they did.

Have you not noticed on threads about multi GPU topics, that most prefer not to use Crossfire? I think you should also take a look at how the red vs green reviews compare. Nvidia isn't going anywhere. Those vids were nothing but a very AMD biased dream.


----------



## RejZoR (Jun 4, 2016)

Except AMD owns the console segment entirely. And we all know consoles are where everything starts these days.


----------



## Frag_Maniac (Jun 4, 2016)

RejZoR said:


> Except AMD owns the console segment entirely. And we all know consoles are where everything starts these days.




1. The console market has extremely small profits on the hardware end

2. Devs are voicing their discontent on how consoles are stagnating the evolution of gaming

3. There's been a lot of growth in new dev teams writing for PC

4. Players are getting fed up with the mediocre quality of AAA titles developed on console

5. There are quite a few former console players switching over to PC

6. Digital distribution has drastically increased overall PC game  sales

The above all spells an eventual shift in market share to larger PC growth in players and developed games. In fact it's been going on for some time. Pretty much ever since digital distribution. Consoles are FAR from where "everything" starts. More correctly, it's where most AAA titles start, and a decreasing number of exclusives. As I said though, the AAA game market is becoming unacceptably shoddy, repetitious,  and boring to many players.


----------



## RejZoR (Jun 4, 2016)

It's not about making money through console hardware, it's how AMD dictates the standards used as a whole. It's where everything begins. It's NVIDIA that needs to beg developers to do exclusive stuff for their hardware. And that reflects on PC as well. And numbers don't lie, console segment is way ahead of PC whether you like it or not. It may change in the future, but as things stand now, that's how it is.


----------



## Frag_Maniac (Jun 4, 2016)

RejZoR said:


> It's not about making money through console hardware, it's how AMD dictates the standards used as a whole. It's where everything begins. It's NVIDIA that needs to beg developers to do exclusive stuff for their hardware. And that reflects on PC as well. And numbers don't lie, console segment is way ahead of PC whether you like it or not. It may change in the future, but as things stand now, that's how it is.



LOL, "dictate"? Look at AMD's woeful market share. They're in no position to "dictate" anything. The funny thing is, YOU yourself are part of that equation. One of many whom defend AMD in an argument, but when it comes time to shell out the money, goes Intel and Nvidia (Yeah that's right, I can read your spec chart). Hell, having an AMD 7970, I'm more pro AMD than YOU are. I'm just not inclined to deny the facts. Part of the reason I bought my 7970, is AMD had VERY strong game endorsement at the time, and I got a solid 3 game bundle when purchasing it. That's not the case anymore, anyone can see that. And AMD has controlled the console market for some time, yet Nvidia's game endorsement is still growing, while AMD's is declining.

Nvidia don't need to "beg" devs to get game endorsement. Devs are glad to because they know what a huge market of Nvidia GPU owners exists, especially since the 900 series. They're also glad to because Nvidia spend more time with them, and listen to them FAR better than AMD does.

Again, you need to look at the game market holistically, vs a handful of hyped up AAA titles that are being shoddily developed, with many getting less than expected sales. AAA titles cost a lot more to make, so when they don't sell well, entire series get canned. It's been happening increasingly for some time. . The PC game sales are still fine, and if anything, increasing in share overall. Even Cliff Bleszinski has recently admitted he was wrong about the PC platform, and that it's the best to develop for.


----------



## Xzibit (Jun 4, 2016)

PC Market is bigger but they generate 1/3rd of the revenue consoles do in the US.






Worldwide PC MMOs account for about 3/4th of total PC gaming revenue.


----------



## medi01 (Jun 5, 2016)

Sony sold 40 million consoles so far. (PS4's I mean)
XBone about 25 I guess.
So 65 million chips over 3 years.

PC gaming market (including those gaming on IGPs) is about 100-200 million (depending on definition of "gaming" ), as far as I remember.

Sony has likely financed lion's share of RX 480's development.
Microsoft (Xbone) is likely involved in either 480 or Vega11 (next Xbox, "Scorpion" is rumored in 2017).
None of the cross-platform developers can ignore CGN.



RejZoR said:


> Except AMD owns the console segment entirely. And we all know consoles are where everything starts these days.



nVidia is rumored to convince Nintendo to use its chips in upcoming NX console.

That's for the all the lame "they are simply not interested, cause low margin yada yada" (with Nintendo's habit of making money on hardware, margins would be even more laughable). nVidia was likely very desperate to win at least one console manufacturer.

Oh, and they have pissed off Microsoft big time last gen. (to the argument about all companies being "just about making money". It surely does mean all of them would get as low as it gets for it)



Xzibit said:


> Worldwide PC MMOs account for about 3/4th of total PC gaming revenue.


Online/microtransaction games are not necessarily MMO, but yeah.


----------



## RejZoR (Jun 5, 2016)

Frag Maniac said:


> LOL, "dictate"? Look at AMD's woeful market share. They're in no position to "dictate" anything. The funny thing is, YOU yourself are part of that equation. One of many whom defend AMD in an argument, but when it comes time to shell out the money, goes Intel and Nvidia (Yeah that's right, I can read your spec chart). Hell, having an AMD 7970, I'm more pro AMD than YOU are. I'm just not inclined to deny the facts. Part of the reason I bought my 7970, is AMD had VERY strong game endorsement at the time, and I got a solid 3 game bundle when purchasing it. That's not the case anymore, anyone can see that. And AMD has controlled the console market for some time, yet Nvidia's game endorsement is still growing, while AMD's is declining.
> 
> Nvidia don't need to "beg" devs to get game endorsement. Devs are glad to because they know what a huge market of Nvidia GPU owners exists, especially since the 900 series. They're also glad to because Nvidia spend more time with them, and listen to them FAR better than AMD does.
> 
> Again, you need to look at the game market holistically, vs a handful of hyped up AAA titles that are being shoddily developed, with many getting less than expected sales. AAA titles cost a lot more to make, so when they don't sell well, entire series get canned. It's been happening increasingly for some time. . The PC game sales are still fine, and if anything, increasing in share overall. Even Cliff Bleszinski has recently admitted he was wrong about the PC platform, and that it's the best to develop for.



Lol, you're more pro AMD just because after several years I own 1 GeForce card. Funny man. My last GeForce before GTX 980 was 7600GT. Now go and count how many generations of AMD/ATi I had in between...

NVIDIA can have 90% of PC market, but in the end, life of nearly every single game, especially AAA game begins on consoles. Maybe that will change in the near future, but as things currently stand, that's how it is. And there, AMD owns 100% of the market. But hey, who am I to list actual statistics...


----------



## medi01 (Jun 5, 2016)

Frag Maniac said:


> ...huge market of..


Let's not buy into too much FUD here, shall we?
Whoever is selling more of things last quarter is, of course, interesting, but misleading too.
As seen on Steam's site (which is as PC gaming as it gets):
http://store.steampowered.com/hwsurvey





First, nVidia to AMD GPU market share is roughly 2 vs 1 and not 3-4 vs 1 as in Q sales reports.
Second, nVidia roughly has half of the Steam's (read "PC gaming") market.
Is that quite a lot? Yes.
But nowhere near 80% figures we saw recently.
Oh, and it also has close to zero on console market.

More interesting in this context is, what would multi-platform game developer use a as a "base". And, ups, that's Xbone, PS4. Which have what? AMD APUs.

Again, nothing stops gaming devs from eating the FUD, but woefulness of AMD's GPU market share (at the moment, sure it would suffer if Polaris fails vs Pascal, although we see no signs of that at the moment, on the opposite) is overrated, to say the least.


----------



## RejZoR (Jun 5, 2016)

AMD has done the analysis. PC gamers spend most money on graphic cards within the $100-300 range. In the end it doesn't matter how awesome NVIDIA's top end is. A graphic card in a range of current R9-390(X) at $199 is a spectacular value. And because it is not power hungry, it can be hooked into a crappy PC with lame PSU and it'll still work. We don't know how NVIDIA will react to this with GTX 1060 or 1050, but it'll be interesting time for casual gamers.


----------



## 64K (Jun 5, 2016)

medi01 said:


> Let's not buy into too much FUD here, shall we?
> Whoever is selling more of things last quarter is, of course, interesting, but misleading too.
> As seen on Steam's site (which is as PC gaming as it gets):
> http://store.steampowered.com/hwsurvey
> ...



A point to be considered is that the Steam Hardware Survey is a current usage survey. Some outside reporting agencies claim that Nvidia is outselling Radeon Tech Group by as much as 80%.

Radeon Tech will not fail against Pascal. They will bring a healthy competition just like we are accustomed to.

The murky water surrounding AMD is about Zen which is probably going to be a disappointment imo. AMD's R&D is minuscule compared to Intel and, yes, Jim Keller is a force to be reckoned with but he brought all that he could within the constraints of his budget.

AMD, is spiraling down the drain financially. But that's not the end of the ATI story. Radeon Tech  Group will survive of course. That's why Lisa Su split them off as a separate business entity from AMD in the first place.


----------



## OneMoar (Jun 5, 2016)

this thread serves no purpose
take your circle jerk speculation to reddit 
we will have benchmarks sometime around the 27th 
until then you are wasting valuable forum space that could be used for some other inane thread such as one about tacos


----------



## NdMk2o1o (Jun 5, 2016)

medi01 said:


> Let's not buy into too much FUD here, shall we?
> Whoever is selling more of things last quarter is, of course, interesting, but misleading too.
> As seen on Steam's site (which is as PC gaming as it gets):
> http://store.steampowered.com/hwsurvey
> ...



Someone speaking some sense... Lol I've always gone back and forth between nv/ati  for 15 years or so ever since I got into building pc's, not much has changed tbh but I suspect a lot of the tpu base are of a younger generation and see Intel and nv as Kings of the hill and amd as a dying breed, that's not the case and as has been pointed out they own 100% of the console market, what these so called statistics don't mention is how many other people are running amd apu based systems as that's another market as well as the console one that nv can't get a foot in, but hey what do I know lol


----------



## Frag_Maniac (Jun 5, 2016)

@Rej,
I'll debate it both past and present. In the sum of my gaming years, I've spent more time on ATI/AMD GPUS, than Nvidia, and I've also had AMD CPU based gaming rigs, my first one was. Presently though is when we are arguing this, not years or decades ago, and right now, despite all your defense toward AMD, you're running an Intel CPU, and Nvidia GPU, and admit you feel the Intel runs better. So your actions speak for Intel, but somehow you aren't willing to really adhere to that with your words. Time to face reality with your words too.

@med,
Thanks for verifying what I already said, Nvidia has the majority of market share. Not long ago AMD led in number of units sold, but still had lower share due to smaller profit margins, but since Nvidia's 900 series took off, that isn't even the case anymore, and as long as AMD keeps putting out cheap cards that have to be run on their inferior Crossfire, that will continue to be the case.


----------



## RejZoR (Jun 5, 2016)

We used to have a term for people with mindset of yours. It starts with F and ends with Y. Wondering why I'm running "outdated" X99 platform with 5820K instead of brand new spanking 6700K ? Why I'm not rushing to update to Broadwell-E ? Why am I running GTX 980 instead of R9 Fury X? Why am I using 2TB SSD? There are countless things that may not be logical to you or someone else. But they are the way I want it. Sometimes they are rational, sometimes they are irrational. What difference does it make to you?

3/4 of things I buy, I buy them out of curiosity, not necessity. I've bought a 6 core with HT because I was curious how it is to have a 12 thread system. Why do I have 32GB of RAM when I never even used 18GB on X58 that I had. Why the hell have I even bought X99 platform when I was talking I'll get myself Zen the entire time? Curiosity. Why have I bought GTX 980 when HD7950 was actually still serving me perfectly well. Why I might buy current high end generation from AMD again? Curiosity. Why have I changed several high end soundcards even though there were no real revolutions in that segment. Curiosity. Would you believe me if I said I was actually this close to getting a FX-9590? Because fuck reason and logic, I was just curious. But then I've decided for something else because you can't have 2 entirely different platforms at once...


----------



## Frag_Maniac (Jun 5, 2016)

RejZoR said:


> Wondering why I'm running...



Again, off topic. All you need pertain to is the bit we actually discussed, why you're on an Intel CPU vs AMD, and you already said it's because you think the Intel is better. So stop trying to get all passive aggressive with your F and Y insults, and stick to the topic. Only reason you don't, is you have no point.

You also don't need to try and come up with BS excuses like you're "curious", because again, you already admitted the Intel was better. This kind of denial is what I firmly believe turns rational people into radical lunatics, that don't even see they're arguing against themselves.


----------



## OneMoar (Jun 5, 2016)

where are the moderators ....
I am tired of setting this crap on tpu TPU is not reddit and so long is I draw breath is shall not  become Reddit
come on people you are better then this


----------



## Frag_Maniac (Jun 5, 2016)

OneMoar said:


> where are the moderators ....
> I am tired of setting this crap on tpu TPU is not reddit and so long is I draw breath is shall not  become Reddit
> come on people you are better then this



Agreed, I shall stop responding to him, since he's not staying on topic anyway.


----------



## OneMoar (Jun 5, 2016)

Frag Maniac said:


> Agreed, I shall stop responding to him, since he's not staying on topic anyway.


at the risk of sounding like a ass
getting real tired of these new users that don't have a god dam clue and all they do is sit around here and argue those people need permabans I and a lot of others on this board dedicate lot of time to helping people and seeing this shit just makes me wanna log out and never come back
#banthen00bs


----------



## Frag_Maniac (Jun 5, 2016)

OneMoar said:


> at the risk of sounding like a ass
> getting real tired of these new users that don't have a god dam clue and all they do is sit around here and argue those people need permabans I and a lot of others on this board dedicate lot of time to helping people and seeing this shit just makes me wanna log out and never come back
> #banthen00bs



Since I've been here nearly as long as you, I have to assume you mean someone else when you say "new users"?


----------



## Fluffmeister (Jun 6, 2016)

The thread is funny now, the talk of market share is also gold because thanks to evil monopolies nVidia don't have the luxury of selling an x86 CPU with an integrated GPU on board.

Intel really aren't that far behind AMD on Steam survey and GPU's aren't even their thing.

Beer.


----------



## RejZoR (Jun 6, 2016)

Because Intel's IGP's are in most CPU's they sell, whether they are used or not. Their latest generations aren't that bad, but mostly just because AMD pressured them into improving them with APU's. Otherwise they'd still be bundling outdated garbage with their CPU's.


----------



## Fluffmeister (Jun 6, 2016)

RejZoR said:


> Because Intel's IGP's are in most CPU's they sell, whether they are used or not. Their latest generations aren't that bad, but mostly just because AMD pressured them into improving them with APU's. Otherwise they'd still be bundling outdated garbage with their CPU's.



As of May 2016, 17.53% of Steam gamers use them to play their beloved CS or whatever. The point is plenty of Nvidia GPU's shipped never end up in gaming rigs or anything remotely close, medi01's post about the market share figures being wrong is.... silly.


----------



## medi01 (Jun 6, 2016)

Frag Maniac said:


> inferior Crossfire


That's plain bullsh1t.
Inferior to what?
Crossfire is noticeably more effective than SLI.
Oh, as demonstrated here, even more so with more cards.
http://iyd.kr/753



Frag Maniac said:


> and as long as AMD keeps putting out cheap cards.


960 outselling 280, despite being about 10% slower is rather sad result of marketing, power consumption at that tier is laughable anyhow.

On the other hand, 970 was good (to a point 980 hardly made sense) and I can understand people caring about 70w difference (although it was percieved as more) against 10% faster 390.

980Ti was a great card, AN ANSWER TO FURY, mind you (which wiped the floor with 980), and due to glorious OC-ing, best of it's gen. Again, nothing wrong with it outselling competition. (Fury X was perceived to be worse than it was, though, for at least for the last 6 month it is faster than stock 980Ti at resolutions starting from 1440p)

Titanium was BS, created mostly for FUD marketing, I guess.



Frag Maniac said:


> you're running an Intel CPU, and Nvidia GPU, and admit you feel the Intel runs better.


Wassup, why attack someone personally, Jeez. How does it prove your point? (if you even have one, I seem to be missing it)
Even AMD run Intel CPU when demoing Fury.



Frag Maniac said:


> Nvidia has the majority of market share.


56% of steam users use it.
26% for AMD, 17% Intel.
As seen in the latest Steam review.

Those are the guys who buy games on PC.
Oh, and that's why most Blizzard's games run even on Intel IGP.







This is the only market share relevant in the context we are discussing: which GPUs devs should pay attention too.


Oh, and there is nothing wrong with great 200-300$ cards. I actually wonder why AMD is even bothering with juggernaut chips.


----------



## Mindweaver (Jun 6, 2016)

Hey guys stop with the back and forth bickering. Either stay on topic or go to another thread. If you see something then report it and we will take care of it, warning issued.


----------



## HD64G (Jun 6, 2016)

Frag Maniac said:


> @med,
> Thanks for verifying what I already said, Nvidia has the majority of market share. Not long ago AMD led in number of units sold, but still had lower share due to smaller profit margins, but since Nvidia's 900 series took off, that isn't even the case anymore, and as long as AMD keeps putting out cheap cards that have to be run on their inferior Crossfire, that will continue to be the case.



Totally false logic eplanation of why 9X0 series GF won over the AMD opposition. 290X was too hungry for power and on the same lvl of performance and price to nVidia as it was much larger chip. Most chose to get 970-980 and not 290-290X after the mining crazyness passed just because of this diff in power consumption and the (invalid for the last 2 years) point of nVidia drivers being better. Not because AMD needed CF to win nVidia as you try to force us to think. In fact, AMD won market over 260 and 280 using much smaller dies as 4850-4870. And again with 5870-5850 vs 470-480. So, as history clearly shows, the strategy they again chose to follow with smaller and lower priced GPUs is their best bet vs nVidia's greedyness that showed its huge amount with that Founders Edition BS... 

Let's see how it goes from July and on now eh?


----------



## Frag_Maniac (Jun 6, 2016)

HD64G said:


> Totally false logic eplanation of why 9X0 series GF won over the AMD opposition.


You totally missed my point. First off, I said "*AND* as long as"... I'm not implying PAST models lost to the 900 series due to that. It was a response I made to the many here implying AMD's get two 480s to run in Crossfire suggestion is sound. You DID point out that AMD's product has other problems, but at the end of the day, it still comes down to what performance at what price, and trust in drivers and hardware quality. 

Look at the reviews on AMD cards vs Nvidia. AMD always has much worse feedback, even from those whom DO trust them enough to buy them. And now that Nvidia is offering high end performance at $300, many are jumping ship. It's as simple as that.

That said, ever since AMD failed to win the price/performance race on high end GPUs with their 300 and Fury series, all they're talking about is cheap cards in Crossfire, or that Duo monster that no one with any sanity wants to bother with.

Face it, AMD are NOT competing effectively in the high end GPU market lately. It's why most every driver release is just a bunch of added Crossfire support mostly. They will continue to play catchup with multi GPU support though, since Nvidia has the majority of endorsements, which I don't see changing with their owning the console market as some suggested it would.

Like it or not, that's the reality, and I don't see that changing this summer as you imply.


----------



## OneMoar (Jun 6, 2016)

Enough .... you people are arguing in circles
this is a derailed thread if I have ever seen one


----------



## ViperXTR (Jun 6, 2016)

@OneMoar: Your avatar fits perfectly now


----------



## OneMoar (Jun 7, 2016)




----------



## medi01 (Jun 7, 2016)

Confirmation about nVidia's driver issues in AotS:
https://twitter.com/dankbaker/status/739880981612625920


----------



## dorsetknob (Jun 7, 2016)

medi01 said:


> Confirmation about nVidia's driver issues in AotS:
> https://twitter.com/dankbaker/status/739880981612625920


Totally pointless Nvidia post thread hijack in an AMD thread
Start your own thread OR GET A LIFE FANBOY

EDIT i did'nt even bother to click the link to TWATTER


----------



## Xzibit (Jun 7, 2016)

dorsetknob said:


> Totally pointless Nvidia post thread hijack in an AMD thread
> Start your own thread OR GET A LIFE FANBOY
> 
> EDIT i did'nt even bother to click the link to TWATTER



Not really pointless. It has to do with 2 pages of this thread was talked about.  Now if that discussion itself is pointless is another matter.

As for the link.  You have a Oxide dev saying Nvidia driver had issues and the Nvidia rep saying that drivers was press only and then others questioning why "press only" and not released as a "beta" so you don't have press reviewing with a different driver then whats available to the public.


----------



## Tatty_One (Jun 7, 2016)

I have just gone through the last page or two of this cluster, struggling to find anything that directly relates to the actual topic, I see a bunch of people saying how it's been derailed and then they spend a page further derailing it so they have succeeded, it's closed.


----------

