Thursday, September 27th 2018

NVIDIA's Weakness is AMD's Strength: It's Time to Regain Mid-Range Users

It won't be easy to see an AMD counterpart to NVIDIA GeForce RTX 2080 Ti in the short time. Heck, it will be difficult to see any real competitors to the RTX 2000 Series anytime soon. But maybe AMD doesn't need to compete on that front. Not yet. The reason is sound and clear: RTX prices are NVIDIA's weakness, and that weakness, my fellow readers, could become AMD's biggest strength.

The prices NVIDIA and its partners are asking for RTX 2080 and RTX 2080 Ti have been a clear discussion topic since we learnt about them. Most users have criticized those price points, even after NVIDIA explained the reasoning behind them. Those chips are bigger, more complex and more powerful than ever, so yes, costs have increased and that has to be taken into account.
None of that matters. It even doesn't matter what ray-tracing can bring to the game, and it doesn't matter if those Tensor Cores will provide benefits beyond DLSS. I'm not even counting on the fact that DLSS is a proprietary technology that will lock us (and developers, for that matter) a little bit more in another walled garden. Even if you realize that the market perception is clear: who has the fastest graphics card is perceived as the tech/market leader.

There's certainly a chance that RTX takes off and NVIDIA sets again the bar in this segment: the reception of the new cards hasn't been overwhelming, but developers could begin to take advantage of all the benefits Turing brings. If they do, we will have a different discussion, one in which future cards such as RTX 2070/2060 and its derivatives could bring a bright future for NVIDIA... and a dimmer one for AMD.

But the thing that matters now for a lot of users is pricing, and AMD could leverage that crucial element of the equation. In fact, the company could do that very soon. Some indications reveal that AMD could launch a new Polaris revision in the near future. This new silicon is allegedly being built on TSMC's 12 nm process, something AMD did successfully with its Ryzen 2000 Series of CPUs.
AMD must have learnt a good lesson there: its CPU portfolio is easily the best in its history, and the latest woes at Intel are helping and causing forecast revisions that estimate a 30% market share globally for AMD in Q4 2018. See? Intel's weakness is AMD's strength on this front.

2018 started with the worst news for Intel -Spectre and Meltdown- and it hasn't gone much better later on: the jump to the 10 nm process never seems to come, and Intel's messages about those delays have not helped to reinforce confidence in this manufacturer. The company would be three generations further than they are now without those big problems, and the situation for AMD would be quite different too.

Everything seems to make sense here: customers are upset with that RTX 2000 Series for the elite, and that Polaris revision could arrive at the right time and the right place. With a smaller node AMD could gain higher yields, decrease cost, increase clock frequencies and provide that 15% performance increase some publications are pointing to. Those are a lot of "coulds", and in fact there's no reason to believe that Polaris 30 is more than just a die shrink, so we would have the same unit counts and higher clocks.
That won't probably be enough to make the hypothetical RX 680 catch up with a GTX 1070: performance of the latter is +34% the one we found in the RX 580 on average according to our tests, so even with that refresh we will have a more competitive Radeon RX family that could win the price/performance battle, and that is no small feat.

The new cards would also not target just existing GTX 7/9 Series users, but also those older AMD Radeon users that were expecting a nice upgrade on performance without having to sell their souls. And for the undecided users, the ones that are thinking about getting a GTX 1050/Ti or a GTX 1060, AMD's offer could be quite attractive if price/performance ratio hits NVIDIA where it hurts more.

That would put that new family of graphic cards (Radeon RX 600?) on a pretty good position to compete with GeForce GTX 1000. NVIDIA presumably could still be king in the power consumption area, but besides that, AMD could position itself on that $300-$500 range (and even below that) with a really compelling suite of products.

So yes, AMD could have a winning hand here. Your move, AMD.
Add your own comment

110 Comments on NVIDIA's Weakness is AMD's Strength: It's Time to Regain Mid-Range Users

#26
R0H1T
John NaylorNo they didn't ... not even close. Pre 9xx series, AMD was not competing in the top two prce tiers. With 9xx, they lost the 3rd and with 10xx they lost the 4th.

First the performance comparison ...... copied from an old post so not current pricing

Now lets look at the sales..... from steam hardware survey

The GTX 1060 is the most popular card hitting steam servers.... representing 13.76 % of the steam market and up 0.87% in the last month.
The RX 480 sits in 28th place among cards hitting steam servers.... representing 0.66 % of the steam market and up 0.07% in the last month. The 58- adds 0.44% bring market share to 1.10% ... Combined, the 1060 ourtsells then by 12.51 to 1. That's 92.6% of the market in this price niche.

So not only is the 1060 outselling the 480 20.84 to one, the gap is widening. More new 1060s hit steam servers oin the last month than all 480s put together since it's release.

The 480, power issues aside, is not a bad card at all... it had some power issues .... it's 2nd place finish in this market niche is was not that far behind however, who remembers who won the silver medal in the Olympic... on;y the gold medal winner makes the Wheaties Box. So, no ... AMD has certainly not "regained" anything here. AMD, at this point in time, doesn't have a horse in the race from the 1060 level on up.

And price wise, I thnk we have to wait for the dust to clear on several fronts. 1) have to wait and see what prices are after the vendor price gouging phase is over where the focus is on taking advantage of those who must be the 1st on the block to have the new shiny thing. 2) It's no secret that nVidia has a large lot of pre-tariff priced 10xx series that it wants to sell before ramping up production on 20xx. And finally 3) let's wait till after January when warehoused stock on this side of the pond for both compoanies has been sold and there's a 30% on all goods imported into US.
You're forgetting the big elephant in the room, or woolly mammoth if you ask me ~ mining, is there any recent gen of cards outselling Polaris in this department? Do you have numbers for the sale of cards especially for mining? What about Mac, do they count? 1060 may show up in greater numbers than x80 Polaris for "Steam Surveys" but it didn't sell anywhere close to that number 12.5:1 in the real world, I'd argue probably a quarter of that so about 3~4x perhaps.
Posted on Reply
#27
kapone32
efikkanAgreed.
The AMD fans seem more and more like a resistance movement or a cult, refusing to face the facts.

Do you want to vote with your wallet? Then refuse to buy inferior GPUs, and AMD will be forced to make something better, instead of a mob pressuring people to buy AMD GPUs for ideological reasons.

And for all of those who desperately clings to the hope of 7 nm saving AMD; 7 nm will definitely be great eventually, but remember that it will be even more beneficial for more efficient architectures. Since both vendors have access to the new nodes, these will just increase the performance gap between them.
Can you show me where AMD stated that the Vega 64 would be faster than a 1080TI. If I remember correctly they were putting the 64 against the 1080 and the 56 vs the 1070. In terms of inferior I will tell you why I no longer but Nvidia. My first cards (that I bought myself) were the GTS 450 I got 2 of them to run in SLI. 2 years later and Nvidia releases the 580 and guess what? They put out a driver so that I could no longer use SLI on my 450s. Then I go for AMD 6850 and then the 7950 then 7950 crossfire which gave me a perfect VR experience from 2011 to 2016. I did go for the Polaris but I found that the only thing that got better was power consumption. I upgraded to Vega and have not looked back. In AOTS I get 150+ FPS in 4K. I do agree that the price of Vega is obnoxious until I saw the prices for the RTX cards. If anyone has noticed AMD has not said one word in response to the RTX cards. All of things that people used to complain about with AMD are in the past e,g power consumption, IPC and now 4K gaming. I have been gaming at 4K with a Polaris xfire and now my Vega. It would appear to me that those who complain about NVidia the most are people that don't actually own them.
Posted on Reply
#28
WikiFM
Some indications reveal that AMD could launch a new Polaris revision in the near future. This new silicon is allegedly being built on TSMC's 12 nm process, something AMD did successfully with its Ryzen 2000 Series of CPUs.
@dmartin Ryzen 2000 Series is being built on GF's 12 nm, not on TSMC's 12 nm.

With a smaller node AMD could gain higher yields, decrease cost, increase clock frequencies and provide that 15% performance increase some publications are pointing to. Those are a lot of "coulds", and in fact there's no reason to believe that Polaris 30 is more than just a die shrink, so we would have the same unit counts and higher clocks.
If Polaris 30 is being built on TSMC's 12 nm then is not comparable to Polaris 10 or 20 which are built on GF's 14 nm instead. The metrics are not compatible between different foundries, we can't be sure if is a smaller node or die shrink after all, so any performance uplift is uncertain.
Posted on Reply
#29
Casecutter
As to:
"resistance movement or a cult, refusing to face the facts."

I just see two huge Ranting's/Breakdowns of this "Editorial" in a matter of minutes, with all these current stat's and claims, almost like they had them canned and waiting in the wings. And, then there's attacks that this "new contributing editor" is a "plant", while two "short-timers" can construct volumes of rebuttal, and they purport he and other must be some shrill or resistance for not adhering to their tyrants... Wow!

The thing to Steam indicator is all GTX 1060 are lumped together, the 6Gb and the cheaper 3Gb geldings. So apples to apples lump the 470/570 numbers in the AMD data also.
Posted on Reply
#30
kapone32
John NaylorNo they didn't ... not even close. Pre 9xx series, AMD was not competing in the top two prce tiers. With 9xx, they lost the 3rd and with 10xx they lost the 4th.

First the performance comparison ...... copied from an old post so not current pricing

=============================
Of course the correct choice for ant individual will ultimately depend on what games you play so will speak in overall terms. What we know:

1. Which one - Not all cards are created equal but this is especially true with the RX 480. Techpowerup writes:

www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/28.html



2. Out of the Box performance - So let's compare two cards from the same (MSI) manufacturer and model line (Gaming X). From above link:





3. AIB Cards - From the above, we see that the MSI RX 480 is 7% faster overall in TPUs 16 game test suite. From Below, the MSI 1060 Gaming X is 3% faster than the reference 1060 ... so we can can conclude that at the time of testing the MSI 1060 was 10% faster than the MSI 480 in the 16 game test suite

4. Overclocking - We see there that the MSI 480 overclocks 8.6% and the MSI 1060 overclocks 15.1%.. So when the 1060 (10% performance advantage) is overclocked, the relative difference would be:

www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/26.html
www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X/27.html

110% x (115.1 / 108.6) = 116.6% of the 480s speed or 16.6 % faster

As for difference between brands ... the various brands trade wins depending on generation and model line but the EVGA SC is one to avoid as, unlike the competition, they use a reference PCB and referece style PCB cooling.

5. Driver improvements - AMDs driver improvements have improved the performance of the 480 since originally tested. As we can see from the link here, TPU tested the results from the latest driver improvements and found an increase if 2.1% at 1080 p average across 21 games:

www.techpowerup.com/reviews/AMD/Radeon_Crimson_ReLive_Drivers/6.html

Unfortunately, that resource provide no info on what improvements have resulted from newer nVidia drivers but suffice to say, those improvements have not erased that 10% gap outta the box (16.6% in both overclocked.

6. Cost - Last I looked (yesterday) the MSI 1060 6GB was about $15 more than the MSI 480 8GB on newegg. But there are other costs worth considering

7. Power - There is a significant difference in power usage between the two cards. One of the reasons for the MSI 480s performance,as stated in the review, is because it is able to use more power than many other 480s. That's 75 watts in typical gaming and 99 watts peak

The MSI 480 draws from 196 - 224 watts
The MSI 1060 draws from 121 - 125 watts

www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/21.html

8. Power Costs - While this is something you normally wouldn't consider, when cards are very close in performance, it may be of significance to many users, especially those in Europe and especially in urban / suburban locales.

75 watts x 35 hours per week x 52.14 weeks per year x 3 years usage x $0.131 US average electric cost per kw-hr / (1000 watts per kw=hr x 85% efficiency) = $63.28

9. Case Cooling - The rule of thump for case fans in a relatively quiet system is one (1) case fan per 75 watts for power. So for comparable interior case temps, you might want to include the cost of an extra case fan.

10. Noise - The 480 is 3 dbA louder than the 1060

www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/22.html
www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X/23.html

=============================================

Now lets look at the sales..... from steam hardware survey

The GTX 1060 is the most popular card hitting steam servers.... representing 13.76 % of the steam market and up 0.87% in the last month.
The RX 480 sits in 28th place among cards hitting steam servers.... representing 0.66 % of the steam market and up 0.07% in the last month. The 58- adds 0.44% bring market share to 1.10% ... Combined, the 1060 ourtsells then by 12.51 to 1. That's 92.6% of the market in this price niche.

So not only is the 1060 outselling the 480 20.84 to one, the gap is widening. More new 1060s hit steam servers oin the last month than all 480s put together since it's release.

The 480, power issues aside, is not a bad card at all... it had some power issues .... it's 2nd place finish in this market niche is was not that far behind however, who remembers who won the silver medal in the Olympic... on;y the gold medal winner makes the Wheaties Box. So, no ... AMD has certainly not "regained" anything here. AMD, at this point in time, doesn't have a horse in the race from the 1060 level on up.

And price wise, I thnk we have to wait for the dust to clear on several fronts. 1) have to wait and see what prices are after the vendor price gouging phase is over where the focus is on taking advantage of those who must be the 1st on the block to have the new shiny thing. 2) It's no secret that nVidia has a large lot of pre-tariff priced 10xx series that it wants to sell before ramping up production on 20xx. And finally 3) let's wait till after January when warehoused stock on this side of the pond for both compoanies has been sold and there's a 30% on all goods imported into US.
Have you actually owned a 480? I can tell you that the power draw you are looking at is for crossfire 480s.
Posted on Reply
#31
Vayra86
notbSuch an awful text. Unbelievable. :-o
@hat what do you think about this one?

Nvidia logo in the header and a full-blown AMD toadying inside.
And, inevitably, Meltdown and Intel's 10nm problems had to be mentioned. Because why not?

This sentence in particular looks like copied straight from AMD strategy presentation or internal mailing:
"The reason is sound and clear: RTX prices are NVIDIA's weakness, and that weakness, my fellow readers, could become AMD's biggest strength."

And then this:
"So yes, AMD could have a winning hand here. Your move, AMD. "

Show yourself @dmartin - joined: Tuesday at 12:35 PM - Messages: 2
Tell me why do you care about AMD "winning" so much? And why do you think I should care?

@W1zzard why are we getting this?
+1

What is the point of these bits of text? To echo the popular sentiment on a forum after the fact? If the idea is to breathe life into a discussion or create one... well. Mission accomplished, but there are a dozen threads that have done that already. And this piece literally adds nothing to it, all it serves is to repeat it.

Regardless.... here's my take on the editorial you wrote up @dmartin

I think it's a mistake to consider Nvidia's Turing a 'weakness'. You guys act like they dropped Pascal on the head when they launched Turing and there is no way back. What Turing is, is an extremely risky and costly attempt to change the face of gaming. Another thing Turing is, is another performance bracket and price bracket over the 1080ti. Nothing more, nothing less. As for perf/dollar, we are now completely stagnated for 3 years or more - and AMD has no real means to change that either.

You can have all sorts of opinion on that, but that does not change the landscape one bit, and a rebranded Polaris (2nd rebrand mind you, where have we seen this before... oh yeah, AMD R9 - that worked out well!) won't either. AMD needs to bring an efficiency advantage to its architecture and until then, Nvidia can always undercut them on price because they simply need less silicon to do more. Specifically in the midrange. Did you fail to realize that the midrange GPU offering realistically hasn't changed one bit with Turing's launch?

If anyone really thinks that an RX680 or whatever will win back the crowd to AMD, all you need to do is look at recent history and see how that is not the case. Yes, AMD sold many 480s when GPUs were scarce, expensive and mostly consumed by miners. In the meantime - DESPITE - mining AMD still lost market share to Nvidia. That's how great they sell.. Look at Steam Survey and you see a considerably higher percent of 1060's than you see RX480/580s. Look anywhere else with lots of data and you can see an overwhelming and crystal clear majority of Nvidia versus AMD.

What I think is that while Nvidia may lose some market share ánd they may have miscalculated the reception of RTRT / RTX, the Turing move still is a conscious and smart move where they can only stand to gain influence and profit. Simply because Pascal still is for sale. They cover the entire spectrum regardless. Considering that, you can also conclude that the 'Pascal stock' really was no accident at all. Nvidia consciously chose to keep that ace up its sleeve, in case Turing wasn't all they made it out to be. There is really no other option here, Nvidia isn't stupid. And I think that choice was made the very moment Nvidia knew Turing was going to get fabbed on 12nm. It had to be dialed back.

Nothing's changed, and until AMD gets a lean architecture and can fight Nvidia's top spots again, this battle is already lost. Even if Turing costs 2000 bucks. The idiocy of stating you can compete with a midrange product needs to stop. It doesn't exist in GPU. Todays midrange is tomorrow's shit performance - it has no future, it simply won't last.
Posted on Reply
#32
Mr.Mopar392
gamermanwaht a heck i read... hehe

nvidia 'weaknes' is amd strengt...? well come on..

amd weakness is totally,next 3 generation amd gpus are lausy junk.

and if we took nvidia weakness i dont think writers dont know what he took.
rtx 2000 series gpus are excellent,thats fact,buts looks that every gpu WHAT NVIDIA MAKE must be 100% more performance top notch like nvidia gtx 1000 series was.


we must remembe that amd cant win even its 6 month ago new gpu amd vega nvvida stone age over 2 tears old gtx 1000 series!
so even if nvidia NOT release rtx series nvidia will leading performance and efficiency. 'weakness... hehe


but now rtx 2080 ti have almost double performance aganist amd vega64 and STILL eat LESS power!!! do you understand how diffucult its is....? i dont think so.

amd never can build gpu like that,ever. i can make bet that.

amd has not skill,cash and engineers for doing that,its fact.
bcoz last 3 generation counting back, amd radeon 200 series to fury and vega its tdp just raise alot but performance AND what is moust interesting efficiency going totally lausy.

its tell only that amd has not nothing way to build better gpus!
only help is coming another way,and that is smaller building line. thats it.
thats why example amd vega64 eat so much power than its competor gtx 1080 and of coz 1080 ti, btw i dont remembe that kind complain and crying when vega gpus coming... huh!!


these days when gpus must running games 4K with 27 to 32 inch monitor gpu need alot power and its hard build faster gpu without power raise skylimit (what amd gpus done next 2 generation btw)

amd ways to fight against nvidia is close end, specially when intel coming 2020, not so long anymore. so yes Q1/2021 is d-day for amd for sure.

i say,year 2021 there is 2 gpu builder, intel and nvidia, amd mght build budget gpus.. might.


nvidia was start and its headline is efficiency now 6 years and its show, they do alot alot hard work to build great gpus, and they really deserve respect, amd not,amd offer old tech gpu for customer generation for generation just for new name.

i can say this, amd never can make good gpu anymore,amd only 'hope' is 7nm line cbuild,its help little but not even close enough......example nvidia build its gpu always larger line tech,but STIL tehy can build better efficiency gpus and faster.... is it nough..

alst 3 generation amd gpus, non e of them not deserve 'editor chocie' reward..so much tehy eat power with average performance, amd not deserve any sympaty or support bcoz that, they dont care and offer ppl lausy gpus and cpus also,, and trying win your sympaty with yelling color red... thats all.

nvidia might stop making gpus any time,they dont need it, more than 70% cash coming another way.......so prey,respect and use them.
gamermanwaht a heck i read... hehe

nvidia 'weaknes' is amd strengt...? well come on..

amd weakness is totally,next 3 generation amd gpus are lausy junk.

and if we took nvidia weakness i dont think writers dont know what he took.
rtx 2000 series gpus are excellent,thats fact,buts looks that every gpu WHAT NVIDIA MAKE must be 100% more performance top notch like nvidia gtx 1000 series was.


we must remembe that amd cant win even its 6 month ago new gpu amd vega nvvida stone age over 2 tears old gtx 1000 series!
so even if nvidia NOT release rtx series nvidia will leading performance and efficiency. 'weakness... hehe


but now rtx 2080 ti have almost double performance aganist amd vega64 and STILL eat LESS power!!! do you understand how diffucult its is....? i dont think so.

amd never can build gpu like that,ever. i can make bet that.

amd has not skill,cash and engineers for doing that,its fact.
bcoz last 3 generation counting back, amd radeon 200 series to fury and vega its tdp just raise alot but performance AND what is moust interesting efficiency going totally lausy.

its tell only that amd has not nothing way to build better gpus!
only help is coming another way,and that is smaller building line. thats it.
thats why example amd vega64 eat so much power than its competor gtx 1080 and of coz 1080 ti, btw i dont remembe that kind complain and crying when vega gpus coming... huh!!


these days when gpus must running games 4K with 27 to 32 inch monitor gpu need alot power and its hard build faster gpu without power raise skylimit (what amd gpus done next 2 generation btw)

amd ways to fight against nvidia is close end, specially when intel coming 2020, not so long anymore. so yes Q1/2021 is d-day for amd for sure.

i say,year 2021 there is 2 gpu builder, intel and nvidia, amd mght build budget gpus.. might.


nvidia was start and its headline is efficiency now 6 years and its show, they do alot alot hard work to build great gpus, and they really deserve respect, amd not,amd offer old tech gpu for customer generation for generation just for new name.

i can say this, amd never can make good gpu anymore,amd only 'hope' is 7nm line cbuild,its help little but not even close enough......example nvidia build its gpu always larger line tech,but STIL tehy can build better efficiency gpus and faster.... is it nough..

alst 3 generation amd gpus, non e of them not deserve 'editor chocie' reward..so much tehy eat power with average performance, amd not deserve any sympaty or support bcoz that, they dont care and offer ppl lausy gpus and cpus also,, and trying win your sympaty with yelling color red... thats all.

nvidia might stop making gpus any time,they dont need it, more than 70% cash coming another way.......so prey,respect and use them.
I don't understand a damn thing you said! other than green shades. literally i that's all i got form you
Posted on Reply
#33
Frick
Fishfaced Nincompoop
And again we see the need for a huge EDITORIAL tag for the forum post as well s the main page.

Polaris would still be good if those damned prices were lower. 4GB RX580 at €200 is what it should be.
Posted on Reply
#34
Mr.Mopar392
dwadeAMD is built on hopes and dreams.
They could do this and that. X Y and Z.
But the opposite usually happens because it’s AMD.
like zen right? i totally get it!
Posted on Reply
#35
efikkan
kapone32Can you show me where AMD stated that the Vega 64 would be faster than a 1080TI.
Can you show me where I said that?
kapone32In terms of inferior I will tell you why I no longer but Nvidia. My first cards (that I bought myself) were the GTS 450 I got 2 of them to run in SLI. 2 years later and Nvidia releases the 580 and guess what? They put out a driver so that I could no longer use SLI on my 450s.
So, you claim Nvidia's current products are inferior because a driver broke your SLI setup many years ago?
I'm not claiming Nvidia is close to perfect, but this just sounds to me like one of those people who get a little disappointed once and then boycotts the vendor "forever".

I've experienced a lot over the years with GPUs, and worn out or bricked a good stack of GPUs (even killing a couple with my own flawed code…), I've seen countless crashes, broken drivers, etc. all of this without touching GPU overclocking. If I were to boycott a vendor once I encounter a minor problem, then I would have nowhere to go.
kapone32In AOTS I get 150+ FPS in 4K.
Finally! Have we found that one guy who actually plays AotS?:cool: </sarcasm>
Posted on Reply
#36
TheinsanegamerN
SteevoI think you fail to understand the majority of users don't buy high end cards, but mid-range or low end cards. Which is why Intel is the leader in GPU sales.
I am fully aware that high end GPUs are not the big sellers.

Like I said, the 480 competed well, but with efficiency gains, pascal based midrange cards are going to outperform AMD at all levels (performance, power consumption, cost), unless nvidia just decides NOT to do that and leave AMD anything below the 2070.

If AMD wants to be competitive in the midrange, they need to be able to meet nvidia tit for tat. When you are shopping on a budget, one brand begin consistently 10-20% faster is going to ensure you buy that brand. You get the best deal for the $$$, and that will be nvidia unless they dont compete.
Posted on Reply
#37
Casecutter
TheinsanegamerNWhen you are shopping on a budget, one brand begin consistently 10-20% faster is going to ensure you buy that brand. You get the best deal for the $$$, and that will be nvidia unless they dont compete.
So, your saying that buying a RX 580 8Gb for $230 is not as good of a "bang-for-buck" purchase than a GTX 1060 6Gb that is normally today pricing out at $25-30 (13%) more? Sorry don't see the data lately indicating the GTX 1060 6Gb 13% better all while shorting you 2Gb of memory.
Posted on Reply
#38
TheinsanegamerN
FrickAnd again we see the need for a huge EDITORIAL tag for the forum post as well s the main page.

Polaris would still be good if those damned prices were lower. 4GB RX580 at €200 is what it should be.
I'd argue lower. The 480 8GB was $220 at launch over two years ago. A 8 GB card should be under $200 now. I'd get the 8GB anyway, got burned twice by VRAM limitations, first by my 550tis and then by my 770s.

I've seen games at 1200p already push over 4GB on my 480. Games are memory hungry now.
CasecutterSo, your saying that buying a RX 580 8Gb for $230 is not as good of a "bang-for-buck" purchase than a GTX 1060 6Gb that is normally today pricing out at $25-30 (13%) more? Sorry don't see the data lately indicating the GTX 1060 6Gb 13% better all while shorting you 2Gb of memory.
NO! God why does everybody think I'm talking about old pascal cards?!?

Mid range cards based on TURING would destroy AMD's bang for the buck argument with GCN. A die shrink might not be enough for GCN. They really need to update their arch to stay competitive in any market segment. IF nvidia comes out with Turing midrange cards, AMD wont have much support until they get a new arch out, unless nvidia outprices themselves ridiculously or just doesnt bother to make anything below the 2070.

This was in response to somebody saying "well AMD will still have the midrange". AMD needs to update their arch, regardless if they are going to compete in the high end, because nvidia's arch improvements will make it to mid range GPUs eventually, and once they do, polaris will have to be so cheap there will not be any profit in it for AMD or the AIBs.
Posted on Reply
#39
kapone32
TheinsanegamerNI am fully aware that high end GPUs are not the big sellers.

Like I said, the 480 competed well, but with efficiency gains, pascal based midrange cards are going to outperform AMD at all levels (performance, power consumption, cost), unless nvidia just decides NOT to do that and leave AMD anything below the 2070.

If AMD wants to be competitive in the midrange, they need to be able to meet nvidia tit for tat. When you are shopping on a budget, one brand begin consistently 10-20% faster is going to ensure you buy that brand. You get the best deal for the $$$, and that will be nvidia unless they dont compete.
If you buy Intel CPUs and Nvidia GPUs yes, but with AMD...well I will give you an example in the days of AM3 to AM3+, you could spend $200 a year on a GPU, MB, CPU and see a measured difference in improvement for the user experience. There is also the argument that the 10% difference in performance doesn't justify the difference in price. Intel GPus are traditionally more expensive than ATI/AMD.The 7950 was $300 at launch after about a year they dropped to $199 how much was the 680 or the 780 at launch? I am willing to state that for most owners of Tahiti GPus in general were happy with them. Remember Vega was supposed to be $499 for the 64 MSRP at launch. We all know that mining and alleged memory price fixing have made it so the market bucked trend so that the actual retail price was as high as $1500, but now that they are in the 5 to $600 range and cheaper on ebay or kijji. I do agree that they need a new mid range card but a Polaris refresh to 12nm should see a drop in power consumption, potentially to the level of the same levels as the 1060 at least
Posted on Reply
#40
TheinsanegamerN
kapone32If you buy Intel CPUs and Nvidia GPUs yes, but with AMD...well I will give you an example in the days of AM3 to AM3+, you could spend $200 a year on a GPU, MB, CPU and see a measured difference in improvement for the user experience. There is also the argument that the 10% difference in performance doesn't justify the difference in price. Intel GPus are traditionally more expensive than ATI/AMD.The 7950 was $300 at launch after about a year they dropped to $199 how much was the 680 or the 780 at launch? I am willing to state that for most owners of Tahiti GPus in general were happy with them. Remember Vega was supposed to be $499 for the 64 MSRP at launch. We all know that mining and alleged memory price fixing have made it so the market bucked trend so that the actual retail price was as high as $1500, but now that they are in the 5 to $600 range and cheaper on ebay or kijji. I do agree that they need a new mid range card but a Polaris refresh to 12nm should see a drop in power consumption, potentially to the level of the same levels as the 1060 at least
The 480 can already match a 1060 with a bit of tweaking.

A turing mid range card is what AMD needs to watch for. They were loosing money in the early 2010s, and the low price of GPUs was NOT helping matters. They dont want to get trapped in that situation again. Remember, those super low prices came after the utter collapse of the mining scene the first time around, and the market was flooded with AMD cards. Those prices were not sustainable, as was shown with the 300 series being more expensive as AMD attempted to make money somewhere.
Posted on Reply
#42
TheoneandonlyMrK
notbSuch an awful text. Unbelievable. :-o
@hat what do you think about this one?

Nvidia logo in the header and a full-blown AMD toadying inside.
And, inevitably, Meltdown and Intel's 10nm problems had to be mentioned. Because why not?

This sentence in particular looks like copied straight from AMD strategy presentation or internal mailing:
"The reason is sound and clear: RTX prices are NVIDIA's weakness, and that weakness, my fellow readers, could become AMD's biggest strength."

And then this:
"So yes, AMD could have a winning hand here. Your move, AMD. "

Show yourself @dmartin - joined: Tuesday at 12:35 PM - Messages: 2
Tell me why do you care about AMD "winning" so much? And why do you think I should care?

@W1zzard why are we getting this?
Why should we care if you care.

And the peice is right , the 2060 When it shows is going to compete on price with most amd hardware perhaps but without useable RTx and a higher price it's also fighting all old pascals on ebay ie a lot of stuff is equal or better.

Nice to see our resident intel and Nvidia power user involved in the debate, you earning good coin.
Posted on Reply
#43
Casecutter
TheinsanegamerNMid range cards based on TURING would destroy
Oh we're discussing unknown's... like in we don't know the Turing price of mid-range or the 1440p level of improvement, or amount of ram, but destroy it is said... Just the facts, Ma'am.

And I'm not saying AMD is not behind the 8-ball here or that Nvidia has upped the ante being they're dropping cash on R&D and engineering as they have one singular focus. But it still it's about a Bang-for-Buck market and given Nvidia's release of Pascale and now Turing they haven't been slow'n their roll as to pricing. They're on a different path needing to apsolutly recoup those expenditures on all market segments, even the high volume sellers.. As not recouping that is a bad, in not having the cash to sustain.

AMD is in the same situation as when Piledriver CPU's had to be the backstop for several years till Zen came about. This to will pass.
TheinsanegamerNThis was in response to somebody saying "well AMD will still have the midrange". AMD needs to update their arch, regardless if they are going to compete in the high end, because nvidia's arch improvements will make it to mid range GPUs eventually, and once they do, polaris will have to be so cheap there will not be any profit in it for AMD or the AIBs.
I'm sure AMD and AIB's could still have decent margins with a re-spin and optimize the current Polaris, go with GDDR6 as a "pipe-cleaner" exercise given the design money is definitely spent. So they pick-up a little extra from just the 12nm process, and refined boost algorithms to push past 1500Mhz, all perhaps in a 175W envelope. They end up with a card that would provide strong/competent 1440p... and $230 price it will have a lot going for it. Then there's the geldings (570's) for $160; and 1080p where perhaps their best attack, bolstering the "entry-novice" as that would keep them pulling market share.

People buy: A) On price; B) Monitor resolution now and near future; C) If they see or find games that tax the experience/immersion of play.
Posted on Reply
#44
Vya Domus
John Naylor8. Power Costs - While this is something you normally wouldn't consider, when cards are very close in performance, it may be of significance to many users, especially those in Europe and especially in urban / suburban locales.
Ain't that the truth, you gotta save up on those pennies. I too live in Europe and I take a break each 15 minutes or so from my gaming sesion to check my power meter and make sure I don't go over my cap. I mean we can afford 300$ cards but an extra 1$ a month ? Nah that's going too far.

Seriously, why are you desperately trying to prove all this ? Is this a business meeting and your trying to make a sales pitch ? It's because of comments like these people end believing some of you are paid shills, for real. Just say this : "I think 1060 is better because it's a bit faster and more power efficient" , spare us the block of text and numbers. It's been 2 god damn years since these cards have been released we know the deal.
Posted on Reply
#45
ArbitraryAffection
A lot of people hate on GCN - especially Vega - for underwhelming performance in games and sub-par power efficiency. Which I can understand especially when you look at NVIDIA's offerings. Vega is a troubled GPU, in my opinion. From what I have gathered from talking to people in the know, and doing some experimenting with my own Vega 56 card, the GPU (or specifically, it's CU array) is massively underutilised in gaming workloads. Whether this is because of other architectural bottlenecks (Such as ROP or Geometry throughput, or even memory bandwidth) or lack of targeted optimisation by game Developers, I do not know for sure. But it is worth considering that NVIDIA's marketshare and somewhat 'heavy-handed' approach means that Developers of major titles are going to be targeting GeForce cards first with all types of optimisation, whether they openly admit that or not.

AMD is fighting an uphill battle in that area, I have heard that NVIDIA are getting developers to actively change the rendering pipeline in major PC-ported game engines to better suit their GPU designs, at the expense of performance on GCN. Even with GCN in the major gaming consoles, when porting to PC, that all goes out of the window. I am not saying GCN is a perfect architecture, it is quite obvious that there needs to be some serious redesign, especially with regards to its resource balancing and scalability (64 ROP, 4 SE, 64 CU seems to be an architectural limitation at this stage).

I think GCN was designed with a different idea of what game engines would use in 3D graphics. A strong emphasis on Compute performance, but a lighter approach to Pixel and Primitive performance. It is obvious that NVIDIA's architecture is more balanced / suited to current game engines. That brings me to the utilisation point: I have been experimenting with GPUOpen's Radeon GPU Profiler and taking some profiles of DX12 and Vulkan games that I own. From what I can see Vega's shader array spends a significant amount of timing doing literally nothing. This is known as "Wavefront Occupancy". I was actually shocked to see it visualised. It is obvious that Vega is not shader bound in current games as the performance gap when clock speeds are normalised between 56 CU and 64 CU chips is ~5%, despite the latter having 14.5% more shading resources.

What I am trying to say with this somewhat lengthy post is, that Vega is not the garbage as so many people claim, AMD tried to fix some major bottlenecks with NGG, but for whatever reason it is not fully implemented in production silicon, or enabled in the current driver stack - Vega 10 falls back to a legacy implementation of fixed-function geometry processing, all while its shader array spends more time than not, idling. So take a moment to fully understand why this could be, rather than bashing the GPU as 'garbage'. Remember that RTG doesn't have the luxury of the funds to develop multiple GPU lines for different uses, Vega is a multi-purpose chip delivering extremely competitive Compute performance and serviceable gaming performance. It also has the highest (Currently out of the dGPUs) level of DX12 feature support, even greater than Turing.

TLDR: Vega is heavily underutilised in games, potentially due to lack of real effort by devs to optimise for it, or being designed for a dramatically different type of workload (Compute-heavy) than what current games demand. Here's to hoping AMD can turn Radeon around. Thanks for reading. -Ash
Posted on Reply
#46
WikiFM
ppnAMD 7nm VEGA20 is just around the corner, 8GB on 4096 bit bus is doable to save the costs. Even 12GB on 3072bit. That card should be 75% faster than VEGA 10. or 2080Ti level of performance based on memory bandwith alone.1225/484 GB/s.

The size of it 325 mm² is insanely big for 7nm. since 14nm 510 mm² 12500 Mtransistor VEGA 10 is around 25M Transistors /mm², down from pure 33 Mtr/mm² sram density.

Аnd 7nm is 100 Mtr/mm² sram, final product realistically 75 Mtr/mm², can't be 50, because that wouldn't be 7nm, but closer to 10nm with 60 for sram, 45 Mtr/mm² for complex chip.

So VEGA 20 is 25 000 million transistors double that of VEGA 10, I don't believe it.
Vega 20 is expected to have 4096 cores as Vega 10, so the performance uplift would be mostly higher clocks (maybe 2Ghz) and the 2.5x memory bandwidth, not 75% faster.

The difference between TSMC's 10 nm and 7 nm are small, and since Vega 10 is being built in GF's 14nm, we can't compare between different foundries, so the real density is unclear.

Vega 20 is very unlike to double transistors since it has the same amount of cores as Vega 10. Anyway Vega 20 is not expected to launch in gaming market, I believe the reason is that the yields are not good enough to be priced at competitive levels (less than $800).
Posted on Reply
#47
ppn
TSMC are very clear about it 10nm is 2x density of 12/16nm, 7nm is another 1,6x, so we have 30, 60 and 100 Mtr/mm2.

Of course AMD says it is only 2x density so it is 10nm. But why would they call it 7nm then.
Posted on Reply
#48
efikkan
ArbitraryAffectionA lot of people hate on GCN - especially Vega - for underwhelming performance in games and sub-par power efficiency. Which I can understand especially when you look at NVIDIA's offerings. Vega is a troubled GPU, in my opinion. From what I have gathered from talking to people in the know, and doing some experimenting with my own Vega 56 card, the GPU (or specifically, it's CU array) is massively underutilised in gaming workloads.
Right, so far. Vega have plenty of computational performance and memory bandwidth, and it works fine for simple compute workloads, so it all comes down to utilization under various workloads.
ArbitraryAffectionWhether this is because of other architectural bottlenecks (Such as ROP or Geometry throughput, or even memory bandwidth(1)) or lack of targeted optimisation by game Developers(2), I do not know for sure. But it is worth considering that NVIDIA's marketshare and somewhat 'heavy-handed' approach means that Developers of major titles are going to be targeting GeForce cards first with all types of optimisation, whether they openly admit that or not.
1) It's not lack of memory bandwidth. RX Vega 64 have 483.8 GB/s, the same as GeForce 1080 Ti (484.3 GB/s), so there is plenty.

2) Well, considering most top games are console ports, the game bias today is favoring AMD more than ever. Still most people are misguided what is actually done in "optimizations" from developers. In principle, games are written using a common graphics API, and none of the big ones are optimized by design for any GPU architecture or a specific model. Developers are of course free to create different render paths for various hardware, but this is rare and shouldn't be done, it's commonly only used when certain hardware have major problems with certain workloads. Many games are still marginally biased one way or the other, this is not intentionally, but simply a consequence of most developers doing the critical development phases on one vendor's hardware, and then by accident doing design choices which favors one of them. This bias is still relatively small, rarely over 5-10%.

So let's put this one to rest once and for all; games don't suck because they are not optimized for a specific GPU. It doesn't work that way.
ArbitraryAffectionAMD is fighting an uphill battle in that area, I have heard that NVIDIA are getting developers to actively change the rendering pipeline in major PC-ported game engines to better suit their GPU designs, at the expense of performance on GCN.
Do you have concrete evidence of that?

Even if that is true, the point of benchmarking 15-20 games is that it will eliminate outliers.
ArbitraryAffectionI think GCN was designed with a different idea of what game engines would use in 3D graphics. A strong emphasis on Compute performance, but a lighter approach to Pixel and Primitive performance. It is obvious that NVIDIA's architecture is more balanced / suited to current game engines. That brings me to the utilisation point: I have been experimenting with GPUOpen's Radeon GPU Profiler and taking some profiles of DX12 and Vulkan games that I own. From what I can see Vega's shader array spends a significant amount of timing doing literally nothing(3). This is known as "Wavefront Occupancy". I was actually shocked to see it visualised. It is obvious that Vega is not shader bound in current games as the performance gap when clock speeds are normalised between 56 CU and 64 CU chips is ~5%, despite the latter having 14.5% more shading resources.

Vega is heavily underutilised in games, potentially due to lack of real effort by devs to optimise for it(4), or being designed for a dramatically different type of workload (Compute-heavy) than what current games demand
Your observation of idle resources is correct(3), that is the result of the big problem with GCN.
You raise some important questions here, but the assessment is wrong (4).

As I've mentioned, GCN scales nearly perfect on simple compute workloads. So if a piece of hardware can scale perfectly, then you might be tempted to think that the error is not the hardware but the workload? Well, that's the most common "engineering" mistake; you have a problem (the task of rendering) and a solution (hardware), and when the solution is not working satisfactory, you re-engineer the problem not the solution. This is why we always hear people scream that "games are not optimized for this hardware yet", well the truth is that games rarely are.

The task of rendering is of course in principle just math, but it's not as simple as people think. It's actually a pipeline of workloads, many of which may be heavily parallel within a block, but may also have tremendous amounts of resource dependencies. The GPU have to divide this rendering tasks into small worker threads (GPU threads, not CPU threads) which runs on the clusters, and based on memory controller(s), cache, etc. it has to schedule things to that the GPU is well saturated at any time. Many things can cause stalls, but the primary ones are resource dependencies (e.g. multiple cores needs the same texture at the same time) and dependencies between workloads. Nearly all of Nvidia's efficiency advantage comes down to this, which answers your (3).

Even with the new "low level APIs", developers still can't access low level instructions or even low-level scheduling on the GPU. There are certainly things developers can do to render more efficiently, but most of that will be bigger things (on a logic or algorithmic level) that benefits everyone, like changing the logic in a shader program or achieving something with less API calls. The true low-level optimizations that people fantasize about is simply not possible yet, even if people wanted to.
Posted on Reply
#49
looniam
the OP reminds me of what i was thinking and posted several times 2 years ago just prior to polaris's launch. nv hadn't released any mid ranged ($200-$300) cards yet and AMD was in position to grab all of it, at least for a minute.

we see how that went - to my utter disappointment. so here we are.

hawaii will be 5 y/o next month (happy birthday!) though it was a hot hungry power hog; it's price/performance was undisputed. that was, imo AMD's last successful launch.

and that makes a sad panda.
Posted on Reply
#50
Liquid Cool
I'm not an english instructor...but that first sentence in your article rolls off the tongue a little strange. In the short time? Other than this...I enjoyed the read Dmartin.

I will mention...I just picked up an RX 480 for 120. on ebay. Just like brand new. There's a ton of RX 470/480 or RX 570/580's floating around dirt cheap out there. With today's 15% off everything...good deals galore.

For 1080p gaming...I'm more than ecstatic...borderline giddy!

:),

Liquid Cool
Posted on Reply
Add your own comment
Dec 18th, 2024 07:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts