Thursday, September 27th 2018

NVIDIA's Weakness is AMD's Strength: It's Time to Regain Mid-Range Users

It won't be easy to see an AMD counterpart to NVIDIA GeForce RTX 2080 Ti in the short time. Heck, it will be difficult to see any real competitors to the RTX 2000 Series anytime soon. But maybe AMD doesn't need to compete on that front. Not yet. The reason is sound and clear: RTX prices are NVIDIA's weakness, and that weakness, my fellow readers, could become AMD's biggest strength.

The prices NVIDIA and its partners are asking for RTX 2080 and RTX 2080 Ti have been a clear discussion topic since we learnt about them. Most users have criticized those price points, even after NVIDIA explained the reasoning behind them. Those chips are bigger, more complex and more powerful than ever, so yes, costs have increased and that has to be taken into account.
None of that matters. It even doesn't matter what ray-tracing can bring to the game, and it doesn't matter if those Tensor Cores will provide benefits beyond DLSS. I'm not even counting on the fact that DLSS is a proprietary technology that will lock us (and developers, for that matter) a little bit more in another walled garden. Even if you realize that the market perception is clear: who has the fastest graphics card is perceived as the tech/market leader.

There's certainly a chance that RTX takes off and NVIDIA sets again the bar in this segment: the reception of the new cards hasn't been overwhelming, but developers could begin to take advantage of all the benefits Turing brings. If they do, we will have a different discussion, one in which future cards such as RTX 2070/2060 and its derivatives could bring a bright future for NVIDIA... and a dimmer one for AMD.

But the thing that matters now for a lot of users is pricing, and AMD could leverage that crucial element of the equation. In fact, the company could do that very soon. Some indications reveal that AMD could launch a new Polaris revision in the near future. This new silicon is allegedly being built on TSMC's 12 nm process, something AMD did successfully with its Ryzen 2000 Series of CPUs.
AMD must have learnt a good lesson there: its CPU portfolio is easily the best in its history, and the latest woes at Intel are helping and causing forecast revisions that estimate a 30% market share globally for AMD in Q4 2018. See? Intel's weakness is AMD's strength on this front.

2018 started with the worst news for Intel -Spectre and Meltdown- and it hasn't gone much better later on: the jump to the 10 nm process never seems to come, and Intel's messages about those delays have not helped to reinforce confidence in this manufacturer. The company would be three generations further than they are now without those big problems, and the situation for AMD would be quite different too.

Everything seems to make sense here: customers are upset with that RTX 2000 Series for the elite, and that Polaris revision could arrive at the right time and the right place. With a smaller node AMD could gain higher yields, decrease cost, increase clock frequencies and provide that 15% performance increase some publications are pointing to. Those are a lot of "coulds", and in fact there's no reason to believe that Polaris 30 is more than just a die shrink, so we would have the same unit counts and higher clocks.
That won't probably be enough to make the hypothetical RX 680 catch up with a GTX 1070: performance of the latter is +34% the one we found in the RX 580 on average according to our tests, so even with that refresh we will have a more competitive Radeon RX family that could win the price/performance battle, and that is no small feat.

The new cards would also not target just existing GTX 7/9 Series users, but also those older AMD Radeon users that were expecting a nice upgrade on performance without having to sell their souls. And for the undecided users, the ones that are thinking about getting a GTX 1050/Ti or a GTX 1060, AMD's offer could be quite attractive if price/performance ratio hits NVIDIA where it hurts more.

That would put that new family of graphic cards (Radeon RX 600?) on a pretty good position to compete with GeForce GTX 1000. NVIDIA presumably could still be king in the power consumption area, but besides that, AMD could position itself on that $300-$500 range (and even below that) with a really compelling suite of products.

So yes, AMD could have a winning hand here. Your move, AMD.
Add your own comment

110 Comments on NVIDIA's Weakness is AMD's Strength: It's Time to Regain Mid-Range Users

#51
Casecutter
looniamthe OP reminds me of what i was thinking and posted several times 2 years ago just prior to polaris's launch. nv hadn't released any mid ranged ($200-$300) cards yet and AMD was in position to grab all of it, at least for a minute.
Well yes Nvidia can always hang back and squash AMD. AMD hasn't won at that game since the days of the 4870.

Context: Nvidia dropped the GP106 Pascal GTX 1060 less than a month after the RX480. Though it wasn't till mid-end of August that AIB reviews, then real stock wasn't till like Sept. AMD was briskly selling 480's and 470's; at and below there MSRP, as folk weren't all seeing that "value" in a card that offered 2Gb (25%) less memory, at 30% more cash while 3-4% performance up-tick.

The worst was even by October mining was effecting the 480/470 prices make them less palatable, and by Christmas they where only getting bought by miners. The GTX 1060 stayed reasonable in price and miners weren't flocking to them. While... why sure as the Steam lumps all GTX1060 (3Gb Geldings and 6Gb) as one, there are high usage. If mining had not been in play, and we looked at all Polaris (480/470/580/570) there's in all probability not near the Steam discrepancy some tout today.
Posted on Reply
#52
hat
Enthusiast
notbSuch an awful text. Unbelievable. :-o
@hat what do you think about this one?
Seems like a good candidate for an editorial tag to me. As such, if it were marked editorial, nothing wrong with it. Everyone gets to have an opinion. As far as any actual news goes, well, none of it is really news (to us) because we've kept up on the issues laid out in the thread, and those topics have already been covered, but to the casual tech reader who isn't keeping up on that news every day, it's a sound opinion piece with already existing facts laid out to back up the author's opinion.
notbAnd, inevitably, Meltdown and Intel's 10nm problems had to be mentioned. Because why not?
It's not really relevant to AMD vs nVidia, but the link between the issues here is that while some users are dissatisfied with nVidia's prices, some users are also dissatisfied with Intel's problems and may be looking to AMD instead. Without cracking open all the Intel vs AMD debates too much (again), it shows the reader that in one way or another, tech giants nVidia and Intel are both currently tripping on their own shoelaces, giving underdog AMD a better chance to compete in a tough environment for them. Again, though, this is an opinion piece, not solid fact, so I'm fine with it. The only issue I take with that is that we basically have Intel vs AMD slipped into a predominately nVidia vs AMD topic. There's room for a whole new editorial about that.
notbThis sentence in particular looks like copied straight from AMD strategy presentation or internal mailing:
"The reason is sound and clear: RTX prices are NVIDIA's weakness, and that weakness, my fellow readers, could become AMD's biggest strength."

And then this:
"So yes, AMD could have a winning hand here. Your move, AMD. "
Well, once again, as an editorial, it's fine. And I don't think the author is wrong. There's plenty of users who aren't... "brand loyal" who would be happy to purchase competitive AMD hardware in this way. Because of Turing's astronomical price, it doesn't exactly have to beat it, just come close enough and be cheap enough to score sales.
notbShow yourself @dmartin - joined: Tuesday at 12:35 PM - Messages: 2
Tell me why do you care about AMD "winning" so much? And why do you think I should care?
Not so sure the article said "I want AMD to win" as much as "AMD could win". Either way, in a market clearly dominated be nVidia, a win from AMD is good for everyone, even hardcore nVidia fanboys. At least they'll get better prices.
Posted on Reply
#53
Jism
Was'nt the MSRP for the 1060 for 350$ approx and the MSRP for the RX480 at 200$?

If the mining craze was'nt there, those cards would be available for MSRP. Then you have these a-hole webshops who put their extra margin on top of the MSRP and there you have it.
Posted on Reply
#54
efikkan
Liquid CoolI will mention...I just picked up an RX 480 for 120. on ebay. Just like brand new. There's a ton of RX 470/480 or RX 570/580's floating around dirt cheap out there.
Buying used is fine, I've done that several times myself, both successful and not.
But buying used is used is used, and should be priced accordingly. You should never pay more than you are willing to risk for a short-lived product. The second-hand market should never be compared to prices of new hardware. E.g. a two year old RX 480 at $120 should be considered as a card with two years less of lifetime vs. a brand new card. And especially in these mining times I would not buy any used graphics card.
Posted on Reply
#55
looniam
CasecutterWell yes Nvidia can always hang back and squash AMD. AMD hasn't won at that game since the days of the 4870.

Context: Nvidia dropped the GP106 Pascal GTX 1060 less than a month after the RX480. Though it wasn't till mid-end of August that AIB reviews, then real stock wasn't till like Sept. AMD was briskly selling 480's and 470's; at and below there MSRP, as folk weren't all seeing that "value" in a card that offered 2Gb (25%) less memory, at 30% more cash while 3-4% performance up-tick.

The worst was even by October mining was effecting the 480/470 prices make them less palatable, and by Christmas they where only getting bought by miners. The GTX 1060 stayed reasonable in price and miners weren't flocking to them. While... why sure as the Steam lumps all GTX1060 (3Gb Geldings and 6Gb) as one, there are high usage. If mining had not been in play, and we looked at all Polaris (480/470/580/570) there's in all probability not near the Steam discrepancy some tout today.
not disagreeing at all with your points. but what stood out to me was at launch how reviewers observed the power via pci-e slot running a little out of spec.

of course it was soon fixed w/firmware but unfortunately, the FUD of it causing your house to burn down had already spread like . .wildfire (no pun intended!) then the later and more expensive 1060 seemed like the better choice regardless.

personally, i'm convinced i will be holding on to my 980ti until it blows up. if nothing is around by then, i give up w/gaming AAA titles.
Posted on Reply
#56
xkm1948
Is this TPU? I thought I am reading another r/AMD self congratulatory thread.

I look at this piece of editorial as well as the RTG crazed fanboys( and fangirls, fancats and etc), all I can think of is this:

Posted on Reply
#57
Camm
I doubt AMD will bother. Even when AMD has competitive products, gamers don't buy them, so its better to focus on the enterprise market where vendors ARE buying AMD cards, because Nvidia got too greedy with GRID licensing and AMD offers perfectly adequate performance in most cases.

IMO, AMD will reenter the gaming GPU market when HBM becomes affordable enough, and 7nm is sampling well enough to launch. Nvidia is sort of locked into this generation for the next year, so I wouldn't be surprised to see a Navi launch targeting the stack (IF Navi performs) around the same time as Zen 2, so Feb\Mar next year.

As for the muppets saying Polaris sold well, it did - to miners. Steam survey shows the 1060 outpacing it around 5-1. Which is a shame, the 580 is a better card than the 1060 most of the time - it should have sold better. But see the original point, gamers don't buy AMD even when they have competitive products.
Posted on Reply
#58
R-T-B
xkm1948Is this TPU? I thought I am reading another r/AMD self congratulatory thread.

I look at this piece of editorial as well as the RTG crazed fanboys( and fangirls, fan_helicopters and etc), all I can think of is this:

Editorials. They happen. They should be marked better though.

In the interest of avoiding political discussions, please leave your helicopter implications at home.
Posted on Reply
#59
NC37
AMD's next best thing is always "just around the corner," but then when the corner comes, there is always a sign that says "Want the next best thing? It's just around the next corner!"

My 390 has been solid with the exception of spending a year with a driver bug in Overwatch that took AMD forever to fix. That alone is reason enough for me to not return to AMD again. Their driver development team is so outclassed by nVidia. When I had issues with nVidia, they'd get fixed much faster than that. However, nVidia choosing to dump 106 series chips in the high end. Chips that have been historically relegated to midrange/low mid GPUs, that is something that doesn't make me want to go back. It means nVidia is getting away with near robbery by charging hundreds higher for a low mid-mid part. Granted, part due to AMD's ineptitude in delivering competitive GPUs, nVidia has been able to get away with this.

To get the GPU I'd want to buy, I'd have to spend close to $500 now when the original sweet spot price point was traditionally in the $300 area. Whatever nVidia dumps in the $300 segment likely will be both underperforming and crippled in terms of VRAM. Hence why I jumped ship back to AMD for the 390. I hope AMD really drills nVidia for this behavior with some solid boards. But more Polaris isn't going to cut it. Polaris is a dead end chip that has been underwhelming since launch and revisions do nothing to address the problems.
Posted on Reply
#60
TheoneandonlyMrK
NC37AMD's next best thing is always "just around the corner," but then when the corner comes, there is always a sign that says "Want the next best thing? It's just around the next corner!"

My 390 has been solid with the exception of spending a year with a driver bug in Overwatch that took AMD forever to fix. That alone is reason enough for me to not return to AMD again. Their driver development team is so outclassed by nVidia. When I had issues with nVidia, they'd get fixed much faster than that. However, nVidia choosing to dump 106 series chips in the high end. Chips that have been historically relegated to midrange/low mid GPUs, that is something that doesn't make me want to go back. It means nVidia is getting away with near robbery by charging hundreds higher for a low mid-mid part. Granted, part due to AMD's ineptitude in delivering competitive GPUs, nVidia has been able to get away with this.

To get the GPU I'd want to buy, I'd have to spend close to $500 now when the original sweet spot price point was traditionally in the $300 area. Whatever nVidia dumps in the $300 segment likely will be both underperforming and crippled in terms of VRAM. Hence why I jumped ship back to AMD for the 390. I hope AMD really drills nVidia for this behavior with some solid boards. But more Polaris isn't going to cut it. Polaris is a dead end chip that has been underwhelming since launch and revisions do nothing to address the problems.
I tried all the last GPUs in some form over the last few years , and other then edge cases from both camps, both had similar driver compatibility issues but all the cards performance was better than expected, every time, it was one game dude , really.
Posted on Reply
#61
ArbitraryAffection
efikkanRight, so far. Vega have plenty of computational performance and memory bandwidth, and it works fine for simple compute workloads, so it all comes down to utilization under various workloads.


1) It's not lack of memory bandwidth. RX Vega 64 have 483.8 GB/s, the same as GeForce 1080 Ti (484.3 GB/s), so there is plenty.

2) Well, considering most top games are console ports, the game bias today is favoring AMD more than ever. Still most people are misguided what is actually done in "optimizations" from developers. In principle, games are written using a common graphics API, and none of the big ones are optimized by design for any GPU architecture or a specific model. Developers are of course free to create different render paths for various hardware, but this is rare and shouldn't be done, it's commonly only used when certain hardware have major problems with certain workloads. Many games are still marginally biased one way or the other, this is not intentionally, but simply a consequence of most developers doing the critical development phases on one vendor's hardware, and then by accident doing design choices which favors one of them. This bias is still relatively small, rarely over 5-10%.

So let's put this one to rest once and for all; games don't suck because they are not optimized for a specific GPU. It doesn't work that way.


Do you have concrete evidence of that?

Even if that is true, the point of benchmarking 15-20 games is that it will eliminate outliers.


Your observation of idle resources is correct(3), that is the result of the big problem with GCN.
You raise some important questions here, but the assessment is wrong (4).

As I've mentioned, GCN scales nearly perfect on simple compute workloads. So if a piece of hardware can scale perfectly, then you might be tempted to think that the error is not the hardware but the workload? Well, that's the most common "engineering" mistake; you have a problem (the task of rendering) and a solution (hardware), and when the solution is not working satisfactory, you re-engineer the problem not the solution. This is why we always hear people scream that "games are not optimized for this hardware yet", well the truth is that games rarely are.

The task of rendering is of course in principle just math, but it's not as simple as people think. It's actually a pipeline of workloads, many of which may be heavily parallel within a block, but may also have tremendous amounts of resource dependencies. The GPU have to divide this rendering tasks into small worker threads (GPU threads, not CPU threads) which runs on the clusters, and based on memory controller(s), cache, etc. it has to schedule things to that the GPU is well saturated at any time. Many things can cause stalls, but the primary ones are resource dependencies (e.g. multiple cores needs the same texture at the same time) and dependencies between workloads. Nearly all of Nvidia's efficiency advantage comes down to this, which answers your (3).

Even with the new "low level APIs", developers still can't access low level instructions or even low-level scheduling on the GPU. There are certainly things developers can do to render more efficiently, but most of that will be bigger things (on a logic or algorithmic level) that benefits everyone, like changing the logic in a shader program or achieving something with less API calls. The true low-level optimizations that people fantasize about is simply not possible yet, even if people wanted to.
Firstly I want to thank you for the informative post. I do appreciate it. I'm definitely no expert on this and I am just saying what I have heard talking to people, and playing with the tool I mentioned. Honestly I have been trying to chase down the 'cause' of Vega's somewhat lacklustre performance given its resource advantage over the GP104 chip.

On the note of Memory Bandwidth: Why would AMD opt for a quad-stack, 4096-bit interface on Vega 20 (This is confirmed as we have seen the chip being displayed), with potentially over 1TB/s of raw memory bandwidth if it wasn't at least somewhat limited by memory bandwidth? Or is that purely due to memory capacity reasons? Honestly almost everyone I talk to about GCN says it is crying out for more bandwidth. It's also worth pointing out that NVIDIA's Delta-Colour Compression is significantly better than AMD's in Vega: 1080 Ti almost certainly has quite a bit more effective bandwidth than Vega when that is factored in.

So resource utilisation is a major issue for Vega, then. Do you think there is any chance they could have 'fixed' any of this for Vega 20? I won't lie, I've been kinda hoping for some Magical Secret Sauce for Vega 10, perhaps NGG Fast Path or the fabled Primitive Shaders. -shrug- even if it doesn't happen, I am satisfied with my Vega 56 as it is, I am only playing at 1080p, 60 Hz so it is plenty fast enough.
Posted on Reply
#62
Zubasa
CammI doubt AMD will bother. Even when AMD has competitive products, gamers don't buy them, so its better to focus on the enterprise market where vendors ARE buying AMD cards, because Nvidia got too greedy with GRID licensing and AMD offers perfectly adequate performance in most cases.

IMO, AMD will reenter the gaming GPU market when HBM becomes affordable enough, and 7nm is sampling well enough to launch. Nvidia is sort of locked into this generation for the next year, so I wouldn't be surprised to see a Navi launch targeting the stack (IF Navi performs) around the same time as Zen 2, so Feb\Mar next year.

As for the muppets saying Polaris sold well, it did - to miners. Steam survey shows the 1060 outpacing it around 5-1. Which is a shame, the 580 is a better card than the 1060 most of the time - it should have sold better. But see the original point, gamers don't buy AMD even when they have competitive products.
Your statement is mostly true.

But the problem with Steam surveys is that it just shows the number of users who played with that GPU, not the actual number of different machines.
For example the ratio of Intel / nVidia sky-rocketed when they added the Chinese users to the survey.
The reason for that is East Asian users generally does not own their own PC but play on Internet Cafe with pre-built PCs ( which generally are Intel / nVidia), each machine can serve hundreds of different users at different times.
Posted on Reply
#63
Emu
DeathtoGnomesif etailers stop price gouging the market would still have favored Nvidia.
The only etailers that I have seen price gouging are a few trying to take advantage of the extremely limited quantities of 2080 (ti) cards to charge too much on Amazon. Here in Australia where we usually pay the "Australia tax", the custom card pricing is actually pretty much in line with the US pricing - checked by getting the US price, converting to Australian dollars and adding the 10% tax.

The problem that I see for Nvidia though is that they have priced the 20x0 series using the 10x0 series comparable performance price points. This means that the only real performance per dollar increase is the as-yet unknown performance boost from DLSS and the image quality improvements from RTX. If Nvidia priced the 20x0 series at the same model price points as the previous 10x0 series then they would literally own the market and force AMD's GPU marketshare from around 15% down into the single digits.
Posted on Reply
#64
Rockarola
Welcome to the hornets nest, you wrote about red/green...now you have to live with a plethora of fanboys, trolls and a few shills.
(good piece, but anything mentioning red/green in a positive fashion WILL get trolled into oblivion by those who subscribe their version of the truth)
Posted on Reply
#65
Melvis
All AMD need to do is bring out new Video cards that compete at the GTX 1070/Ti/1080/1080Ti range for alot less money then Nvidia and you got ya self a winner. Yes the Vega is around the 1080 ish performance (little less on average) but its way to expensive, bring a card in that performance the same or a little over for $200 less and people will buy buy buy.
Posted on Reply
#66
hat
Enthusiast
I think $200 is overly optimistic for such a card, but damn if it wouldn't sell...
Posted on Reply
#67
Arjai
blah RED!
blah GREEN!

It really is a personal choice. I just bought a used 580 (Power Color Red Devil 8GB) from a trusted TPU'er, that mined on it. By under volting it and boosting memory clocks. I went with the 580 because I am going to be using a 1080p monitor and it is right in line with a lot of cards, at 1080p, that are much more expensive to buy. So, once my gaming rig is finished, hopefully Sunday, I will be adding another 580 to the Steam results, LOL.

I am far from a "Gaming Enthusiast" and more of a play-when-I-can gamer. I am sure I will be plenty happy with my 580. My last gaming card, now in one of my T3500's is going to do some extra duty as my gamer, for now, is a MSI 7850.
Posted on Reply
#68
Vayra86
ZubasaYour statement is mostly true.

But the problem with Steam surveys is that it just shows the number of users who played with that GPU, not the actual number of different machines.
For example the ratio of Intel / nVidia sky-rocketed when they added the Chinese users to the survey.
The reason for that is East Asian users generally does not own their own PC but play on Internet Cafe with pre-built PCs ( which generally are Intel / nVidia), each machine can serve hundreds of different users at different times.
While true, the low AMD share was always a fact, even long before the Chinese popped up on Steam. And even so, those hundreds of users are using an Nvidia card regardless. Why not an AMD card? Its not like China said to Nvidia: 'come in, we've said to AMD they can sod off'

In a broader sense though yes, Steam Survey can really paint a wrong picture if you consider it 'the entire market'.
MelvisAll AMD need to do is bring out new Video cards that compete at the GTX 1070/Ti/1080/1080Ti range for alot less money then Nvidia and you got ya self a winner. Yes the Vega is around the 1080 ish performance (little less on average) but its way to expensive, bring a card in that performance the same or a little over for $200 less and people will buy buy buy.
Yes and at that price AMD will be doing lots of work for nothing more than a break even. You know this won't happen unless they can build a GPU as efficiently as they built Zen: awesome yields, good efficiency, and performance that can be scaled across the whole product stack. The idea that somehow you can win on price alone just won't fly. You need a product that is competitive at an architectural / design level.
Posted on Reply
#69
hat
Enthusiast
Unfortunately, GPUs have to be monolithic, because SLI/xFire is, well, bad... unless they can design smaller parts that can work together in a better way.
Posted on Reply
#70
XiGMAKiD
dmartina bright future for NVIDIA... and a dimmer one for AMD.
Notice how their codename go from the brightest star to dimmer star :roll:
dmartinAMD could position itself on that $300-$500 range
$300 for re-refreshed Polaris? :kookoo:
Posted on Reply
#71
londiste
JismWas'nt the MSRP for the 1060 for 350$ approx and the MSRP for the RX480 at 200$?
If the mining craze was'nt there, those cards would be available for MSRP. Then you have these a-hole webshops who put their extra margin on top of the MSRP and there you have it.
msrp-s:

summer 2016:
- rx470: $179
- rx480 4gb: $199
- rx480 8gb: $239
- 1060 3gb: $199
- 1060 6gb: $249

spring 2017:
- rx570: 169
- rx580 4gb: $199
- rx580 8gb: $239
EmuIf Nvidia priced the 20x0 series at the same model price points as the previous 10x0 series then they would literally own the market and force AMD's GPU marketshare from around 15% down into the single digits.
No it would not. Steam HW survey despite its shortcomings is good enough for some generic data and it does show GTX1080 and up cards make up around 5% of the market. It is not a big enough sector to shake things.
ArjaiIt really is a personal choice. I just bought a used 580 (Power Color Red Devil 8GB) from a trusted TPU'er, that mined on it. By under volting it and boosting memory clocks. I went with the 580 because I am going to be using a 1080p monitor and it is right in line with a lot of cards, at 1080p, that are much more expensive to buy. So, once my gaming rig is finished, hopefully Sunday, I will be adding another 580 to the Steam results, LOL.
I am far from a "Gaming Enthusiast" and more of a play-when-I-can gamer. I am sure I will be plenty happy with my 580. My last gaming card, now in one of my T3500's is going to do some extra duty as my gamer, for now, is a MSI 7850.
That is an awesome card and congrats for getting one. A note though - these sold anywhere between $400 and $500 when new, basically twice the MRSP ;)
Posted on Reply
#72
c12038
I am still using my GTX 780 which is still going strong all this hype about RTX cards not much really its still new tech give it chance to evolve nothing happens over night, as the saying goes "Rome wasn't built in a day" I am waiting for all the bugs and issues to be sorted then may think of upgrading to an GTX 1080 Ti as these will come down in price like always.

If you want better quality hold back paying for crap use your wallets not your impulsive buying nature to show all these companies who owns them yes its you the consumer who owns their ass with out you they would be out of business.... common logical sense (Save money buy old tech wait until issues and bugs are sorted then invest in the hardware simple )


that's my 2p's worth
Posted on Reply
#73
Aldain
DeathtoGnomesif etailers stop price gouging the market would still have favored Nvidia.
nope.. it is not only on them.. nvidia prices turing sky high and they are paying the rice for it now
TheinsanegamerNAMD already regained the middle ground with the RX480, it was competitive and sold well.

AMD' problem is trying to compete with an old arch. GCN was useful, but needs serious updating to be competitive. AMD instead wasted money on vega, which didnt go anywhere. They also let their driver support go lax yet again, 18.6 to 18.9 have been trainwrecks of bugs and performance issues.

And now, with nvidia getting slovenly, AMD's GPU division is nowhere to be found. Navi *might* come out in 2019, but it also may just be tweaked vega. Nvidia is gonna make a mint on thsi generation, just like last generation, and by the time AMD shows up, the market will be ready for true next generation GPUs, not AMD's somewhat -competitive-with-2-year-old-GPUs design.
Vega did not go anywhere? Ironically as a product vega is the BEST arch for amd in a long time.
Posted on Reply
#74
dmartin
Vayra86+1

What is the point of these bits of text? To echo the popular sentiment on a forum after the fact? If the idea is to breathe life into a discussion or create one... well. Mission accomplished, but there are a dozen threads that have done that already. And this piece literally adds nothing to it, all it serves is to repeat it.

Regardless.... here's my take on the editorial you wrote up @dmartin

I think it's a mistake to consider Nvidia's Turing a 'weakness'. You guys act like they dropped Pascal on the head when they launched Turing and there is no way back. What Turing is, is an extremely risky and costly attempt to change the face of gaming. Another thing Turing is, is another performance bracket and price bracket over the 1080ti. Nothing more, nothing less. As for perf/dollar, we are now completely stagnated for 3 years or more - and AMD has no real means to change that either.

You can have all sorts of opinion on that, but that does not change the landscape one bit, and a rebranded Polaris (2nd rebrand mind you, where have we seen this before... oh yeah, AMD R9 - that worked out well!) won't either. AMD needs to bring an efficiency advantage to its architecture and until then, Nvidia can always undercut them on price because they simply need less silicon to do more. Specifically in the midrange. Did you fail to realize that the midrange GPU offering realistically hasn't changed one bit with Turing's launch?

If anyone really thinks that an RX680 or whatever will win back the crowd to AMD, all you need to do is look at recent history and see how that is not the case. Yes, AMD sold many 480s when GPUs were scarce, expensive and mostly consumed by miners. In the meantime - DESPITE - mining AMD still lost market share to Nvidia. That's how great they sell.. Look at Steam Survey and you see a considerably higher percent of 1060's than you see RX480/580s. Look anywhere else with lots of data and you can see an overwhelming and crystal clear majority of Nvidia versus AMD.

What I think is that while Nvidia may lose some market share ánd they may have miscalculated the reception of RTRT / RTX, the Turing move still is a conscious and smart move where they can only stand to gain influence and profit. Simply because Pascal still is for sale. They cover the entire spectrum regardless. Considering that, you can also conclude that the 'Pascal stock' really was no accident at all. Nvidia consciously chose to keep that ace up its sleeve, in case Turing wasn't all they made it out to be. There is really no other option here, Nvidia isn't stupid. And I think that choice was made the very moment Nvidia knew Turing was going to get fabbed on 12nm. It had to be dialed back.

Nothing's changed, and until AMD gets a lean architecture and can fight Nvidia's top spots again, this battle is already lost. Even if Turing costs 2000 bucks. The idiocy of stating you can compete with a midrange product needs to stop. It doesn't exist in GPU. Todays midrange is tomorrow's shit performance - it has no future, it simply won't last.
I only mentioned RTX prices as a weakness, not Turing as a while. In fact, I think NVIDIA has been pretty bold. The easy thing would have been to give users a slight refresh on the die, clocks and performance.

I agree with you for the most part, the idea here was to express that shared idea : c'mon AMD, do something, react, give users an option. I think they really can take advantage on the pricing issues with RTX, but it won't be easy to see big chances. Let's see what happens, I guess I'm being too optimistic here.

Thank you for sharing your thoughts, as I've said they are a good portrait of the current situation.
Posted on Reply
#75
del42sa
"This new silicon is allegedly being built on TSMC's 12 nm process, something AMD did successfully with its Ryzen 2000 Series of CPUs. "

AFAIK Ryzen is build on GloFo 12nm ....
Posted on Reply
Add your own comment
Dec 18th, 2024 07:27 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts