Friday, February 22nd 2019
AMD Partners Cut Pricing of Radeon RX Vega 56 to Preempt GeForce GTX 1660 Ti
AMD cut pricing of the Radeon RX Vega 56 in select markets to preempt the GeForce GTX 1660 Ti, and help the market digest inventory. The card can be had for as little as €269 (including VAT) for an MSI RX Vega 56 Air Boost, which is a close-to-reference product. The GTX 1660 Ti reportedly has a starting price of $279.99 (excluding taxes). This development is significant given that the GTX 1660 Ti is rumored to perform on-par with the GTX 1070, which the RX Vega 56 outperforms. The RX Vega series is still very much a part of AMD's product stack, and AMD continues to release new game optimizations for the card. NVIDIA is expected to launch the GeForce GTX 1660 Ti within February. Although based on the "Turing" architecture, it lacks real-time raytracing and AI acceleration features, yet retains the increased IPC of CUDA cores from the new generation.
120 Comments on AMD Partners Cut Pricing of Radeon RX Vega 56 to Preempt GeForce GTX 1660 Ti
Vega(14nm) is practically "unsalable" at this point, so I don't expect them to do more production runs of these chips, even though the bigger Navi chip might be many months away. Similar price dumps could be expected if they sell too slowly.
Compared to either card, Vega 56 & 1660ti, the Vega 56 hardware wise wins out any day.
Its a compute card & a gaming card in one.
After all Turing was never meant for 12nm, so I'd expect 7nm+ GPU's this year.
But you can't ignore the technological improvements of the new architecture, like the "small" optimization of splitting fp32 and int units in the shader processors, which gives some nice improvements in games which heavily rely on integer math in their shaders, which more and more games do. From TSMC or Samsung?
TSMC 7nm+(EUV) are starting tapeouts now, so finished chips can theoretically arrive very late 2019 or early 2020. I believe Samsung is about the same or a little later.
I still haven't heard anything about a new consumer lineup from Nvidia this year, and I don't expect one, but something limited like a new Titan based on a potential new Tesla lineup is entirely possible.
The main reason for poor sales of Turing are a couple of things. Firstly a higher than normal overstock of old cards on discount led to an undeserved negative response by reviewers; RTX 2070 and RTX 2080 is not that badly priced, but looks bad vs. Pascal on discount. Secondly, the availability of the new cards have been a little low.
It's not RX480. They can't just push the clocks and recycle it. Unless they make Navi just another GCN refresh on the old node. This is based on...? Do we have some official data?
2000 series are high-end cards and they're meant to sell in relatively low volume. But from what I've heard, sales are fine in the target group. Even looking at this forum, you can see many people decided to replace their 1080. Because RTX cards are just for the high-end at this point. Nvidia released 2060 first hoping some people would increase their budget by those $50. It's like smartphone manufacturers who make the flagship phones a bit more expensive with every generation.
Now it's time for the mainstream cards. "if you tell a lie a thousand times"...
Furthermore the only performance uplift is found in the 2080ti. The rest of the stack is identical to Pascal in terms of raw perf, and costs as much or more, except for the 2060. This isn't progress, its just Nvidia pushing an overpriced Titan variant to the market when it comes to performance. Been there done that, and it was never a good buy, and it isn't today.
Realistically what we're looking at since Pascal release is complete stagnation for the overwhelming majority of the target audience.
Rough estimates from Techpowerup's reviews:
GTX 1060 => RTX 2060: +~63%(1440p) +~60%(1080p)
GTX 1070 => RTX 2070: +~44%(4K) +~41%(1440p)
GTX 1080 => RTX 2080: +~43%(4K) +~41%(1440p)
GTX 1080 Ti => RTX 2080 Ti: +~38%(4K) +~33%(1440p)
If only the shoe was on the other foot, you would have praised Turing.
Radeon VII 16GB (February 2019 7nm) => GTX 1080 TI (March 2017 16nm) -8% (1080P) -5%(1440P) +/- 0% (4K)
1070 ($ 379,-) >> 2070 ($ 499,-)
Should I continue? ... this isn't rocket science. Perf/dollar is 100% stagnant from Pascal to Turing and the only interesting release is found in the midrange, with performance that was available for over 2 years now.
There is absolutely nothing to praise here, and this has been clear as day since this generation was announced. Nvidia does its usual twist and bend of the numbers to extract more cash from a market that has been starving for upgrades for years, the question is should you fall for it, or use common sense and hard numbers that factor in price. The reality is, if you look at TPU reviews, barely anything has changed across the board in terms of realistic purchase options - the stack got updated and we barely gained anything. A few 5-8% perf jumps at the same price point is hopeless, (that is what AMD managed with just driver releases over the time Pascal was released...for some perspective) - especially with twice as much time in between releases as usual.
But let's drive this home anyway, because apparently I'm saying strange things to you
970 >> 1070: +40% perf / $329,- vs $ 379,- (Let's say the usual, generational +30% got a little bonus and it costs us 50 bucks extra, reasonable!)
980 >> 1080: +51% perf / $549,- vs $ 599,- (Same deal, bigger jump)
Yeah, business as usual hm? :p My god... I thought we had smart people here. Titan was faster too back in the Kepler days, but nobody told you it was an interesting GPU to buy for gaming.
I think its about time people cooled down their upgrade itch a bit and look at the hard numbers, most notably the one with the dollar sign in front. There is little to be gained here for a high end Pascal user, which most of us are.
Note: this does NOT mean there is a hard upper limit to the cost of a GPU. A good example is the 1080ti, lots of those have been sold in spite of a pretty hefty price tag. It was simply great price/perf. And this is where Turing falls flat on its face (most of it).
Price is a key factor, there is no denying it. If it weren't we'd have our ray tracing decades ago, and a computer farm in the basement to run it.
The only thing I was trying to correct was when you said "I think its about time people cooled down their upgrade itch a bit and look at the hard numbers". Then you also said "Cost is a thing to 95% or more of the target market, its too big to ignore."
The 95% is debatable, but then again not everybody lives in a first world country. My point was, there aren't many that need to "cool down their upgrade itch". If you're already price conscious, you weren't planing to upgrade anyway. That's all. I hope that makes sense now.
How can you say 2070 and 2080 are not badly priced haha, 2080 took over 1080 Ti's pricing while performing the same.
I'll be buying Nvidia GPU's again when 7nm chips are out.
But good job greentwisting reality.
Another great excuse it how expensive new process node is.
I'm also not presenting excuses, quite the opposite, in fact. Do you even recognize an ally when he drops a care package at your door, or are you simply blinded by anger?