Wednesday, June 12th 2019

NVIDIA's SUPER Tease Rumored to Translate Into an Entire Lineup Shift Upwards for Turing

NVIDIA's SUPER teaser hasn't crystallized into something physical as of now, but we know it's coming - NVIDIA themselves saw to it that our (singularly) collective minds would be buzzing about what that teaser meant, looking to steal some thunder from AMD's E3 showing. Now, that teaser seems to be coalescing into something amongst the industry: an entire lineup upgrade for Turing products, with NVIDIA pulling their chips up one rung of the performance chair across their entire lineup.

Apparently, NVIDIA will be looking to increase performance across the board, by shuffling their chips in a downward manner whilst keeping the current pricing structure. This means that NVIDIA's TU106 chip, which powered their RTX 2070 graphics card, will now be powering the RTX 2060 SUPER (with a reported core count of 2176 CUDA cores). The TU104 chip, which power the current RTX 2080, will in the meantime be powering the SUPER version of the RTX 2070 (a reported 2560 CUDA cores are expected to be onboard), and the TU102 chip which powered their top-of-the-line RTX 2080 Ti will be brought down to the RTX 2080 SUPER (specs place this at 8 GB GDDR6 VRAM and 3072 CUDA cores). This carves the way for an even more powerful SKU in the RTX 2080 Ti SUPER, which should be launched at a later date. Salty waters say the RTX 2080 Ti SUPER will feature and unlocked chip which could be allowed to convert up to 300 W into graphics horsepower, so that's something to keep an eye - and a power meter on - for sure. Less defined talks suggest that NVIDIA will be introducing an RTX 2070 Ti SUPER equivalent with a new chip as well.
This means that NVIDIA will be increasing performance by an entire tier across their Turing lineup, thus bringing improved RTX performance to lower pricing brackets than could be achieved with their original 20-series lineup. Industry sources (independently verified) have put it forward that NVIDIA plans to announce - and perhaps introduce - some of its SUPER GPUs as soon as next week.

Should these new SKUs dethrone NVIDIA's current Turing series from their current pricing positions, and increase performance across the board, AMD's Navi may find themselves thrown into a chaotic market that they were never meant to be in - the RT 5700 XT for $449 features performance that's on par or slightly higher than NVIDIA's current RTX 2070 chip, but the SUPER version seems to pack in just enough more cores to offset that performance difference and then some, whilst also offering raytracing.
Granted, NVIDIA's TU104 chip powering the RTX 2080 does feature a grand 545 mm² area, whilst AMD's RT 5700 XT makes do with less than half that at 251 mm² - barring different wafer pricing for the newer 7 nm technology employed by AMD's Navi, this means that AMD's dies are cheaper to produce than NVIDIA's, and a price correction for AMD's lineup should be pretty straightforward whilst allowing AMD to keep healthy margins.
Sources: WCCFTech, Videocardz
Add your own comment

126 Comments on NVIDIA's SUPER Tease Rumored to Translate Into an Entire Lineup Shift Upwards for Turing

#76
bug
XaledDo you have a prove that "they costs just as much"?
Just dont eat anything Nvidia throws at you, like "our cards cost too much, that's why they are expensive" or the reason why they didn't move to 7nm. They are just dishonest and not only not telling the truth but they are obviously and intentionally lying. (Yes there is a huge difference between keeping silent and not saying anything and between lying) Nvidia didn't move to 7nm because there were no NEED for it (yet). That is the only and the only reason for that.
Is this proof enough for you? www.anandtech.com/show/14528/amd-announces-radeon-rx-5700-xt-rx-5700-series
Posted on Reply
#77
Unregistered
Guess we'll have to wait and see what happens. If they can get current build "2080 Ti" performance into a card that costs around $700, I'll buy two of them and corresponding waterblock kits to go with them. If they don't, then I'll keep waiting for Intel Xe & Nvidia 3000 series.
Posted on Edit | Reply
#78
Mephis
SteevoDevalued by the same exact card now being a tier lower in pricing.

I bought a X1800XT and low, the X1900XT displaced it and devalued it so it lost a couple hundred bucks in value for resale.

Did it still work? Yes.
Was it worth the cost? No.

It's about consumer value, if I buy a premium product I would expect it to be valued as a premium product.

Just my opinion.
Did you buy the card as an investment? Or did you, like most everyone else, buy the card to play games or do some other kind of work?

If it is the first one, then I'm sorry. If it was the second one, then whether or not something else came out later doesn't effect your usage ability of the card. You still get the same "value" out of the card as you got before the new one came out.
Posted on Reply
#79
danbert2000
It's interesting that Nvidia feels the need to come out with a whole new product stack instead of just dropping prices on their current offerings. Perhaps these SUPER cards are going to hit more cost-efficient configurations of the current chips. Or maybe Nvidia just felt like "new" would sell better than "now cheaper." Whatever the case, the whole GPU space is going to be a nightmare now. 1650, 1660, 1660 Ti, 2060, 2060 Super, 2070, 2070 Super, rumored 2070 Ti, 2080, 2080 Super, 2080 Ti, 2080 Ti Super. 12 GPUs not counting laptop parts. That's obscene. And without a bump in the product numbers, there are going to be so many confused people buying inferior cards. It's going to take a while for retailers to discount the "old" GPUs, and so I can guess some kids are going to be buying the non-SUPER parts for more.

I really hope Nvidia is flushing their non-SUPER chips out of the market, because 12 GPUs is just ridiculous.

EDIT: Forgot about the rumored 1650 Ti, so that's at least 13 GPUs to keep a hold of! Pray for @W1zzard, those GPU tests are not going to be fun...
Posted on Reply
#80
Xzibit
danbert2000I really hope Nvidia is flushing their non-SUPER chips out of the market, because 12 GPUs is just ridiculous.
Last quarter report showed inventory turn over at 143 days, Normal is 60-ish. Its been above 120 for 3 quarters. Will be interesting if a new line-up is added, what effect it has.
Posted on Reply
#81
AceKingSuited
MephisHow exactly is the card they just bought devalued? Does it all of a sudden get less fps in games, now that a new card is out?

Of course, you can always wait and get something newer and faster. But that will always be true. Should people not buy Zen 2 processors, because Zen 3 will be coming down the road, not to mention Zen 4 or 5?
I think it's the timing and the high price when it was released. The Turing cards were released ~ 6 months ago at atronomical prices and now a new card is coming and is faster and cheaper. At least with the Zen chips; people knew the schedule is about 1 year out.
Posted on Reply
#82
Vayra86
SteevoDevalued by the same exact card now being a tier lower in pricing.

I bought a X1800XT and low, the X1900XT displaced it and devalued it so it lost a couple hundred bucks in value for resale.

Did it still work? Yes.
Was it worth the cost? No.

It's about consumer value, if I buy a premium product I would expect it to be valued as a premium product.

Just my opinion.
But it is - the 2080ti is not getting the SUPER treatment as far as I can read...

So the 'premium' product is still up there on its number one halo spot. The rest is not 'premium' in the stack, an x80 is nothing special in the usual line of things. The price is special, sure .... :D
Posted on Reply
#83
64K
Vayra86But it is - the 2080ti is not getting the SUPER treatment as far as I can read...

So the 'premium' product is still up there on its number one halo spot. The rest is not 'premium' in the stack, an x80 is nothing special in the usual line of things. The price is special, sure .... :D
The 2080 Ti should release in the Super lineup but not at the same time as the 2060, 2070, 2080 Supers according to the current rumors. I'm sure Nvidia wants to keep milking the Titan RTX for $2,500 for a little while longer.
Posted on Reply
#84
Cybrshrk
MephisHow exactly is the card they just bought devalued? Does it all of a sudden get less fps in games, now that a new card is out?

Of course, you can always wait and get something newer and faster. But that will always be true. Should people not buy Zen 2 processors, because Zen 3 will be coming down the road, not to mention Zen 4 or 5?
Value is directly tied to performance to dollar.

The value of every frame per second the card can deliver is now cut in half because the card is now worth half as much as it was worth before.

The value (ie what the card could be sold to someone else for) has now decreased and their "investment" is going to return half as much as it could have if they had sold it before these cards existed.

That's the "value" most would be upset about losing.
MephisDid you buy the card as an investment? Or did you, like most everyone else, buy the card to play games or do some other kind of work?

If it is the first one, then I'm sorry. If it was the second one, then whether or not something else came out later doesn't effect your usage ability of the card. You still get the same "value" out of the card as you got before the new one came out.
Sorry but for many its actually a bit of both we buy the card to play games or what not but we also take into account the resale value of the card and how long the card can last without losing too much of its value for resale.

The process has allowed me to upgrade almost yearly and lose very little value in my parts over the years.

This move will be the first time I will most likely have to spend double what I normally pay to upgrade due to the quick de-value of the current 20 series.

It is a big deal to some of us for more than "clout" reasons.
Posted on Reply
#85
efikkan
CybrshrkSorry but for many its actually a bit of both we buy the card to play games or what not but we also take into account the resale value of the card and how long the card can last without losing too much of its value for resale.

The process has allowed me to upgrade almost yearly and lose very little value in my parts over the years.
Upgrading graphics cards yearly is probably the most expensive thing you can do, unless you rely on buying during heavy discounts. Resell value is also very hard to predict; 1.5 years ago resell value was good due to the mining mania, lately it has been very bad, and it also depends on the region/country.

If you buy a new card at x dollars every year, and sell the old for 67% of the price (which is optimistic), after three years you've spent 1.67x the original card price(probably >2x if we account for shipping), for very minimal upgrades. In general you would be much better off buying one or two tiers up and keeping it for three years.
Posted on Reply
#86
Cybrshrk
efikkanUpgrading graphics cards yearly is probably the most expensive thing you can do, unless you rely on buying during heavy discounts. Resell value is also very hard to predict; 1.5 years ago resell value was good due to the mining mania, lately it has been very bad, and it also depends on the region/country.

If you buy a new card at x dollars every year, and sell the old for 67% of the price (which is optimistic), after three years you've spent 1.67x the original card price(probably >2x if we account for shipping), for very minimal upgrades. In general you would be much better off buying one or two tiers up and keeping it for three years.
We'll for one it's not really yearly but I tend to sell right before a new release while value is still high using backup cards in the mean time or just go without gaming. Then I'm not over spending on crazy cards as I only need base cards and I watercool.

My last investment was about 1500 for my two 1080 ti's and I sold them for 1250 2 years later took that 1250 got a base 2080ti and watwecooled it.

All in all I only "spent" about 250 dollars for 2 years worth of top end gaming.

Now I'll be lucky to clear 700 and then I'll be reinvesting another 1200 or more meaning I'll have spent over 500 for 6 months or less of similar top end gaming.

You see the issue I'm facing? If I were to upgrade not saying I would but the "value" of my setup is absolutely trashed by this.
Posted on Reply
#87
moproblems99
CybrshrkYou see the issue I'm facing?
Drag that 6 months out to 12 and you will have doubled your return. The fact you sold your 1080 tis basically for what you paid for them is a fluke and should be used to base any other transactions.

Also, by purchasing a top end GPU yearly, you should never expect any sort of return and always expect to lose your ass.
Posted on Reply
#88
Cybrshrk
moproblems99Drag that 6 months out to 12 and you will have doubled your return. The fact you sold your 1080 tis basically for what you paid for them is a fluke and should be used to base any other transactions.

Also, by purchasing a top end GPU yearly, you should never expect any sort of return and always expect to lose your ass.
I've managed a similar loss on each generation before if just a little worse than the 125 a year I lost on the 1080ti's and again its not exactly yearly but I owned a 780 sli 980 ti sli 1080 sli and 1080 ti sli each time I sold the previous cards right before the price fell on the old cards but between most of those it was a year or more I did get the most out of the 1080ti setup for sure but my losses were never as bad as they would be if u were to follow the same path I've taken before now with the 2080 ti.

The only thing that will force my hand to upgrade at this point is hdmi 2.1 support so if these cards have it I will upgrade if not I'll get to hold out for the next Gen, but as someone with a new oled tv that has hdmi 2.1 and no sources that support it currently. The new Gen consoles and a video card that offers it are the only things I'm looking forward to right now.
Posted on Reply
#89
medi01
Have you noticed an elephant in the room?
A whole tier bump for the same price, when was the last time you've seen that from greedy green?

Something is going on... hm... what could it be, hehehe...
efikkanTSMC 7nm is at least twice as expensive per density, probably more, since the old "16/12nm" node have reached its full potential, while the 7nm node is still maturing.
It MUST BE. Else we'd need to call BS on the whole "pretty cool margins".
Posted on Reply
#91
bug
medi01Have you noticed an elephant in the room?
A whole tier bump for the same price, when was the last time you've seen that from greedy green?
Save for an odd release here and there, I've only seen this on every single launch I care to remember.
Posted on Reply
#92
moproblems99
bugSave for an odd release here and there, I've only seen this on every single launch I care to remember.
The only difference is that this isn't really a 'release' in the true sense of the word. Same arch, same node. All they are doing is releasing what they probably planned on 6 months ago before AMD decided to show up to the potluck without a dish again.
Posted on Reply
#93
bug
moproblems99The only difference is that this isn't really a 'release' in the true sense of the word. Same arch, same node. All they are doing is releasing what they probably planned on 6 months ago before AMD decided to show up to the potluck without a dish again.
Eh, solid arguments are lost on medi, no need to bother with little details like these.
Posted on Reply
#94
efikkan
moproblems99The only difference is that this isn't really a 'release' in the true sense of the word. Same arch, same node. All they are doing is releasing what they probably planned on 6 months ago before AMD decided to show up to the potluck without a dish again.
It's a product refresh. It's at least a change in the binning, if not a new stepping.
Yields were initially a little "sub-optimal" leading Nvidia to create the "A" and "non-A" versions of the chips.
Posted on Reply
#95
Valantar
efikkanIt's a product refresh. It's at least a change in the binning, if not a new stepping.
Yields were initially a little "sub-optimal" leading Nvidia to create the "A" and "non-A" versions of the chips.
Given the maturity of TSMC 12nm at the time of launch, I guess that gives us a pretty good indication of just how troublesome the gigantic die sizes of Turing were - and seemingly still are, or have at least continued being for quite a while.
Posted on Reply
#96
bug
ValantarGiven the maturity of TSMC 12nm at the time of launch, I guess that gives us a pretty good indication of just how troublesome the gigantic die sizes of Turing were - and seemingly still are, or have at least continued being for quite a while.
It's the same trouble as producing any large die: you get fewer dies from a wafer. If you get 20 dies out of a wafer, losing a die is losing 5%. If you only get 10 dies, then one defective transistor will cost 10% instead. Imprinting the pattern is pretty much copy/paste, the limiting factor here is the fab's ability to keep defects to a minimum. And while 12nm was over a year old at the time we got Turing, I don't think that's long enough to get to the optimum output for a given process.

Regardless, we're getting cheaper Turing now.
Posted on Reply
#97
Ruru
S.T.A.R.S.
bugRegardless, we're getting cheaper Turing now.
It's easy to release something cheaper when the whole lineup is hella expensive :D
Posted on Reply
#98
Valantar
bugIt's the same trouble as producing any large die: you get fewer dies from a wafer. If you get 20 dies out of a wafer, losing a die is losing 5%. If you only get 10 dies, then one defective transistor will cost 10% instead. Imprinting the pattern is pretty much copy/paste, the limiting factor here is the fab's ability to keep defects to a minimum. And while 12nm was over a year old at the time we got Turing, I don't think that's long enough to get to the optimum output for a given process.

Regardless, we're getting cheaper Turing now.
A year into volume production for a process node as similar to its predecessor as TSMC's 12nm it really ought to be quite mature - but as you say, even a single defect can have big consequences when the die is big enough, and Nvidia are really pushing things in that regard with Turing. Even the RTX 2070 and 2060 are gigantic compared to previous products in the same market segment (note: not price segment). Kind of funny how people were shocked at the size of the Fury X die back in the day at 596mm2 on 28nm, yet here Nvidia is pushing out significantly bigger dice than that on 12nm, and "mid-range" dice more than 2/3 of this. Times are changing, I suppose.
Posted on Reply
#99
bug
Chloe PriceIt's easy to release something cheaper when the whole lineup is hella expensive :D
I didn't say it was hard ;)
Hell, till I see benchmarks I'm not even sure we're back into sane territory.
Posted on Reply
#100
64K
I thought I saw on the Videocardz site something about price reductions on the Supers. I can't find it now though. Maybe I just imagined it. It wouldn't make much sense for Nvidia to lower the price below what the cards are selling for now unless the stockholders are getting nervous about the RTX cards not selling as well as expected according to Mr Huang.
Posted on Reply
Add your own comment
Dec 19th, 2024 04:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts