Saturday, September 19th 2020

NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

A GIGABYTE webpage meant for redeeming the RTX 30-series Watch Dogs Legion + GeForce NOW bundle, lists out eligible graphics cards for the offer, including a large selection of those based on unannounced RTX 30-series GPUs. Among these are references to a "GeForce RTX 3060" with 8 GB of memory, and more interestingly, a 20 GB variant of the RTX 3080. The list also confirms the RTX 3070S with 16 GB of memory.

The RTX 3080 launched last week comes with 10 GB of memory across a 320-bit memory interface, using 8 Gbit memory chips, while the RTX 3090 achieves its 24 GB memory amount by piggy-backing two of these chips per 32-bit channel (chips on either side of the PCB). It's conceivable that the the RTX 3080 20 GB will adopt the same method. There exists a vast price-gap between the RTX 3080 10 GB and the RTX 3090, which NVIDIA could look to fill with the 20 GB variant of the RTX 3080. The question on whether you should wait for the 20 GB variant of the RTX 3080 or pick up th 10 GB variant right now, will depend on the performance gap between the RTX 3080 and RTX 3090. We'll answer this question next week.
Source: VideoCardz
Add your own comment

157 Comments on NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

#126
Valantar
BoboOOZNvidia doesn't panic, but they plan ahead and take competition very seriously. That's why they win so often, even sometimes when they don't have the best performance or the best price-performance ratio. They rarely leave theùselves open and that's how any well-organized company should be.
Well, the launch of the Super SKUs in the last generation kind of contradicts that - that was a pure reaction to AMD and clearly not one that was planned out to any significant degree beforehand. Had it been, they wouldn't have made such a mess of their lineup (some Supers replacing older SKUs, others coexisting with them, etc.) while torpedoing the value of their previous SKUs. Holding a dominant market position carries a certain momentum with it, which is far more likely the reason for Nvidia's continued success in the (relatively few) situations where they have been notably behind. Most customers are poorly informed and not quite rational, so brand recognition and customer trust (especially when coupled with huge marketing budgets) go quite far even when the competition has a superior product.
Posted on Reply
#127
BoboOOZ
ValantarWell, the launch of the Super SKUs in the last generation kind of contradicts that - that was a pure reaction to AMD and clearly not one that was planned out to any significant degree beforehand. Had it been, they wouldn't have made such a mess of their lineup (some Supers replacing older SKUs, others coexisting with them, etc.) while torpedoing the value of their previous SKUs. Holding a dominant market position carries a certain momentum with it, which is far more likely the reason for Nvidia's continued success in the (relatively few) situations where they have been notably behind. Most customers are poorly informed and not quite rational, so brand recognition and customer trust (especially when coupled with huge marketing budgets) go quite far even when the competition has a superior product.
I'm sure that you agree with me that they had planned these new higher VRAM SKU exactly so that they don't need to have another short notice reaction this time.
And for the last time, there was no way to launch new SKUs without hurting the sales of the older ones, it's natural. As soon as the 3080 20GB comes out, the 10GB version will be much less desirable, that's how it goes.
And the fact their lineup was confusing, it may have been part of their strategy, or maybe they don't think it is important. Just look at their mobile lineup. There was no pressure from AMD there, and it's still all over the place with supers and max-q SKU where you have no idea to expect in terms of performance if you haven't watched 10 comparative reviews.
Posted on Reply
#128
londiste
BoboOOZIn this case, 12GB would've been great for the 3080, but potentially the bandwidth would've been the same as the 3090, which would have not created enough segmentation.
12GB vs 10GB would probably not be a big enough difference to matter in this context. Especially when the rumored competition has 16GB.
Posted on Reply
#129
EarthDog
ValantarWell, the launch of the Super SKUs in the last generation kind of contradicts that - that was a pure reaction to AMD and clearly not one that was planned out to any significant degree beforehand. Had it been, they wouldn't have made such a mess of their lineup (some Supers replacing older SKUs, others coexisting with them, etc.) while torpedoing the value of their previous SKUs.
Sorry, I disagree here. I follow the logic, but sometimes it isnt that easy...

Both Navi and (some) Super cards were launched in July 2019, right? If this was a response, shouldn't it be after Navi release? How are the super cards neutered...by software or laser? You cant enable more bits, so they seem laser cut. While rumors are abound, if this was a response and not something planned, I find it difficult to believe they can software or laser cut to neuter the dies and get them out that fast. It wasn't like they didn't know AMD would be coming out with something. They didn't read a rumor on wccftech and suddenly start looking for ways to get a more competitive product out there.

So yes, the timing was likely in response, sure, but to think these were not planned beforehand feels a bit myopic to me. I'd bet my life Super cards would have come out regardless of AMD.

Did I miss something?
Posted on Reply
#130
BoboOOZ
londiste12GB vs 10GB would probably not be a big enough difference to matter in this context. Especially when the rumored competition has 16GB.
12GB would have been enough that they could have turned RT on in Battlefield 5 in the Digital Foundry comparison.
Frankly, I think that this leak (the fact that they are preparing a 20GB version) hurts their image much more than the fact that the competition has 16GB. 12GB would've been enough for 4 years at 4k.
EarthDogWhile rumors are abound, if this was a response and not something planned, I find it difficult to believe they can software or laser cut to neuter the dies and get them out that fast. It wasn't like they didn't know AMD would be coming out with something.
Well, it's almost impossible to get solid proof of this, but from what I have heard from people in the industry, they are an extremely agile company, capable of taking a decision in a matter of days and implementing it in a matter of weeks. For the Supers, they managed to get info about the performance and the pricing of the 5700XT, and I would argue that their response, albeit rushed, was much less botched than that of AMD, who had to apply a last-minute overclock on the card and drop its price, which lead to problems with the card being too hot, loud and power-hungry.
That is probably why AMD had finally learned their lesson and there are almost no leaks coming out this time.
Posted on Reply
#131
EarthDog
Oh yes, they do seem a bit more agile, but I highly doubt they had hard information in time to respond with the Supers. The supers were easily a twinkle in their eye long before AMD released Navi. I do imagine Navi sped up the release time frame for these cards, a response if you will, but these don't come out in weeks.
Posted on Reply
#132
medi01
BoboOOZFor the Supers, they managed to get info about the performance and the pricing of the 5700XT, and I would argue that their response, albeit rushed, was much less botched than that of AMD, who had to apply a last-minute overclock on the card and drop its price, which lead to problems with the card being too hot, loud and power-hungry.
Lolwhat, 5700XT is power hungry?
Posted on Reply
#133
BoboOOZ
medi01Lolwhat, 5700XT is power hungry?
Well, not anymore :p .

Loudness and temperatures are much more disturbing, only next summer will start users complaining about their 3080's, since all cooling solutions seem to be good quality.
Posted on Reply
#134
medi01
BoboOOZWell, not anymore :p .
Nor in the past.
AMD has forced NV to respond with supers.
5700 series still sold beautifully despite being rather pricey by AMD's standards.
Posted on Reply
#135
Valantar
BoboOOZI'm sure that you agree with me that they had planned these new higher VRAM SKU exactly so that they don't need to have another short notice reaction this time.
And for the last time, there was no way to launch new SKUs without hurting the sales of the older ones, it's natural. As soon as the 3080 20GB comes out, the 10GB version will be much less desirable, that's how it goes.
And the fact their lineup was confusing, it may have been part of their strategy, or maybe they don't think it is important. Just look at their mobile lineup. There was no pressure from AMD there, and it's still all over the place with supers and max-q SKU where you have no idea to expect in terms of performance if you haven't watched 10 comparative reviews.
I don't doubt they had higher VRAM SKUs planned all along (it's way too soon for those to be showing up now if it was a late addition), but they've likely been kept "secret" as Nvidia don't want to hurt sales of their current and upcoming cards - availability of 2GB GDDR6X chips isn't expected until 2021, after all. It's still pretty weird though, as they're leaving themselves with the choice of either cluttering up the lineup like last time, and thus pissing people off, or having higher VRAM SKUs replace the initial ones, pissing off early buyers. Either way it'll be a weird mess.
EarthDogSorry, I disagree here. I follow the logic, but sometimes it isnt that easy...

Both Navi and (some) Super cards were launched in July 2019, right? If this was a response, shouldn't it be after Navi release? How are the super cards neutered...by software or laser? You cant enable more bits, so they seem laser cut. While rumors are abound, if this was a response and not something planned, I find it difficult to believe they can software or laser cut to neuter the dies and get them out that fast. It wasn't like they didn't know AMD would be coming out with something. They didn't read a rumor on wccftech and suddenly start looking for ways to get a more competitive product out there.

So yes, the timing was likely in response, sure, but to think these were not planned beforehand feels a bit myopic to me. I'd bet my life Super cards would have come out regardless of AMD.
That depends entirely how you define "response". Your definition here seems to be a literalist one, i.e. that to be a response it must arrive after and have been thought out after the arrival of what it responds to. IMO that ignores the timescales and predictability of the GPU market, where it is entirely possible to... what should we call it, "preemptively respond"(?) to something. Nvidia clearly knew AMD had new GPUs coming, and that they were going to be competitive in their market segments. They also clearly had planned how to deliver such a response, and did so early in the hopes of showing themselves as having initiative and not simply being reactive. However the positioning, performance and featuresets of the GPUs in question contradict this, as it is obvious that the Super lineup was in no way planned from the launch of Turing - if that was the case, they wouldn't have ended up with the confusing mess of a lineup they did (2060, 2060S, 2070, 2070S, 2080, 2080S, 2080 Ti), with confusion about which SKUs were discontinued and which weren't, etc. My impression is that Nvidia wanted to demonstratively preempt AMD's launch while also seeing an opportunity to sell lower binned (partially disabled) dice that they previously had no use for, letting them allocate fully enabled chips entirely to higher margin enterprise products. This also speaks to the possibility of there being worse yields of Turing than initially planned, as a pure price cut would otherwise have made more sense, though there's also an argument here that Nvidia didn't want to establish a precedent for a $499 RTX xx80 series. Either way, even if Nvidia was early it was clearly a response from their side. Was it planned months before? Obviously. Was it part of their roadmap for Turing all along? I highly doubt that. So was it a response to AMD becoming more competitive in these market segments, even if AMD's GPUs weren't out yet? Absolutely.
Posted on Reply
#136
EarthDog
ValantarThat depends entirely how you define "response". Your definition here seems to be a literalist one, i.e. that to be a response it must arrive after and have been thought out after the arrival of what it responds to. IMO that ignores the timescales and predictability of the GPU market, where it is entirely possible to... what should we call it, "preemptively respond"(?) to something. Nvidia clearly knew AMD had new GPUs coming, and that they were going to be competitive in their market segments. They also clearly had planned how to deliver such a response, and did so early in the hopes of showing themselves as having initiative and not simply being reactive. However the positioning, performance and featuresets of the GPUs in question contradict this, as it is obvious that the Super lineup was in no way planned from the launch of Turing - if that was the case, they wouldn't have ended up with the confusing mess of a lineup they did (2060, 2060S, 2070, 2070S, 2080, 2080S, 2080 Ti), with confusion about which SKUs were discontinued and which weren't, etc. My impression is that Nvidia wanted to demonstratively preempt AMD's launch while also seeing an opportunity to sell lower binned (partially disabled) dice that they previously had no use for, letting them allocate fully enabled chips entirely to higher margin enterprise products. This also speaks to the possibility of there being worse yields of Turing than initially planned, as a pure price cut would otherwise have made more sense, though there's also an argument here that Nvidia didn't want to establish a precedent for a $499 RTX xx80 series. Either way, even if Nvidia was early it was clearly a response from their side. Was it planned months before? Obviously. Was it part of their roadmap for Turing all along? I highly doubt that. So was it a response to AMD becoming more competitive in these market segments, even if AMD's GPUs weren't out yet? Absolutely.
You're basing your opinion on a lot of assumptions. I don't have the time to go into a diatribe about the whole thing, but I can tell you these were more than a twinkle in their eye. IDGAH(oot) about naming conventions.... correlation is not causation. Again, they just can't go, 'oh shit, AMD results', and then suddenly respond and get a product to market WITH the new AMD cards...regardless of bins/full chips, etc etc.

Nvidia came out with cards and price points that fit the market at the time. Knowing AMD would respond eventually, surely they had something being cooked up.
ValantarWas it planned months before? Obviously. Was it part of their roadmap for Turing all along? I highly doubt that.
Doubt it all you want... video cards aren't pulled out of assess (just Jensen's oven... :p). The Supers were all a part of things, surely.

We'll agree to disagree and move forward. ;)
Posted on Reply
#137
BoboOOZ
ValantarMy impression is that Nvidia wanted to demonstratively preempt AMD's launch while also seeing an opportunity to sell lower binned (partially disabled) dice that they previously had no use for, letting them allocate fully enabled chips entirely to higher margin enterprise products. This also speaks to the possibility of there being worse yields of Turing than initially planned, as a pure price cut would otherwise have made more sense, though there's also an argument here that Nvidia didn't want to establish a precedent for a $499 RTX xx80 series.
My opinion is that Nvidia tries to avoid straight price cuts, they either try to offer better performance at the same price, or when doing a price cut, they justify it by removing some functionality to keep face (like for the 2060 KO). That's definitely good marketing.
Posted on Reply
#138
EarthDog
BoboOOZMy opinion is that Nvidia tries to avoid straight price cuts, they either try to offer better performance at the same price, or when doing a price cut, they justify it by removing some functionality to keep face (like for the 2060 KO). That's definitely good marketing.
The 2060 KO wasn't an Nvidia product IIRC, it was Evga(?).
Posted on Reply
#139
Valantar
EarthDogYou're basing your opinion on a lot of assumptions. I don't have the time to go into a diatribe about the whole thing, but I can tell you these were more than a twinkle in their eye. IDGAH(oot) about naming conventions.... correlation is not causation. Again, they just can't go, 'oh shit, AMD results', and then suddenly respond and get a product to market WITH the new AMD cards.

Nvidia came out with cards and price points that fit the market. Knowing AMD would respond eventually, surely they had something being cooked up.
Doubt it all you want... video cards aren't pulled out of assess (just Jensen's oven... :p). The Supers were all a part of things, surely.

We'll agree to disagree and move forward. ;)
But I didn't say that. I explicitly said that they were planned, but planned as a response. AMD had been singing Navi's praises long before it arrived, and Nvidia obviously has as well placed sources within the industry as anyone else. So as I said, I think they saw an opportunity to shuffle their product stack, replace expensive fully enabled SKUs with cheaper cut-down ones, while responding to AMD on price and by not seeming to have stagnated. Do I believe Nvidia would have made a mid-cycle Turing refresh without Navi looming? They probably would have, but I sincerely doubt it would have looked anything like the Super lineup we came to know - Nvidia's preferred way of doing these things is to add higher priced, higher performance tiers, not cut prices outright. When was the last time Nvidia explicitly cut the price of anything?

Edit: nice ninja edit btw ;)
Posted on Reply
#140
EarthDog
ValantarBut I didn't say that. I explicitly said that they were planned, but planned as a response. AMD had been singing Navi's praises long before it arrived, and Nvidia obviously has as well placed sources within the industry as anyone else. So as I said, I think they saw an opportunity to shuffle their product stack, replace expensive fully enabled SKUs with cheaper cut-down ones, while responding to AMD on price and by not seeming to have stagnated. Do I believe Nvidia would have made a mid-cycle Turing refresh without Navi looming? They probably would have, but I sincerely doubt it would have looked anything like the Super lineup we came to know - Nvidia's preferred way of doing these things is to add higher priced, higher performance tiers, not cut prices outright. When was the last time Nvidia explicitly cut the price of anything?

Edit: nice ninja edit btw ;)
Yes, you said 'planned to any significant degree'. I disagree with that assertion...simple. I can't buy the assumptions you're selling and we're talking in grey areas, so we'll agree to disagree and move forward. :)

RE: The ninja edit... what's your point? It changed nothing. :rolleyes:
Posted on Reply
#141
BoboOOZ
EarthDogThe 2060 KO wasn't an Nvidia product IIRC, it was Evga(?).
It's a collaboration, the chip on the 2060 KO is a special chip provided by Nvidia. It's a cut-down version of a larger die but maintains some hardware from the larger die, such that the performance in encoding is better than non-KO 2060s.
So basically, although it's got more performance, it's offered at a lower price, and it was launched around the launch of the 5600XT and managed to steal the thunder of the AMD card and many reviewers recommend the KO.
So no, this is not an EVGA move, it's just another brilliant marketing move from Nvidia, made in collaboration with a trusted partner.
Posted on Reply
#142
EarthDog
BoboOOZIt's a collaboration, the chip on the 2060 KO is a special chip provided by Nvidia. It's a cut-down version of a larger die but maintains some hardware from the larger die, such that the performance in encoding is better than non-KO 2060s.
So basically, although it's got more performance, it's offered at a lower price, and it was launched around the launch of the 5600XT and managed to steal the thunder of the AMD card and many reviewers recommend the KO.
So no, this is not an EVGA move, it's just another brilliant marketing move from Nvidia, made in collaboration with a trusted partner.
Sorry, yes... they of course had to work with Nvidia on it. Question though... do any other board partners have the KO silicon (I don't know)?
Posted on Reply
#143
BoboOOZ
EarthDogSorry, yes... they of course had to work with Nvidia on it. Question though... do any other board partners have the KO silicon (I don't know)?
It's exclusive to Evga, indeed.
Posted on Reply
#144
Valantar
EarthDogYes, you said 'planned to any significant degree'. I disagree with that assertion...simple. I can't buy the assumptions you're selling and we're talking in grey areas, so we'll agree to disagree and move forward. :)

RE: The ninja edit... what's your point? It changed nothing. :rolleyes:
Yes, that seems to be the most productive approach at this point. My point about the ninja edit was that I saw that piece of writing for the first time quoted in my own response after posting it, which kind of undermined the message of your edit :P Just made me chuckle, that's all.
Posted on Reply
#145
R0H1T
BoboOOZMy opinion is that Nvidia tries to avoid straight price cuts, they either try to offer better performance at the same price, or when doing a price cut, they justify it by removing some functionality to keep face (like for the 2060 KO). That's definitely good marketing.
Nope that's Intel 101 & part of the reason why I more often than not dislike Nvidia, much like Intel or indeed Apple. Planned obsolescence !
Posted on Reply
#146
BoboOOZ
R0H1TNope that's Intel 101 & part of the reason why I more often than not dislike Nvidia, much like Intel or indeed Apple. Planned obsolescence !
I understand where you're coming from, but a large company cannot succeed only with good engineers. Good leadership, marketing, sales, and lawyers are all required for success, such are the rules of the market. Engineering is only one part of the equation.
And for alleged immoral or anti-consumer practices, the companies that you are talking about are doing very well, which means they are doing what they should be doing. It's AMD who has to improve. You either adapt to the market, or you disappear.
Posted on Reply
#147
Valantar
BoboOOZI understand where you're coming from, but a large company cannot succeed only with good engineers. Good leadership, marketing, sales, and lawyers are all required for success, such are the rules of the market. Engineering is only one part of the equation.
And for alleged immoral or anti-consumer practices, the companies that you are talking about are doing very well, which means they are doing what they should be doing. It's AMD who has to improve. You either adapt to the market, or you disappear.
... that's a downright shockingly naive stance. "They are doing well, which means they are doing what they should be doing" - so anticompetitive practices are "what you should be doing" as long as you get away with it? The only thing that matters is that the company is successful, no matter how they come about this success? Yeah, you really ought to rethink that statement with a bit more context taken into account.
Posted on Reply
#148
BoboOOZ
Valantar... that's a downright shockingly naive stance. "They are doing well, which means they are doing what they should be doing" - so anticompetitive practices are "what you should be doing" as long as you get away with it? The only thing that matters is that the company is successful, no matter how they come about this success? Yeah, you really ought to rethink that statement with a bit more context taken into account.
Naive? I'd call it realistic, maybe cynical, but certainly not naive. It's simply an amoral, evolutionary perspective. People vote with their wallets, and it's very obvious lots of people vote with Apple, Intel, and Nvidia, in spite of their anticompetitive past history.
I'm not saying I agree with it and I certainly vote with my wallet as my conscience points me, but that doesn't change reality, most people are not that disciplined, knowledgeable, or they simply do not care about these issues.
The only things that really stop companies from behaving very badly are existing laws and market acceptance. Even existing laws are not a hard line, because in many cases it is more profitable for these companies to break the rules and drag it out in court, instead of simply obeying them. Both Nvidia and Intel did this in the past (I don't know much about Apple) and it allowed them to stomp certain of their competitors. This is the reality and understanding is not naive, quite the contrary.
Posted on Reply
#149
Vayra86
efikkanAny time data needs to be swapped from system memory etc., there will be a penalty, there no doubt about that. It's a latency issue, no amount of bandwidth for PCIe or SSDs will solve this. So you're right so far.

Games have basically two ways of managing resources;
- No active management - everything is allocated during loading (still fairly common). The driver may swap if needed.
- Resource streaming


This is where I have a problem with your reasoning, where is the evidence of this GPU being unbalanced?
3080 is two generations newer than 1080, it has 2 GB more VRAM, more advanced compression, more cache and a more advanced design which may utilize the VRAM more efficiently. Where is your technical argument for this being less balanced?
I'll say the truth is in benchmarking, not in anecdotes about how much VRAM "feels right". :rolleyes:
If you revisit GPUs from different time frames with new games, you can spot where some of those fall off faster than others. Sometimes that is even up to the memory wiring, such as with the 970, which lost more frames than it should have for 4GB GPU. It just drowns earlier; and in a similar way, how the 7970 held its own for so long with 3GB and hefty bandwidth. Maybe I'm wrong, and yes, its about gut feeling more than anything. Because really thats all we have looking forward in time. GPU history is not without design and balancing failures, we all know this.

In the end, success or failure of advances in GPUs will be down to how devs (get to) utilize it. Whether or not they understand it and whether or not its workable. I'm seeing the weight move from tried and trusted VRAM capacity to different areas. Not all devs are going to keep up. In that sense its sort of a good sign that Nvidias system for Ampere is trying to mimick next gen console features, but still. This is going to be maintenance intensive for sure.

I'll shut up about it now, I've had enough attention for my thoughts on the subject ;)
Posted on Reply
#150
lexluthermiester
Vayra86I'll shut up about it now, I've had enough attention for my thoughts on the subject ;)
Right there with you. This thread has turned into the same tired set of arguments from years past...
Posted on Reply
Add your own comment
Aug 30th, 2024 12:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts