• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Plans GeForce RTX 4060 Launch for Summer 2023, Performance Rivaling RTX 3070

So you're saying Nvidia could potentially exploit something so we are going to assume they are evil?
Where is the evidence of Nvidia holding back the lower SKUs? (beyond nonsense from some YouTube channels)
It's normal that lower chips follow in sequence. I thought people would remember this by now.


Renaming a product due to market backlash? How is this relevant to your claims?


GTX 980 was the top model for about half a year, and it remained in the high-end segment until it was suceeded by Pascal.


The mid-range cards of the 600-series was using both GK106 and GK104 chips.
The 600-series was "short lived" compared to the current release tempo. Back then Nvidia used to release a full generation and a refreshed generation (with new silicon) every ~1.25-1.5 years or so.
Geforce GTX 480 was delayed due to at least three extra steppings.

And back in the 400-series they used a GF100 chip in the GTX 465, which scaled terribly.
You should spend some time looking through the List of Nvidia GPSs. The naming is arbitrary; in one generation a 06 chip is the lowest, in others the 08 chip is. What they do is design the biggest chip in the family first, then "cut down" the design into as many chips as they want to, and name them accordingly; 0, 2, 3, 4, 6, 7, 8. Sometimes they even make it more complicated by making 110, 114, etc. which seems like minor revisions to 100 and 104 respectively.
So listen and learn, or keep digging…



This might be your impression, but it doesn't match the reality. Back in the ATI days, they used to offer higher value in the upper mid-range to lower high-end segments, but since then they have been all over the place.
The Fury cards didn't start things off well, low availability and high price. Followed by RX 480/580 which were very hard to come by at a good price, compared to the competitor GTX 1060 which sold massive amounts and still was very available, even below MSRP at times. The RX Vega series was even worse, most have now forgotten that the $400/$500 price tag was initially with a game bundle, and it took months before they were somewhat available close to that price. Over the past 5+ years, AMD's supplies have been too low. Quite often the cheaper models people want are out of stock, while Nvidia's counterparts usually are. This is why I said AMD needs to have plenty of supplies to gain market shares.

We need to stop painting Nvidia/AMD/(Intel) as villains or heroes. They are not our friends, they are companies who want to make money, and given the chance, they will all overcharge for their products.


RTX is their term for the overarching GPU architecture:
rtxplatform002.png

I doubt it will go away until their next major thing.

You kind of said a lot and said nothing at the same time. Why is there market backlash? Perhaps because of what I mentioned earlier. It doesn't hold up to the x80 tier. They are holding back the 4070 and 4060 until next year, too.

The GTX 465 was die harvested to move inventory. It's not that it used GF100 because it was designed around it, it was just a way to shift bad bins of higher end cards. No wonder it sucked. The GF100 at best felt like a prototype of the GF110, and I should know 'cause I had 3 480s in SLI, and then 2 580s back in the day.

The 11x-class chips haven't been released since the GK110, which already goes back around 8 years at this point. They are intra-generational refreshes, same as the -20x chips such as GK208 and GM204/GM200. I don't know why you brought up the correlation between HBM cards (low yield, expensive tech) and their midrange successors, both GTX 1060 and Polaris sold tens of millions of units and are still amongst the most widely used GPUs of all time. The 480's very low launch price at $199 may have been a little difficult at the beginning, but for a couple of years after they lost their shine and before the crypto boom, you could easily get them for change.

GA106 was not the smallest Ampere, for example. The GA107 was also used in some SKUs and in the mobile space, and there is also a GA107S intended for the embedded market. It's not really a hard-rule, but the tiers are clearly denominated.

I... don't see how any of this was productive?
 
We need to stop painting Nvidia/AMD/(Intel) as villains or heroes. They are not our friends, they are companies who want to make money, and given the chance, they will all overcharge for their products.

They are more like our enemies, then.
No one asks them to be our friends, we simply ask them to follow their public and social duty. They serve a purpose, and this purpose is not to make money, making money is only a side effect of the capitalist market economy which we don't know how much more time will last.
 
Where is the evidence of Nvidia holding back the lower SKUs? (beyond nonsense from some YouTube channels)
It's normal that lower chips follow in sequence. I thought people would remember this by now.

The evidence is in their behavior. There is nothing "normal" about the Ada launch. I could spend half an hour of my life digging through release dates for each GPU architecture over the last decade to prove it, but I shouldn't have to.

I'll just say this: typically, the "80" class card comes at near the very beginning, and it's priced in the $500-700 range. Then you usually have a 70 card quickly follow suit, then yes, the lower SKUs filter in over the next few months. In the case of Ampere, IIRC the 3090 came first, but only by a couple of weeks, then we had the 3080 at $700 and 3070 at $500 in quick succession, then the 3060 Ti at $400 came in not too long after that. Typically, Nvidia cards sell exceedingly well for at least a month or two after release, so well that they're very hard to find at first, even without a crypto-mining boom to spike demand.

This release is markedly different. We thus far only have two cards, the 4090 and 4080, both over $1,200 MSRP. The "4080 12 GB" that was slated for release at $900 was pre-emptively recalled, with rumors that we might see it released as the "4070 Ti" in January, at an unknown price point. Other than that, we have a vague expectation that Nvidia will launch a "4060," again at an unknown price point, in June of 2023, or roughly nine months after Ada's introduction.

And the 4080 is not selling well, immediately after its launch, because it's priced too high for what it is. Meanwhile, most of the Ampere stack remains at or above MSRP, while AMD's RDNA2 plummets in price:


Of course I can't prove that Nvidia is restricting sellers from lowering prices on Ampere, or anything to that effect, but what I can say beyond question is that the Ada launch isn't providing much in the way of downward pressure on the Ampere stack, and that Nvidia was apparently willing to risk lukewarm sales figures on the 4080 to make that happen. As to why, that becomes obvious when you look at the ~50% year-on-year decline in Nvidia's "gaming" revenue--which suggests that Nvidia did indeed gorge itself on mining-enhanced Ampere sales, and thus by extension that they produced a metric buttload of those cards.

The problem for them, and for a lot of the arguments I see on this forum, is that the mining boom is over, and even a corporation as large as Nvidia cannot single-handedly meme mining-boom-era pricing schemes into "the new normal."
 
You also need to mention that EVGA with around 40% market share in North America alone no longer offers GeForce.
This is also a major blow to the nvidia ambitions and it must rethink and change its behaviour because this as an event was quite significant.
 
Just in to say: although AMD's upcoming GPUs will be priced lower than nGreedia's, their MSRP too, is ridiculous.
 
The evidence is in their behavior. There is nothing "normal" about the Ada launch. I could spend half an hour of my life digging through release dates for each GPU architecture over the last decade to prove it, but I shouldn't have to.
You claim there is evidence, yet you can't point to it.

Claiming they are holding products back would imply the products are completed and ready to be shipped in volumes. When the products are done with their engineering stage and in full mass production we usually get loads of leaks and sneak peeks. So where is the evidence that the rest of the lineup is ready? Because if there isn't evidence, I call BS.
 
You claim there is evidence, yet you can't point to it.

Claiming they are holding products back would imply the products are completed and ready to be shipped in volumes. When the products are done with their engineering stage and in full mass production we usually get loads of leaks and sneak peeks. So where is the evidence that the rest of the lineup is ready? Because if there isn't evidence, I call BS.
lol, no one has alleged that Nvidia is sitting on mountains of completed products. Good grief.

The allegation is that Nvidia is purposely holding back on producing a full stack Ada cards--or even a partial stack with a single affordable product in the lineup--because they overproduced Ampere during the mining boom. The allegation is that they've positioned products in this gen so as not to interfere with (the bulk of) Ampere price points. (Or so as to delay interfering with Ampere price points.)

These ideas aren't exactly controversial. The evidence is right before your eyes; all of it is publicly available. It's in launch announcements, MSRPs, earnings' reports, and any sensible reading of timing and precedent. Are you honestly trying to suggest that Nvidia's current behavior isn't divergent, or that the most likely explanation for their divergent behavior has nothing to do with the historic mining-driven GPU shortage that dominated all other considerations for the last ~2 years? Have you been living under a rock?

Occam's Razor, man. This Ada launch is unprecedented in several ways, most of which I laid out already, and because you have no answer you instead resort to this ridiculous distortion, "Unless you can PROVE that there are MILLIONS of affordable Ada cards languishing in a warehouse RIGHT NOW, you have no argument." Ludicrous.
 
Last edited:
They are more like our enemies, then.
No one asks them to be our friends, we simply ask them to follow their public and social duty. They serve a purpose, and this purpose is not to make money, making money is only a side effect of the capitalist market economy which we don't know how much more time will last.
You can buy a North Korean or a Chinese GPU, to have a peaceful product, who is not made to earn money and will exist for at least the next millennia. :laugh:

You also need to mention that EVGA with around 40% market share in North America alone no longer offers GeForce.
This is also a major blow to the nvidia ambitions and it must rethink and change its behaviour because this as an event was quite significant.
This is no blow to nVidia, but to "made in USA".

Closed ecosystem anything is bad. Eventually everything will be ray traced and RTX will certainly be a thing of the past.
A working closed ecosystem is better, than a not working open standard. That's why nVidia leads in RT.
 
Last edited:
lol, no one has alleged that Nvidia is sitting on mountains of completed products. Good grief.
The allegation is that Nvidia is purposely holding back on producing a full stack Ada cards--or even a partial stack with a single affordable product in the lineup--because they overproduced Ampere during the mining boom…
Keep twisting and turning. That claim is even more far fetched.

These ideas aren't exactly controversial. The evidence is right before your eyes; all of it is publicly available. It's in launch announcements, MSRPs, earnings' reports, and any sensible reading of timing and precedent.
That's not evidence of anything, it's called an anecdote, and just referring to vague "reports" doesn't back your claim.
If you knew the first thing about how microprocessors are developed, you'd know this idea is not very likely. Postponing wafers would require them to have something else to run instead, and chips that will end up as mid-range chips are very high volume compared to the larger chips, so what else would they fill in? Any such rescheduling would probably be done nearly a year in advance. And intentionally postponing a chip before it's fully developed is risky, because if the intended final stepping is bad, then the product will quickly be postponed even more.

And as I've been saying, mid-range models arriving months after a release is common. E.g. RTX 2060 was ~4 months in, and GTX 1660 was ~6 months in. GTX 3060 was ~5 months after GTX 3080. So nothing unusual this far.

It's funny that your post should mention Occam's Razor, when there are more obvious reasons for why RTX 4060 etc. isn't launching now; like it was planned this way from the beginning, or they need more tweaking on the engineering sample stage(s). Anyone with basic knowledge of GPUs, their launch history and is capable of logical deduction would understand these explanations to be far more likely. You are jumping to an unlikely conclusion and pretending it's obvious. :rolleyes:

Please stop trolling.
 
It's funny that your post should mention Occam's Razor, when there are more obvious reasons for why RTX 4060 etc. isn't launching now; like it was planned this way from the beginning
lmao, no shit, Sherlock. Nvidia are in a position to plan out their releases. You're absolutely twisting in the wind here, trying to draw a hard distinction between "Nvidia planned Ada's releases to account for an overabundance of Ampere stock" and "Nvidia are holding back on reasonably priced Ada GPUs to make space for remaining Ampere stock." They are, in fact, the same claim, and you either know that but can't refute the former, or you're simply so fixated on defending Nvidia here that you've made yourself look painfully obtuse and utterly ridiculous.

"The historic mining boom of 2020-2021 and Nvidia's earnings reports are an ANECDOTE," efikkan screams, straining to pose as the reasonable party to the discussion.

And as I've been saying, mid-range models arriving months after a release is common. E.g. RTX 2060 was ~4 months in, and GTX 1660 was ~6 months in. GTX 3060 was ~5 months after GTX 3080. So nothing unusual this far.
lol, sure, nothing at all unusual. We have zero cards below $1200 and won't get one until January; the last proposed price for that card, the former "4080 12 GB," was $900; doubtful that its relaunch will make it much cheaper than $800. The 4060 is set to launch NINE MONTHS after Ada's introduction, and we have no reason to expect that it will priced reasonably. But I suppose noting that Nvidia's new Ada stack conveniently fails to compete with the MSRP of any Ampere card below the 3080 is a "MEANINGLESS ANECDOTE." And the fact that 9 months > 4 months likewise, what kind of EVIDENCE is that? lmao, what a dork.

EDIT: Added a link to the earnings report, posted on this very site on the 16th of November, which of course makes it a super obscure item in this discussion involving Nvidia's financial incentives. "Vague 'reports,'" indeed.
 
Last edited:
"Nvidia are holding back on reasonably priced Ada GPUs to make space for remaining Ampere stock." They are, in fact, the same claim, and you either know that but can't refute the former, or you're simply so fixated on defending Nvidia here that you've made yourself look painfully obtuse and utterly ridiculous.
It's your obligation to provide evidence to support your claims, not the other party's obligation to prove a negative. You've been given plenty of chances to provide evidence to support your claims, but have provided none, so it's fairly safe to assume thay you have none, and instead you choose to attack the one that calls out your BS, which is a very solid indication that you know you've lost.

"The historic mining boom of 2020-2021 and Nvidia's earnings reports are an ANECDOTE," efikkan screams, straining to pose as the reasonable party to the discussion.
You know very well I didn't call earnings reports an anecdote, I categorized your argumentation as anecdotal because you lack evidence to support your claim. And the link you provided here contains no evidence either.
What you are doing here is attempting a straw man argument, and it's yet another solid indicator that you lack support for your claims. And the rest of your post is just a meaningless rant, this is not how to make your case.

You should learn about how microchips are developed before making such bold claims, because it all makes more sense when you actually understand this.
When GPUs are taped out (~1-1.5 years ahead of release), they usually start with the largest GPU in the architecture and work their way down, in sequence. They don't complete the tapeout until they are confident in the design, and the more difficult the design is, the longer it will take. The current trend is that this process takes more time than it did 10 years ago, and possibly even longer in the future, and as a result, we get more cascading GPU generations instead of complete lineups arriving within a short window. Armed with this knowledge, it's easy to come up with two much more likely and rational explanations for why it takes more time; 1) It was planned this way from tapeout, due to the reasons above, and 2) there could be further delays postponing the release due to unforeseen events. These explanations are far superior to an unfounded claim of Nvidia intentionally holding products back, which makes no sense since their wafer supply is usually booked up to several years in advance.

And BTW; where is the outcry for AMD not launching a complete lineup?

lol, sure, nothing at all unusual.
It's fascinating to see how you react when confronted with facts. Very grown up.
You should study Nvidia GPUs and you'll learn that mid-range chips can actually arrive more than a few months later, actually that's been the trend for the past couple of generations.

<snip>
"MEANINGLESS ANECDOTE."
<snip>
You are the only one screaming here.
This is adolescent behavior, and you should go away and not come back until you've learned some manners.
 
3070 performance for $400 three years later? What a deal!
 
NVidia is not holding back, but reduced the ordered production capacity a year ago in order to prevent overcapacity. It is a big loss-making business to keep finished products in stock, it ties up a lot of capital and the value falls over time. For nVidia, for AIBs, for stores. The forecasts on which the reduced capacities are based were correct. Total shipment of GPUs over the last 12 month declined by 25%.
What's interesting, the nVidia market share is actually growing. That means, all opinions about some bad nVidia products causing a reduction of sale or used 3000er are damaging nVidia are false, as the consumer even bought lesser AMD than nVidia. The new AMD cards can change the market share, but that is far from certain.
dGPU.jpg

If AMD has ordered a high production capacity, they will get problems soon.

Back to some specific price information. Used 3090 are sold around 550$ in Argentina. My girlfriend has some relatives there. They are just finding out if the export of used PC hardware is taxed etc. Probably not, but it's always better to be sure. The 4000 series is still quite expensive and my system is a bit older. I can install a 3090 without further adjustments. So I'm going to get a used 3090 from Argentina. The relatives are coming to visit around Christmas, they'll bring me one. That would solve the issue of GPUs for me until the 5000 series around 2025/2026. I will keep my 2070 as backup.
 
I've had:
Geforce 2 MX400
Geforce 4 MX440
Geforce FX 5700 Ultra
Geforce 6800 GS
Geforce RTX 3060 Ti

Radeon HD4870
Radeon HD7870
Radeon HD7970
Radeon RX 580
Radeon RX 6800 XT

Never had any driver problems. In 20+ years of gaming.
that's simply not true, and I had far more than that list of graphic cards.
Drivers issue are always present , to some extent, but on RDNA was really a nightmare.
 
I simply stating my case, don't know about anyone else.

You skipped RDNA 1 and Vega, those were dark days for AMD driver stability. RDNA 2 really saw a huge improvement, and they are still hard at work refactoring driver code. But no GPU is bug free. I've been very lucky with Ampere (RTX 3090), but at the same time, I would 100% have run into that problem with Warzone 2.0/MW2 multiplayer if I played it... and that was the nastiest driver bug NV's had in some time.
 
Back
Top