Friday, October 14th 2022

NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name

NVIDIA has decided to cancel the November 2022 launch of the GeForce RTX 4080 12 GB. The company will relaunch the card under a different name, though it didn't announce the replacement name just yet. The naming of the RTX 4080 12 GB was cause for much controversy. With the RTX 40-series "Ada," NVIDIA debuted three SKUs—the already launched RTX 4090 which is in stores right now; the RTX 4080 16 GB, and the RTX 4080 12 GB. Memory size notwithstanding, the RTX 4080 12 GB is a vastly different graphics card from the RTX 4080 16 GB.

The RTX 4080 12 GB and RTX 4080 16 GB didn't even share the same silicon. While the 16 GB model is based on the larger "AD103" silicon, has 9,728 CUDA cores, and a 256-bit wide GDDR6X memory bus; the RTX 4080 12 GB is based on the smaller "AD104" silicon, has just 7,680 CUDA cores (21% fewer CUDA cores); and a meager 192-bit wide GDDR6X memory bus. This had the potential to confuse buyers, especially given the $900 price. With criticism spanning not just social media but also bad press, NVIDIA decided to pull the plug on the RTX 4080 12 GB. The company will likely re-brand it as a successor to the RTX 3070 Ti, although then it will have a hard time justifying its $900 price-tag. The RTX 4080 16 GB, however, is on track for a November 16 availability date, with a baseline price of $1,200.
Source: NVIDIA
Add your own comment

423 Comments on NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name

#376
InVasMani
sepheronxWe need a hero to save us. We need the one, the chosen one who can stop the evil that is Nvidia, and to bring decent value of GPU's back to the land so we can all be free, happy and entertained again.

Come sir Robin of Drop bears. Lead us so we can all get good cheap GPUs again.
Jensen: You under estimate my power...
Everyone:
Posted on Reply
#377
sepheronx
Ananas is a face to be feared and revered.
Posted on Reply
#378
InVasMani
Will have to remember that when people hate on pineapple on pizza... :laugh: for your crimes you have been sentenced to death by pineapple...
Posted on Reply
#379
sepheronx
InVasManiWill have to remember that when people hate on pineapple on pizza... :laugh: for your crimes you have been sentenced to death by pineapple...
You would eat this face?
Posted on Reply
#381
AusWolf
sepheronxWe need a hero to save us. We need the one, the chosen one who can stop the evil that is Nvidia, and to bring decent value of GPU's back to the land so we can all be free, happy and entertained again.

Come sir Robin of Drop bears. Lead us so we can all get good cheap GPUs again.
Intel Arc A770 entered the chat...

Hmm... "hero... chosen one... stop the evil..."

Intel Arc A770 left the chat...
Posted on Reply
#382
sepheronx
InVasManiIs it warm or cold? :laugh:
as cold as AusWolf's heart
AusWolfIntel Arc A770 entered the chat...

Hmm... "hero... chosen one... stop the evil..."

Intel Arc A770 left the chat...
I had an interest in the A770 until the fact I cannot even get one here.
Posted on Reply
#383
AusWolf
sepheronxas cold as AusWolf's heart
Maybe not that cold. Particles completely stop moving, disrupting the known laws of space-time at those temperatures.
sepheronxI had an interest in the A770 until the fact I cannot even get one here.
There is literally one store in the UK where I could put it on pre-order for £400, but that's just a big "NO".
Posted on Reply
#384
sepheronx
AusWolfMaybe not that cold. Particles completely stop moving, disrupting the known laws of space-time at those temperatures.


There is literally one store in the UK where I could put it on pre-order for £400, but that's just a big "NO".
I have quite a few GPU's I would have liked to do my own tests with and share with the community. Been wanting to start my own channel to go with a site I've been working on but of course, without goods, I got no content. No content, can't get viewers. No viewers, can't get some kind of borrowed equipment to do further tests. Only way to get viewers is to do stupid videos where the videos front image has me with my mouth wide open in an "O" looking like I'm about to stick something Phallic into it.

So yeah, would like one to also document performance gains over driver iterations.

But no pre orders either. Just nothing. How absolutely pathetic canada has become beyond just our general stupidity. Can't even get goods now.
Posted on Reply
#385
AusWolf
sepheronxI have quite a few GPU's I would have liked to do my own tests with and share with the community.
Same here. I always say first-hand is the only real experience. I've been positively disappointed by massively hated products (6500 XT, Rocket Lake i7) and negatively disappointed by massively loved ones (R5 3600). That's why I buy a lot more PC hardware than I need to. That's also why I want a Zen 4 system in the near future even though I have zero need for that too. I like seeing for myself what the fuss is about.
sepheronxHow absolutely pathetic canada has become beyond just our general stupidity.
That's the whole world, I'm afraid.
Posted on Reply
#386
sepheronx
If I find a place to buy a arc 770 or 750 I'll grab one. Kinda wanted the acer model.

I would have purchased a 4090 if it was cheaper, like $1200 but at 1600 usd it's just too much.

I don't have much faith in AMD either honestly.

I've been telling people, want 4K (even if it isn't actually 4K) gaming? Cheap? Get a ps5 or a Xbox series x.

Right now, even with GPU prices as low as they are, it still is abysmal.
Posted on Reply
#387
Mussels
Freshwater Moderator
fevgatosWhat exactly are those scummy practices? What morally bankrupt actions are you talking about? Id like to know, if that's the case I won't buy nvidia either
To be fair, there have been quite a few over the years.

The texture compression and lower quality rendering cheats of the old FX series
The 970 having 3.5GB and not 4GB VRAM
Deliberately selling cards in tiers that make them obsolete faster (1050Ti 4GB vs 1060 3GB - they'd BOTH be better off with the VRAM amounts swapped)
Then the modern shenanigans with FE cards being limited to certain countries, the 4080 12GB being a 4070 at 4080 prices


Hell look at the laptops for the worst things where product names became meaningless, a laptop 1060 could have been just about anything, they had products using names to mislead people as well as different TDP variants with drastically differing perfomance they did their best to keep hidden.

Theres more and AMD is not above this sort of thing either - i think all brands have done dodgy shit over the years.
Posted on Reply
#388
JustBenching
MusselsThe texture compression and lower quality rendering cheats of the old FX series
Both companies pulled that crap, and that was literally around 20 years ago, no?
MusselsThe 970 having 3.5GB and not 4GB VRAM
The 970 did in fact have 4gb vram actually. But I think it's kind of whatever, it was the best vfm GPU on the planet at the time, I don't think whether it was advertised as 4 or 3.5gb would change sales by one IOTA.

Only the 1030 shaenigans with the ddr4 vs gddr5 I consider to be a bs move, because even an informed buyer couldn't actually distinguish between the products
Posted on Reply
#389
AusWolf
MusselsDeliberately selling cards in tiers that make them obsolete faster (1050Ti 4GB vs 1060 3GB - they'd BOTH be better off with the VRAM amounts swapped)
Since you mentioned that, I think the whole Ampere lineup deserves a few words too:
  • The 3090 Ti that only came out to milk flagship hunters beyond imagination after they've got their 3090s already,
  • The 3080 having 10 GB VRAM initially, with a 12 GB version unexpectedly coming out later,
  • The 3070 Ti 16 GB version being scrapped to force buyers into planned obsolescence with a choice of more VRAM on a lesser card or more performance with less VRAM,
  • The 3060 having 12 GB VRAM, basically pissing on everything that has only 8 or 10 GB even several tiers above, despite the fact that it doesn't necessarily need that much,
  • The 3050 with questionable performance for its tier selling for way more than it should just because it's RTX, and...
  • Having nothing below that. The 3050 is still based on a scrappy version of the mid-tier GA106, which means Nvidia is openly pissing on gamers on a budget.
The 20-series Super cards established a trend which Ampere only continued. That trend is "milk gamers now, then make them realise their mistake when the proper version comes out later". I'm pretty sure every "mistake" we see in Ada cards is planned ahead of time, and we'll see them somewhat rectified in Ti/Super releases next year. Only somewhat because someone will have to buy 50-series, too.

Edit: Before someone argues it, these are not dealbreaker moves, just plain scummy ones.

Like you said, AMD isn't innocent, either (their laptop chip naming scheme is horrendous), but at least they give you the best performance physically possible (fully enabled chips with decent amounts of VRAM), and not crippled versions that only tease you into the "refresh" coming next year.
Posted on Reply
#390
JustBenching
AusWolfSince you mentioned that, I think the whole Ampere lineup deserves a few words too:
  • The 3090 Ti that only came out to milk flagship hunters beyond imagination after they've got their 3090s already,
  • The 3080 having 10 GB VRAM initially, with a 12 GB version unexpectedly coming out later,
  • The 3070 Ti 16 GB version being scrapped to force buyers into planned obsolescence with a choice of more VRAM on a lesser card or more performance with less VRAM,
  • The 3060 having 12 GB VRAM, basically pissing on everything that has only 8 or 10 GB even several tiers above, despite the fact that it doesn't necessarily need that much,
  • The 3050 with questionable performance for its tier selling for way more than it should just because it's RTX, and...
  • Having nothing below that. The 3050 is still based on a scrappy version of the mid-tier GA106, which means Nvidia is openly pissing on gamers on a budget.
The 20-series Super cards established a trend which Ampere only continued. That trend is "milk gamers now, then make them realise their mistake when the proper version comes out later". I'm pretty sure every "mistake" we see in Ada cards is planned ahead of time, and we'll see them somewhat rectified in Ti/Super releases next year. Only somewhat because someone will have to buy 50-series, too.

Edit: Before someone argues it, these are not dealbreaker moves, just plain scummy ones.

Like you said, AMD isn't innocent, either (their laptop chip naming scheme is horrendous), but at least they give you the best performance physically possible (fully enabled chips with decent amounts of VRAM), and not crippled versions that only tease you into the "refresh" coming next year.
Have you heard of the XT moniker they used on zen 2 and now on their RDNA cards? They have multiple versions of the 6900, the normal 6900xt, the 6900xt special with the binned core and the 6950xt. I mean come on
Posted on Reply
#391
AusWolf
fevgatosHave you heard of the XT moniker they used on zen 2 and now on their RDNA cards? They have multiple versions of the 6900, the normal 6900xt, the 6900xt special with the binned core and the 6950xt. I mean come on
I had a feeling someone would bring this up.

Those are a different matter, because the 3800X for example, was a fully enabled chip with full capability compared to the 3800XT. Same as the 6900XT compared to the 6950XT. The only thing you lose with the "lesser" version is a couple percent performance, max, which you can bring back with overclocking if you're lucky. Nvidia's "Super" game is different, because nowadays, you mostly see vanilla (non-Ti) releases with crippled chips, which you cannot necessarily overcome with overclocking, especially if you pair it with crippled VRAM capacity as well. Additionally, my personal note is that anyone who knows a thing or two about GPUs can suspect from the crippled chips that the good ones are reserved for something coming later. They give you the scraps to tease you into the good stuff you didn't wait for - this is the scummy side of it, imo.
Posted on Reply
#392
JustBenching
AusWolfI had a feeling someone would bring this up.

Those are a different matter, because the 3800X for example, was a fully enabled chip with full capability compared to the 3800XT. Same as the 6900XT compared to the 6950XT. The only thing you lose with the "lesser" version is a couple percent performance, max, which you can bring back with overclocking if you're lucky. Nvidia's "Super" game is different, because nowadays, you mostly see vanilla (non-Ti) releases with crippled chips, which you cannot necessarily overcome with overclocking, especially if you pair it with crippled VRAM capacity as well. Additionally, my personal note is that anyone who knows a thing or two about GPUs can suspect from the crippled chips that the good ones are reserved for something coming later. They give you the scraps to tease you into the good stuff you didn't wait for - this is the scummy side of it, imo.
That's just the wrong buyers mentality. If you bought a product at a specific point in time at a specific price, you MUST think that the product was worth it. So what difference does it makes if 6 or 9 or 15 months later a different better product is released? I bought a 3090, and then the 3090 ti came up. It didn't bother me a tiny bit. Only people that aren't sure about what they are buying have issues with these kinds of things.
Posted on Reply
#393
AusWolf
fevgatosThat's just the wrong buyers mentality. If you bought a product at a specific point in time at a specific price, you MUST think that the product was worth it. So what difference does it makes if 6 or 9 or 15 months later a different better product is released? I bought a 3090, and then the 3090 ti came up. It didn't bother me a tiny bit. Only people that aren't sure about what they are buying have issues with these kinds of things.
I'm not suggesting that you're suddenly unhappy with your 3090. Of course you're not. It's that you could have had the choice to buy the 3090 or the Ti right from the start, but Nvidia didn't give you that choice. The scummy thing is that the 3090 Ti is not an entirely different product with its own development cycle - it's just a more enabled one that was reserved to make sure people buy the lesser version. It's not even that pronounced on this level, but if you look at the 3080 12 GB, that represents what I mean a bit better.

Edit: It's like a girlfriend that gives you the worse slice of pizza, then halfway through her better slice, realizes that she's full, and you're left thinking "great, I could have had that one right from the start".
Posted on Reply
#394
JustBenching
AusWolfI'm not suggesting that you're suddenly unhappy with your 3090. Of course you're not. It's that you could have had the choice to buy the 3090 or the Ti right from the start, but Nvidia didn't give you that choice. The scummy thing is that the 3090 Ti is not an entirely different product with its own development cycle - it's just a more enabled one that was reserved to make sure people buy the lesser version. It's not even that pronounced on this level, but if you look at the 3080 12 GB, that represents what I mean a bit better.

Edit: It's like a girlfriend that gives you the worse slice of pizza, then halfway through her better slice, realizes that she's full, and you're left thinking "great, I could have had that one right from the start".
But you dont think there is a technical issue with not launching the 3090ti from the get go? Like for example yield issues? Generally speaking as manafacturing matures you get better products.

For example, even though they were sold as the same product, if you happen to buy a 1st gen ryzen close to launch compared to 10 months later, they were completely different products in terms of ocing. My 1600 needed an insane amount of voltage to hit 3.6ghz, while 2 i bought for my friends were casually hitting 4ghz with minimal amount.
Posted on Reply
#395
Vya Domus
fevgatosYou said that they would have legal issues, meaning that they didn't know it for the last few months and realised it yesterday
No, they knew it was disingenuous from the start, they just hoped it would fly under the radar. I don't know why you are trying so hard to be obtuse on purpose and pretend to not understand the basic idea here.

We get it, you love Nvidia and wont admit that they did something shitty, even though they've themselves have admitted the naming was confusing. But like you said, I am sure you know better than a multibillion dollar company who decided to retract a product because they were too dumb or something, I don't know.
SisyphusThey won't be able to mislead informed consumers.
What even is an informed consumer ? Millions of people play video games, it's evident that most wont have technical knowledge about these things, that's why there are laws against false advertising in the first place.
Posted on Reply
#396
AusWolf
fevgatosBut you dont think there is a technical issue with not launching the 3090ti from the get go? Like for example yield issues? Generally speaking as manafacturing matures you get better products.

For example, even though they were sold as the same product, if you happen to buy a 1st gen ryzen close to launch compared to 10 months later, they were completely different products in terms of ocing. My 1600 needed an insane amount of voltage to hit 3.6ghz, while 2 i bought for my friends were casually hitting 4ghz with minimal amount.
If there are technical issues during manufacturing (for example yield), then why do they only hit Nvidia GPUs specifically and nothing else? They are the only company (as far as I know) that has ever pulled off a launch of a new generation without having any single product based on a fully enabled die in the lineup.

OCing is a different matter. The 1700X one bought at launch is a fully enabled chip, and the same 1700X one bought a year later when you compare stock settings. Intel/AMD/Nvidia aren't selling OC - that's something you do for yourself.
Posted on Reply
#397
JustBenching
Vya DomusNo, they knew it was disingenuous from the start, they just hoped it would fly under the radar. I don't know why you are trying so hard to be obtuse on purpose and pretend to not understand the basic idea here.

We get it, you love Nvidia and wont admit that they did something shitty, even though they've themselves have admitted the naming was confusing. But like you said, I am sure you know better than a multibillion dollar company who decided to retract a product because they were too dumb or something, I don't know.
I love nvidia? No i dont, but even if i did, that's not an actual argument. I could tell you that you hate nvidia blablabla, doesnt matter.

What im saying is naming is irrelevant as long as an informed consumer can differentiate between your products on a shelf. That wasnt the case with the 2 versions of the 1030 for example, but it was the case with the 2 versions of the 4080.

What do you think will actually change now that they took it back? Theyll rename it to 4070 and sell it for the same price. WOW, huge win for the consumer right?
AusWolfIf there are technical issues during manufacturing (for example yield), then why do they only hit Nvidia GPUs specifically and nothing else?

OCing is a different matter. The 1700X one bought at launch is a fully enabled chip, and the same 1700X one bought a year later when you compare stock settings. Intel/AMD/Nvidia aren't selling OC - that's something you do for yourself.
They dont only hit nvidia. Amd does it, intel does it. Sure you can get down to the technicalities and claim its different cause nvidia changes the amount of cudas but is it really different? Is 10 % more cuda cores immoral but 10% more clockspeeds are fine?
Posted on Reply
#398
Vya Domus
fevgatosWhat im saying is naming is irrelevant as long as an informed consumer
As I wrote above not everyone can be an informed consumer because not everyone has the knowledge or interest to be so, that's why the law protects these consumers.
fevgatoshuge win for the consumer right?
Actually, yes, it is a huge win because a huge corporation doesn't get to mislead it's consumers.
Posted on Reply
#399
AusWolf
fevgatosWhat do you think will actually change now that they took it back? Theyll rename it to 4070 and sell it for the same price. WOW, huge win for the consumer right?
They won't be able to sell it for $900.

If I'm mistaken, and they actually do sell it for that price, then I'll agree with you that people are stupid for buying it (besides being disappointed by the human intellect for the Nth time).
fevgatosThey dont only hit nvidia. Amd does it, intel does it. Sure you can get down to the technicalities and claim its different cause nvidia changes the amount of cudas but is it really different? Is 10 % more cuda cores immoral but 10% more clockspeeds are fine?
Sorry, I edited my post a bit late, so I'll write it down again (my bad, really).

Nvidia is the only company I've ever seen to pull off a full launch of an entirely new generation without one single product based on a fully enabled die anywhere in the product stack. Why is that? If Intel can release the 12900K together with the 12700K and 12600K, if AMD can release the 7950X and 7700X together with the 7900X and 7600X (or the 6900XT together with the 6800 and 6800XT for that matter), then why can't Nvidia do the same?
Posted on Reply
#400
BSim500
sepheronxI have quite a few GPU's I would have liked to do my own tests with and share with the community. Been wanting to start my own channel to go with a site I've been working on but of course, without goods, I got no content. No content, can't get viewers. No viewers, can't get some kind of borrowed equipment to do further tests. Only way to get viewers is to do stupid videos where the videos front image has me with my mouth wide open in an "O" looking like I'm about to stick something Phallic into it.
I say go for it. You've certainly nailed the Youtube algorithm... :D
Posted on Reply
Add your own comment
Mar 15th, 2025 22:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts