• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name

It beats a 3090 and a 3080. 4070 would have been definitely justified. If they launched it with that name right away.
They highest they can charge for a 4070 is $600 even accounting for inflation. The 3070 beat the 2080 Ti for less than half the price.
3080 was damn near(13%) to 3090 for less than half the price, now the 4080 is FARTHER in performance BUT CLOSER in price (how anyone can wish to buy this shit is unknown to me). 4080 at the most should cost 800 - 850 accounting inflation.

If amd can match the 4080 with the 7800 XT, for $750(more like $850), that would be awesome....
 
As I pointed out in another discussion thread, it makes zero sense trying to compare model numbers between GeForce generations.

This announcement pounds that concept home with brutal clarity.

Prospective customers simply need to assess the various models in the product generation and decide whether or not each SKU's features are worthy of the price being asked while disregarding the model number printed on the card and the retail packaging.

NVIDIA changes their idea of what each model number represents. Essentially, a --70 card is simply a product between a --60 model and an --80 model from the same generation's product stack. Comparing the 1070, 2070, 3070, and 4070 lacks relevancy because of NVIDIA's constant reinterpretation of their model numbers.

My answer has nothing to do with older model numbers, simply that the card bellow the (real) 4080 can't have just 8gb of vram, much less when the 4080 takes 16gb.

Nvidia continues to push their rtx narrative and also dlss and 4k gaming, 8gb is simply not enough on a high end card. The 3070 with 8bg and even the 3080 with 10gb were already starved by vram, the 4070 would be even more so.
 
I did. Why is it's box so huge? :kookoo:
It seems Galax is taking the GB RX6600 Eagle approach? (But probably just a trial placeholder box used just for the show)

title.jpg
 
My answer has nothing to do with older model numbers, simply that the card bellow the (real) 4080 can't have just 8gb of vram, much less when the 4080 takes 16gb. Nvidia continues to push their rtx narrative and also dlss and 4k gaming, 8gb is simply not enough on a high end card. The 3070 with 8bg and even the 3080 with 10gb were already starved by vram, the 4070 would be even more so.

I never said that releasing a 4070 with 8GB VRAM is best. However I think that's what NVIDIA will do.

Two different concepts.

I admire NVIDIA for their engineering, both their hardware and their software (and I'm not even a developer).

I have a separate opinion about how they run their business however I choose not to provide any more details today. I've already said enough about the topic this week.
 
I never said that releasing a 4070 with 8GB VRAM is best. However I think that's what NVIDIA will do.

Two different concepts.

They will have to explain. Because 8 GB is a limitation and a deal-breaker for potential buyers with their own thinking process.
This is like launching the RTX 3080 with 8 GB when even 10 GB led to cheating by nvidia, lowering texture resolution and image quality to maintain framerates.
 
They will have to explain. Because 8 GB is a limitation and a deal-breaker for potential buyers with their own thinking process.
This is like launching the RTX 3080 with 8 GB when even 10 GB led to cheating by nvidia, lowering texture resolution and image quality to maintain framerates.

Well, NVIDIA sold a ton of 3080 10GB cards before they released the 3080 Ti 12GB and eventual 3080 12GB cards.

I have a 3060 Ti 8GB card and for 1440p I don't recall running into VRAM limits apart from maybe one or two titles. 8GB VRAM is a factor for 4K gaming.

Clearly they could debut a 4070 8GB card and later release a 10GB or 12GB variant. It's not like NVIDIA is forced to pick a specific VRAM amount for any given GPU. And consumers will likely buy whatever is offered regardless.
 
Low quality post by Aretak
Nvidia listened to customers for once? This is incredible.

So clearly they could debut a 4070 8GB card and later release a 10GB or 12GB variant.
The 10GB variant then would be with 160-bit memory bus. Having a memory bus that narrow in a xx70 card would be beyond insane.
 
Well, NVIDIA sold a ton of 3080 10GB cards before they released the 3080 Ti 12GB and eventual 3080 12GB cards.

So clearly they could debut a 4070 8GB card and later release a 10GB or 12GB variant. It's not like NVIDIA is forced to pick a specific VRAM amount for any given GPU. And consumers will likely buy whatever is offered regardless.

I know and it's bizarre. Next is they can try a 4070 4 GB. I guess it will also sell a ton.
 
The 10GB variant then would be with 160-bit memory bus. Having a memory bus that narrow in a xx70 card would be beyond insane.

I wouldn't count anything out from them. After all, they were ready to launch this 3080 12GB card at an abominable price.
 
I wouldn't count anything out from them. After all, they were ready to launch this 3080 12GB card at an abominable price.
True. But a card with 160-bit bus sold with a high-end price is beyond any sense. On the other hand, it's Ngreedia..
 
A product name change is without any importance. The price depends on demand and competition. AMD is late, nVidia has about 6 months to cash in with its AD102 and AD103. Then we will see, what price AMD calls up for GPUs with comparable performance. A secondary impact will arise from the recession, together with inflation. Difficult to forecast the impact on the chip market, pricing.
 
A product name change is without any importance. The price depends on demand and competition. AMD is late, nVidia has at least 6 months to cash in with its AD102 and AD103. Then we will see, what price AMD calls up for GPUs with about the same performance. A secondary impact will arise from the recession, together with inflation. Difficult to forecast the impact on the chip market, pricing.
Names have meaning. For a long time, Nvidia's top tier GPU was the x80. It changed to x80 Ti when AMD surprised them with Hawaii, and remained so until Ampere. I would also like to know your source regarding a delay in RDNA3.
 
A product name change is without any importance.
Wrong.

Per NVs own discovery (was in their slides) they've shockingly discovered that customers stick with series (e.g. 970 => 1070 => 2070) rather than sticking with the price bracket.

4080 losing "80" would be a major hit.
 
True. But a card with 160-bit bus sold with a high-end price is beyond any sense. On the other hand, it's Ngreedia..

Due to the relationship between VRAM memory capacity and memory bus width, the 40 Series has created some tricky choices.

Most consumers will just look at the amount of VRAM printed on the retail box and figure that 12GB is better than 10GB even if the bus width is narrower on the former.
 
Last edited:
Maybe, just maybe the chinese looking fellow on pictures a few posts back just wanted to give the people something to chew on to not observe the real problem: the co-existence of RTX 30xx and 40xx series on the market with the latter at much higher price just so nvidiots can prop up their stocks?
The nvidia excrement show continues...
popcorn.gif
 
Names have meaning. For a long time, Nvidia's top tier GPU was the x80. It changed to x80 Ti when AMD surprised them with Hawaii, and remained so until Ampere. I would also like to know your source regarding a delay in RDNA3.
Names are PR. Consumers need to inform themselves about the specs. I will not defend people, who buy their GPUs based on naming or nicely colored packaging.
The delay is an estimation. Fill in whatever timespan you like.
 
Names are PR. Consumers need to inform themselves about the specs. I will not defend people, who buy their GPUs based on naming or nicely colored packaging.

There are lawyers who will defend them, though. Because giving a 4080 brand to a 12-GB-AD104-card is like claiming a C-class mercedes to be S-class.
 
Nvidia listened to customers for once? This is incredible.


The 10GB variant then would be with 160-bit memory bus. Having a memory bus that narrow in a xx70 card would be beyond insane.

More like, they got caught with their pants down and are now acting desperate to fix things:

- Pretending there is a scalper/miner shortage on 4090
- "Loyalty program" for buying FE 4090, only for current nvidia owners
- Killing the LHR from 30 series
- And now, cancelling the dumb 4080 12GB.

More "unusual moves" to be expected as Jensen continues to wake up from his hubris enduced coma.
 
Back
Top