Tuesday, November 7th 2023

NVIDIA Plans 2024 CES (January) Launch for RTX 40-series SUPER?

NVIDIA is reportedly planning to time the launch of its GeForce RTX 40-series SUPER line of graphics cards with the 2024 International CES, scheduled for January. This is according to kopite7kimi, a reliable source with NVIDIA leaks. The company tends to plan special GeForce RTX gaming events along the sidelines of big trade-shows for new product launches.

The mid-life refresh of the RTX 40-series "Ada" will see the introduction of three new enthusiast-segment graphics card models without disturbing RTX 4090 as the series flagship. In the higher end of things, the company is planning to launch the RTX 4080 SUPER, which could push the performance envelope further above that of the AMD Radeon RX 7900 XTX, and possibly prepare for the upcoming RX 7950 XTX. The unusually named RTX 4070 Ti SUPER could be designed to consolidate at the $800 price-point against the RX 7900 XT. The new RTX 4070 SUPER, meanwhile, could be designed to fix the company's standing at the $600 mark by beating the RX 7800 XT.
Update 05:29 UTC: MEGAsizeGPU revealed the retail branding inserts for boxes, for the RTX 40-series SUPER family. NVIDIA has changed the SUPER logo, replacing a stylized typeface with the same bland sans-serif typeface it uses for the main GeForce RTX branding. It even confirms the existence of the RTX 4070 Ti SUPER (two brand extensions clumped together).
Sources: kopite7kimi (Twitter), VideoCardz, MEGAsizeGPU (Twitter)
Add your own comment

81 Comments on NVIDIA Plans 2024 CES (January) Launch for RTX 40-series SUPER?

#2
Hyderz
Well im genuinely surprised nvidia is releasing the super refresh...
but i guess certain imposing sanctions on china by united states caused them to lose money...
i guess this is stop gap plan until 50 series...
Posted on Reply
#4
AusWolf
So we'll have a 4070, a 4070 Super, a 4070 Ti and a 4070 Ti Super. Why? Just why?! :slap:
Posted on Reply
#5
Hyderz
AusWolfSo we'll have a 4070, a 4070 Super, a 4070 Ti and a 4070 Ti Super. Why? Just why?! :slap:
i guess its what nvidia releases the super this time, 2060 to 2060 super i believe had a decent upgrade.. but the rest of the 20 series had minor bump clocks, vram etc
Posted on Reply
#6
BigDogDING
Яid!culousOwONeed a 4060 SUPER
But little difference between 4060 and 4060ti.
Posted on Reply
#7
hsgawdsaefryjt
RTX 4090 price breakdown (these prices are grabbed from the half point of the cards launch till new generation of cards launch or present/averaged in its manufacturing peak)
(AD102 die) costs about 200-220 USD per die (roughly 18K-20K USD per wafer including the GPU wafer tax I believe) (the die enabled is only 88.89% of the full available)
(24GB 21Gb/s (VRAM)) costs about 95-120 USD for all 24GB (roughly 4-5 USD per 1GB of GDDR6X VRAM, the modules are 2GB each, 12 total but that doesn't impact price by over 5% at best)
The RTX 4090 and it's PCB costs around 20-25 USD at best? (just so you guys know the geforce GTX 285 had a pcb cost of 7.37 USD and the GTX 580 had a pcb cost of 7.29 USD)
The cooler probably costs around 80-100 USD to make if not less (raw metal costs barely account for the total price, it's mainly the machining and painting (this goes for any air cooler btw))
395-465 USD for the entire thing excluding software R&D.
This accounts for FE coolers only (which NVIDIA on THEIR END has to make)
I'm pretty sure AIB coolers are as cheap as they can get when it comes to machining as they have to do this every generation of GPU's


Now we have the RTX 3090 price breakdown (these prices are grabbed from the half point of the cards launch till new generation of cards launch or present/averaged in its manufacturing peak)
(GA102 die) costed around 62.5-67.5 USD per die (roughly 5.5K-6K USD per wafer including the GPU wafer tax I believe) (the die enabled is 97.62% of the full available)
(24GB 19.5Gb/s VRAM) costed around 385-430 USD for all 24GB (roughly 16-18 USD per 1GB of GDDR6X VRAM, the modules are 1GB each, 24 total but this also doesn't impact price by over 5% at best)
The RTX 3090 and it's PCB costed around 25-30 USD at best (the reason why it's more was because it's design was far different from the previous generation of GPU's)
The cooler probably costed around 120-135 USD at best to make (once again it's more expensive over 40 series because it was far different over the previous generation of GPU's and a far different design overall)
592.5-662.5 USD for the entire thing excluding software R&D. (once again this is for the FE's only)
The RTX 3080 10GB was around 355-400 USD to make
Please don't ask for sources, if you want sources find them yourself as I don't get paid for this, nor should I.

The pricing for 40 series across the board sucks, it gets worse after looking at this
Posted on Reply
#8
AusWolf
Hyderzi guess its what nvidia releases the super this time, 2060 to 2060 super i believe had a decent upgrade.. but the rest of the 20 series had minor bump clocks, vram etc
What about the Ti versions this time? I want a 4070 Ti Super Ti! :laugh:
Posted on Reply
#9
the54thvoid
Super Intoxicated Moderator
Just crazy. Nvidia throwing suffixes around and in doing so, making sure they mean nothing. The 'Ti' naming is now rendered pointless by Super. Super, which for the previous gens was the step above base but below Ti, now becomes the better product. Perhaps next they'll release the Ultra? 4070ti Super Ultra.

Someone will pop in and say, but why does that matter? It matters because a company uses a suffix to explain the product performance. So when the company abandons that heirarchy, they make it meaningless. So if it's meaningless, why use it at all?
Posted on Reply
#10
AusWolf
the54thvoidJust crazy. Nvidia throwing suffixes around and in doing so, making sure they mean nothing. The 'Ti' naming is now rendered pointless by Super. Super, which for the previous gens was the step above base but below Ti, now becomes the better product. Perhaps next they'll release the Ultra? 4070ti Super Ultra.

Someone will pop in and say, but why does that matter? It matters because a company uses a suffix to explain the product performance. So when the company abandons that heirarchy, they make it meaningless. So if it's meaningless, why use it at all?
This. Besides, does anybody remember the time when the Ti/Ultra suffix meant the fully unlocked chip, the better, "ultimate" version? Now you've got the 4070 Super which is not an upgraded 4070 Ti, but a demoted 4080. Not that it matters to 99% of buyers. I just find it strange that a smaller and smaller portion of Nvidia's product stack is based on a fully enabled die these days.
Posted on Reply
#12
Bwaze
HyderzWell im genuinely surprised nvidia is releasing the super refresh...
but i guess certain imposing sanctions on china by united states caused them to lose money...
i guess this is stop gap plan until 50 series...
I don't think it's really meant to bring in any significant revenue. 40 sales have not been good, but there was no chance to live up to inflated sales of cyptofrenzy of 30 series, and the general high market demand due to lockdowns (which Nvidia gladly abused by giving priority to cyptofraudsters). Also, most of the people live in a recession with no visible upturn in salaries, only high inflation.

So I think this 40 refresh is just a work of Nvidia section that is resting on the laurels of previous generations' income, and is also not under any pressure to deliver high revenue - that part is now in the hands of Data Center and their exploding sales of AI related hardware - which has not transfered to the gaming section, contrary to Nvidia forecasts that home AI acceleration will drive the sales of "previously just gaming (and cryptomining, of course) cards".

So we could be even seeing worse price / performance in upcoming Super refresh, and rising prices of base models, or their discontinuation.
Posted on Reply
#13
Vayra86
the54thvoidJust crazy. Nvidia throwing suffixes around and in doing so, making sure they mean nothing. The 'Ti' naming is now rendered pointless by Super. Super, which for the previous gens was the step above base but below Ti, now becomes the better product. Perhaps next they'll release the Ultra? 4070ti Super Ultra.

Someone will pop in and say, but why does that matter? It matters because a company uses a suffix to explain the product performance. So when the company abandons that heirarchy, they make it meaningless. So if it's meaningless, why use it at all?
Ti never meant anything. The 660ti was what exactly? A great 660? A castrated 670? And what was the 680 compared to that? The 750ti Boost... was what exactly... that's a new arch prototype, called Ti. I mean...

And it doesn't get better no matter what gen or part of the stack you look at. Ti just like Super never meant jack shit. Nvidia refreshed Kepler with 700 series... that was effectively a Super line up. The 770 was a 680 with faster memory for example. But then Pascal got a mid-gen refresh without any Ti or Super changes with faster memory (11gbps). And then there was a 780ti on top, the ultimate enthusiast card that is now just an x90 no ti. But surely it meant the full die there? Nope, 980ti begs to differ, as does the 1080ti, and... and...
the54thvoidabandons that heirarchy
They don't right? Cards still sit right on top of one another, there is hierarchy in the stack. Cards only differentiate on performance really.
Posted on Reply
#14
AusWolf
Vayra86Ti never meant anything. The 660ti was what exactly? A great 660? A castrated 670? And what was the 680 compared to that? The 750ti Boost... was what exactly... that's a new arch prototype, called Ti. I mean...

And it doesn't get better no matter what gen or part of the stack you look at. Ti just like Super never meant jack shit. Nvidia refreshed Kepler with 700 series... that was effectively a Super line up. The 770 was a 680 with faster memory for example. And then there was a 780ti on top, the ultimate enthusiast card that is now just an x90 no ti. But surely it meant the full die there? Nope, 980ti begs to differ, as does the 1080ti, and... and...
Sure, but everybody always knew that Ti is better than non-Ti. Now we have Ti and Super. How do they stack up against each other? It'll confuse regular buyers for sure.

It kind of reminds me of ATi's X1800 series that had XT, XL, Pro, GT, GTO, and even GTO2 that no one could make any sense of. Or let's not even cross to the red side: Nvidia confused people way back in the GeForce 6 era with the 6800 that had a top-of-the-line Ultra and a GT below that, but also released an XT which was a renamed LE, which was basically a 6600 GT with an extra vertex shader.
Posted on Reply
#15
Vayra86
AusWolfSure, but everybody always knew that Ti is better than non-Ti. Now we have Ti and Super. How do they stack up against each other? It'll confuse regular buyers for sure.
True, there is that. But then again we also had OEM versions of the same SKU. And what about the 3 vs 6GB 1060. And and and. You mention ATI.

The gist has always been that you need to carefully look at what part you're getting. The fact some people are too stupid to do so doesn't change reality.
Posted on Reply
#16
the54thvoid
Super Intoxicated Moderator
Vayra86Ti never meant anything. The 660ti was what exactly? A great 660? A castrated 670? And what was the 680 compared to that? The 750ti Boost... was what exactly... that's a new arch prototype, called Ti. I mean...

And it doesn't get better no matter what gen or part of the stack you look at. Ti just like Super never meant jack shit. Nvidia refreshed Kepler with 700 series... that was effectively a Super line up. The 770 was a 680 with faster memory for example. But then Pascal got a mid-gen refresh without any Ti or Super changes with faster memory (11gbps). And then there was a 780ti on top, the ultimate enthusiast card that is now just an x90 no ti. But surely it meant the full die there? Nope, 980ti begs to differ, as does the 1080ti, and... and...


They don't right? Cards still sit right on top of one another, there is hierarchy in the stack. Cards only differentiate on performance really.
Of course cards sit above (or below) the other. A 4090 sits above a 4060. But even those are names too. The suffixes aren't based on RGB candy from AIB's. They're given by Nvidia to delineate products. They've also meant a certain performance. I know what you're trying to say, but you're missing the point. I also think you'll not try to see the point. Regardless, what's the point of a suffix when you reshuffle their 'marketing' meaning? That's rhetorical - you don't need to answer.
Posted on Reply
#17
AusWolf
Vayra86True, there is that. But then again we also had OEM versions of the same SKU. And what about the 3 vs 6GB 1060. And and and.
6 is better than 3, there's nothing confusing about that.
Vayra86The gist has always been that you need to carefully look at what part you're getting. The fact some people are too stupid to do so doesn't change reality.
While I agree with the basic notion, not everybody knows or cares about computers as much as we do. They might consider looking into specs a waste of time, just like I consider looking into DIY plumbing a waste of time when I can just call a plumber. I don't think there's anything wrong with that, nor do I think that these people should be deceived by dubious model names.
Posted on Reply
#18
Vayra86
AusWolf6 is better than 3, there's nothing confusing about that.
It doesn't mention missing shaders. Its really just a different tier of GPU altogether.
Posted on Reply
#19
AusWolf
Vayra86It doesn't mention missing shaders. Its really just a different tier of GPU altogether.
True, I'll give you that. Although, that brings up the question whether a 10% difference in shaders should denote an entire product class or not.
Posted on Reply
#20
the54thvoid
Super Intoxicated Moderator
Vayra86The fact some people are too stupid to do so doesn't change reality.
Hope that's a generalisation and not personal. The true problem is most people don't understand tech (they're not stupid). The tech-ignorant consumer tends to rely on naming conventions, so when you tear them up, it becomes an issue for them. Sure, they can do research but it doesn't change the fact Nvidia are throwing terms around that have lost any meaning.

I'll agree the worst renaming is for memory allocations or speeds when the consumer isnt even notified. That is bad practice.
Posted on Reply
#21
Vayra86
AusWolfTrue, I'll give you that. Although, that brings up the question whether a 10% difference in shaders should denote an entire product class or not.
Look at the gap between what's out today... And the gap that'll be the Ada stack between Ti and Supers...

Its very human to try and attach some structure or system to what we see. Its how our mind helps us understand shit. But its a figment of our imagination. And marketing handily plays into it.

Iphone Max... Pro... Ultra. Nuff said. Every customer is extra ultra special.
the54thvoidHope that's a generalisation and not personal. The true problem is most people don't understand tech (they're not stupid). The tech-ignorant consumer tends to rely on naming conventions, so when you tear them up, it becomes an issue for them. Sure, they can do research but it doesn't change the fact Nvidia are throwing terms around that have lost any meaning.

I'll agree the worst renaming is for memory allocations or speeds when the consumer isnt even notified. That is bad practice.
Of course its generalisation. The majority here doesn't suffer from buying the wrong product, we all know this, and you allude to that difference too. These naming schemes are built to confuse the average Joe. Why did we ever try to attach any sort of sense to it to begin with? This is a matter of perspective of how we (all) look at marketing, products and releases. A lot of things are implicit and we turn them into something explicit and then consider it a rule or fact. Its nonsense.
Posted on Reply
#22
AusWolf
Vayra86Look at the gap between what's out today... And the gap that'll be the Ada stack between Ti and Supers...

Its very human to try and attach some structure or system to what we see. Its how our mind helps us understand shit. But its a figment of our imagination. And marketing handily plays into it.

Iphone Max... Pro... Ultra. Nuff said.
That kind of answers my (hypothetical) question: Nvidia used to think that a 10% difference isn't worth introducing an entirely new product name, but now, they're throwing 5% differences at us with fancy new suffixes in hopes of more people falling for the trap and buying them. How much the face of a company can change in 7 years...
Posted on Reply
#23
Vayra86
AusWolfThat kind of answers my (hypothetical) question: Nvidia used to think that a 10% difference isn't worth introducing an entirely new product name, but now, they're throwing 5% differences at us with fancy new suffixes in hopes of more people falling for the trap and buying them. How much the face of a company can change in 7 years...
How much customers changed. There's a new generation of stupid, its clear, and it clearly influences the quality of product releases too.

We are our own worst enemies. Idiocracy is happening, for real. The fact Nvidia can do this in the face of economic downturn says a lot. We, at large, need to stop pointing fingers at 'something else'. We're the architects of our own demise if we keep doing that. That lens fits on almost everything today. The system is broken. I too believe in customer rights. But I also strongly believe in due diligence, and in the latter... people are screwing up bigtime and they're paying their own damages throughout life. But they also set a norm for how companies treat them.

There is one prime example here. 'RTX'. It was nothing, today its still a whole lot of nothing. People buy it, despite paying a premium on underpowered hardware. I mentioned Iphone earlier: Nvidia is selling Iphones now. Software + hardware in a holy combination of efficiency and marketing, with Nvidia as the gatekeeper to features, and people buy it. Gosh I wonder what they'll get more of, now. Let's appreciate the fact that in this move, even one generation of cards is already enough to lose feature access (DLSS3+).
Posted on Reply
#24
Legacy-ZA
And as soon as they launched this idiotic refresh:

"Introducing the all-new RTX5090 now with DLSS 4.0, we will lock out functionality for older cards, all this for a low price of just a mere $5000, buy now!"
Posted on Reply
#25
AusWolf
Vayra86How much customers changed. There's a new generation of stupid, its clear, and it clearly influences the quality of product releases too.

We are our own worst enemies. Idiocracy is happening, for real. The fact Nvidia can do this in the face of economic downturn says a lot. We, at large, need to stop pointing fingers at 'something else'. We're the architects of our own demise if we keep doing that. That lens fits on almost everything today. The system is broken. I too believe in customer rights. But I also strongly believe in due diligence, and in the latter... people are screwing up bigtime and they're paying their own damages throughout life. But they also set a norm for how companies treat them.

There is one prime example here. 'RTX'. It was nothing, today its still a whole lot of nothing. People buy it, despite paying a premium on underpowered hardware. I mentioned Iphone earlier: Nvidia is selling Iphones now. Software + hardware in a holy combination of efficiency and marketing, with Nvidia as the gatekeeper to features, and people buy it. Gosh I wonder what they'll get more of, now. Let's appreciate the fact that in this move, even one generation of cards is already enough to lose feature access (DLSS3+).
Unless Nvidia is desperate because nobody is buying the 40-series and they're trying to boost sales by playing on our real or imagined stupidity.
Posted on Reply
Add your own comment
Dec 18th, 2024 06:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts