Monday, July 22nd 2024

NVIDIA GeForce "Blackwell" Won't Arrive Before January 2025?

It appears like 2024 will go down as the second consecutive year without any new GPU generation launch from either NVIDIA or AMD. Kopite7kimi, a reliable source with NVIDIA leaks, says that the GeForce RTX 50-series "Blackwell" generation won't see a debut before the 2025 International CES (January 2025). It was earlier expected that the company would launch at least its top two SKUs—the RTX 5090 and RTX 5080—toward the end of 2024, and ramp the series up from 2025. There is no explanation behind this "delay." Like everyone else, NVIDIA could be rationing its foundry allocation of the 3 nm wafers from TSMC for its high-margin "Blackwell" AI GPUs. The company now makes over five times the revenue from selling AI GPUs than it does from gaming GPUs, so this development should come as little surprise.

Things aren't any different with NVIDIA's rivals in this space, AMD and Intel. AMD's RDNA 4 graphics architecture and the Radeon RX series GPUs based on it, aren't expected to arrive before 2025. AMD is making several architectural upgrades with RDNA 4, particularly to its ray tracing hardware; and the company is expected to build these GPUs on a new foundry node. Meanwhile, Intel's Arc B-series gaming GPUs based on the Xe2 "Battlemage" graphics architecture are expected to arrive in 2025, too, although these chips are rumored to be based on a more mature 4 nm-class foundry node.
Sources: kopite7kimi (Twitter), Videocardz
Add your own comment

36 Comments on NVIDIA GeForce "Blackwell" Won't Arrive Before January 2025?

#26
Minus Infinity
Oh no, now what will I spend my $3K on? Oh, I know, a life.
Posted on Reply
#27
tussinman
OnasiMeh, at this point with my low “needs” I might just grab a 4060 or something just to have an up to date video engine and access to DLSS and call it a day. Nothing graphics intensive interests me anyway and waiting with bated breath to be honored with an opportunity to spend like 800 bucks or whatever NV decides to ask for a 5070 is just pointless. I genuinely feel like checking out of the GPU game completely. It’s all so tiresome.
Thinking about doing the same. Friend has offered to sell met his 12GB 6750XT for $220 which isn't bad considering it does high settings at 1440p and very high settings at 1080p. Might be a decent stop gap because right now the current gen cards in the $250-500 range are underwhelming.
Posted on Reply
#28
dj-electric
NVIDIA has no reasons to rush generational GPU releases anymore for the consumer market. They can take their time and take they do.
Posted on Reply
#29
ARF
dj-electricNVIDIA has no reasons to rush generational GPU releases anymore for the consumer market. They can take their time and take they do.
I don't think so.
Lack of DisplayPort 2.1.
Low RT performance.
Extremely high power consumption.
Extremely large, that sag, and melt their power connectors.

Plenty of reasons for improvements..
Posted on Reply
#30
dj-electric
ARFI don't think so.
Lack of DisplayPort 2.1.
Low RT performance.
Extremely high power consumption.
Extremely large, that sag, and melt their power connectors.

Plenty of reasons for improvements..
No no.... There are plenty of reasons for improvement - but NVIDIA GPUs are selling well enough anyway for them.
The reasons I was referring to are not technical reasons for us - they were sales related ones for them. I appreciate the naivety, though.

The model of coasting on an architecture for 2 years is being slowly replaced by one of doing so for 3 years, seemingly.

To touch some of the echo-chamber points brought up from a retail and user experience perspective, though.
- DSC and the effective widespread lack of DP 2.1 availability in monitors might not push this to exist next-gen. I own an FO32U2P, feel free to doubt this. User experience is not going to change much without DSC.
- Low RT is a matter of perspective. Who is going to beat them at it? If the answer on this flow-chart is nobody, its not going to improve much
- Power consumption is high, but new PSU standards and massive coolers are making this a somewhat of a none-issue consumer-wise
- newer cases can accommodate those larger cards, new revisions of 12VHPWR and 12V2x6 contain relatively high surface area contact with the connectors. I know from quite a large personal connection to retail numbers that this is effectively such a low percent issue its not being counted on any feasible RMA statistic anymore. AMD is also expected to adopt this new connector in their upcoming generation.
Posted on Reply
#31
mahirzukic2
bugYou seem to be a little confused. Companies releasing something each each doesn't mean you have to buy each year. I have only upgraded from one generation to the next when AMD released the second gen s939 CPUs with dual-cores. Other than that, if I'm not seeing a sizable (~25-50%), I'm not upgrading. More recently, I went 2500k->6600k->12600k. Didn't feel I missed anything either.
You are good my man, I went from 2600k (got it in 2015) to 13700k (on release date) and a year after to 13900k due to my demand for multi threading.
Prior to 2600k I had some quad core AMD from 2009 or so. And the sole reason I went to Intel 13 series is not that it was better in performance than AMD 7000 series, but was rather better value, which is very ironic.

I suppose I will switch to AMD Zen 6 next year hopefully, the 10950x variant.
Posted on Reply
#32
A&P211
Zen 5 is the one coming out next year.
mahirzukic2You are good my man, I went from 2600k (got it in 2015) to 13700k (on release date) and a year after to 13900k due to my demand for multi threading.
Prior to 2600k I had some quad core AMD from 2009 or so. And the sole reason I went to Intel 13 series is not that it was better in performance than AMD 7000 series, but was rather better value, which is very ironic.

I suppose I will switch to AMD Zen 6 next year hopefully, the 10950x variant.
Posted on Reply
#33
Minus Infinity
A&P211Zen 5 is the one coming out next year.
What are you talking about. Do you mean RDNA5? Zen 5 is about to officially launch in a few days and already some people have them.
Posted on Reply
#34
KhalidAbusaud
I think companies should focus more on optimizing software instead of churning out new hardware every fucking single year like maniacs. Not all people have the budget or money to keep up with new hardware
Posted on Reply
#35
DudeBeFishing
KhalidAbusaudI think companies should focus more on optimizing software instead of churning out new hardware every fucking single year like maniacs. Not all people have the budget or money to keep up with new hardware
I still haven't worked out all the quirks with my old PC, X58 motherboard with GTX 1080. Every now and then I find a driver or software setting that improves performance and stability. I feel like most hardware and software releases recently are from mediocre employees trying to justify their job rather than make something that just works.

Does a company really need multiple RTX 4080 skews with marginal differences in core clock? 99% of customers aren't going to care. Same with how each motherboard brand has at least 5 motherboards per chipset, with little to no differences in feature sets.
Posted on Reply
#36
Godrilla
64KWell that's a bummer but I guess we should have seen it coming. Gaming GPUs are pretty much an afterthought to Nvidia right now. If the launch of the 5090 and 5080 are early next year then it will be even longer before the more affordable Blackwell GPUs roll out. :(
I have seen others echoing my theory about this very topic ever since Nvidia became the new favorite blue chip stock last year. Nvidia knows that their is a possibility that the ai market will reach a saturation point ( some call the bubble) where the market is satisfied with hardware at a slower pace than Nvidia has supply manufacturing rate. ( simple supply and demand especially competition is not stagnating in ai from all fronts especially Nvidia's own consumers publicly stating as much). For example Sam atman seeking a way to make own ai hardware as well as others. Due to this fact that no one can deny that there is a possibility that the market will be saturated with ai hardware and satisfied due to oversupply Nvidia has no choice on to default to plan B ( the gaming market) 10.5 billion dollar to be exact and still growing. While this is chump change in comparison to ai growth Nvidia still controls and has a monopoly on this market. Nvidia will not likely just give up majority market share here. Nvidia could simply fall back on the gaming market once the ai is saturated to meat Forcast goals. The ai market will probably have cycling saturation points next stop hyper efficiency at maximum performance.
I also predicted that there will be a TITAN ai card ( 512 bit) when the 448 bit 5090 rumors started to circulate a few months back.
How does Nvidia get gamers to pay more for a x080ti card? Just name it the 5090. How does Nvidia get consumers to pay more for a x090 card? Just call it the Titan ai with more vram.
While I do agree Nvidia's primary focus is ai, I also believe they will be ready with a spectrum of tiers ( ranging from what Titan ai status to the low end) gaming graphics cards if the saturation point hits prematurely. One thing to consider is Chinese ai market is satisfying Nvidia thirst and Nvidia is taking the opportunity while everyone is being distracted with Trump's assassination attempt and Biden dropping out to supply the Chinese market with Blackwell q4 2024 ( source Videocardz website past week article).
I believe this market specifically probably the reason that Nvidia possibility delayed the Blackwell cards till q1 2025.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts