Monday, July 22nd 2024
NVIDIA GeForce "Blackwell" Won't Arrive Before January 2025?
It appears like 2024 will go down as the second consecutive year without any new GPU generation launch from either NVIDIA or AMD. Kopite7kimi, a reliable source with NVIDIA leaks, says that the GeForce RTX 50-series "Blackwell" generation won't see a debut before the 2025 International CES (January 2025). It was earlier expected that the company would launch at least its top two SKUs—the RTX 5090 and RTX 5080—toward the end of 2024, and ramp the series up from 2025. There is no explanation behind this "delay." Like everyone else, NVIDIA could be rationing its foundry allocation of the 3 nm wafers from TSMC for its high-margin "Blackwell" AI GPUs. The company now makes over five times the revenue from selling AI GPUs than it does from gaming GPUs, so this development should come as little surprise.
Things aren't any different with NVIDIA's rivals in this space, AMD and Intel. AMD's RDNA 4 graphics architecture and the Radeon RX series GPUs based on it, aren't expected to arrive before 2025. AMD is making several architectural upgrades with RDNA 4, particularly to its ray tracing hardware; and the company is expected to build these GPUs on a new foundry node. Meanwhile, Intel's Arc B-series gaming GPUs based on the Xe2 "Battlemage" graphics architecture are expected to arrive in 2025, too, although these chips are rumored to be based on a more mature 4 nm-class foundry node.
Sources:
kopite7kimi (Twitter), Videocardz
Things aren't any different with NVIDIA's rivals in this space, AMD and Intel. AMD's RDNA 4 graphics architecture and the Radeon RX series GPUs based on it, aren't expected to arrive before 2025. AMD is making several architectural upgrades with RDNA 4, particularly to its ray tracing hardware; and the company is expected to build these GPUs on a new foundry node. Meanwhile, Intel's Arc B-series gaming GPUs based on the Xe2 "Battlemage" graphics architecture are expected to arrive in 2025, too, although these chips are rumored to be based on a more mature 4 nm-class foundry node.
36 Comments on NVIDIA GeForce "Blackwell" Won't Arrive Before January 2025?
Lack of DisplayPort 2.1.
Low RT performance.
Extremely high power consumption.
Extremely large, that sag, and melt their power connectors.
Plenty of reasons for improvements..
The reasons I was referring to are not technical reasons for us - they were sales related ones for them. I appreciate the naivety, though.
The model of coasting on an architecture for 2 years is being slowly replaced by one of doing so for 3 years, seemingly.
To touch some of the echo-chamber points brought up from a retail and user experience perspective, though.
- DSC and the effective widespread lack of DP 2.1 availability in monitors might not push this to exist next-gen. I own an FO32U2P, feel free to doubt this. User experience is not going to change much without DSC.
- Low RT is a matter of perspective. Who is going to beat them at it? If the answer on this flow-chart is nobody, its not going to improve much
- Power consumption is high, but new PSU standards and massive coolers are making this a somewhat of a none-issue consumer-wise
- newer cases can accommodate those larger cards, new revisions of 12VHPWR and 12V2x6 contain relatively high surface area contact with the connectors. I know from quite a large personal connection to retail numbers that this is effectively such a low percent issue its not being counted on any feasible RMA statistic anymore. AMD is also expected to adopt this new connector in their upcoming generation.
Prior to 2600k I had some quad core AMD from 2009 or so. And the sole reason I went to Intel 13 series is not that it was better in performance than AMD 7000 series, but was rather better value, which is very ironic.
I suppose I will switch to AMD Zen 6 next year hopefully, the 10950x variant.
Does a company really need multiple RTX 4080 skews with marginal differences in core clock? 99% of customers aren't going to care. Same with how each motherboard brand has at least 5 motherboards per chipset, with little to no differences in feature sets.
I also predicted that there will be a TITAN ai card ( 512 bit) when the 448 bit 5090 rumors started to circulate a few months back.
How does Nvidia get gamers to pay more for a x080ti card? Just name it the 5090. How does Nvidia get consumers to pay more for a x090 card? Just call it the Titan ai with more vram.
While I do agree Nvidia's primary focus is ai, I also believe they will be ready with a spectrum of tiers ( ranging from what Titan ai status to the low end) gaming graphics cards if the saturation point hits prematurely. One thing to consider is Chinese ai market is satisfying Nvidia thirst and Nvidia is taking the opportunity while everyone is being distracted with Trump's assassination attempt and Biden dropping out to supply the Chinese market with Blackwell q4 2024 ( source Videocardz website past week article).
I believe this market specifically probably the reason that Nvidia possibility delayed the Blackwell cards till q1 2025.