Thursday, July 27th 2023

NVIDIA Cancels GeForce RTX 4090 Ti, Next-Gen Flagship to Feature 512-bit Memory Bus

NVIDIA has reportedly shelved plans in the short term to release the rumored GeForce RTX 4090 Ti flagship graphics card, according to Kopite7kimi, a reliable source with NVIDIA leaks. This card had been extensively leaked over the past few months as featuring a cinder block-like 4-slot thickness, and a unique PCB that's along the plane of the motherboard, rather than perpendicular to it. From the looks of it, sales and competition in the high-end/halo segment are too slow, the current RTX 4090 remains the fastest graphics card you can buy, and the company seems unfazed by the alleged Radeon RX 7950 series, given that AMD has already maxed out the "Navi 31" silicon, and there are only so many things the red team can try, to beat the RTX 4090.

That said, the company is reportedly planning more SKUs based on the AD103 and AD106 silicon. The AD103 powers the GeForce RTX 4080, which nearly maxes it out. The AD104 has been maxed out by the RTX 4070 Ti, and there could be a gap between the RTX 4070 Ti and the RTX 4080 that AMD could try to exploit by competitively pricing its RX 7900 series, and certain upcoming SKUs. This creates scope for new SKUs based on cut-down AD103 and the GPU's 256-bit memory bus. The AD106 is nearly maxed out with the RTX 4060 Ti, however there's still room to unlock its last remaining TPC, use faster GDDR6X memory, and attempt to slim the vast gap between the RTX 4060 Ti and the RTX 4070.
In related news, Kopite7kimi also claims that NVIDIA's next-generation flagship GPU could feature a 512-bit wide memory interface, in what could be an early hint that the company is sticking with GDDR6X (currently as fast as 23 Gbps), and not transitioning over to the GDDR7 standard (starts at 32 Gbps), which offers double the speeds of GDDR6.
Sources: VideoCardz, kopite7kimi (Twitter), kopite7kimi (Twitter)
Add your own comment

75 Comments on NVIDIA Cancels GeForce RTX 4090 Ti, Next-Gen Flagship to Feature 512-bit Memory Bus

#26
kondamin
Well yeah, no one is buying consumer grade stuff all the money is in AI
Posted on Reply
#27
Vayra86
Haha these 'leakers' have created their own full-circle 'oh rumor' 'oh cancelled' bullshit train.

And you're all buying because there's a pick of a 4 slot nondescript piece of metal? Pffff

I bet kopite7 has bought a nice number of 4090's on you guys already
Posted on Reply
#28
phanbuey
TheLostSwedeMaybe even the reinforced slots couldn't handle it?
either that or it was cracking the t-joint point that connected board to the slot... there are so many things that can go wrong when you hang a brick, connected by a tiny T-PCB. off your motherboard.
Posted on Reply
#29
Chry
Maybe their manufacturing is too busy baking those 10 million chips for OpenAI?
Posted on Reply
#30
Mister300
Nothing to see here move along, AMD had a 512 bit card in 2015 my XFX 390X. And they had the Fury X too with HBM.
Posted on Reply
#31
Assimilator
TheLostSwedeMaybe Nvidia figured no-one wanted to pay $3,000 for a consumer graphics card?
You think a 512-bit bus is gonna be cheap?
ARFIt depends on AMD. If it releases the long awaited Radeon RX 7950 XTX, then nvidia will have to do something.


www.techpowerup.com/gpu-specs/radeon-rx-7950-xtx.c3980
Performance estimated based on architecture, shader count and clocks.
Performance estimated based on architecture, shader count and clocks.
Performance estimated based on architecture, shader count and clocks.
Performance estimated based on architecture, shader count and clocks.

But then, literacy has never been your strong suit.
Posted on Reply
#32
Dan.G
Plans for a new Titan, perhaps?
I wonder what the power draw will be with that 512-bit bus...
Better have a fire extinguisher at the ready!
Posted on Reply
#33
QUANTUMPHYSICS
GOOD. I'd hate to see all these "kids" running out trying to upgrade to yet another unnecessary/ expensive upgrade.

Just wait for the 5000 series.

BETTER YET: wait for games that actually justify the upgrade.
Posted on Reply
#34
wheresmycar
btarunr....and there could be a gap between the RTX 4070 Ti and the RTX 4080......
Thats what i'm hoping to see too. A 16GB 4080GRE (golden reptile edition)

tbf, Nvidia already has an approachable stack of products. They just need to open the doors with big price reductions corrections and let everyone inside. The 4080 for $800, although expensive, would be an acceptable start.

[SIZE=3][B]NVIDIA Reportedly Cans "RTX 4090 Ti,"[/B][/SIZE]

I can't get over the 4070 TIs price, the rest is just in the wind
Posted on Reply
#35
kondamin
QUANTUMPHYSICSGOOD. I'd hate to see all these "kids" running out trying to upgrade to yet another unnecessary/ expensive upgrade.

Just wait for the 5000 series.

BETTER YET: wait for games that actually justify the upgrade.
what gives you the idea the 5000 series will be affordable/good for gamers?
Posted on Reply
#36
wheresmycar
kondaminwhat gives you the idea the 5000 series will be affordable/good for gamers?
Yep, been there, done that! Oh, the optimism! Every Gen-up is just another slap in the face and a disappointment for the wallets. For a while, the optimism returned with AMD climbing the ladder but that too quickly shot down with those top tier 60/70-series.

Word of advice for the optimists: keep your resolutions small, keep your image count moderately smooth and throw epic/ultra in the bin. That will get you close to affordable.
Posted on Reply
#37
pavle
512-bit, eh? From one extreme (128-bit on the 40504060/Ti) to the other! Looks like another GTX 280. Adios My Dineros might just bring out another HD 4870 (small, but zippy), if they get their heads out of their behinds.
Posted on Reply
#38
mechtech
And I was so looking forward to a $5000 gaming card ;)
Posted on Reply
#39
Darmok N Jalad
NorengThe solution is obvious: crank the power limit to 750W (5x 8-pin), and stack some VCache on the MCDs.
If they used 3DV, if that’s even possible, they may not even need insane clocks. It would only help if Navi was starved for bandwidth though.
pavle512-bit, eh? From one extreme (128-bit on the 40504060/Ti) to the other! Looks like another GTX 280. Adios My Dineros might just bring out another HD 4870 (small, but zippy), if they get their heads out of their behinds.
It was actually the 3870 that AMD did the small and zippy to start with. I had one and it was one heckofa card. Bioshock looked so good.
Posted on Reply
#40
N/A
pavle512-bit, eh? From one extreme (128-bit on
It's not extreme. but a gradual incerementation of the bus,

4090 384 bit -> 5090 512 bit.
4070 192 bit -> 5070 256 bit.

Thats what we wanted anyway but not getting it until Ada-next on the same node. Since N3 is not needed for that. 512 bit means big die double the size of AD103.
Posted on Reply
#41
Minus Infinity
"This creates scope for new SKUs based on cut-down AD103 and the GPU's 256-bit memory bus."

The 4070 series was originally planned to be on AD103 all along and was changed to AD104 late in the game. Most likely becuase Nvidia wanted to move specs down a tier but raise prices a tier and people would gladly pay. That has backfired spectacularly, not that they give a fcuk.

Tom's Hardware is saying probably only for China though on these new cards.
Posted on Reply
#42
jigar2speed
Prima.VeraThat 4 slot abomination must be the winner of the World's Ugliest Video Card.

That reminded me of another monster, but winner of the World's Coolest Video Card:
What a beauty
Posted on Reply
#43
TheLostSwede
News Editor
AssimilatorYou think a 512-bit bus is gonna be cheap?
£3.50?
Posted on Reply
#44
80251
QUANTUMPHYSICSGOOD. I'd hate to see all these "kids" running out trying to upgrade to yet another unnecessary/ expensive upgrade.

Just wait for the 5000 series.

BETTER YET: wait for games that actually justify the upgrade.
Metro Exodus Enhanced Edition already puts the stick on my RTX 4090. If I set the shading quality to anything > high (and everything else maxed, incl. RTX), it can't maintain 60FPS @ 4K.

So there will be no full die Ada GPU, outside of HPC/AI industrial applications? Or will Titan Ada still be released to the consumer market for big, big $$$!?
Posted on Reply
#46
QUANTUMPHYSICS
80251Metro Exodus Enhanced Edition already puts the stick on my RTX 4090. If I set the shading quality to anything > high (and everything else maxed, incl. RTX), it can't maintain 60FPS @ 4K.

So there will be no full die Ada GPU, outside of HPC/AI industrial applications? Or will Titan Ada still be released to the consumer market for big, big $$$!?
As I said: wait for the 5090!!!
Posted on Reply
#47
pavle
Darmok N Jalad...
It was actually the 3870 that AMD did the small and zippy to start with. I had one and it was one heckofa card. Bioshock looked so good.
Indeed, however Radeon HD 4870 was the one that whooped GTX 280's behind. Just look at its shading power (old but honest benchmark).

Posted on Reply
#48
Unregistered
Meh, it would have been unobtanium anyway based on price if it were a thing.
Darmok N JaladIt was actually the 3870 that AMD did the small and zippy to start with. I had one and it was one heckofa card. Bioshock looked so good.
I loved my Toxic 3870 so much back in the day I bought a second :D
That was really only foray I had with CF though, after that I just stuck with single cards.
TheLostSwede£3.50?
Tree-fiddy*
Posted on Edit | Reply
#49
AnotherReader
crlogicWiz's review shows you can gain about 7% from an overclock on 7900XTX, so I'm not sure how they're going to pull off a 30% uplift when they've already run out of core. Here's hoping though!
The non-reference 7900 XTX SKUs show more benefit from overclocking: 14 to 15%. Still, that isn't enough to catch up to the 4090 which can be overclocked too. Nvidia knows this and that's why they don't need to waste fully functional AD102 dies on a 4090 Ti.
Posted on Reply
#50
sethmatrix7
HBSoundWhat is sad is that the GPU is getting more expansive. GPUs are getting larger and larger, and cell phones are getting smaller and smaller!
It is sad not to see how a good redesign compresses the modern-day GPU. At this rate, every time the GPU is updated (2 years max) you have to get a new case and PSU.
Increasing performance requires additional size and power as we reach the limit of shrinking the silicon. There are options available that consume less power and/or are smaller form factor.
Posted on Reply
Add your own comment
Dec 20th, 2024 01:51 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts