Sunday, May 5th 2024

NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025

NVIDIA debuted the current RTX 40-series "Ada" in 2022, which means the company is expected to debut its next-generation in some shape or form in 2024, having refreshed it earlier this year. We've known for a while that the new GeForce RTX 50-series "Blackwell" could see a 2024 debut, which is going by past trends, would be the top-two or three SKUs, followed by a ramp up in the following year, but we're now learning through a new Moore's Law is Dead leak that the launch could be limited to just the flagship product, the GeForce RTX 5090, or the SKU that succeeds the RTX 4090.

Even a launch limited to the flagship RTX 5090 would give us a fair idea of the new "Blackwell" architecture, its various new features, and how the other SKUs in the lineup could perform at their relative price-points, because the launch could at least include a technical overview of the architecture. NVIDIA "Blackwell" is expected to introduce another generational performance leap over the current lineup. The reasons NVIDIA is going with a more conservative launch of GeForce "Blackwell" could be to allow the market to digest inventories of the current RTX 40-series; and to accord higher priority to AI GPUs based on the architecture, which fetch the company much higher margins.
Source: Moore's Law is Dead (YouTube)
Add your own comment

154 Comments on NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025

#26
Waweq
They did the same with 4090, seems like a winning strategy also AMD won't fight back lol
Posted on Reply
#27
Space Lynx
Astronaut
DragokarBeyond the forums and just with simple gamers, most of them don't even know what upscaling is, no matter if it's called DLSS FG, FSR FG or XEsomething. They build or buy the pc and it just works the way they build it. Most of them never ever even open any driver panel. Thats "just us".
Aye, my sister recently played a game and texted me, "what do I pick, DLSS, XESS, etc" lol I am aware we are a niche crowd
Posted on Reply
#28
Outback Bronze
I wonder if it will need 2x 12V-2x6 connectors to power it :wtf: Surely hope not. Double the trouble.
Posted on Reply
#29
Minus Infinity
Space Lynxnah most people want DLSS these days I think, that being said I am really happy with my shiny new 7900 XT. latest revision model too, it runs ice cold. Witcher 3 ultra 165 hz 165 fps ultra 1440p - 50 Celsius core and 62 hotspot, my last 7900 XT ran about 15 Celsius hotter in the same test. I love this beast, was 1/3 the cost of the 4090. Ray tracing doesn't interest me, so I am happy to skip rdna4 and 5090. this is a solid rig. :) I probably won't upgrade until 6090 rtx and 11800x3d. and keep same Mobo/ram, etc
AMD's next gen FSR with AI should address most concerns. Already the latest update of FSR 3.1 is a huge improvement despite lacking AI upscaling. Going forward that'll be one less thing to bash them over. And then we have the hopefully trumor about much better RT performance.
Posted on Reply
#30
nguyen
Minus InfinityAMD's next gen FSR with AI should address most concerns. Already the latest update of FSR 3.1 is a huge improvement despite lacking AI upscaling. Going forward that'll be one less thing to bash them over. And then we have the hopefully trumor about much better RT performance.
Not only that this is the wrong thread, but also FSR 3.1 doesn't exist yet, so where does the "huge improvement" coming from?
Posted on Reply
#31
stimpy88
Wanna bet the only reason is they can't sell the datacenter version to the Chinese, but these "gamer" cards on the other hand... Good luck getting one if you live outside of China.
Posted on Reply
#32
TheDeeGee
Space Lynxaye, I don't care either way at the moment. I just do everything old school still whenever I can... native resolution only, no scaling, MSAA x4 and the other one x8 or x16. and then like a mixture of medium/high settings, unless its an old game and easy to achieve max fps for my monitor, then ultra.

this is how I have always done things and I am probably not going to change, at least not until like its forced upon me, like it is in some games, but in those games I'm like meh it is what it is.
Good luck forcing MSAA and SSAA in DX10/11/12 titles, as that's simply not possible because of deferred shaders.
Posted on Reply
#33
N/A
zo0lykas
28GBps GDDR7 512 bit 1,792 GB/s or 1344 GB/s in case of 384 bit.
also 24576 is full chip, gimped 20-22K Cuda safe to predict 18432 for a 5080 Ti or 75%/384 bit enabled die and 90% performance scaling. this is what most would want anyway. 5080 is shaping up only 50% of 5090, a total joke. would be 5070 Ti at 1199.
Posted on Reply
#34
TheDeeGee
Outback BronzeI wonder if it will need 2x 12V-2x6 connectors to power it :wtf: Surely hope not. Double the trouble.
Would be for the better actually, as it will balance the load.
Posted on Reply
#35
john_
zo0lykas
I googled around and seen the rumors about MCM design for the top chips and 24K CUDA cores. I really doubt it. I don't expect 5090 to go over those 16K cores of the 4090. For gaming I am expecting just whatever comes from the better architecture, faster and more VRAM, new and better features, meaning better software and drivers.
Posted on Reply
#36
CyberPomPom
nguyenI'm getting 80-100FPS @ 4K Pathtraced + DLSS Balanced + Frame Generation in both Cyberpunk 2077 and Alan Wake 2, if 5090 can get 2x the RT performance of 4090 then it would sit comfortable in the 160-200FPS range.

I'm interested in trying Path Tracing in Dragon Dogma 2 and RE4 Remake too.
The thing is that DLSS Balanced + Frame Gen is about 1 "true" pixel per 6 pixel displayed so there is that.
I know there is no such thing as "true" pixel and real time 3D rendering is a matter of cheating in a visually convincing way but with path tracing we are still far from playable at 4k and high refresh rates without cheating a bit too much.

The thing that irks me with DLSS (and FSR/XeSS) is NVidia getting us to call a 50-60fps @ 1200p setting as 80-100fps @ 4K Bal+FG.
Display as not much in common with render. Why call a render setting (internal resolution) with a display configuration name (4K) that got nothing to do with it ?
Playing with a lower resolution and frame generation are not new techs. DLSS is by far the best implementation of it but my 15 years old TV did it.
Posted on Reply
#37
mama
If it's just the 5090 then definitely a paper release. No question.
Posted on Reply
#38
Vayra86
freeagentI said I would get a 5090, and I will.. after I know for sure it wont try to self emolate.
Already ran out of that 12GB?! :peace:
john_I am expecting RTX 4090 to drop to $1500 and RTX 5090 introduce a new price point at $2000-$2200.
Good times ahead :p
What makes you think Nvidia won't be giving this new flagship 16GB and a xx103 SKU now...? Its the ONLY WAY they can keep it at the low low price of $2000,-. It does fit the trend of the specs being adjusted downwards. Not the TDP though, 5090 will be able to do 1KW in a 6 slot form factor.

Can't be giving full dies to these Geforce plebs. You're being way too positive here. You don't need hardware. All you need is DLSS4.2 which will reconstruct a game out of an AI generated image.
Posted on Reply
#39
Daven
Minus Infinitytrumor
I’m not sure if this was intentional or a typo but I love it…

trumor = truth + rumor

Lol. Perfectly sums up leaks like this.
Posted on Reply
#40
mb194dc
Technological stagnation is here... Outside of the likes of RT, the generations since 2020 will do fine for pretty much any gaming use case. Little point in them launching newer cards when inventory from 3 or 4 years ago is still out there in the channel. If fact, in the UK you can still get new 3070, 3080 from the big etailers and they're still at msrp pretty much. It's pretty tragic, can't believe anyone actually buys them but who knows.

Which is why they're pusing RT, with upscaling, which for me just leads to an overall worse visual experience (plus other side effects), because they've got nothing to sell to people outside of that.
Posted on Reply
#41
john_
mb194dcIt's pretty tragic, can't believe anyone actually buys them but who knows.
When hardware reaches a certain point, where it is more than enough for the majority of consumers, why replace it? When looking to buy a 4TB HDD for storage, does it matter if it is a 1 or 10 years old model? Does it matter much so someone starts looking at specs before buying and not just the price? Does it matter if the wifi receiver uses a 10 years old chip, if the receiver is old, but there are drivers for Win11? There are motherboards with audio chipsets that are 5+ years old. Does anyone care to see if they are old chips? Maybe check to see if the motherboard supports 6 or 8 channels, offers or not a digital audio out port. Nothing more.
I do understand that here we talk about graphics, but the majority out there doesn't really needs much. AMD could be selling a boatload of RX 580s if they could market them at $100 and still make a good profit. And most people would have been totally happy with such a card.

P.S. GT 1030, GT 710, are still available. And I don't really want to mention GT 210 here....eeewwwwwww
Posted on Reply
#42
_larry
I will never buy a GPU that costs more than my mortgage.. unless I win the lottery of course lol.
Posted on Reply
#43
Sithaer
Well thats gonna be a long wait for me until the 5060/Ti comes out and then wait for prices to drop a bit or pick up a second hand 4070 Super or something similar in 2025.:oops: 'not interested in anything higher up, way too expensive for me'
Posted on Reply
#44
nguyen
CyberPomPomThe thing is that DLSS Balanced + Frame Gen is about 1 "true" pixel per 6 pixel displayed so there is that.
I know there is no such thing as "true" pixel and real time 3D rendering is a matter of cheating in a visually convincing way but with path tracing we are still far from playable at 4k and high refresh rates without cheating a bit too much.

The thing that irks me with DLSS (and FSR/XeSS) is NVidia getting us to call a 50-60fps @ 1200p setting as 80-100fps @ 4K Bal+FG.
Display as not much in common with render. Why call a render setting (internal resolution) with a display configuration name (4K) that got nothing to do with it ?
Playing with a lower resolution and frame generation are not new techs. DLSS is by far the best implementation of it but my 15 years old TV did it.
4K DLSS Performance (so 1080p internally) looks miles better than 1440 Native (and much higher FPS). Lots of tech reviewer who do pixel peeping already concluded that 4K DLSS Quality provide equivalent static image quality to 4K Native while provide 40-50% higher FPS (meaning much higher motion fluidity)

Let see your 15 year old TV play a game then
Posted on Reply
#45
Intervention
I'd love to see what NVIDIA is going to be pricing this thing upon release, given the 4090 is already a blood sucking vampire with it's sucky power connector.
Posted on Reply
#46
P4-630
Nvidia: "We have designed a new 800 Watt power connector!"....
Posted on Reply
#47
FiRe
nguyenI'm getting 80-100FPS @ 4K Pathtraced + DLSS Balanced + Frame Generation in both Cyberpunk 2077 and Alan Wake 2, if 5090 can get 2x the RT performance of 4090 then it would sit comfortable in the 160-200FPS range.

I'm interested in trying Path Tracing in Dragon Dogma 2 and RE4 Remake too.
If you're using DLSS Balanced, you're not doing 4K pathtraced, you're doing less than 1440p pathtraced basically
Posted on Reply
#48
Onasi
So that, if the rumor is true, would indicate that the mid-range (i.e cards that actually matter) would drop end of spring 2025 at best. Cool, I guess there is a lot of 4060 and 4060 Ti to sell still. Great. Maybe AMD and Intel could actually react to this and release good 200-400 bucks options while NV jerks itself off with how much money they make on AI before that bubble inevitably bursts. They won’t though, probably, since Intel still has ways to go and a drunk hobo and his three-legged dog are more competent than current Radeon leadership. Sadge.
Sorry, that was unnecessarily snarky of me. True, though.
Posted on Reply
#49
Chaitanya
john_Well, people who pay for the top model (I see you have the 4090 and I am guessing you will be one of the first to go for a 5090), usually sell their current card months before the new one becomes available. So, I am guessing in the following months many will start putting ads for their 4090's at prices over $1400, sell them (easily thanks to the AI craze), stay with a mid range GPU until 5090 comes out and enjoy an upgrade (plus full new warranty) for less than $1000.
While GPUs are overpriced, one of the advantages of going RTX, is that the prices stay more or less stable over the years.
Quick scan at newegg points to cheapest 4090 at $1699 with most models going upwards of that. Ideally those who want 4090 should just wait for 5090 to launch and then get a good used 4090 at decent discounts.
www.newegg.com/p/pl?N=100007709%20601408874%208000%204814&Order=1
P4-630Nvidia: "We have designed a new 800 Watt power connector!"....
www.techpowerup.com/319242/nvidia-rtx-50-series-blackwell-to-debut-16-pin-pcie-gen-6-power-connector-standard
Posted on Reply
#50
neatfeatguy
Space LynxAye, my sister recently played a game and texted me, "what do I pick, DLSS, XESS, etc" lol I am aware we are a niche crowd
My brother was looking through the a game's graphic settings we were about to start playing a couple weeks back and asked me what DLSS is. I explained it to him and he said, "That's stupid, why would anyone want to use that?" He asked if I used it, I just laughed and told him no, that I don't have a need nor want to use it.

I'm just waiting to hear from Nvidia claiming they have a "new" way to make more frames or better downscaling to upscaling on a new AI software version that only works with the 50xx series.
Posted on Reply
Add your own comment
Nov 30th, 2024 19:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts