Sunday, May 5th 2024

NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025

NVIDIA debuted the current RTX 40-series "Ada" in 2022, which means the company is expected to debut its next-generation in some shape or form in 2024, having refreshed it earlier this year. We've known for a while that the new GeForce RTX 50-series "Blackwell" could see a 2024 debut, which is going by past trends, would be the top-two or three SKUs, followed by a ramp up in the following year, but we're now learning through a new Moore's Law is Dead leak that the launch could be limited to just the flagship product, the GeForce RTX 5090, or the SKU that succeeds the RTX 4090.

Even a launch limited to the flagship RTX 5090 would give us a fair idea of the new "Blackwell" architecture, its various new features, and how the other SKUs in the lineup could perform at their relative price-points, because the launch could at least include a technical overview of the architecture. NVIDIA "Blackwell" is expected to introduce another generational performance leap over the current lineup. The reasons NVIDIA is going with a more conservative launch of GeForce "Blackwell" could be to allow the market to digest inventories of the current RTX 40-series; and to accord higher priority to AI GPUs based on the architecture, which fetch the company much higher margins.
Source: Moore's Law is Dead (YouTube)
Add your own comment

154 Comments on NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025

#126
TheinsanegamerN
I mean, that's fine with me. Further extends the life of my 6800xt.
Posted on Reply
#127
DemonicRyzen666
wolfI can't recall a single time tbh, he very begrudgingly feels forced to admit (because Tim's upscaling testing conclusively bears it out) that DLSS is the superior upscaling tech if that matters to you as a buyer, and routinely says RT shouldn't matter to most buyers that are looking in the midrange or lower end of the market.
That whole comment he makes about D.L.S.S being superior, yet he has no physical evidence for that even to be true.
If someone want to actually claim something like that, they need to use a scientific method to prove it to be true.
Every time he makes this statement, he always states that is from his eyes, there for it nothing but opinion of the reviewer not fact.
Posted on Reply
#128
GhostRyder
RedelZaVednoI'm betting on $1999 IF Jensen doesn't get carried away. IF he does, MSRP will likely be $2199 or higher.
You maybe right, I was thinking he may do a slow burn but its going to come down to how they view the XX90 series cards behind the scenes. Most people who are buying those are streamers that use them for their business and can afford to do ridiculous high end upgrades now that those cards have hit those pricing brackets. Its really hard to justify spending that much money on a GPU, heck I fought with myself over spending 1K on a GPU years ago after spending $550 on the top end card (I bought 3 so I guess I did spend that much LOL, but its hard to do that for one for me).

I think if they decide that its just for streamers/people whose job relies on it they will push it to 2K especially if there is nothing else close. However, if they worry about upsetting the market or seeing backlash, they may do 1799 or 1899.
Posted on Reply
#129
HOkay
DemonicRyzen666That whole comment he makes about D.L.S.S being superior, yet he has no physical evidence for that even to be true.
If someone want to actually claim something like that, they need to use a scientific method to prove it to be true.
Every time he makes this statement, he always states that is from his eyes, there for it nothing but opinion of the reviewer not fact.
...no evidence except all the video he shows where DLSS looks better than FSR? There's a chance he's got loads of video showing FSR looking better which he's hiding, but I doubt it. Have you compared the two yourself? It varies massively from game to game, but DLSS almost always looks a little bit better to me. It's subjective of course, DLSS usually has more of a vaseline smear look compared to FSRs frequent failures to deal with aliased objects.
I don't use either unless I really need to for the frame rate I want, but if there's options for DLSS & FSR I'll pretty much always prefer DLSS.

What scientific method could you possibly use for comparisons? Surely which one looks better when you're playing a game is what matters?
john_They are pro for years. They just go with the flow. And as with most tech press, they will use harsh titles for AMD, they will be much more careful when having to talk about Nvidia or Intel.
Your example proves it.
They use the word "scam" for the 5700. They just avoided to use the same word for the RTX 3050 6GB. In fact they praised the card if I remember correctly as the best card at 75Ws. They also avoided using that word for Intel's fiasco with it's CPUs. You seem to not understand that the title in a video is much more influential than some excuses hidden in a video. Many tech sites do this constantly. Many tech sites keep certain companies happy by putting the right titles on their videos/articles.

So let's recap. A model with a different name is a scam, because it also comes with different specs. On the other hand, a model with worst specs and same name is not. CPUs that are becoming unstable and degrade in short time of use, they aren't either a scam. They are just Intel's mess. A mistake, but not a scam. Something that can be fixed. Not a scam. Selling CPUs at frequencies that aren't stable for at least the warranty period, is not a scam. Selling a graphics card with less CUDA cores, at lower frequencies with the same model name, is not a scam. Selling a CPU with less cache as a DIFFERENT model, is a scam. Am I saying it correctly?

Tim is praising DLSS and trashing FSR in every video he makes for the last year. He will lower the video speed at 25%, zoom by 4, and call any FSR pixel that is not at the same position that DLSS algorithm puts it, as being in the wrong position. As for Steve I will again repeat. He goes with the flow. Praising a 6 core/ 6 threads Intel CPU based on 720p benchmarking on games that couldn't see more than 4 threads against the R5 1600, then a few years latter praising the R5 1600 over that Intel CPU and the last couple of years praising Intel again, with chips like the 7800X3D being exceptions, because, let's face it, even Intel in a public review would had to admit that 7800X3D is one of the top options for gaming.


Blackwell for servers is a dual die chip. I will be surprised if Nvidia sells a dual die chip for gamers, except if it goes full AI push in the retail market and sell the 5080 with one full Blackwell die and then come latter with a dual die, dual memory 5090 at $3000 with the second die offering some extra performance/features in AI and other cases. While I can't think something specific, I bet a 2.5 trillions company will think of something.


LOL nice one.
I watch a lot of HWUB, enough that I am subbed to their Floatplane to give them a little bit of income. I don't _always_ agree with their conclusions, but the vast majority of the time I do. Their conclusions aside, they are very clear about their test methods & their results so I can form my own conclusions anyway. You seem more obsessed with their thumbnail choices then their data. I can honestly say I don't even notice what's in any thumbnails other than the title these days. You could be right on thumbnails having a large effect on people's opinions, I'm honestly not sure either way on that one. They could inadvertently be causing bias due to their thumbnails, but I believe they're just trying to produce clickbaity thumbnails & titles to get more viewers. That's the name of the game for YouTube unfortunately!
Posted on Reply
#130
wolf
Better Than Native
DemonicRyzen666That whole comment he makes about D.L.S.S being superior, yet he has no physical evidence for that even to be true.
If someone want to actually claim something like that, they need to use a scientific method to prove it to be true.
Every time he makes this statement, he always states that is from his eyes, there for it nothing but opinion of the reviewer not fact.
What physical evidence do you want? It's a consensus across every publication that tests upscaling, they show side by sides and you can literally see the difference. And it's plainly evident they're all correct, from the perspective of gamers who have access to use them all.
Posted on Reply
#131
DemonicRyzen666
wolfWhat physical evidence do you want? It's a consensus across every publication that tests upscaling, they show side by sides and you can literally see the difference. And it's plainly evident they're all correct, from the perspective of gamers who have access to use them all.
edge A.I.:
A side-by-side comparison often involves a degree of subjectivity, as it relies on the viewer’s perception and interpretation of the images. It can provide a visual representation of the differences between two images, but it might not offer quantifiable or factual proof of superiority.

For example, one viewer might prefer an image with more vibrant colors, while another might prefer an image with more realistic colors. Therefore, the “better” image can vary depending on individual preferences.

If you’re looking for a more objective or factual comparison, you might want to consider methods that involve measurable criteria, such as pixel count, color accuracy, or contrast ratio. However, even these methods might not capture all aspects of what makes an image “better” to a particular viewer.

In conclusion, while side-by-side comparisons can be useful, they often involve a degree of subjectivity and might not provide definitive proof of one image’s superiority over another. It’s always important to consider the context and purpose of the comparison.
subjectivity
the quality of being based on or influenced by personal feelings, tastes, or opinions.
Posted on Reply
#132
wolf
Better Than Native
@DemonicRyzen666 I see your point, of course it relies on peoples individual eyes, but in this instance I don't think you'll get anything harder evidence than all tech press coming to the same conclusion, bolstered by the vast majority of people who have RTX hardware and thus can view all solutions in action, saying the same thing. It's a consensus for a reason, and in the absence of physical evidence, this is an acceptable scientific process for something visual, take a large sample size of individuals and industry experts, record all their results and see what pattern emerges.

You're free to set the criteria for you personally accepting a given outcome, but I doubt it's going to convince anyone else to only accept a conclusion based on your criteria.

Out of curiosity, when you look at comparisons by the likes of HUB or DF, do you not see the difference? They zoom in and slow down the footage to make sure you look exactly where the artefacts/comparative differences are, but when gaming and viewing the live render with your own eyes, these artefacts and differences are easy to spot, most draw my eye right to them.
Posted on Reply
#133
nguyen
@DemonicRyzen666
So when people tell you chicken wing is much better than a turd, they are not being objective but rather subjective? :roll: . Weird logic there man, also a copy paste answer from Edge AI? what the heck.
Posted on Reply
#134
AGlezB
I was thinking about nVidia releasing the 5080 first and wondering what would motivate me to do the same if I was in their position. I found a simple answer.

If I release the 5090 first I'll get all the buyers looking for 5090s and maybe some of the 5080 buyers as well. Nothing new here.

But if I first release a limited batch of 5080 and immediatly after release the 5090s I'll sell all the 5080s but I will also leave a lot of potential 5080 buyers waiting.
That alone push a lot of 5080 buyers to upgrade their buying decision and take the 5090. There are several reasons for it, but creating scarcity to guide the buyers is also nothing new.

That explanation works if nVidia is trying to maximize profits in the RTX market but there is another, maybe more likely explanation: They'll concentrate their production in 5080s and use most of the production capacity allocated for 5090s (which require better binning) for AI and other enterprise products which are way more profitable for them. They'll still sell pretty much every 5080 they put on the shelfs and by making 5090s a rarer product it increases in perceived price so buyers will be more likely to pay for it. Its'a win-win for them.

This is complicated and I certainly don't have a crystall ball. We'll see in a few months how it goes.
Posted on Reply
#135
HOkay
AGlezBI was thinking about nVidia releasing the 5080 first and wondering what would motivate me to do the same if I was in their position. I found a simple answer.

If I release the 5090 first I'll get all the buyers looking for 5090s and maybe some of the 5080 buyers as well. Nothing new here.

But if I first release a limited batch of 5080 and immediatly after release the 5090s I'll sell all the 5080s but I will also leave a lot of potential 5080 buyers waiting.
That alone push a lot of 5080 buyers to upgrade their buying decision and take the 5090. There are several reasons for it, but creating scarcity to guide the buyers is also nothing new.

That explanation works if nVidia is trying to maximize profits in the RTX market but there is another, maybe more likely explanation: They'll concentrate their production in 5080s and use most of the production capacity allocated for 5090s (which require better binning) for AI and other enterprise products which are way more profitable for them. They'll still sell pretty much every 5080 they put on the shelfs and by making 5090s a rarer product it increases in perceived price so buyers will be more likely to pay for it. Its'a win-win for them.

This is complicated and I certainly don't have a crystall ball. We'll see in a few months how it goes.
I think they generally go for the opposite approach though, release the 5090 first & tempt all those potential 5080 buyers into stepping up to a 5090 because it's right there & they could have it right now if they just pay a little (a lot!) more. I imagine that's probably the best tactic for pushing buyers up the stack since they keep doing it that way. Keeping the best silicon for AI products does complicate things though, they could give us something lower in the stack, or they might even just give us a much more cut down 5090 :oops:
Posted on Reply
#136
stimpy88
nguyen@DemonicRyzen666
So when people tell you chicken wing is much better than a turd, they are not being objective but rather subjective? :roll: . Weird logic there man, also a copy paste answer from Edge AI? what the heck.
You obviously have never owned a dog. I'd never bet my life on which one of these it would choose!
Posted on Reply
#137
AGlezB
HOkayI think they generally go for the opposite approach though, release the 5090 first & tempt all those potential 5080 buyers into stepping up to a 5090 because it's right there & they could have it right now if they just pay a little (a lot!) more. I imagine that's probably the best tactic for pushing buyers up the stack since they keep doing it that way. Keeping the best silicon for AI products does complicate things though, they could give us something lower in the stack, or they might even just give us a much more cut down 5090 :oops:
Rembember, they can't sell 5090s in China. What they did with the 4090 was cut it down an sell it as 4090D which means they had lo leave profit on the table. Plus TSMC's fab capacity is sold all the way thru 2027 so they have to make do with the capacity they currently have. The AI gold rush just makes it more profitable to go all in on enterprise products because there is a waitlist for those. As I see it, the only reason nVidia has to keep pushing products in the consumer market is to keep the competition at bay because the real money is in the enterprise market. That is also why they will not lower their prices, because as long as they're competitive against Intel and AMD they don't care about the rest.
Posted on Reply
#138
john_
HOkayYou seem more obsessed with their thumbnail choices
It's an indication when a reviewer is biased, either by fear, or for gain. JMO. When a publication uses strong wording against company A, while being more careful when talking about company B or C, you know that, no matter their data, they are NOT 100% objective. Now HU is famous for their tests, so even I will accept that their data will be accurate. They have too or they are done. But Steve's wording in his conclusion will be carefully biased, based on which company's hardware is getting tested and what the findings are. And then there is Tim. He just can't hide it. Are there openings at Nvidia this period?
Posted on Reply
#139
AGlezB
john_But Steve's wording in his conclusion will be carefully biased, based on which company's hardware is getting tested and what the findings are. And then there is Tim. He just can't hide it. Are there openings at Nvidia this period?
The whole point of a subscribing youtube tech channel is to hear what they think about stuff, a.k.a. what their bias is. Complaning about it is like going to the circus and complaining about the clowns.

There are no unbiased youtube channels.
There are no unbiased instagram reels.
There are no unbiased facebook posts.
Even scientific publications get their fair share of bias no matter how hard they try to keep it out.
And as a result there are, so far, no unbiased LLMs (AIs) because they're trained with biased datasets.

Complaining about bias makes no sense to me. That is the way to an early anxiety disorder.
Posted on Reply
#140
john_
AGlezBThe whole point of a subscribing youtube tech channel is to hear what they think about stuff, a.k.a. what their bias is. Complaning about it is like going to the circus and complaining about the clowns.
Sorry, I can't understand this line as an argument.
AGlezBThere are no unbiased youtube channels.
There are no unbiased instagram reels.
There are no unbiased facebook posts.
Your point? other than agreeing that HU is not totally objective?
Even scientific publications get their fair share of bias no matter how hard they try to keep it out.
And as a result there are, so far, no unbiased LLMs (AIs) because they're trained with biased datasets.
Scientific publications need bias when the people writing those can't explain everything they try to convince others that try can explain.
LLMs.... How did we got from reviews to LLMs, compilers, and whatever? Please, focus.
Complaining about bias makes no sense to me. That is the way to an early anxiety disorder.
When it makes sense to you, you will be in the right direction. And no, nothing to do with "anxiety disorder". How did this became an argument anyway? Personal experience?
We are not brainless drones to not criticize stuff we see around us. It's just human behavior.
Posted on Reply
#141
AGlezB
john_Sorry, I can't understand this line as an argument.
It's very simple. Water is wet, fire is hot, circuses have clowns, humans have biases.

You talk about objectivity and the lack of it in a medium that is by definition biased.
That means complaining about their bias has the exact same value as complaining that water is wet or fire is hot or clowns in a circus.

This conversation wouldn't have gone pass the first post if you just said "I don't like them" instead of going out of your way to justify your own biases towards them.
You see, this whole time I've been trying to avoid the use of the word "hypocrisy".
Posted on Reply
#142
john_
AGlezBIt's very simple. Water is wet, fire is hot, circuses have clowns, humans have biases.

You talk about objectivity and the lack of it in a medium that is by definition biased.
That means complaining about their bias has the exact same value as complaining that water is wet or fire is hot or clowns in a circus.

This conversation wouldn't have gone pass the first post if you just said "I don't like them" instead of going out of your way to justify your own biases towards them.
You see, this whole time I've been trying to avoid the use of the word "hypocrisy".
OK, OK, you just try to convince me that HU being biased is normal and expected.
Sorry haven't realized that the whole time you where supporting my opinion.
Thanks?
Posted on Reply
#143
AGlezB
john_OK, OK, you just try to convince me that HU being biased is normal and expected.
Sorry haven't realized that the whole time you where supporting my opinion.
Thanks?
Not quite "supporting your opinion". It's always been pretty obvious to me that opinion equals bias so I was just surprised you thought it could be any different to the point of complaining about it.

AFAIC the only thing in the world that isn't biased is raw measurment data and that's only if the instruments used were properly calibrated and set up and even then there is a not insignificant chance of uncertainty introduced by the precision and accuracy of the instruments which is why you'll usually hear terms like "margin of error" and measurments are taken over and over again until an acceptable level of confidence is achieved. "Confidence" meaning "we know the data isn't perfect but this is the best we can do".
Posted on Reply
#145
Godrilla
Rumors changing by the day 5080 rumored to launch before 5090 with 4090d like performance with 16 gigs of vram and 95 sms on 4n node. With plenty of 4090s still in circulation Nvidia doesn't want a 3090ti like event where prices dropped by half. Hence the rumors changing every day. Daily reminder Nvidia is the master at market manipulation leaks rumors, misinformation and all!
Posted on Reply
#146
64K
GodrillaRumors changing by the day 5080 rumored to launch before 5090 with 4090d like performance with 16 gigs of vram and 95 sms on 4n node. With plenty of 4090s still in circulation Nvidia doesn't want a 3090ti like event where prices dropped by half. Hence the rumors changing every day. Daily reminder Nvidia is the master at market manipulation leaks rumors, misinformation and all!
If the 5080 will only perform like a 4090 then that will be a disappointment indeed. The xx80 has always been a good bit faster than the previous generation gaming flagship GPU but as you said the rumor mill is churning right now.
Posted on Reply
#147
Godrilla
64KIf the 5080 is only performing like a 4090 then that will be a disappointment indeed. The xx80 has always been a good bit faster than the previous generation gaming flagship GPU but as you said the rumor mill is churning right now.
There is always the smoke and mirrors trick at a premium called dlss exclusive to Blackwell features. Remember when Nvidia said dlss will reach level 10. When mid range 4000 series dropped I already knew that Nvidia will be moving away from the hardware improvement model to a software imrovement premium model. Who remembers those frame generation charts Nvidia dropped on us in 2022 that no one asked for and mostly no one wants/uses. If your hardware is strong enough you mostly will not use it or weak enough where you can't use it. Who wants to add lag and or artifacts to their experience? I know I don't outside maybe one or two titles where I am pretty sure I could have waited for the hardware to catch up without using it.
Posted on Reply
#148
stimpy88
GodrillaThere is always the smoke and mirrors trick at a premium called dlss exclusive to Blackwell features. Remember when Nvidia said dlss will reach level 10. When mid range 4000 series dropped I already knew that Nvidia will be moving away from the hardware improvement model to a software imrovement premium model. Who remembers those frame generation charts Nvidia dropped on us in 2022 that no one asked for and mostly no one wants/uses. If your hardware is strong enough you mostly will not use it or weak enough where you can't use it. Who wants to add lag and or artifacts to their experience? I know I don't outside maybe one or two titles where I am pretty sure I could have waited for the hardware to catch up without using it.
I actually think a lot of the nGreedia algorithms are actually done in software, in the driver, rather than in dedicated fixed-function hardware. We have had a few instances where nGreedia have stated that some new feature is because of dedicated hardware, yet things have been hacked to work on older gen cards. I think a lot of this DLSS "AI" is BS and is just a fake tax on games studios. I mean sure, they have an algorithm that makes DLSS work, and yeah in its early days they may well have taken samples from games to improve it, but to say it's AI done at nGreedia HQ on state-of-the-art hardware is pure marketing BS. The last feature I remember them saying needed specific hardware was RTX video, then it came to the 20x0 series in the end when somebody found you could make it work and nearly a year later nGreedia released it for those cards.

But I would not be surprised if Blackwell did bring in some new fixed function hardware to speed certain things up as it's been a while, UE5 comes to mind after hearing rumours of some hardware acceleration for Lumen, but from everything I have seen and heard about Blackwell so far seems to indicate that it's the same chip as the 40x0 cards, just some 50% more of it. The same as the 30x0 series of cards - 50% more of the same. It's been a while since we have seen actual new hardware in the die, with much of it ending up being pure software.
Posted on Reply
#149
arni-gx
GodrillaRumors changing by the day 5080 rumored to launch before 5090 with 4090d like performance with 16 gigs of vram and 95 sms on 4n node. With plenty of 4090s still in circulation Nvidia doesn't want a 3090ti like event where prices dropped by half. Hence the rumors changing every day. Daily reminder Nvidia is the master at market manipulation leaks rumors, misinformation and all!
OMG, plz no..... VRAM 16gb is not enough anymore for 2160p in pc games AAA, 2024-206 !!

anyway, is this the 1st nvidia gpu (rtx 5080-5090) that has full support of PCIE 5.0 16x ??
Posted on Reply
#150
Blueberries
If the 5090 dual-die rumors are true it might be enough performance to warrant upgrading from my 4090, but I was hoping to get 5 years out of it.
Posted on Reply
Add your own comment
Nov 30th, 2024 19:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts