Sunday, May 5th 2024
NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025
NVIDIA debuted the current RTX 40-series "Ada" in 2022, which means the company is expected to debut its next-generation in some shape or form in 2024, having refreshed it earlier this year. We've known for a while that the new GeForce RTX 50-series "Blackwell" could see a 2024 debut, which is going by past trends, would be the top-two or three SKUs, followed by a ramp up in the following year, but we're now learning through a new Moore's Law is Dead leak that the launch could be limited to just the flagship product, the GeForce RTX 5090, or the SKU that succeeds the RTX 4090.
Even a launch limited to the flagship RTX 5090 would give us a fair idea of the new "Blackwell" architecture, its various new features, and how the other SKUs in the lineup could perform at their relative price-points, because the launch could at least include a technical overview of the architecture. NVIDIA "Blackwell" is expected to introduce another generational performance leap over the current lineup. The reasons NVIDIA is going with a more conservative launch of GeForce "Blackwell" could be to allow the market to digest inventories of the current RTX 40-series; and to accord higher priority to AI GPUs based on the architecture, which fetch the company much higher margins.
Source:
Moore's Law is Dead (YouTube)
Even a launch limited to the flagship RTX 5090 would give us a fair idea of the new "Blackwell" architecture, its various new features, and how the other SKUs in the lineup could perform at their relative price-points, because the launch could at least include a technical overview of the architecture. NVIDIA "Blackwell" is expected to introduce another generational performance leap over the current lineup. The reasons NVIDIA is going with a more conservative launch of GeForce "Blackwell" could be to allow the market to digest inventories of the current RTX 40-series; and to accord higher priority to AI GPUs based on the architecture, which fetch the company much higher margins.
154 Comments on NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025
If someone want to actually claim something like that, they need to use a scientific method to prove it to be true.
Every time he makes this statement, he always states that is from his eyes, there for it nothing but opinion of the reviewer not fact.
I think if they decide that its just for streamers/people whose job relies on it they will push it to 2K especially if there is nothing else close. However, if they worry about upsetting the market or seeing backlash, they may do 1799 or 1899.
I don't use either unless I really need to for the frame rate I want, but if there's options for DLSS & FSR I'll pretty much always prefer DLSS.
What scientific method could you possibly use for comparisons? Surely which one looks better when you're playing a game is what matters? I watch a lot of HWUB, enough that I am subbed to their Floatplane to give them a little bit of income. I don't _always_ agree with their conclusions, but the vast majority of the time I do. Their conclusions aside, they are very clear about their test methods & their results so I can form my own conclusions anyway. You seem more obsessed with their thumbnail choices then their data. I can honestly say I don't even notice what's in any thumbnails other than the title these days. You could be right on thumbnails having a large effect on people's opinions, I'm honestly not sure either way on that one. They could inadvertently be causing bias due to their thumbnails, but I believe they're just trying to produce clickbaity thumbnails & titles to get more viewers. That's the name of the game for YouTube unfortunately!
You're free to set the criteria for you personally accepting a given outcome, but I doubt it's going to convince anyone else to only accept a conclusion based on your criteria.
Out of curiosity, when you look at comparisons by the likes of HUB or DF, do you not see the difference? They zoom in and slow down the footage to make sure you look exactly where the artefacts/comparative differences are, but when gaming and viewing the live render with your own eyes, these artefacts and differences are easy to spot, most draw my eye right to them.
So when people tell you chicken wing is much better than a turd, they are not being objective but rather subjective? :roll: . Weird logic there man, also a copy paste answer from Edge AI? what the heck.
If I release the 5090 first I'll get all the buyers looking for 5090s and maybe some of the 5080 buyers as well. Nothing new here.
But if I first release a limited batch of 5080 and immediatly after release the 5090s I'll sell all the 5080s but I will also leave a lot of potential 5080 buyers waiting.
That alone push a lot of 5080 buyers to upgrade their buying decision and take the 5090. There are several reasons for it, but creating scarcity to guide the buyers is also nothing new.
That explanation works if nVidia is trying to maximize profits in the RTX market but there is another, maybe more likely explanation: They'll concentrate their production in 5080s and use most of the production capacity allocated for 5090s (which require better binning) for AI and other enterprise products which are way more profitable for them. They'll still sell pretty much every 5080 they put on the shelfs and by making 5090s a rarer product it increases in perceived price so buyers will be more likely to pay for it. Its'a win-win for them.
This is complicated and I certainly don't have a crystall ball. We'll see in a few months how it goes.
There are no unbiased youtube channels.
There are no unbiased instagram reels.
There are no unbiased facebook posts.
Even scientific publications get their fair share of bias no matter how hard they try to keep it out.
And as a result there are, so far, no unbiased LLMs (AIs) because they're trained with biased datasets.
Complaining about bias makes no sense to me. That is the way to an early anxiety disorder.
LLMs.... How did we got from reviews to LLMs, compilers, and whatever? Please, focus. When it makes sense to you, you will be in the right direction. And no, nothing to do with "anxiety disorder". How did this became an argument anyway? Personal experience?
We are not brainless drones to not criticize stuff we see around us. It's just human behavior.
You talk about objectivity and the lack of it in a medium that is by definition biased.
That means complaining about their bias has the exact same value as complaining that water is wet or fire is hot or clowns in a circus.
This conversation wouldn't have gone pass the first post if you just said "I don't like them" instead of going out of your way to justify your own biases towards them.
You see, this whole time I've been trying to avoid the use of the word "hypocrisy".
Sorry haven't realized that the whole time you where supporting my opinion.
Thanks?
AFAIC the only thing in the world that isn't biased is raw measurment data and that's only if the instruments used were properly calibrated and set up and even then there is a not insignificant chance of uncertainty introduced by the precision and accuracy of the instruments which is why you'll usually hear terms like "margin of error" and measurments are taken over and over again until an acceptable level of confidence is achieved. "Confidence" meaning "we know the data isn't perfect but this is the best we can do".
i hope rtx 5080 with 20GB vram will come 1st in Q3-Q4 2024, and will still full compatible with PCIE v3.0 16x.
But I would not be surprised if Blackwell did bring in some new fixed function hardware to speed certain things up as it's been a while, UE5 comes to mind after hearing rumours of some hardware acceleration for Lumen, but from everything I have seen and heard about Blackwell so far seems to indicate that it's the same chip as the 40x0 cards, just some 50% more of it. The same as the 30x0 series of cards - 50% more of the same. It's been a while since we have seen actual new hardware in the die, with much of it ending up being pure software.
anyway, is this the 1st nvidia gpu (rtx 5080-5090) that has full support of PCIE 5.0 16x ??