Sunday, December 24th 2023

NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).

NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source: Moore's Law is Dead (YouTube)
Add your own comment

126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024

#76
ThrashZone
stimpy88Well, I couldn't justify spending £1200 to £2200 on the only two 40x0 series cards that make any kind of sense. Even the 4080 is a bad buy because it's so slow in modern games @4k. The gap between it and the 4090 is just too wide, if the 4080 had been 20% faster, I would have pulled the trigger. The 40x0 series is the worst value series nGreedia have ever released. I don't think their shenanigans will stop with the 50x0 series.

I know a lot of people that felt the same, and are holding on for the 50x0 series instead, as hopefully, even the lower range cards will offer some meaningful improvements over the 20x0 and 30x0 series cards everyone still use.
Hi,
Being someone still on 10 series I haven't seen any value in anything to dates worth buying seeing I wasn't a crypto miner :laugh:
Posted on Reply
#77
dalekdukesboy
Space LynxNot going to lie, if I was rich a 2nm Arrow Lake and RTX 5090 would be epic as fuck combo. lol

Considering the price I paid for my current rig though, $200 cpu and $110 mobo, and $705 gpu on sale... meh. I am happy where I am.

If someone hires me with a decent salary next year though I might considering selling my current rig and getting my dream Ultima 5090 and 2nm Arrow Lake combo in Winter 2024.
Ditto. The 7900xtx would be slightly better in your rig but as it is the xt might be bottlenecks by cpu etc. You did make the most of it though!
Posted on Reply
#78
ThrashZone
Prima.VeraAI is here to stay. I foresee that toilet seats will soon have AI here in Japan...
lol
Reminds me of this clip of better cal saul :laugh:
Posted on Reply
#79
stimpy88
gffermari....but the 4080 is a 1440p card. Not a 4K one.

The 2000 series was the worst value lineup ever released. There was no use case to take the most out of them. They had way more value later (DLSS) rather than when they were released.
Oh I know. 1080p if you want the full UE5 experience with ultra quality and hopefully 30-50fps! Not bad for the low, low price of £1200!

I have a 2070, and think that it's not a bad card, I still get 30fps in Cyberpunk with all features up full, HDR, enhanced 4k texture pack, using DLSS quality on a 4k monitor, no RT of course.
Posted on Reply
#80
gffermari
stimpy88Oh I know. 1080p if you want the full UE5 experience with ultra quality and hopefully 30-50fps! Not bad for the low, low price of £1200!

I have a 2070, and think that it's not a bad card, I still get 30fps in Cyberpunk with all features up full, HDR, enhanced 4k texture pack, using DLSS quality on a 4k monitor, no RT of course.
...any gpu north of 2080Ti/3060Ti/6800 is capable of providing ridiculous number of fps in raster games, at mostly any resolution as long as the vram requirements does not exceed the available one.
You don't need a 4080 for this.

The 2070 is ok.
I had 2080 and 2080Ti before moving to 4080. DLSS is godsent feature for these gpus.

On topic:
the price of 4090 has been increased a year after release.
that's not good for the 5000 series.

In general, nVidia goes in high, low, high, low pattern.
1000 - 2000 - 3000 - 4000 - 5000series
Low - High - Low - High - Low

I think we may have a second round of 3000 series again where the MSRPs are low but the actual market prices are ridiculous due to AI, crypto craziness.
Posted on Reply
#81
umeng2002
If AMD doesn't start competing with nVidia on ray-tracing and other features, nVidia will just keep on increasing prices. Remember when Intel kept core count at four and where charging like $500 for them for a decade?
Posted on Reply
#82
THU31
umeng2002If AMD doesn't start competing with nVidia on ray-tracing and other features, nVidia will just keep on increasing prices. Remember when Intel kept core count at four and where charging like $500 for them for a decade?
You're right about the core counts, but not about the prices. You could always buy an i5 around $200 and an i7 around $300. CPU prices have barely gone up over the last 14 years. The main difference is that you can only overclock the K models from Intel, and those have gotten significantly more expensive. 2500K was $216, 14600K is $319. But the 2400 was $184, and the 13400 is $221 (so it's actually cheaper if you account for inflation).

But if AMD doesn't have high-end RDNA4 chips, then the 5090 and 5080 will be even more expensive. Though personally I don't care about those cards anyway, I'm not buying another GPU that draws more than 200 W (unless it's cheap and I can limit the power for the summer).
Posted on Reply
#83
umeng2002
Well, Intel's BIG BOY gaming chips were always 4 core 8 threat and $400 for like a decade.

AMD's RT performance is a generation behind nVidia. My RTX 4070 Ti performs better than a 7900 XTX in Cyberpunk Path Tracing even when FSR 3 is modded in. And even then, a 4070 Ti is just barely good enough at 1440p.
Posted on Reply
#84
sethmatrix7
gffermari....but the 4080 is a 1440p card. Not a 4K one.

The 2000 series was the worst value lineup ever released. There was no use case to take the most out of them. They had way more value later (DLSS) rather than when they were released.
I want what you're smoking if the $1200 4080 is not a 4K card.

Posted on Reply
#85
THU31
@sethmatrix7 You should calculate an average from all the games that have come out since that original review.

I'm gonna do that for you.

Avatar - 45 FPS
Alan Wake II - 50 FPS (no RT), 33 FPS (regular RT)
Lords of the Fallen - 41 FPS
Cyberpunk PL - 53 FPS (no RT), 27 FPS (regular RT)
Immortals of Aveum - 37 FPS
Atlas Fallen - 115 FPS
Ratchet & Clank - 87 FPS (no RT), 53 FPS (RT)
Remnant II - 36 FPS
Jedi Survivor - 56 FPS (no RT), 53 FPS (RT)
The Last of Us - 59 FPS
Hogwarts Legacy - 58 FPS (no RT), 26 FPS (RT)
Dead Space - 57 FPS
Sporfoken - 64 FPS (no RT), 53 FPS (RT)

Average - 58 FPS (no RT)

I focused on "next-gen" games. There were some cross-gen titles that obviously ran really well (Resident Evil 4, Atomic Heart, AC Mirage).

Clearly I don't blame the card for this. Most of these games were completely unoptimized, some of them pushed crazy visuals. But with average 58 FPS I don't consider this a native 4K card. And if you have to upscale, it's not a 4K card, just like the 3090 was not an 8K card just because it supported DLSS Ultra Performance.

Whoever is to blame, paying $1200 for 58 FPS without ray tracing is just pathetic. A 4090 can't even hit 60 in some of these games.
Posted on Reply
#86
gffermari
sethmatrix7I want what you're smoking if the $1200 4080 is not a 4K card.

So, all the cards from 2080Ti and 3070 and above are 4K cards as well??
Since they can score 60+fps avg.

Most of the gpus can perform in simple raster games at 4K. That does not make them 4K cards.
In RT/PT conditions is where the performance matters only and the 4080 can do that up to UW1440p.
Posted on Reply
#87
chrcoluk
john_I would expect, 5090 offering +30% performance over 4090, costing $2200, at 600W typical and 5060 ti at $500 with 8GB of VRAM, and performance equal to a 4060 Ti, but with DLSS 4.0 and a new type of ray reconstruction, both offered only to 5000 series..
I wouldnt put it past Nvidia to avoid putting 16 gig on 5070 and 5060ti. 8 gig would be really taking the mickey though, 12 gig surely the absolute minimum.
Posted on Reply
#88
Dawora
ThrashZoneHi,
Being someone still on 10 series I haven't seen any value in anything to dates worth buying seeing I wasn't a crypto miner :laugh:
if 1080 its allredy kinda Slow..
everything under is realy slow

No money to buy then ppls say "i havent seen any value" its like lieing u self to fell better.
gffermariSo, all the cards from 2080Ti and 3070 and above are 4K cards as well??
Since they can score 60+fps avg.

Most of the gpus can perform in simple raster games at 4K. That does not make them 4K cards.
In RT/PT conditions is where the performance matters only and the 4080 can do that up to UW1440p.
There is option in games..
so yes slower cards can be used in 4k just fine.
THU31@sethmatrix7 You should calculate an average from all the games that have come out since that original review.

I'm gonna do that for you.

Avatar - 45 FPS
Alan Wake II - 50 FPS (no RT), 33 FPS (regular RT)
Lords of the Fallen - 41 FPS
Cyberpunk PL - 53 FPS (no RT), 27 FPS (regular RT)
Immortals of Aveum - 37 FPS
Atlas Fallen - 115 FPS
Ratchet & Clank - 87 FPS (no RT), 53 FPS (RT)
Remnant II - 36 FPS
Jedi Survivor - 56 FPS (no RT), 53 FPS (RT)
The Last of Us - 59 FPS
Hogwarts Legacy - 58 FPS (no RT), 26 FPS (RT)
Dead Space - 57 FPS
Sporfoken - 64 FPS (no RT), 53 FPS (RT)

Average - 58 FPS (no RT)

I focused on "next-gen" games. There were some cross-gen titles that obviously ran really well (Resident Evil 4, Atomic Heart, AC Mirage).

Clearly I don't blame the card for this. Most of these games were completely unoptimized, some of them pushed crazy visuals. But with average 58 FPS I don't consider this a native 4K card. And if you have to upscale, it's not a 4K card, just like the 3090 was not an 8K card just because it supported DLSS Ultra Performance.

Whoever is to blame, paying $1200 for 58 FPS without ray tracing is just pathetic. A 4090 can't even hit 60 in some of these games.
kinda BS.. PC is not console.
there is options available..
Using best optimal settings..

Avatar 84 FPS
Alanwake II 72 FPS
Lords of the Fallen - 79 FPS
Cyberpunk PL - 104 FPS (no RT)
Immortals of Aveum - 62 FPS
Atlas Fallen - 188 FPS
Ratchet & Clank - 175 FPS (no RT)
Remnant II - 71FPS
Jedi Survivor - 88 FPS (no RT)
The Last of Us - 94 FPS
Hogwarts Legacy - 102 FPS
Dead Space - 91 FPS
Sporfoken - 102 FPS (no RT)
Posted on Reply
#89
KLMR
Why do you still create hype before hardware releases? Haven't you suffered enough new-product-marketing-cycles?

AMD has released good products in all segments, problem is, reviews don't update prices over time/place, so most render just as an advertisment and not a review after a few months.

If TPU or similar used a local price crawler associated to their review information (mostly updated after each part release) people would have a good decision tool and I'm sure that would incline AMD in many more cases. People thinks they are grasping some 4090 when they buy a 3060ti or completely discart intel GPUs because of old bad reviews.
If you look for intel 770 review to evaluate the purchase you'll have outdated information. Who looks for a 7800XT review to evaluate the performance of the 3060ti or an intel one? Few.

There is no competition to 4090, true. But after that there is competition in every single tier and its diferent at every region or country depending on local prices. I think its the worst non-solved problem of written and youtube reviews.
What AMD can't afford is not competing with products, but with marketing in the "high end" segment.
Posted on Reply
#90
umeng2002
Just to reiterate, with path-tracing, there is no competition. The 7900 XTX absolutely fails to deliver playable frame rates even with frame generation.
Posted on Reply
#91
rjc34
OnasiIt is still the future in the absolute perspective, at some point rasterization will just straight up run out of potential improvements and rendering cheats that can be applied. However, we aren’t talking 2025 or even 2030. People who fell for NVidias hype and genuinely thought that GPUs that are capable of full RTRT rendering with the level of fidelity expected from modern high budget titles are just around the corner were silly. We are going to have a decade at least of the hybrid approach where RT is used for certain effects only at a steadily decreasing cost (hopefully). And this isn’t even talking about full fat Path Tracing which is significantly more expensive.
It seems like you've got both parts of the equation here, but you haven't put them together yet.

In the same way rasterization fidelity improved through boatloads of rendering cheats and other techniques learned by experience, developers and game engines will build up a similar portfolio of experience with how to optimally leverage RT on available hardware.

It's a chicken-and-egg situation. Can't get devs to adopt new rendering technology without an available hardware install base to run it on. You also can't just suddenly start dedicating half your die area to rendering hardware for quickly improving technology that comes at the direct cost of die area for your rasterization hardware. There has to be some kind of concrete-ish long term goal to transition towards, and Nvidia has pretty clearly communicated their hat is in the ring of RT lighting and ML upscaling, with full RTGI as one of the long term benchmarks.

Current gen consoles will continue to ensure all major releases are still built upon a fundamentally traditional rasterized/render cheated baseline, but as titles like Alan Wake II are showing off, more and more will learn to smoothly integrate various RTGI type systems into their existing non-RT bases. We'll have a broadly hybrid approach for the next 5 years or so, but I can certainly see over the next 2-3 years some major titles starting to ship with RTGI as their "definitive" baseline and non- or limited-RT versions built to run on the consoles delivering a noticeably downgraded visual experience.
Posted on Reply
#92
TechnoLadz
Dirt ChipHoping for a 1080ti proper successor.
All it takes is a 4080 pref level with 4090 mem GB, 4070 W and 4070ti price.

But as presumably only NV is left on the table, well...
I mean - you will likely see that with Blackwell as the 5070 or 5070 Ti. But that is likely over a year away (Presuming; 5090 and 5080 are Dec 2024, 5070 Ti is Feb 2025 (announcement at CES tho) and 5070 is April 2025.
Posted on Reply
#93
Waldorf
@KLMR
except the best hw is useless, if you dont have (proper) working drivers.
just because intel makes a decent gpu, doesnt mean it performs (good) enough.
Posted on Reply
#94
Dirt Chip
TechnoLadzI mean - you will likely see that with Blackwell as the 5070 or 5070 Ti. But that is likely over a year away (Presuming; 5090 and 5080 are Dec 2024, 5070 Ti is Feb 2025 (announcement at CES tho) and 5070 is April 2025.
I will wait patiently:)
Posted on Reply
#95
Prima.Vera
I'm struggling really hard to think of any game out there worth of investment of a new video card....
Posted on Reply
#96
lexluthermiester
Prima.VeraI'm struggling really hard to think of any game out there worth of investment of a new video card....
I look at it this way: One game might be the reason for the timing of the upgrade, but it will never be the only benefactor of said purchase. Every game that follows and precedes will also benefit. Allowing a single game to influence your purchase timing is fine, there's nothing wrong with that, because it will very likely happen sooner or later anyway.
Posted on Reply
#97
Waldorf
@Prima.Vera
how about ppl running 1 or 2K, and are switching to 4K?
not everyone centers their purchase on a (single) game...

ignoring that if i still had my old job, the next xx80 with a WB would have been mine,
without having to do a single thing with games...
Posted on Reply
#98
Nordic
umeng2002Just to reiterate, with path-tracing, there is no competition. The 7900 XTX absolutely fails to deliver playable frame rates even with frame generation.
Would you say that the RTX 3090 fails to deliver playable frame rates with frame generation or not?
Posted on Reply
#99
sLowEnd
Prima.VeraI'm struggling really hard to think of any game out there worth of investment of a new video card....
Everyone's got different gaming needs. I don't need a graphics card upgrade for what I currently play either, but that could always change in the future if there's a new and demanding game that captures my interest. As it stands right now though, I don't know what I'd be doing with more GPU power except maybe fold more. *shrug*
Posted on Reply
#100
Waldorf
@sLowEnd
thought the same until a few years ago, once you start looking at 0.1% of worst fps, it can change.
in the past 20y i usually was on xx60Ti/70 chips, basically just providing enough frames for gaming,
but since my 2080S is LC and having more than enough power for my games, i started paying attention to
the worst performing parts of results (incl in-game stuff), and could see a smal but notcable difference to my friends Ti,
even that it was on air (less and shorter boost).

its similar to cars for me.
sure a prius or other small car is usually fuel efficient, but when you dont put your foot down,
as in driven the same way, most (european) cars can have 2-3 times the output,
yet still do same or even better (e.g. 17mpg vs 19 for a BMW M3).
saw it similar in late 90s when i had a (little tweaked) golf with a third of the output of my fathers E55,
yet when driving at normal pace/speed, he needed less than 1/2gal more per 100km,
as i basically had to put my foot down, to stay with him "cruising" along..
Posted on Reply
Add your own comment
Nov 21st, 2024 07:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts