Sunday, December 24th 2023
NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024
NVIDIA's next-generation GeForce RTX 50-series "Blackwell" gaming GPUs are on course to debut toward the end of 2024, with a Moore's Law is Dead report pinning the launch to Q4-2024. This is an easy to predict timeline, as every GeForce RTX generation tends to have 2 years of market presence, with the RTX 40-series "Ada" having debuted in Q4-2022 (October 2022), and the RTX 30-series "Ampere" in late-Q3 2020 (September 2020).
NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
Source:
Moore's Law is Dead (YouTube)
NVIDIA's roadmap for 2024 sees a Q1 debut of the RTX 40-series SUPER, with three high-end SKUs refreshing the upper half of the RTX 40-series. The MLID report goes on to speculate that the generational performance uplift of "Blackwell" over "Ada" will be smaller still, than that of "Ada" over "Ampere." With AI HPC GPUs outselling gaming GPUs by 5:1 in terms of revenues, and AMD rumored to be retreating from the enthusiast segment for its next-gen RDNA4, we get to see why this is the case.
126 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024
Being someone still on 10 series I haven't seen any value in anything to dates worth buying seeing I wasn't a crypto miner :laugh:
Reminds me of this clip of better cal saul :laugh:
I have a 2070, and think that it's not a bad card, I still get 30fps in Cyberpunk with all features up full, HDR, enhanced 4k texture pack, using DLSS quality on a 4k monitor, no RT of course.
You don't need a 4080 for this.
The 2070 is ok.
I had 2080 and 2080Ti before moving to 4080. DLSS is godsent feature for these gpus.
On topic:
the price of 4090 has been increased a year after release.
that's not good for the 5000 series.
In general, nVidia goes in high, low, high, low pattern.
1000 - 2000 - 3000 - 4000 - 5000series
Low - High - Low - High - Low
I think we may have a second round of 3000 series again where the MSRPs are low but the actual market prices are ridiculous due to AI, crypto craziness.
But if AMD doesn't have high-end RDNA4 chips, then the 5090 and 5080 will be even more expensive. Though personally I don't care about those cards anyway, I'm not buying another GPU that draws more than 200 W (unless it's cheap and I can limit the power for the summer).
AMD's RT performance is a generation behind nVidia. My RTX 4070 Ti performs better than a 7900 XTX in Cyberpunk Path Tracing even when FSR 3 is modded in. And even then, a 4070 Ti is just barely good enough at 1440p.
I'm gonna do that for you.
Avatar - 45 FPS
Alan Wake II - 50 FPS (no RT), 33 FPS (regular RT)
Lords of the Fallen - 41 FPS
Cyberpunk PL - 53 FPS (no RT), 27 FPS (regular RT)
Immortals of Aveum - 37 FPS
Atlas Fallen - 115 FPS
Ratchet & Clank - 87 FPS (no RT), 53 FPS (RT)
Remnant II - 36 FPS
Jedi Survivor - 56 FPS (no RT), 53 FPS (RT)
The Last of Us - 59 FPS
Hogwarts Legacy - 58 FPS (no RT), 26 FPS (RT)
Dead Space - 57 FPS
Sporfoken - 64 FPS (no RT), 53 FPS (RT)
Average - 58 FPS (no RT)
I focused on "next-gen" games. There were some cross-gen titles that obviously ran really well (Resident Evil 4, Atomic Heart, AC Mirage).
Clearly I don't blame the card for this. Most of these games were completely unoptimized, some of them pushed crazy visuals. But with average 58 FPS I don't consider this a native 4K card. And if you have to upscale, it's not a 4K card, just like the 3090 was not an 8K card just because it supported DLSS Ultra Performance.
Whoever is to blame, paying $1200 for 58 FPS without ray tracing is just pathetic. A 4090 can't even hit 60 in some of these games.
Since they can score 60+fps avg.
Most of the gpus can perform in simple raster games at 4K. That does not make them 4K cards.
In RT/PT conditions is where the performance matters only and the 4080 can do that up to UW1440p.
everything under is realy slow
No money to buy then ppls say "i havent seen any value" its like lieing u self to fell better. There is option in games..
so yes slower cards can be used in 4k just fine. kinda BS.. PC is not console.
there is options available..
Using best optimal settings..
Avatar 84 FPS
Alanwake II 72 FPS
Lords of the Fallen - 79 FPS
Cyberpunk PL - 104 FPS (no RT)
Immortals of Aveum - 62 FPS
Atlas Fallen - 188 FPS
Ratchet & Clank - 175 FPS (no RT)
Remnant II - 71FPS
Jedi Survivor - 88 FPS (no RT)
The Last of Us - 94 FPS
Hogwarts Legacy - 102 FPS
Dead Space - 91 FPS
Sporfoken - 102 FPS (no RT)
AMD has released good products in all segments, problem is, reviews don't update prices over time/place, so most render just as an advertisment and not a review after a few months.
If TPU or similar used a local price crawler associated to their review information (mostly updated after each part release) people would have a good decision tool and I'm sure that would incline AMD in many more cases. People thinks they are grasping some 4090 when they buy a 3060ti or completely discart intel GPUs because of old bad reviews.
If you look for intel 770 review to evaluate the purchase you'll have outdated information. Who looks for a 7800XT review to evaluate the performance of the 3060ti or an intel one? Few.
There is no competition to 4090, true. But after that there is competition in every single tier and its diferent at every region or country depending on local prices. I think its the worst non-solved problem of written and youtube reviews.
What AMD can't afford is not competing with products, but with marketing in the "high end" segment.
In the same way rasterization fidelity improved through boatloads of rendering cheats and other techniques learned by experience, developers and game engines will build up a similar portfolio of experience with how to optimally leverage RT on available hardware.
It's a chicken-and-egg situation. Can't get devs to adopt new rendering technology without an available hardware install base to run it on. You also can't just suddenly start dedicating half your die area to rendering hardware for quickly improving technology that comes at the direct cost of die area for your rasterization hardware. There has to be some kind of concrete-ish long term goal to transition towards, and Nvidia has pretty clearly communicated their hat is in the ring of RT lighting and ML upscaling, with full RTGI as one of the long term benchmarks.
Current gen consoles will continue to ensure all major releases are still built upon a fundamentally traditional rasterized/render cheated baseline, but as titles like Alan Wake II are showing off, more and more will learn to smoothly integrate various RTGI type systems into their existing non-RT bases. We'll have a broadly hybrid approach for the next 5 years or so, but I can certainly see over the next 2-3 years some major titles starting to ship with RTGI as their "definitive" baseline and non- or limited-RT versions built to run on the consoles delivering a noticeably downgraded visual experience.
except the best hw is useless, if you dont have (proper) working drivers.
just because intel makes a decent gpu, doesnt mean it performs (good) enough.
how about ppl running 1 or 2K, and are switching to 4K?
not everyone centers their purchase on a (single) game...
ignoring that if i still had my old job, the next xx80 with a WB would have been mine,
without having to do a single thing with games...
thought the same until a few years ago, once you start looking at 0.1% of worst fps, it can change.
in the past 20y i usually was on xx60Ti/70 chips, basically just providing enough frames for gaming,
but since my 2080S is LC and having more than enough power for my games, i started paying attention to
the worst performing parts of results (incl in-game stuff), and could see a smal but notcable difference to my friends Ti,
even that it was on air (less and shorter boost).
its similar to cars for me.
sure a prius or other small car is usually fuel efficient, but when you dont put your foot down,
as in driven the same way, most (european) cars can have 2-3 times the output,
yet still do same or even better (e.g. 17mpg vs 19 for a BMW M3).
saw it similar in late 90s when i had a (little tweaked) golf with a third of the output of my fathers E55,
yet when driving at normal pace/speed, he needed less than 1/2gal more per 100km,
as i basically had to put my foot down, to stay with him "cruising" along..