Tuesday, September 20th 2022

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

NVIDIA just kicked off the GTC Autumn 2022 Keynote address that culminates in Project Beyond, the company's launch vehicle for its next-generation GeForce RTX 40-series graphics cards based on the "Ada" architecture. These are expected to nearly double the performance over the present generation, ushering in a new era of photo-real graphics as we inch closer to the metaverse. NVIDIA CEO Jensen Huang is expected to take center-stage to launch these cards.

15:00 UTC: The show is on the road.
15:00 UTC: AI remains the center focus, including how it plays with gaming.

15:01 UTC: Racer X is a real-time interactive tech demo. Coming soon.
15:02 UTC: Future games will be simulations, not pre-baked- Jensen Huang
15:03 UTC: This is seriously good stuff (RacerX). It runs on a single GPU, in real-time, uses RTX Neural Rendering
15:05 UTC: Ada Lovelace is a huge GPU
15:06 UTC: 76 billion transistors, over 18,000 shaders. 76 billion transistors, Micron GDDR6X memory. Shader execution reordering is major innovation, as big as out-of-order execution for CPUs, gains up to 25% in-game performance. Ada built on TSMC 4 nm, using 4N, a custom process designed in together with NVIDIA.

There's a new streaming multiprocessor design, with a total of 90 TFLOPS. Power efficiency is doubled over Ampere.
Ray Tracing is on the third generation now, with 200 RT TFLOPS and twice the triangle intersection speed.
Deep Learning AI uses 4th gen Tensor Cores, 1400 TFLOPS, "Optical Flow Accelerator"
15:07 UTC: Shader Execution Reordering similar to the one we saw with Intel Xe-HPG
15:08 UTC: Several new hardware-accelerated ray tracing innovations with 3rd gen RTX.
15:09 UTC: DLSS 3 is announced. It brings with it several new innovations, including temporal components, and Reflex latency optimizations. Generates new frames without involving the graphics pipeline.
15:11 UTC: Cyberpunk 2077 to get DLSS 3 and SER. 16 times increase in effective performance using DLSS 3 vs. DLSS 1. MS Flight Simulator to get DLSS 3 support
15:13 UTC: Portal RTX, a remaster just like Quake II RTX, available from November, created with Omniverse RTX Remix.
15:14 UTC: Ada offers a giant leap in total performance. Everything has been increased 40 -> 90 TFLOPS shader, 78 -> 200 TFLOPS RTX, 126 -> 300 TFLOPS OFA, 320 -> 1400 TFLOPS Tensor.
15:17 UTC: Power efficiency is more than doubled, but power goes up to 450 W now.
15:18 UTC: GeForce RTX 4090 will be available on October 12, priced at $1600. It comes with 24 GB GDDR6X and is 2-4x faster than RTX 3090 Ti.
15:18 UTC: RTX 4080 is available in two versions, 16 GB and 12 GB. The 16 GB version starts at $1200, the 12 GB at $900. 2-4x faster than RTX 3080 Ti.
15:19 UTC: New pricing for RTX 30-series, "for mainstream gamers", RTX 40-series "for enthusiasts".
15:19 UTC: "Ada is a quantum leap for gamers"—improved ray tracing, shader execution reordering, DLSS 3.
15:20 UTC: Updates to Omniverse

15:26 UTC: Racer X demo was built by a few dozen artists in just 3 months.
15:31 UTC: Digital twins would play a vital sole in product development and lifecycle maintenence.
15:31 UTC: Over 150 connectors to Omniverse.
15:33 UTC: GDN (graphics delivery network) is the new CDN. Graphics rendering over the Internet will be as big in the future as streaming video is today.
15:37 UTC: Omniverse Cloud, a planetary-scale GDN
15:37 UTC: THOR SuperChip for automotive applications.

15:41 UTC: NVIDIA next-generation Drive
Add your own comment

333 Comments on NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

#176
cvaldes
HofnaerrchenI know but it is quite a difference if it is told in the keynote or just on the website. I doubt many people go to the website to check it out these days. Most people watch videos and don't read (at all^^) tech sheets anymore.
Massive power consumption is likely not something NVIDIA prefers to highlight in a limited duration event. They can't touch on every single feature and data point so they need to be thoughtful about what to address and what to leave out.

Do some people want to know the thicknesses of the thermal pads on the ROG Strix cooler? I'm sure there are a few people here who would love some information.

I know there are a lot of lazy people on the Internet, many more now than there were 10-15 years ago when people actually visited corporate websites and knew what the acronyms RTFM and STFW meant. And not just here at TPU. Everywhere online about pretty much every topic.

And if Jensen stepped on the stage at CES in Las Vegas twenty years ago, he still wouldn't have time to go over all of the details on power requirements.

Should I throw a tantrum because Mercedes-Benz didn't mention how heavy the E-Class seatbelt buckles were in their 15 second television commercial? Should TPU be post the entire contents of the ASUS website for a given product + exploded view schematics? Should a software review on DaVinci Resolve also include the 1000+ page manual?
Posted on Reply
#177
Darller
Nice. The price is about what I expected. Looking forward to that sweet 4090 goodness. I may upgrade to triple 4k oled for the sim rig.
Posted on Reply
#178
N3M3515
RTX 3080 MSRP: $700(EXPENSIVE) - RTX 4080 MSRP $900 .........................WTF?
Posted on Reply
#179
IllIIIl
price usd/performance 1080p
6600(260/109≈2.4)
6800xt(600/203≈3)
3070ti(600/163≈3.7)
4090 (1600/422≈3.8)
3090ti(1100/211≈5.2)

Based on previous reviews, assuming the nvidia 4090 is indeed twice the performance of the 3090ti and average, based on the msrp of the 4090, based on the approximate price I've found now. Performance from youtuber Hardware Unboxed.

The price/performance ratio of the 4090 is indeed better than the 3090ti, but the price/performance ratio of the 3090ti itself is terrible.

And that doesn't include continued price cuts for old graphics cards, new graphics cards costing more than msrp.
Posted on Reply
#180
Sisyphus
ARFIt really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
4090 will sell good, special peer group. 4080 price will drop off later, depending on AMDs real performance/pricing. 10-15% higher prices nVidia versus AMD will continue (if only rasterization is compared) for various reasons.
Posted on Reply
#181
Dirt Chip
I will wait for that 175W 4xxx GPU`s but nice to see the dramatic uplift of performance in the same W (100-300W range).

Ada`s (TSMC) 4nm looks like order of magnitude better than ampere`s 8nm (Samsung).
That will allow NV cards to stretch its legs with comfort, to whoever willing to pay the temperature and wattage bill.
Posted on Reply
#182
usiname
If you think that 4080 12GB for $900 with slower rasterisation than 3090/3090ti and crapy 192bit-500gb/s bandwith is stupid, just wait for the "real" 4070 with same die but less cores 48 of 60, on par with 3080 for just $700.
Posted on Reply
#183
ARF
Tek-CheckIt seems to me that it's 7800 that will try to be competitive with 4080 12GB, 7800XT with 4080 16 GB, and 7700XT with 4070.
If this happens, it would be a complete nightmare and a disaster/epic fail for AMD.
Really, AD104 should be called RTX 4060 Ti, not "RTX 4080-12".

AD104: 7680 shaders / 500 GB/s bus over 192-bit
GA102: 8704 shaders / 760 GB/s bus over 320-bit

:confused::kookoo:
Posted on Reply
#184
dogwitch
oh the wait for 4k series pricing. it well be cheaper......
age poorly.
Posted on Reply
#185
Valantar
AusWolfI guess there's no 4080 12 GB FE, only AIB cards. It really should have been called the 4070. I also guess it would have looked really bad to release a 285 W x70-series card, that's why the two vastly different 4080s. I'm curious what the x60 and x50 tier will look like, and how the 4070 will be positioned if there's gonna be one.

I still hold my position - not impressed.
Yep, that would definitely not have looked good. I wonder if calling the 16GB the 4080 Ti would have been better, though that would have left them without an option for a later release of a more fully enabled chip (4080 Ti Super? Ugh.).

This to me looks like Nvidia is adding even more SKUs than previously, and trying to drive up ASPs across each category. Which IMO doesn't bode well for lower end SKUs and value/performance increases. I'd be happy if we see anything more than the 1:1 perf/price increase of the previous generation, but I'm not very hopeful.
Posted on Reply
#186
ARF
ValantarYep, that would definitely not have looked good. I wonder if calling the 16GB the 4080 Ti would have been better, though that would have left them without an option for a later release of a more fully enabled chip (4080 Ti Super? Ugh.).

This to me looks like Nvidia is adding even more SKUs than previously, and trying to drive up ASPs across each category. Which IMO doesn't bode well for lower end SKUs and value/performance increases. I'd be happy if we see anything more than the 1:1 perf/price increase of the previous generation, but I'm not very hopeful.
How do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.
Posted on Reply
#187
Dirt Chip
Hopefully we could have DLSS-3 without RT (RT off) so a 4050/4060 will be just fine for a 4K, 60FPS, not max ultra settings.
ARFHow do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.
4080 12gb
4080 12gb ti
4080 12gb super
4080 12gb super ti

4080 16gb
4080 16gb ti
4080 16gb super
4080 16gb super ti

4090 20gb/ti/super/super ti

4090 24gb ti/super/ super ti

and you can have every bit of +/-256 shaders you like
:)
Posted on Reply
#188
maxfly
I forgot who said it here but they obviously anticipate having to cut pricing at some point. Start higher so you can go lower in a year. Then release the TIs to fill the gaping performance holes they've purposely left open so once again, there's that, I'm getting a deal feel.

The above moves forward by 6 months if AMD throws a wrench in the works. Throw away Lisa Su, throw away. We need you to increase that market share girl.
Posted on Reply
#189
Valantar
ARFHow do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.
It absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy - but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
Posted on Reply
#190
ARF
ValantarIt absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy - but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
Well, Resident evil village shows some 70-80% higher performance of 4090 over the "4080-12".

Posted on Reply
#191
Dirt Chip
ValantarIt absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy - but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
erased, I made a mistake.
Posted on Reply
#192
ARF
Dirt ChipRemember the "starting at 329$" for the 4060 in the slide
Hmm, that was for the 3060? Can you post an image?
Posted on Reply
#193
Valantar
ARFWell, Resident evil village shows some 70-80% higher performance of 4090 over the "4080-12".

Aren't all those comparisons with RT and DLSS enabled? I'm not all that interested in edge cases like that, I want something more representative.
Dirt ChipRemember the "starting at 329$" for the 4060 in the slide, imply that we will see the same 4080
("starting at 899$) 12/16gb trick in lower tiers.
You mean 3060? There's no mention of any 40-series SKU below the 4080 12GB. And that's just the MSRP for that card, it's been that since day 1. Nvidia is just for some reason promoting that a year and a half after its launch you might actually have a chance at finding one close to MSRP (rather than any kind of price drop, which would be the logical thing to expect that long after launch.

Which just supports my point: if Nvidia is saying "hey, the 3060 is great value at $329" in mid-to-late 2022, then I wouldn't be surprised at all to see them launch a 4060 Ti at $499 or higher in early 2023 (with a 4060 around $399-450).
Posted on Reply
#194
Dirt Chip
ValantarAren't all those comparisons with RT and DLSS enabled? I'm not all that interested in edge cases like that, I want something more representative.

You mean 3060? There's no mention of any 40-series SKU below the 4080 12GB. And that's just the MSRP for that card, it's been that since day 1. Nvidia is just for some reason promoting that a year and a half after its launch you might actually have a chance at finding one close to MSRP (rather than any kind of price drop, which would be the logical thing to expect that long after launch.

Which just supports my point: if Nvidia is saying "hey, the 3060 is great value at $329" in mid-to-late 2022, then I wouldn't be surprised at all to see them launch a 4060 Ti at $499 or higher in early 2023 (with a 4060 around $399-450).
you are right, my mistake.
I though it was 4060 in the slide..
Posted on Reply
#195
ARF
ValantarAren't all those comparisons with RT and DLSS enabled? I'm not all that interested in edge cases like that, I want something more representative.
I think this is the best representation of the least CPU-bound scenario possible. When CPU is a bottleneck, the cards literally show no performance difference.
Very odd.
ValantarYou mean 3060? There's no mention of any 40-series SKU below the 4080 12GB. And that's just the MSRP for that card, it's been that since day 1. Nvidia is just for some reason promoting that a year and a half after its launch you might actually have a chance at finding one close to MSRP (rather than any kind of price drop, which would be the logical thing to expect that long after launch.

Which just supports my point: if Nvidia is saying "hey, the 3060 is great value at $329" in mid-to-late 2022, then I wouldn't be surprised at all to see them launch a 4060 Ti at $499 or higher in early 2023 (with a 4060 around $399-450).
I think nvidia uses the old series to fill the gaps because of the unlimited supply - it's hard to get rid of the stock. But EVGA said "bye-bye nvidia" :D
Posted on Reply
#196
Valantar
ARFI think this is the best representation of the least CPU-bound scenario possible. When CPU is a bottleneck, the cards literally show no performance difference.
Very odd.
That's possible, but it's still rather suspect that they're exclusively showing off RT+DLSS titles - though at this point we know that Nvidia cares more about selling their proprietary tech (and thus tying people into their ecosystem) than selling GPUs.
ARFI think nvidia uses the old series to fill the gaps because of the unlimited supply - it's hard to get rid of the stock. But EVGA said "bye-bye nvidia" :D
Well, more like they've got so much stock they can't launch anything new to replace that without having a fire sale to clear out stock. Which would of course eat into their margins, and force them to give retroactive rebates to AIB partners to have them agree to the sale. Nvidia would much rather tout the value of a two-year-old product still not selling at MSRP.

I really, really hope AMD comes out swinging with 7000 series pricing. This just seems like such a massive opportunity for them, regardless of absolute performance.
Posted on Reply
#197
Keullo-e
S.T.A.R.S.
Have fun playing with these with the rising electricity pricing.

Also WTF is that "4080 12GB", that should be named as a 4070 or 4060 Ti?
Posted on Reply
#198
THU31
ARFHow do you explain the performance gap between 4090 and "4080-16" - 16,384 vs 9,728 shaders?
4090 has 68% more shaders?

This is not normal.

Look:

RTX 3090 Ti - 10,752
RTX 3090 - 10,496
RTX 3080 Ti - 10,240
RTX 3080 - 8,704

3090 Ti has only 23.5% more shaders than 3080.
And 3090 Ti has 2% more shaders than 3090.
Well, actually the 30 series was not normal in this regard. Look at all the previous generations. The x80 Ti usually had ~40% more shaders. And the Titan (full GPU) had about 50% more. But those Ti cards usually came out a year after a normal x80.

The 3090 and Ti were absolutely terrible value over the 3080. 3080 is one of their best cards. And it was the first time since Kepler (7 series) when they used their biggest GPU in a normal x80 card.

The 4090 actually looks to be better value than the 4080s. Its biggest problem is the heat output. I could probably save up to buy one, but the card would be unusable in the summer. Undervolted it will consume 300-350 W, but that is still too much for me.
Posted on Reply
#199
Sisyphus
ValantarIt absolutely isn't. It just establishes the 90 tier as a hyper-exclusive, pie-in-the-sky tier for the ultra wealthy
The 3090, 4090 etc. are for semiprofessionals, creative people, not for gamers, nobody needs 24 GB VRam in gaming. Buts its nice for mixed use and a lot cheaper than the quadro series.
- but at the same time the 80 tier is following its pricing closely, which makes even less sense. This also makes me wonder why the performance differences in Nvidia's slides (not that there was much in terms of performance numbers) were relatively modest.

I gotta say, I don't have much hope for accessibly priced GPUs from Nvidia. With a $1200 80 tier GPU, I wouldn't be surprised at all if the 4000 series gets a $499 or higher 60 tier card.
The only explanation for me: Gamers love to have the "best" card, naming/PR is everything. AMD isn't much better, neither much cheaper. So there is room for a good margin. nVidia will continue, to maximize revenue as long, as AMD is behind in RT/tensor core technology. Clients, who doesn't bother about RT, Dlss, are just fine with AMD. No problem at all.
Posted on Reply
#200
medi01
cvaldesNVIDIA has something like 80% of the discrete desktop graphics card market
That's revenue share, not unit share, afaik.
Posted on Reply
Add your own comment
Jun 29th, 2024 09:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts