Tuesday, September 20th 2022

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

NVIDIA just kicked off the GTC Autumn 2022 Keynote address that culminates in Project Beyond, the company's launch vehicle for its next-generation GeForce RTX 40-series graphics cards based on the "Ada" architecture. These are expected to nearly double the performance over the present generation, ushering in a new era of photo-real graphics as we inch closer to the metaverse. NVIDIA CEO Jensen Huang is expected to take center-stage to launch these cards.

15:00 UTC: The show is on the road.
15:00 UTC: AI remains the center focus, including how it plays with gaming.

15:01 UTC: Racer X is a real-time interactive tech demo. Coming soon.
15:02 UTC: Future games will be simulations, not pre-baked- Jensen Huang
15:03 UTC: This is seriously good stuff (RacerX). It runs on a single GPU, in real-time, uses RTX Neural Rendering
15:05 UTC: Ada Lovelace is a huge GPU
15:06 UTC: 76 billion transistors, over 18,000 shaders. 76 billion transistors, Micron GDDR6X memory. Shader execution reordering is major innovation, as big as out-of-order execution for CPUs, gains up to 25% in-game performance. Ada built on TSMC 4 nm, using 4N, a custom process designed in together with NVIDIA.

There's a new streaming multiprocessor design, with a total of 90 TFLOPS. Power efficiency is doubled over Ampere.
Ray Tracing is on the third generation now, with 200 RT TFLOPS and twice the triangle intersection speed.
Deep Learning AI uses 4th gen Tensor Cores, 1400 TFLOPS, "Optical Flow Accelerator"
15:07 UTC: Shader Execution Reordering similar to the one we saw with Intel Xe-HPG
15:08 UTC: Several new hardware-accelerated ray tracing innovations with 3rd gen RTX.
15:09 UTC: DLSS 3 is announced. It brings with it several new innovations, including temporal components, and Reflex latency optimizations. Generates new frames without involving the graphics pipeline.
15:11 UTC: Cyberpunk 2077 to get DLSS 3 and SER. 16 times increase in effective performance using DLSS 3 vs. DLSS 1. MS Flight Simulator to get DLSS 3 support
15:13 UTC: Portal RTX, a remaster just like Quake II RTX, available from November, created with Omniverse RTX Remix.
15:14 UTC: Ada offers a giant leap in total performance. Everything has been increased 40 -> 90 TFLOPS shader, 78 -> 200 TFLOPS RTX, 126 -> 300 TFLOPS OFA, 320 -> 1400 TFLOPS Tensor.
15:17 UTC: Power efficiency is more than doubled, but power goes up to 450 W now.
15:18 UTC: GeForce RTX 4090 will be available on October 12, priced at $1600. It comes with 24 GB GDDR6X and is 2-4x faster than RTX 3090 Ti.
15:18 UTC: RTX 4080 is available in two versions, 16 GB and 12 GB. The 16 GB version starts at $1200, the 12 GB at $900. 2-4x faster than RTX 3080 Ti.
15:19 UTC: New pricing for RTX 30-series, "for mainstream gamers", RTX 40-series "for enthusiasts".
15:19 UTC: "Ada is a quantum leap for gamers"—improved ray tracing, shader execution reordering, DLSS 3.
15:20 UTC: Updates to Omniverse

15:26 UTC: Racer X demo was built by a few dozen artists in just 3 months.
15:31 UTC: Digital twins would play a vital sole in product development and lifecycle maintenence.
15:31 UTC: Over 150 connectors to Omniverse.
15:33 UTC: GDN (graphics delivery network) is the new CDN. Graphics rendering over the Internet will be as big in the future as streaming video is today.
15:37 UTC: Omniverse Cloud, a planetary-scale GDN
15:37 UTC: THOR SuperChip for automotive applications.

15:41 UTC: NVIDIA next-generation Drive
Add your own comment

333 Comments on NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

#301
ratirt
THU31Basic living costs, yes. But not electronics, video games, streaming services and other things like that.

I will give you a simple example of what I mean. BLC means bills and food, stuff you absolutely need.
Poland - 1250 EUR average income - 750 EUR basic living costs (60%) - 500 EUR left
Germany - 4000 EUR average income - 2400 EUR basic living costs (60%) - 1600 EUR left

Who has an easier time buying a graphics card that costs 1000 EUR or more (same price in both countries)? ;)
New video games cost about 70 EUR over here, which is completely insane.
Again Electricity in Poland is cheaper. That one I know for sure.
Germany 4k EUR? Am not sure if that is true. My friend is a nurse and she gets measly 2K-2.4K EUR. Imagine there are a lot more less lucrative professions. Well in Poland that is considered a lot but yet in Germany you pay a lot for other things as well. Everything is way more expensive there.
The data from 2021 show that Polish salary on average is around 90720PLN while German people get 47.700 EUR average data from 2020. Dont get me wrong but that is all gross. Now if you consider the tax rates in both of those countries you will know your data is totally inaccurate. In Poland you have 12% and 34% if you exceed 120k by the salary number given earlier Polish person gets 80k PLN per year after tax income in Germany you've got 14% to 42% for 9.985 – 58.596 euros by the number given above you are left with 26kEUR. All of them are calculated as average but it also depends which tax category you will have in Germany.
Just by looking at the numbers you know Germans dont have it that sweet. Working in Germany and living in Poland totally different perspective. You have to consider that things in Germany are way more expensive than in Poland.
Posted on Reply
#302
ModEl4
AusWolfWhat the F is this? 90 < 80 < 81 < 117 fps? Did nvidia go full retard? :kookoo:
I know i agree, it's confusing, the bars doesn't represent fps but how much faster 4090 is compared to previous gen.(they used the previous charts that was all about how much faster 4090 was in relation with 3090Ti and just added on them fps achieved)
THU31If Portal runs at 117 FPS with 6x DLSS performance increase, does that mean it runs at 19.5 FPS without DLSS, on a 4090? Sounds pretty bad. :D
Was it 6X? I only saw the below graph (around 2.85X). So probably 117fps with frame interpolation and rendered at 1080p (DLSS 4K performance) and around 41fps 4K native.

Posted on Reply
#303
THU31
ModEl4Was it 6X? I only saw the below graph (around 2.85X). So probably 117fps with frame interpolation and rendered at 1080p (DLSS 4K performance) and around 41fps 4K native.
Look at the picture I quoted in my last message, you can see it a few posts before mine.

It shows a comparison between native and DLSS 3. Portal goes up to 6x (or almost that, I guess about 5.6x).
Posted on Reply
#304
ModEl4
THU31Look at the picture I quoted in my last message, you can see it a few posts before mine.

It shows a comparison between native and DLSS 3. Portal goes up to 6x (or almost that, I guess about 5.6x).
You're right, in the picture i quoted Nvidia implies that 3090Ti is with DLSS performance mode, not native 4K.
So if i interpret correctly the graphs, around 5.6X (3090Ti native 4K vs 4090 4K DLSS performance (1080p) + frame interpolation= DLSS 3.0)
or around 2.8X (3090Ti 4K DLSS performance (1080p) vs 4090 4K DLSS performance (1080p) + frame interpolation= DLSS 3.0))
117/5.6=20-21fps but for 3090Ti 4K native not for 4090!
Posted on Reply
#305
THU31
Where are you getting 3090 Ti from? The slide does not mention any GPUs at all. It is meant to show what DLSS 3 is doing on the same card (whichever one it might be, but I think it is safe to assume it is the 4090).

I expect the games on the left side of the slide are CPU-limited, that is why only the interpolation is causing a 2x increase in framerate. Anything above 2x means that lowering the rendering resolution gives a boost, after which the framerate is doubled by DLSS 3.

The bottom of the slide only mentions that the resolution is 4K with DLSS Performance and the interpolation enabled.

cdn.wccftech.com/wp-content/uploads/2022/09/NVIDIA-Ada-Lovelace-GPU-GeForce-RTX-4090-RTX-4080-Series-Graphics-Cards-_7-1480x833.jpg
Posted on Reply
#306
Sisyphus
Raw compute performance 3090 30 TFLOPS, 4090 90 TFLOPS.
Posted on Reply
#307
ModEl4
THU31Where are you getting 3090 Ti from? The slide does not mention any GPUs at all. It is meant to show what DLSS 3 is doing on the same card (whichever one it might be, but I think it is safe to assume it is the 4090).

I expect the games on the left side of the slide are CPU-limited, that is why only the interpolation is causing a 2x increase in framerate. Anything above 2x means that lowering the rendering resolution gives a boost, after which the framerate is doubled by DLSS 3.

The bottom of the slide only mentions that the resolution is 4K with DLSS Performance and the interpolation enabled.

cdn.wccftech.com/wp-content/uploads/2022/09/NVIDIA-Ada-Lovelace-GPU-GeForce-RTX-4090-RTX-4080-Series-Graphics-Cards-_7-1480x833.jpg
Yeah, probably you are right!
I can't wait for independent testing because if indeed is 4090 only, then Nvidia claims for example that in Cyberpunk max RT when you enable DLSS 3.0 your frame rate quadruples!
I wonder, can it be that good in games that push raytracing hard?
Anyway in classic raster, since jensen claimed 4090 will be 2X vs 3090Ti, i expect it to be less (-10% -15% less, so 1.8X 1.7X).
I think jensen in 3080 launch had said also 2X vs 2080 but the actual difference at launch was around 1.65X not 2X, so the logic assumption is around 1.65X this time also!
Posted on Reply
#308
THU31
I made a new slide for NVIDIA's marketing. :)



Original for comparison.

Posted on Reply
#309
Sisyphus
2080 ti 14.2 TFLOPS, rtx titan 16.3 TFLOPS (all FP32, rasterization)
About price, there have been significant price cuts in all 30x0 products and AMD GPUs. Who wants the newest high-end card, barely on the market, has to pay early adopter price and could run into early adopter problems. Enough said. I will wait six months, to see real performance, impact of new AMD series, likely second revision of AIBs, consolidated market price. Until then, a nice time for all, don't forget to enjoy, what you already have.
Posted on Reply
#310
dogwitch
Sisyphus2080 ti 14.2 TFLOPS, rtx titan 16.3 TFLOPS (all FP32, rasterization)
About price, there have been significant price cuts in all 30x0 products and AMD GPUs. Who wants the newest high-end card, barely on the market, has to pay early adopter price and could run into early adopter problems. Enough said. I will wait six months, to see real performance, impact of new AMD series, likely second revision of AIBs, consolidated market price. Until then, a nice time for all, don't forget to enjoy, what you already have.
most amd gpu are in server market. there making a killing on them. the gamer market the left overs.
Posted on Reply
#311
gffermari
dir_dI dont know how you can say that when games get more complicated with more textures every year. I run 4k 120hz on a 3080TI and i would still love more Raster for a lower price than Nvidia is providing. RT does a a tiny bit of immersion for me but most of the time i am too focused on the game to really look at RT features. I need the High FPS with no dips or tearing in the latest games.
A tiny bit?
Maybe I'm getting old or have played so many games but I can no longer play games with no RT or simple baked lighting, flying objects and flat image.
To be fair there are games with no RT that managed to capture the environment, not correctly but in a way that it doesn't trigger the player's brain.
ratirtapparently it counts for you since 90% of gamers don't care a lot and majority of games still use raster.
Just because a company is pushing for RT it does not mean it has become a mainstream. Especially with kinda pathetic performance.
Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

The performance will always be something that we have to work around when new techs are added. When GeForce 3 Ti introduced pixel shader 1, the performance was bad. Should we have stayed with the MXs because they were faster?(showing flat shxxxxxxty surfaces everywhere...)
Posted on Reply
#312
oxrufiioxo
gffermariA tiny bit?
Maybe I'm getting old or have played so many games but I can no longer play games with no RT or simple baked lighting, flying objects and flat image.
To be fair there are games with no RT that managed to capture the environment, not correctly but in a way that it doesn't trigger the player's brain.



Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

The performance will always be something that we have to work around when new techs are added. When GeForce 3 Ti introduced pixel shader 1, the performance was bad. Should we have stayed with the MXs because they were faster?(showing flat shxxxxxxty surfaces everywhere...)
I agree baked lighting looks substantially worse in most cases vs RT lighting done properly same with SSR vs RT reflections. Hell even ambient occlusion looks substantially better when done with RT.

Only games with very static environments like the Last of us 2 pull off baked lighting pretty decently but I'm personally not a huge fan of games that linear and even then it would benefit from RT.

Flagships/high end gpu should not only be able to do RT but do it well. Midrange I can still give a pass for now.
Posted on Reply
#313
80-watt Hamster
gffermariOnly those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.
Ray tracing has its benefits, sure. But both of these claims are... bold, let's say. Don't get me wrong; I'm all for tools that will make good lighting easier to implement (lighting is probably my most common gripe in 3D games), but that's all RT is: Another tool in the box.
Posted on Reply
#314
Valantar
gffermariA tiny bit?
Maybe I'm getting old or have played so many games but I can no longer play games with no RT or simple baked lighting, flying objects and flat image.
To be fair there are games with no RT that managed to capture the environment, not correctly but in a way that it doesn't trigger the player's brain.



Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

The performance will always be something that we have to work around when new techs are added. When GeForce 3 Ti introduced pixel shader 1, the performance was bad. Should we have stayed with the MXs because they were faster?(showing flat shxxxxxxty surfaces everywhere...)
What you're describing is poorly done lighting, not an inherent RT/non-RT split. If objects appear floating or the image appears flat, then the developer did a poor job with their lighting. RT makes this easier, and you have to spend far less time applying all the various hacks and tricks inherent to making non-RT lighting look good, but it can definitely be done well that way as well.
Posted on Reply
#315
THU31
I got curious about another thing when I looked at the specs. The AD104 is 295 mm2. How can a card with such a tiny die have a TDP of 285 W? All these cards must be factory overclocked like crazy, just like the 3090 Ti was. They probably did it so that the cards seem super powerful in hopes of justifying the 4080 naming.

In the GTX era, 104/204 dies were always under 200 W on the initial launch. Even the 3070 was only 220 W, and that was 393 mm2.

These cards will undervolt like a dream. You can probably cut the power consumption by 50% and only lose 20% performance or less. If the prices ever get changed to normal, these cards will be very attractive.
Posted on Reply
#316
Valantar
THU31I got curious about another thing when I looked at the specs. The AD104 is 295 mm2. How can a card with such a tiny die have a TDP of 285 W? All these cards must be factory overclocked like crazy, just like the 3090 Ti was. They probably did it so that the cards seem super powerful in hopes of justifying the 4080 naming.

In the GTX era, 104/204 dies were always under 200 W on the initial launch. Even the 3070 was only 220 W, and that was 393 mm2.

These cards will undervolt like a dream. You can probably cut the power consumption by 50% and only lose 20% performance or less. If the prices ever get changed to normal, these cards will be very attractive.
GPUs used to be architecturally limited to relatively low clocks, which inherently (at least to some degree - you can still make a crazy inefficient design, obviously) limits the per-area power draw. AMD has had more of a steady progression in clock speeds, while Nvidia jumped to just below 2GHz with Pascal and stayed there for three generations, but have now again jumped another 6-700MHz. All the while, AMD is reported to have RDNA3 nearing 4GHz, which is downright insane for a GPU. This naturally brings with it higher per-area power draws as each transistor is running at higher speeds.

On top of this, AD104 has 38 billion transistors in that area, vs. 17 billion for GA104 - so not only are the transistors running much faster, but there are twice as many of them, in a smaller overall area.
DarllerYou are such a spaz. I stated facts:

1. If you can’t afford it you aren’t the target market.
2. I work very hard(12+ hours daily, 6 days per week for over 20 years) and that’s why I can afford these things now, where they were a hardship when I was younger.

If you’re offended by either of those statements, or need to read more into it than what I explicitly stated… well, maybe you should figure out what’s making you so unhappy in life and fix it.
It's always fun when people have nothing to back up their "arguments" and instead resort to name-calling and personal attacks. It really, really isn't my problem that you are clearly incapable of seeing the ideological undertones of your own statements. Because they are there, whether you like it or not. Clear as day to anyone with even the most minute ability to interpret writing. This isn't me being offended, it's me telling you that you're blind to the implications of your own beliefs and words, and attempting to inform you that there might be things that you're missing in what you're saying. Your initial responses made it seem like you didn't really want to be expressing these kinds of things, so maybe you should try and reconsider those beliefs, and fix that instead?
Posted on Reply
#317
AusWolf
THU31I got curious about another thing when I looked at the specs. The AD104 is 295 mm2. How can a card with such a tiny die have a TDP of 285 W? All these cards must be factory overclocked like crazy, just like the 3090 Ti was. They probably did it so that the cards seem super powerful in hopes of justifying the 4080 naming.

In the GTX era, 104/204 dies were always under 200 W on the initial launch. Even the 3070 was only 220 W, and that was 393 mm2.

These cards will undervolt like a dream. You can probably cut the power consumption by 50% and only lose 20% performance or less. If the prices ever get changed to normal, these cards will be very attractive.
Well, you have nearly 300 W distributed on nearly 300 mm2. My overclocked 6500 XT runs quite cool at 100 W (that is GPU die power - not total board power like on Nvidia cards) with a roughly 100 mm2 die while maintaining a steady 2.95 GHz. While GPU temp is at 55-60 °C, hotspot temp can get up to 90 °C sometimes, which isn't bad from a modern GPU.

Coolers will have to have good contact to avoid hotspots, that's for sure. Other than that, my only worry is the heat passed on to the rest of the case. GPU temps should be fine, I think.
Posted on Reply
#318
Dyatlov A
Seems like more 4080’s are on the way,
Posted on Reply
#319
AusWolf
Dyatlov ASeems like more 4080’s are on the way,
I don't have money for these... where's the 2 GB version?
Posted on Reply
#320
Dyatlov A
AusWolfI don't have money for these... where's the 2 GB version?
Maybe that’s already a bit small for calling it 4080
Posted on Reply
#321
AusWolf
Dyatlov AMaybe that’s already a bit small for calling it 4080
4070 maybe?
Posted on Reply
#322
Valantar
AusWolf4070 maybe?
4070 Ti, minimum. Or maybe 4080 LE? It's about time they brought back that designation!
Posted on Reply
#323
Dyatlov A
Valantar4070 Ti, minimum. Or maybe 4080 LE? It's about time they brought back that designation!
LE? I remember in Geforce 2 age MX, MX200, MX400 meant the budget versions. So yes, the 2GB version could be the Geforce 4080 MX200
Posted on Reply
Add your own comment
Jul 1st, 2024 13:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts