• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

If Portal runs at 117 FPS with 6x DLSS performance increase, does that mean it runs at 19.5 FPS without DLSS, on a 4090? Sounds pretty bad. :D
Or maybe it runs at 117 fps without DLSS, and 702 fps with it? :D
 
Basic living costs, yes. But not electronics, video games, streaming services and other things like that.

I will give you a simple example of what I mean. BLC means bills and food, stuff you absolutely need.
Poland - 1250 EUR average income - 750 EUR basic living costs (60%) - 500 EUR left
Germany - 4000 EUR average income - 2400 EUR basic living costs (60%) - 1600 EUR left

Who has an easier time buying a graphics card that costs 1000 EUR or more (same price in both countries)? ;)
New video games cost about 70 EUR over here, which is completely insane.
Again Electricity in Poland is cheaper. That one I know for sure.
Germany 4k EUR? Am not sure if that is true. My friend is a nurse and she gets measly 2K-2.4K EUR. Imagine there are a lot more less lucrative professions. Well in Poland that is considered a lot but yet in Germany you pay a lot for other things as well. Everything is way more expensive there.
The data from 2021 show that Polish salary on average is around 90720PLN while German people get 47.700 EUR average data from 2020. Dont get me wrong but that is all gross. Now if you consider the tax rates in both of those countries you will know your data is totally inaccurate. In Poland you have 12% and 34% if you exceed 120k by the salary number given earlier Polish person gets 80k PLN per year after tax income in Germany you've got 14% to 42% for 9.985 – 58.596 euros by the number given above you are left with 26kEUR. All of them are calculated as average but it also depends which tax category you will have in Germany.
Just by looking at the numbers you know Germans dont have it that sweet. Working in Germany and living in Poland totally different perspective. You have to consider that things in Germany are way more expensive than in Poland.
 
What the F is this? 90 < 80 < 81 < 117 fps? Did nvidia go full retard? :kookoo:
I know i agree, it's confusing, the bars doesn't represent fps but how much faster 4090 is compared to previous gen.(they used the previous charts that was all about how much faster 4090 was in relation with 3090Ti and just added on them fps achieved)

If Portal runs at 117 FPS with 6x DLSS performance increase, does that mean it runs at 19.5 FPS without DLSS, on a 4090? Sounds pretty bad. :D
Was it 6X? I only saw the below graph (around 2.85X). So probably 117fps with frame interpolation and rendered at 1080p (DLSS 4K performance) and around 41fps 4K native.

1a2fed1e85.jpg
 
Was it 6X? I only saw the below graph (around 2.85X). So probably 117fps with frame interpolation and rendered at 1080p (DLSS 4K performance) and around 41fps 4K native.
Look at the picture I quoted in my last message, you can see it a few posts before mine.

It shows a comparison between native and DLSS 3. Portal goes up to 6x (or almost that, I guess about 5.6x).
 
Look at the picture I quoted in my last message, you can see it a few posts before mine.

It shows a comparison between native and DLSS 3. Portal goes up to 6x (or almost that, I guess about 5.6x).
You're right, in the picture i quoted Nvidia implies that 3090Ti is with DLSS performance mode, not native 4K.
So if i interpret correctly the graphs, around 5.6X (3090Ti native 4K vs 4090 4K DLSS performance (1080p) + frame interpolation= DLSS 3.0)
or around 2.8X (3090Ti 4K DLSS performance (1080p) vs 4090 4K DLSS performance (1080p) + frame interpolation= DLSS 3.0))
117/5.6=20-21fps but for 3090Ti 4K native not for 4090!
 
Last edited:
Where are you getting 3090 Ti from? The slide does not mention any GPUs at all. It is meant to show what DLSS 3 is doing on the same card (whichever one it might be, but I think it is safe to assume it is the 4090).

I expect the games on the left side of the slide are CPU-limited, that is why only the interpolation is causing a 2x increase in framerate. Anything above 2x means that lowering the rendering resolution gives a boost, after which the framerate is doubled by DLSS 3.

The bottom of the slide only mentions that the resolution is 4K with DLSS Performance and the interpolation enabled.

 
Where are you getting 3090 Ti from? The slide does not mention any GPUs at all. It is meant to show what DLSS 3 is doing on the same card (whichever one it might be, but I think it is safe to assume it is the 4090).

I expect the games on the left side of the slide are CPU-limited, that is why only the interpolation is causing a 2x increase in framerate. Anything above 2x means that lowering the rendering resolution gives a boost, after which the framerate is doubled by DLSS 3.

The bottom of the slide only mentions that the resolution is 4K with DLSS Performance and the interpolation enabled.

Yeah, probably you are right!
I can't wait for independent testing because if indeed is 4090 only, then Nvidia claims for example that in Cyberpunk max RT when you enable DLSS 3.0 your frame rate quadruples!
I wonder, can it be that good in games that push raytracing hard?
Anyway in classic raster, since jensen claimed 4090 will be 2X vs 3090Ti, i expect it to be less (-10% -15% less, so 1.8X 1.7X).
I think jensen in 3080 launch had said also 2X vs 2080 but the actual difference at launch was around 1.65X not 2X, so the logic assumption is around 1.65X this time also!
 
I made a new slide for NVIDIA's marketing. :)

RTX 40 Giant Leap.png


Original for comparison.

RTX 40 Giant Leap by NV.png
 
2080 ti 14.2 TFLOPS, rtx titan 16.3 TFLOPS (all FP32, rasterization)
About price, there have been significant price cuts in all 30x0 products and AMD GPUs. Who wants the newest high-end card, barely on the market, has to pay early adopter price and could run into early adopter problems. Enough said. I will wait six months, to see real performance, impact of new AMD series, likely second revision of AIBs, consolidated market price. Until then, a nice time for all, don't forget to enjoy, what you already have.
 
Last edited:
2080 ti 14.2 TFLOPS, rtx titan 16.3 TFLOPS (all FP32, rasterization)
About price, there have been significant price cuts in all 30x0 products and AMD GPUs. Who wants the newest high-end card, barely on the market, has to pay early adopter price and could run into early adopter problems. Enough said. I will wait six months, to see real performance, impact of new AMD series, likely second revision of AIBs, consolidated market price. Until then, a nice time for all, don't forget to enjoy, what you already have.
most amd gpu are in server market. there making a killing on them. the gamer market the left overs.
 
I dont know how you can say that when games get more complicated with more textures every year. I run 4k 120hz on a 3080TI and i would still love more Raster for a lower price than Nvidia is providing. RT does a a tiny bit of immersion for me but most of the time i am too focused on the game to really look at RT features. I need the High FPS with no dips or tearing in the latest games.

A tiny bit?
Maybe I'm getting old or have played so many games but I can no longer play games with no RT or simple baked lighting, flying objects and flat image.
To be fair there are games with no RT that managed to capture the environment, not correctly but in a way that it doesn't trigger the player's brain.

apparently it counts for you since 90% of gamers don't care a lot and majority of games still use raster.
Just because a company is pushing for RT it does not mean it has become a mainstream. Especially with kinda pathetic performance.

Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

The performance will always be something that we have to work around when new techs are added. When GeForce 3 Ti introduced pixel shader 1, the performance was bad. Should we have stayed with the MXs because they were faster?(showing flat shxxxxxxty surfaces everywhere...)
 
A tiny bit?
Maybe I'm getting old or have played so many games but I can no longer play games with no RT or simple baked lighting, flying objects and flat image.
To be fair there are games with no RT that managed to capture the environment, not correctly but in a way that it doesn't trigger the player's brain.



Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

The performance will always be something that we have to work around when new techs are added. When GeForce 3 Ti introduced pixel shader 1, the performance was bad. Should we have stayed with the MXs because they were faster?(showing flat shxxxxxxty surfaces everywhere...)

I agree baked lighting looks substantially worse in most cases vs RT lighting done properly same with SSR vs RT reflections. Hell even ambient occlusion looks substantially better when done with RT.

Only games with very static environments like the Last of us 2 pull off baked lighting pretty decently but I'm personally not a huge fan of games that linear and even then it would benefit from RT.

Flagships/high end gpu should not only be able to do RT but do it well. Midrange I can still give a pass for now.
 
Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

Ray tracing has its benefits, sure. But both of these claims are... bold, let's say. Don't get me wrong; I'm all for tools that will make good lighting easier to implement (lighting is probably my most common gripe in 3D games), but that's all RT is: Another tool in the box.
 
A tiny bit?
Maybe I'm getting old or have played so many games but I can no longer play games with no RT or simple baked lighting, flying objects and flat image.
To be fair there are games with no RT that managed to capture the environment, not correctly but in a way that it doesn't trigger the player's brain.



Only those who play competitive games and only, do not care about RT.
It's the holy grail in the single player games.

The performance will always be something that we have to work around when new techs are added. When GeForce 3 Ti introduced pixel shader 1, the performance was bad. Should we have stayed with the MXs because they were faster?(showing flat shxxxxxxty surfaces everywhere...)
What you're describing is poorly done lighting, not an inherent RT/non-RT split. If objects appear floating or the image appears flat, then the developer did a poor job with their lighting. RT makes this easier, and you have to spend far less time applying all the various hacks and tricks inherent to making non-RT lighting look good, but it can definitely be done well that way as well.
 
I got curious about another thing when I looked at the specs. The AD104 is 295 mm2. How can a card with such a tiny die have a TDP of 285 W? All these cards must be factory overclocked like crazy, just like the 3090 Ti was. They probably did it so that the cards seem super powerful in hopes of justifying the 4080 naming.

In the GTX era, 104/204 dies were always under 200 W on the initial launch. Even the 3070 was only 220 W, and that was 393 mm2.

These cards will undervolt like a dream. You can probably cut the power consumption by 50% and only lose 20% performance or less. If the prices ever get changed to normal, these cards will be very attractive.
 
I got curious about another thing when I looked at the specs. The AD104 is 295 mm2. How can a card with such a tiny die have a TDP of 285 W? All these cards must be factory overclocked like crazy, just like the 3090 Ti was. They probably did it so that the cards seem super powerful in hopes of justifying the 4080 naming.

In the GTX era, 104/204 dies were always under 200 W on the initial launch. Even the 3070 was only 220 W, and that was 393 mm2.

These cards will undervolt like a dream. You can probably cut the power consumption by 50% and only lose 20% performance or less. If the prices ever get changed to normal, these cards will be very attractive.
GPUs used to be architecturally limited to relatively low clocks, which inherently (at least to some degree - you can still make a crazy inefficient design, obviously) limits the per-area power draw. AMD has had more of a steady progression in clock speeds, while Nvidia jumped to just below 2GHz with Pascal and stayed there for three generations, but have now again jumped another 6-700MHz. All the while, AMD is reported to have RDNA3 nearing 4GHz, which is downright insane for a GPU. This naturally brings with it higher per-area power draws as each transistor is running at higher speeds.

On top of this, AD104 has 38 billion transistors in that area, vs. 17 billion for GA104 - so not only are the transistors running much faster, but there are twice as many of them, in a smaller overall area.

You are such a spaz. I stated facts:

1. If you can’t afford it you aren’t the target market.
2. I work very hard(12+ hours daily, 6 days per week for over 20 years) and that’s why I can afford these things now, where they were a hardship when I was younger.

If you’re offended by either of those statements, or need to read more into it than what I explicitly stated… well, maybe you should figure out what’s making you so unhappy in life and fix it.
It's always fun when people have nothing to back up their "arguments" and instead resort to name-calling and personal attacks. It really, really isn't my problem that you are clearly incapable of seeing the ideological undertones of your own statements. Because they are there, whether you like it or not. Clear as day to anyone with even the most minute ability to interpret writing. This isn't me being offended, it's me telling you that you're blind to the implications of your own beliefs and words, and attempting to inform you that there might be things that you're missing in what you're saying. Your initial responses made it seem like you didn't really want to be expressing these kinds of things, so maybe you should try and reconsider those beliefs, and fix that instead?
 
I got curious about another thing when I looked at the specs. The AD104 is 295 mm2. How can a card with such a tiny die have a TDP of 285 W? All these cards must be factory overclocked like crazy, just like the 3090 Ti was. They probably did it so that the cards seem super powerful in hopes of justifying the 4080 naming.

In the GTX era, 104/204 dies were always under 200 W on the initial launch. Even the 3070 was only 220 W, and that was 393 mm2.

These cards will undervolt like a dream. You can probably cut the power consumption by 50% and only lose 20% performance or less. If the prices ever get changed to normal, these cards will be very attractive.
Well, you have nearly 300 W distributed on nearly 300 mm2. My overclocked 6500 XT runs quite cool at 100 W (that is GPU die power - not total board power like on Nvidia cards) with a roughly 100 mm2 die while maintaining a steady 2.95 GHz. While GPU temp is at 55-60 °C, hotspot temp can get up to 90 °C sometimes, which isn't bad from a modern GPU.

Coolers will have to have good contact to avoid hotspots, that's for sure. Other than that, my only worry is the heat passed on to the rest of the case. GPU temps should be fine, I think.
 
4070 Ti, minimum. Or maybe 4080 LE? It's about time they brought back that designation!
LE? I remember in Geforce 2 age MX, MX200, MX400 meant the budget versions. So yes, the 2GB version could be the Geforce 4080 MX200
 
Back
Top