Tuesday, September 20th 2022

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

NVIDIA just kicked off the GTC Autumn 2022 Keynote address that culminates in Project Beyond, the company's launch vehicle for its next-generation GeForce RTX 40-series graphics cards based on the "Ada" architecture. These are expected to nearly double the performance over the present generation, ushering in a new era of photo-real graphics as we inch closer to the metaverse. NVIDIA CEO Jensen Huang is expected to take center-stage to launch these cards.

15:00 UTC: The show is on the road.
15:00 UTC: AI remains the center focus, including how it plays with gaming.

15:01 UTC: Racer X is a real-time interactive tech demo. Coming soon.
15:02 UTC: Future games will be simulations, not pre-baked- Jensen Huang
15:03 UTC: This is seriously good stuff (RacerX). It runs on a single GPU, in real-time, uses RTX Neural Rendering
15:05 UTC: Ada Lovelace is a huge GPU
15:06 UTC: 76 billion transistors, over 18,000 shaders. 76 billion transistors, Micron GDDR6X memory. Shader execution reordering is major innovation, as big as out-of-order execution for CPUs, gains up to 25% in-game performance. Ada built on TSMC 4 nm, using 4N, a custom process designed in together with NVIDIA.

There's a new streaming multiprocessor design, with a total of 90 TFLOPS. Power efficiency is doubled over Ampere.
Ray Tracing is on the third generation now, with 200 RT TFLOPS and twice the triangle intersection speed.
Deep Learning AI uses 4th gen Tensor Cores, 1400 TFLOPS, "Optical Flow Accelerator"
15:07 UTC: Shader Execution Reordering similar to the one we saw with Intel Xe-HPG
15:08 UTC: Several new hardware-accelerated ray tracing innovations with 3rd gen RTX.
15:09 UTC: DLSS 3 is announced. It brings with it several new innovations, including temporal components, and Reflex latency optimizations. Generates new frames without involving the graphics pipeline.
15:11 UTC: Cyberpunk 2077 to get DLSS 3 and SER. 16 times increase in effective performance using DLSS 3 vs. DLSS 1. MS Flight Simulator to get DLSS 3 support
15:13 UTC: Portal RTX, a remaster just like Quake II RTX, available from November, created with Omniverse RTX Remix.
15:14 UTC: Ada offers a giant leap in total performance. Everything has been increased 40 -> 90 TFLOPS shader, 78 -> 200 TFLOPS RTX, 126 -> 300 TFLOPS OFA, 320 -> 1400 TFLOPS Tensor.
15:17 UTC: Power efficiency is more than doubled, but power goes up to 450 W now.
15:18 UTC: GeForce RTX 4090 will be available on October 12, priced at $1600. It comes with 24 GB GDDR6X and is 2-4x faster than RTX 3090 Ti.
15:18 UTC: RTX 4080 is available in two versions, 16 GB and 12 GB. The 16 GB version starts at $1200, the 12 GB at $900. 2-4x faster than RTX 3080 Ti.
15:19 UTC: New pricing for RTX 30-series, "for mainstream gamers", RTX 40-series "for enthusiasts".
15:19 UTC: "Ada is a quantum leap for gamers"—improved ray tracing, shader execution reordering, DLSS 3.
15:20 UTC: Updates to Omniverse

15:26 UTC: Racer X demo was built by a few dozen artists in just 3 months.
15:31 UTC: Digital twins would play a vital sole in product development and lifecycle maintenence.
15:31 UTC: Over 150 connectors to Omniverse.
15:33 UTC: GDN (graphics delivery network) is the new CDN. Graphics rendering over the Internet will be as big in the future as streaming video is today.
15:37 UTC: Omniverse Cloud, a planetary-scale GDN
15:37 UTC: THOR SuperChip for automotive applications.

15:41 UTC: NVIDIA next-generation Drive
Add your own comment

333 Comments on NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

#151
Tek-Check
ARFFor me, too. I always look for the next Radeon generation.
The question is if AMD cares about its customers or not? I mean we don't want an RX 7700 XT for $879 lol
It's going to be interesitng to see the pricing of RDNA3 in November. Brace for some surprises, as there is a flood of GPUs on secondary markets now.
7700XT is expected to perform similarly or better than 3090, which has ~69% better raster in relative performance. If RDNA3 architecture brings this kind of uplift, which it should, then AMD needs to convince people to buy 7700XT instead of used 3090 or new 4070.

If you can buy 3090 for less than $700 on ebay, AMD will need to propose convincing price for 7700XT to attract buyers. It cannot cost more than used 3090, so I expect $600-$650 max. But this is far away, in H1 2023. Things can change a bit.
Posted on Reply
#153
THU31
ARFPerformance:


NVIDIA introduces GeForce RTX 4090/4080 series, RTX 4090 launches October 12th for 1599 USD - VideoCardz.com
The 4080s look pretty bad in games without ray-tracing. The 3090 Ti is only 25% faster than the 3080, and the "4080-12" is actually slower here. It must be the terrible memory bandwidth, considering how high GPU clocks are.

I will hold off until a game with some insane RT visuals comes out (that I actually want to play). My 3080 is more than enough for rasterization. Maybe they will drop the prices next year. No way they can keep this up after the 30 series is gone.
Posted on Reply
#154
RandallFlagg
ModEl4---------------------------------------------------
Judt look at the size of the Aorus, it's just ridiculous!
3090 Strix vs 4090 Strix :

Posted on Reply
#155
ARF
THU31The 4080s look pretty bad in games without ray-tracing. The 3090 Ti is only 25% faster than the 3080, and the "4080-12" is actually slower here. It must be the terrible memory bandwidth, considering how high GPU clocks are.

I will hold off until a game with some insane RT visuals comes out (that I actually want to play). My 3080 is more than enough for rasterization. Maybe they will drop the prices next year. No way they can keep this up after the 30 series is gone.
Yeah, it just shows that it is a real 4060 Ti, because the 3060 Ti memory is very similar.
And hey, what is a "4080 - 12" with a 192-bit bus? What's next? A 4060 with 64-bit bus? lol
Posted on Reply
#156
Abula
Tek-CheckIt's going to be interesitng to see the pricing of RDNA3 in November.
I don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.
Posted on Reply
#157
ARF
AbulaI don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.
It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
Posted on Reply
#158
cvaldes
ARFIt really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
NVIDIA has something like 80% of the discrete desktop graphics card market. I doubt that AMD can defeat NVIDIA with one card with sharp pricing.

At the most, AMD has a chance and nibbling away at NVIDIA's market share.

Let's not forget that the NVIDIA GPU chip has excellent adoption as a compute engine (AI, etc.) and that NVIDIA's Gaming business is now smaller than their Data Center business.
Posted on Reply
#159
Valantar
cvaldesNVIDIA has something like 80% of the discrete desktop graphics card market. I doubt that AMD can defeat NVIDIA with one card with sharp pricing.

At the most, AMD has a chance and nibbling away at NVIDIA's market share.

Let's not forget that the NVIDIA GPU chip has excellent adoption as a compute engine (AI, etc.) and that NVIDIA's Gaming business is now smaller than their Data Center business.
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.

As for that last point, it has nothing to do with Geforce sales nor desktop GPU marketshare.
Posted on Reply
#160
DarkS0ul
The lack of "new"DLSS 3.0 on the entire series of RTX cards is a spit in the face of players. I understand that a given series like RTX 2000 and RTX 3000 would differ in performance by 10-30% depending on how strong the card is. Now it seems as if everything was planned from the beginning ? The cryptocurrencies is over and suddenly we have technology available only on the latest series? After all, this is not a technological change like GTX vs RTX. And then maybe it is a pre-planned process? On the one hand, a sudden and huge leap and on the other hand, no (full) support from the same family - RTX ?! It's like showing the middle finger to players.
Have to wait for reliable tests, checking the technology, but it shows that currently the duopoly, previously cryptocurrencies turned out to be deadly for players and gave huge, disproportionate profits for Nvidia, AMD. Playing on PC becomes an abstractio

If it were to be confirmed, I dont fully understand the producers and publishers of games. They limit the potential number of customers with questionable technology. Currently, RT looks like an on / off in terms of lighting, shadows etc. Yet many titles without RT looked great. Then such a technological leap would make sense - but it looks like a lack of support. It's like you buy a new iPhone and after a year your iPhone 12, 13 is no longer usable...for basic functions (performance drop by several dozen percent). With an update, you lose the usefulness of your phone even in a year or two. You have to switch to new products after the premiere of the new series because the life cycle is extremely short.

Someone will say - great, we have a huge performance increase. And I will say - the RTX 2000 series gave a very small increase in performance in games without RT. There were a few issues with RT games and the performance was mediocre. At the same price over the previous series, barely 7-10%, when the GTX 1070-1080Ti cost more, but the performance gains were much greater.
For example
www.techpowerup.com/review/nvidia-geforce-gtx-1070/24.html what we get for 379$ ;)
Even the next RTX series as the most efficient Nvidia RTX 3090Ti card costing $ 2,000 in 4K with RT in some games we only had 25-50 frames. For $ 2000! If now the leap in performance was so large only with DLSS 3.0 it would potentially mean that Nvidia gets the profits from cryptocurrencies, didnt have to worry about what players thinkin . One market is over* cryptocurrencies - now in if you want to play, everyone "has" to buy a new series.
Posted on Reply
#161
cvaldes
steenReviews are gonna be tricky. No DLSS vs no DLSS, DLSS2 vs DLSS2, & DLSS2 vs DLSS3.
Graphics card reviewers have already been dealing with this for a while.

The better reviewers typically compared a number of games just on pure 3D rasterization. They then add some comparisons with RT on but no enhancements like DLSS or FSR. Finally there are some comparisons between DLSS and FSR, particularly with the handful of titles that support both. With each passing week, there are more gaming titles that support these two technologies.

And we will soon see the additional of XeSS to the mix.

It's not like hardware reviewers are being blindsided by DLSS 3.0.

For the most part, today's graphics cards have enough 3D rasterization performance on a single PCB which is why NVLink wasn't even included on the Ada cards. Super sampling technologies like DLSS, FSR, and XeSS are most beneficial when ray tracing is enabled.
Posted on Reply
#162
Valantar
AusWolfNo mention of TDP, and the prices are a slap in the face. I don't care about omniverses, either. What a disappointment!
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:


So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
Posted on Reply
#163
cvaldes
ValantarAMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.

As for that last point, it has nothing to do with Geforce sales nor desktop GPU marketshare.
To my knowledge, AMD Radeon has been historically less expensive than comparable GeForce products, at least in the past five years. At least from a performance-per-dollar metric on 3D rasterization, Radeon is a better buy, at least at 1080p and 1440p resolutions. However AMD's position weakens with the 4K and 8K resolutions as well as other features like RT and DLSS/FSR.

With more gaming titles adopting RT and DLSS/FSR (and now XeSS), the performance comparison now takes multiple charts, tables, and graphs. The old paradigm of "run this 3D benchmark utility to compare scores" from five years ago is increasingly obsolete.

And cherry picking one or two games (MS Flight Simulator, Cyberpunk 2077, Red Dead Redemption, whatever) isn't particularly useful. The better analyses use a battery of games to average out architectural advantages and disadvantages of each manufacturer, whether it be DX11 vs. DX12 or other newer features like super sampling.

I don't know about anyone else here but I happen to play some older titles on newer hardware, including games that aren't always part of a graphics card review.
Posted on Reply
#164
Tek-Check
AbulaI don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.
Highest tier market is not going to be interesting for most people, so there is no excitement with 7900XT vs 4090. You are right.
More interesting will be upper mid rage and mid range. Do not forget that AMD will also offer DP 2.0 ports, PCIe 5.0 interface x16 and x8, as well as more VRAM, for more future-proof products.
ARFIt really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
It seems to me that it's 7800 that will try to be competitive with 4080 12GB, 7800XT with 4080 16 GB, and 7700XT with 4070.
Posted on Reply
#165
Chrispy_
dir_dThose prices are garbage
I feel like Nvidia might be losing the plot here. We on TPU (according to the survey) definitely fit into the minority niche group of hardware enthusiasts. Something approaching 99% of the population will ask themselves "should I buy a graphics card for my PC, or should I buy a PS5/XBox? When the GPU costs so much more than a decent console and comes with the additional burden of needing a new PSU as well, it's not really relevant to the vast majority.

If, when the sub-$500 cards are launched, they can be run on a reasonable PSU that someone with a last-gen PC might have (so probably a 650-850W PSU), then we can tell for sure if Nvidia have lost the plot or not.
Posted on Reply
#166
Arkz
My favourite part is where they decided to call the 4070 the 4080 12GB instead. That will sell more units. Watch the 4060 come out as the 4080 8GB.
Posted on Reply
#167
Chrispy_
ValantarFull specs are on their site
The specs are kinda useless since Nvidia have once again redefined what units in the architecture count as "one CUDA core". Per CUDA core, Turing was vastly better than Ampere, but the numbers went up regardless. Until we get a full independent review of Lovelace vs Ampere, the core and clock numbers are meaningless because we don't know how potent each core is in relation to the current gen.
ArkzMy favourite part is where they decided to call the 4070 the 4080 12GB instead. That will sell more units. Watch the 4060 come out as the 4080 8GB.
I'm in the queue for my low-profile, single-slot 4080 4GB DDR4 :D
Posted on Reply
#168
Tek-Check
ValantarAMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.
Not just that, but RDNA3 cards will offer PCIe 5.0 connectivity, DP 2.0 and more VRAM. It should be an attractive package. We shall see.
Posted on Reply
#169
ModEl4
ValantarFull specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:


So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
I expect around -18% at QHD and a little bit more at 4K.
Reference 3090 with 350W TBP needed 2 PCIe 8pins and 4080 with 320W needs 3?
Posted on Reply
#170
ymdhis
$1200 and $1600, fucking lmao, I called it.
Posted on Reply
#172
Valantar
oxrufiioxo
Ouch and us Americans think we got it bad....

Also maybe this was mentioned before but the 12GB is a partner only model.
That's what VAT does to you when comparing to US MSRPs, which are always without tax. Germany even has it pretty good at 19%, in Norway and Sweden its 25%.
Posted on Reply
#173
oxrufiioxo
ValantarThat's what VAT does to you when comparing to US MSRPs, which are always without tax. Germany even has it pretty good at 19%, in Norway and Sweden its 25%.
Still like 1730 usd vs 1950 euro.... At least the 4080 16GB is getting the same cooler as the 4090 and not the super gimped version they used on everything below the 3090 last gen.
Posted on Reply
#174
AusWolf
ValantarFull specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:


So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
I guess there's no 4080 12 GB FE, only AIB cards. It really should have been called the 4070. I also guess it would have looked really bad to release a 285 W x70-series card, that's why the two vastly different 4080s. I'm curious what the x60 and x50 tier will look like, and how the 4070 will be positioned if there's gonna be one.

I still hold my position - not impressed.
Posted on Reply
#175
Hofnaerrchen
cvaldesThere are published specifications for power.
I know but it is quite a difference if it is told in the keynote or just on the website. I doubt many people go to the website to check it out these days. Most people watch videos and don't read (at all^^) tech sheets anymore.
Posted on Reply
Add your own comment
Nov 26th, 2024 16:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts