Tuesday, September 20th 2022

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

NVIDIA just kicked off the GTC Autumn 2022 Keynote address that culminates in Project Beyond, the company's launch vehicle for its next-generation GeForce RTX 40-series graphics cards based on the "Ada" architecture. These are expected to nearly double the performance over the present generation, ushering in a new era of photo-real graphics as we inch closer to the metaverse. NVIDIA CEO Jensen Huang is expected to take center-stage to launch these cards.

15:00 UTC: The show is on the road.
15:00 UTC: AI remains the center focus, including how it plays with gaming.

15:01 UTC: Racer X is a real-time interactive tech demo. Coming soon.
15:02 UTC: Future games will be simulations, not pre-baked- Jensen Huang
15:03 UTC: This is seriously good stuff (RacerX). It runs on a single GPU, in real-time, uses RTX Neural Rendering
15:05 UTC: Ada Lovelace is a huge GPU
15:06 UTC: 76 billion transistors, over 18,000 shaders. 76 billion transistors, Micron GDDR6X memory. Shader execution reordering is major innovation, as big as out-of-order execution for CPUs, gains up to 25% in-game performance. Ada built on TSMC 4 nm, using 4N, a custom process designed in together with NVIDIA.

There's a new streaming multiprocessor design, with a total of 90 TFLOPS. Power efficiency is doubled over Ampere.
Ray Tracing is on the third generation now, with 200 RT TFLOPS and twice the triangle intersection speed.
Deep Learning AI uses 4th gen Tensor Cores, 1400 TFLOPS, "Optical Flow Accelerator"
15:07 UTC: Shader Execution Reordering similar to the one we saw with Intel Xe-HPG
15:08 UTC: Several new hardware-accelerated ray tracing innovations with 3rd gen RTX.
15:09 UTC: DLSS 3 is announced. It brings with it several new innovations, including temporal components, and Reflex latency optimizations. Generates new frames without involving the graphics pipeline.
15:11 UTC: Cyberpunk 2077 to get DLSS 3 and SER. 16 times increase in effective performance using DLSS 3 vs. DLSS 1. MS Flight Simulator to get DLSS 3 support
15:13 UTC: Portal RTX, a remaster just like Quake II RTX, available from November, created with Omniverse RTX Remix.
15:14 UTC: Ada offers a giant leap in total performance. Everything has been increased 40 -> 90 TFLOPS shader, 78 -> 200 TFLOPS RTX, 126 -> 300 TFLOPS OFA, 320 -> 1400 TFLOPS Tensor.
15:17 UTC: Power efficiency is more than doubled, but power goes up to 450 W now.
15:18 UTC: GeForce RTX 4090 will be available on October 12, priced at $1600. It comes with 24 GB GDDR6X and is 2-4x faster than RTX 3090 Ti.
15:18 UTC: RTX 4080 is available in two versions, 16 GB and 12 GB. The 16 GB version starts at $1200, the 12 GB at $900. 2-4x faster than RTX 3080 Ti.
15:19 UTC: New pricing for RTX 30-series, "for mainstream gamers", RTX 40-series "for enthusiasts".
15:19 UTC: "Ada is a quantum leap for gamers"—improved ray tracing, shader execution reordering, DLSS 3.
15:20 UTC: Updates to Omniverse

15:26 UTC: Racer X demo was built by a few dozen artists in just 3 months.
15:31 UTC: Digital twins would play a vital sole in product development and lifecycle maintenence.
15:31 UTC: Over 150 connectors to Omniverse.
15:33 UTC: GDN (graphics delivery network) is the new CDN. Graphics rendering over the Internet will be as big in the future as streaming video is today.
15:37 UTC: Omniverse Cloud, a planetary-scale GDN
15:37 UTC: THOR SuperChip for automotive applications.

15:41 UTC: NVIDIA next-generation Drive
Add your own comment

333 Comments on NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

#51
oxrufiioxo
CallandorWoTand that image of much less Cuda cores in the 12gb variant is official? so if you really want a 4080 you want the 16gb version which is $1200... man if thats true, I hope RDNA3 kicks ass.
Well it depends how rdna 3 is priced my guess is not cheap... Also if the 4080 12GB is 25-35% faster in rasterizaion and 60% ish in RT over the 3090ti it'll be ok if it performs about the same it'll be a major yawn at that price.
Posted on Reply
#52
evernessince
diatribePretty much in line with past generations. At least they didn't go crazy with a $2500 4090. $1599, $1199 and $899 for the 4090, 4080(16?) and 4080(12) are not bad considering.

October 12th.
How is $900 for the xx80 SKU not bad? They've somehow managed to increase the price of that SKU $200 OVER turing's trash pricing. That's on top of it being only 12GB. It's a joke. The 1080 had an MSRP of $599 and that's considering it was a massive performance and efficiency jump. 980 was $550. Any way you slice it, it's another price hike.
Posted on Reply
#53
cvaldes
DragokarOh, trust me they easily could answer that, but also there is a certain amount of $$ going up when something is not really available even if it is ;)

The 4080 12G should not be called 4080........but yeah lets wait and see.
Well, NVIDIA can't predict if scalpers will scoop up available inventory, regardless of whether or not miners are interested at this moment in time.

My guess is that NVIDIA prefers to focus on the positive aspects of the new technology in today's event rather than business in today's rockier economic climate.
Posted on Reply
#54
ARF
LFaWolfThat is not how corporate America works. It will be cheaper… by maybe a $100?
The most important that we all need is fair competition. Because behind-the-scene agreements between the corporations against the greater good of the customers is not good for anyone.
Posted on Reply
#55
Valantar
evernessinceHow is $900 for the xx80 SKU not bad? They've somehow managed to increase the price of that SKU $200 OVER turing's trash pricing. That's on top of it being only 12GB. It's a joke.
"But $900 is so much less than $1500! Just imagine how much it could have cost!"

/s, in case that wasn't clear.
LFaWolfThat is not how corporate America works. It will be cheaper… by maybe a $100?
I'd really love to see AMD compete on value again. Sadly they haven't seemed motivated to do so since they caught up with the competition in the past couple of years. I might be wrong, but I blame shareholder pressure (and the leadership culture that encourages) to increase margins rather than sales volumes. Such an odd strategy for a company with ~20% market share though.
Posted on Reply
#56
Hyderz
Mmmm RTX4090 want one but im happy with rtx3090 for a few more generations :)
Posted on Reply
#57
oxrufiioxo
ValantarThat's pretty reasonable, sure, as long as you have reliable performance data to go off of, and are also keeping on top of reasonable expectations for gen-over-gen price/perf increases. It's very easy to start thinking "hey, this is faster, so it's reasonable that it costs more" and to forget that we used to get those performance increases for the same price as the previous generation of the same tier.
I've owned pretty much every flagship over the last decade.

Other than the 1080ti which launched quite a bit after the 1080 which was the flagship pascal at launch we typically got 30-40% gains at most gen on gen with almost none other than the 2080ti with turning other than RT.

This past gen we got 50% ish 2080ti to 3090
And now it seems 100% for this gen.


Don't get me wrong I think the 4080s are priced too high and the 4090 priced ok but I'll reserve final judgment for reviews.
Posted on Reply
#58
evernessince
ValantarI'd really love to see AMD compete on value again. Sadly they haven't seemed motivated to do so since they caught up with the competition in the past couple of years. I might be wrong, but I blame shareholder pressure (and the leadership culture that encourages) to increase margins rather than sales volumes. Such an odd strategy for a company with ~20% market share though.
Yeah, unfortunately AMD might just follow suite with pricing.
Posted on Reply
#59
StefanM
So Ada Lovelace has a lower compute capability (8.9) than Hopper (9.0)
Nobody had expected that ...
Posted on Reply
#60
evernessince
oxrufiioxoThe 4080s are more expensive than I expected but the 4090 is cheaper if performance is 2x 3080ti and 3090ti like they claim it's not terrible I guess.
Nvidia is changing prices to encourage people to buy a higher SKU. The 4090 is still terrible value, it's just that the 4080 (both SKUs) is now worse value making the 4090 look like a better value.
Posted on Reply
#61
oxrufiioxo
evernessinceYeah, unfortunately AMD might just follow suite with pricing.
I think they'll be cheaper maybe even much cheaper for the RX 7900X but likely due to much lower RT performance and no answer to DLSS 3.0 and how even in cpu limited games it can drastically improve framerate.
Posted on Reply
#63
oxrufiioxo
evernessinceNvidia is changing prices to encourage people to buy a higher SKU. The 4090 is still terrible value, it's just that the 4080 (both SKUs) is now worse value making the 4090 look like a better value.
Amd is sorta doing the same with the 7950X, I guess that's business 101.
Posted on Reply
#64
Dragokar
cvaldesWell, NVIDIA can't predict if scalpers will scoop up available inventory, regardless of whether or not miners are interested at this moment in time.

My guess is that NVIDIA prefers to focus on the positive aspects of the new technology in today's event rather than business in today's rockier economic climate.
They could, but a proper system costs money, and they don't really care which user is buying them, why should they. The inventory is easily predicted and only in certain timeframes you can get in trouble like many did within the pandemic and the mining grace.

That is also a disadvantage of just-in-time production since barely no one want to have party in stock warehouse.

I am still happy to see new GPUs, but will probably don't use 4080 and 4090 due to its TGP/TDP. My personal max is between 200-250w, and I am not going down that route that the industry is going. Now it is waiting for the full release and RDNA3 as well, beside the Intel VaporArc.
Posted on Reply
#65
evernessince
dick_cheneyPrices are insane, $899 for not even a top sku.

$899 for a slower xx80 SKU with only 12GB. The faster SKU is $1,199. Crazy.
Posted on Reply
#66
Valantar
oxrufiioxoI've owned pretty much every flagship over the last decade.

Other than the 1080ti which launched quite a bit after the 1080 which was the flagship pascal at launch we typically got 30-40% gains at most gen on gen with almost none other than the 2080ti with turning other than RT.

This past gen we got 50% ish 2080ti to 3090
And now it seems 100% for this gen.


Don't get me wrong I think the 4080s are priced too high and the 4090 priced ok but I'll reserve final judgment for reviews.
It's true that a 2x increase - if true - is well above the norm for a gen-on-gen increase. But on the other hand, gen-on-gen increases for the past couple of generations have either been lacklustre (Turing - which also bumped prices, yay!) or coupled with a massive price hike (Ampere). That we now seem to be getting a big gen-on-gen increase, but with another price increase is really not good in light of this. It's not as if prices have been stagnant for the past half decade.
oxrufiioxoAmd is sorta doing the same with the 7950X, I guess that business 101.
Are they? IMO the 7600X looks pretty damn good next to the 7950X. Clock differences are there, sure, but they're not that significant.
Posted on Reply
#67
AnotherReader
Valantar"But $900 is so much less than $1500! Just imagine how much it could have cost!"

/s, in case that wasn't clear.

I'd really love to see AMD compete on value again. Sadly they haven't seemed motivated to do so since they caught up with the competition in the past couple of years. I might be wrong, but I blame shareholder pressure (and the leadership culture that encourages) to increase margins rather than sales volumes. Such an odd strategy for a company with ~20% market share though.
Yes, the old AMD that used to undercut Nvidia, is gone for good. You're right though; it's an odd strategy for the player with only 20% market share. I hope Intel can kick both incumbents out of their complacency, but I'm not holding my breath.
Posted on Reply
#68
Hyderz
those who hold onto 10series card would get a massive increase in perf like 1080/ti 1070/ti
Posted on Reply
#69
RandallFlagg
oxrufiioxoI've owned pretty much every flagship over the last decade.

Other than the 1080ti which launched quite a bit after the 1080 which was the flagship pascal at launch we typically got 30-40% gains at most gen on gen with almost none other than the 2080ti with turning other than RT.

This past gen we got 50% ish 2080ti to 3090
And now it seems 100% for this gen.


Don't get me wrong I think the 4080s are priced too high and the 4090 priced ok but I'll reserve final judgment for reviews.
My take on the GPU market higher end models, is that they don't compare to previous gen outside of the model designation.

i.e. I don't think it's correct to compare a 2080 Ti to anything other than a 3080 Ti. Otherwise you have to start looking at Titan which doesn't exist now.

It's like if you bought a Ford Fusion before the Ford Taurus came out, and choose to compare the pricing to a new Taurus because the Fusion was the top line Ford sedan at the time you bought it. It's just not a valid comparison, it's a different model that is more upscale. Everyone is doing this in all industries, can't compare house prices from 30 years ago to new houses because new houses are 40% larger.

Me personally I agree with @Dragokar. I want to see what both AMD and Nvidia can do below 225W.

It's not a tree hugger / green thing, I just don't want to get a bigger PSU and noisier GPU for something I spend maybe 10% of my time doing. I also don't like how the market has moved beyond what a typical consumer can do with an OEM rig, which will usually limit you down to a 6-pin power connector for a GPU. It gets even worse when you look at 75W cards, seems like forever since that market has moved much at all.
Posted on Reply
#70
oxrufiioxo
ValantarIt's true that a 2x increase - if true - is well above the norm for a gen-on-gen increase. But on the other hand, gen-on-gen increases for the past couple of generations have either been lacklustre (Turing - which also bumped prices, yay!) or coupled with a massive price hike (Ampere). That we now seem to be getting a big gen-on-gen increase, but with another price increase is really not good in light of this. It's not as if prices have been stagnant for the past half decade.
Definitely agree but everything is getting more expensive unfortunately cars, phones, electricity, food.... GPUs aren't immune to this and personally I'd rather them go all out doubling performance vs giving us 35% more performance at the same price and calling it a flagship. I'm as happy about the pricing as everyone else but I also try to be releastic. Again once reviews come out and competing RDNA3 cards are released I'll decide how good or bad these cards are at a given price.

All these cards could be good or bad depending on performance vs ampere and RDNA3 I'm personally rooting for much better priced amd cards I'm also not holding my breath.
Posted on Reply
#71
Abula
Well i expected worst, i though the RTX4090 was going to be $2k, not saying im happy with $1600, but its lower than what i though. That said, i expect the custom coolers to go close to $2k.

I do think it was a little fishy the games that were used to compare the 4090, i want to see it vs real AAA games. Is the NDA release before Oct 12? so we can see 3rd party reviews before we buy?
Posted on Reply
#72
cvaldes
DragokarThey could, but a proper system costs money, and they don't really care which user is buying them, why should they. The inventory is easily predicted and only in certain timeframes you can get in trouble like many did within the pandemic and the mining grace.
NVIDIA most certainly knows how many Ada Lovelace GPUs they delivered to AIB partners and they probably have a good idea how many of each card is available to ship.

The point is that they don't know who the buyer is: a university purchasing agent for some research lab, a PC gamer, a crypto miner (less likely today), or a scalper who won't actually use the card but will sell it to one of the first three.

NVIDIA should actually care about scalpers because customer satisfaction is partially based on the value proposition. By using AIB partners and a third-party retail store marketplace, NVIDIA has very little control over scalping. They had very limited success thwarting scalping of Ampere cards in the USA, even their own Founders Edition models sold directly through their sole representative, Best Buy.

Certainly mining demand won't be there like it was two years ago but there's nothing preventing scalpers from scooping up available 4090 inventory and reselling to gamers, technical users, content creators, etc.

We'll all just have to wait and see what happens in October with the 4090 release.
Posted on Reply
#73
Tek-Check
I cannot see that Nvidia 4000 series feature DisplayPort 2.0 ports with at least 40 Gbps port. I can see that MSI cards have DP 1.4 only.
New DP 2.0 monitors are being validated and certified as we speak, and will launch soon.

For such expensive products, almost 2023 products, it's not acceptable not to move to newly available DP standard. Let's see whether AMD's RDNA3 cards deliver on what has been releaved by Phoronix code. Those cards are thought to have support for 80 Gbps ports in software.
Posted on Reply
#74
oxrufiioxo
RandallFlaggMy take on the GPU market higher end models, is that they don't compare to previous gen outside of the model designation.

i.e. I don't think it's correct to compare a 2080 Ti to anything other than a 3080 Ti. Otherwise you have to start looking at Titan which doesn't exist now.

It's like if you bought a Ford Fusion before the Ford Taurus came out, and choose to compare the pricing to a new Taurus because the Fusion was the top line Ford sedan at the time you bought it. It's just not a valid comparison, it's a different model that is more upscale. Everyone is doing this in all industries, can't compare house prices from 30 years ago to new houses because new houses are 40% larger.

Me personally I agree with @Dragokar. I want to see what both AMD and Nvidia can do below 225W.

It's not a tree hugger / green thing, I just don't want to get a bigger PSU and noisier GPU for something I spend maybe 10% of my time doing. I also don't like how the market has moved beyond what a typical consumer can do with an OEM rig, which will usually limit you down to a 6-pin power connector for a GPU. It gets even worse when you look at 75W cards, seems like forever since that market has moved much at all.
I don't disagree with your take because neither of us is wrong or right everyone has to do their own research and decide based on a number of factors if a 900 gpu is right for them. A car is a necessity for most people a gpu is a commodity and much less important but I don't necessarily look at cars any different I decide what I want to spend and buy whatever offers me the most in that price range.
Posted on Reply
#75
Hofnaerrchen
Only mentioning efficiency and not actual power consumption is a bad sign for end customers, while it is good news for the professional sector. The cards - already coming with a nice markup - will be even more expensive taking their power needs into consideration. Only good news: Right now it's just an announcement. Undervolting testing will show if this generation might still be worthy upgrade. Time will tell. Right now RTX 4000 is rather disappointing.
Posted on Reply
Add your own comment
Nov 22nd, 2024 23:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts