• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

RDNA3 will bring balance to the Force. Be patient. The leather jacket is Dark, wait for the Light.
That is not how corporate America works. It will be cheaper… by maybe a $100?
 
and that image of much less Cuda cores in the 12gb variant is official? so if you really want a 4080 you want the 16gb version which is $1200... man if thats true, I hope RDNA3 kicks ass.

Well it depends how rdna 3 is priced my guess is not cheap... Also if the 4080 12GB is 25-35% faster in rasterizaion and 60% ish in RT over the 3090ti it'll be ok if it performs about the same it'll be a major yawn at that price.
 
Pretty much in line with past generations. At least they didn't go crazy with a $2500 4090. $1599, $1199 and $899 for the 4090, 4080(16?) and 4080(12) are not bad considering.

October 12th.

How is $900 for the xx80 SKU not bad? They've somehow managed to increase the price of that SKU $200 OVER turing's trash pricing. That's on top of it being only 12GB. It's a joke. The 1080 had an MSRP of $599 and that's considering it was a massive performance and efficiency jump. 980 was $550. Any way you slice it, it's another price hike.
 
Oh, trust me they easily could answer that, but also there is a certain amount of $$ going up when something is not really available even if it is ;)

The 4080 12G should not be called 4080........but yeah lets wait and see.

Well, NVIDIA can't predict if scalpers will scoop up available inventory, regardless of whether or not miners are interested at this moment in time.

My guess is that NVIDIA prefers to focus on the positive aspects of the new technology in today's event rather than business in today's rockier economic climate.
 
That is not how corporate America works. It will be cheaper… by maybe a $100?

The most important that we all need is fair competition. Because behind-the-scene agreements between the corporations against the greater good of the customers is not good for anyone.
 
How is $900 for the xx80 SKU not bad? They've somehow managed to increase the price of that SKU $200 OVER turing's trash pricing. That's on top of it being only 12GB. It's a joke.
"But $900 is so much less than $1500! Just imagine how much it could have cost!"

/s, in case that wasn't clear.
That is not how corporate America works. It will be cheaper… by maybe a $100?
I'd really love to see AMD compete on value again. Sadly they haven't seemed motivated to do so since they caught up with the competition in the past couple of years. I might be wrong, but I blame shareholder pressure (and the leadership culture that encourages) to increase margins rather than sales volumes. Such an odd strategy for a company with ~20% market share though.
 
Mmmm RTX4090 want one but im happy with rtx3090 for a few more generations :)
 
That's pretty reasonable, sure, as long as you have reliable performance data to go off of, and are also keeping on top of reasonable expectations for gen-over-gen price/perf increases. It's very easy to start thinking "hey, this is faster, so it's reasonable that it costs more" and to forget that we used to get those performance increases for the same price as the previous generation of the same tier.

I've owned pretty much every flagship over the last decade.

Other than the 1080ti which launched quite a bit after the 1080 which was the flagship pascal at launch we typically got 30-40% gains at most gen on gen with almost none other than the 2080ti with turning other than RT.

This past gen we got 50% ish 2080ti to 3090
And now it seems 100% for this gen.


Don't get me wrong I think the 4080s are priced too high and the 4090 priced ok but I'll reserve final judgment for reviews.
 
I'd really love to see AMD compete on value again. Sadly they haven't seemed motivated to do so since they caught up with the competition in the past couple of years. I might be wrong, but I blame shareholder pressure (and the leadership culture that encourages) to increase margins rather than sales volumes. Such an odd strategy for a company with ~20% market share though.

Yeah, unfortunately AMD might just follow suite with pricing.
 
So Ada Lovelace has a lower compute capability (8.9) than Hopper (9.0)
Nobody had expected that ...
 
The 4080s are more expensive than I expected but the 4090 is cheaper if performance is 2x 3080ti and 3090ti like they claim it's not terrible I guess.

Nvidia is changing prices to encourage people to buy a higher SKU. The 4090 is still terrible value, it's just that the 4080 (both SKUs) is now worse value making the 4090 look like a better value.
 
Yeah, unfortunately AMD might just follow suite with pricing.

I think they'll be cheaper maybe even much cheaper for the RX 7900X but likely due to much lower RT performance and no answer to DLSS 3.0 and how even in cpu limited games it can drastically improve framerate.
 
Nvidia is changing prices to encourage people to buy a higher SKU. The 4090 is still terrible value, it's just that the 4080 (both SKUs) is now worse value making the 4090 look like a better value.

Amd is sorta doing the same with the 7950X, I guess that's business 101.
 
Last edited:
Well, NVIDIA can't predict if scalpers will scoop up available inventory, regardless of whether or not miners are interested at this moment in time.

My guess is that NVIDIA prefers to focus on the positive aspects of the new technology in today's event rather than business in today's rockier economic climate.
They could, but a proper system costs money, and they don't really care which user is buying them, why should they. The inventory is easily predicted and only in certain timeframes you can get in trouble like many did within the pandemic and the mining grace.

That is also a disadvantage of just-in-time production since barely no one want to have party in stock warehouse.

I am still happy to see new GPUs, but will probably don't use 4080 and 4090 due to its TGP/TDP. My personal max is between 200-250w, and I am not going down that route that the industry is going. Now it is waiting for the full release and RDNA3 as well, beside the Intel VaporArc.
 
I've owned pretty much every flagship over the last decade.

Other than the 1080ti which launched quite a bit after the 1080 which was the flagship pascal at launch we typically got 30-40% gains at most gen on gen with almost none other than the 2080ti with turning other than RT.

This past gen we got 50% ish 2080ti to 3090
And now it seems 100% for this gen.


Don't get me wrong I think the 4080s are priced too high and the 4090 priced ok but I'll reserve final judgment for reviews.
It's true that a 2x increase - if true - is well above the norm for a gen-on-gen increase. But on the other hand, gen-on-gen increases for the past couple of generations have either been lacklustre (Turing - which also bumped prices, yay!) or coupled with a massive price hike (Ampere). That we now seem to be getting a big gen-on-gen increase, but with another price increase is really not good in light of this. It's not as if prices have been stagnant for the past half decade.

Amd is sorta doing the same with the 7950X, I guess that business 101.
Are they? IMO the 7600X looks pretty damn good next to the 7950X. Clock differences are there, sure, but they're not that significant.
 
"But $900 is so much less than $1500! Just imagine how much it could have cost!"

/s, in case that wasn't clear.

I'd really love to see AMD compete on value again. Sadly they haven't seemed motivated to do so since they caught up with the competition in the past couple of years. I might be wrong, but I blame shareholder pressure (and the leadership culture that encourages) to increase margins rather than sales volumes. Such an odd strategy for a company with ~20% market share though.
Yes, the old AMD that used to undercut Nvidia, is gone for good. You're right though; it's an odd strategy for the player with only 20% market share. I hope Intel can kick both incumbents out of their complacency, but I'm not holding my breath.
 
those who hold onto 10series card would get a massive increase in perf like 1080/ti 1070/ti
 
I've owned pretty much every flagship over the last decade.

Other than the 1080ti which launched quite a bit after the 1080 which was the flagship pascal at launch we typically got 30-40% gains at most gen on gen with almost none other than the 2080ti with turning other than RT.

This past gen we got 50% ish 2080ti to 3090
And now it seems 100% for this gen.


Don't get me wrong I think the 4080s are priced too high and the 4090 priced ok but I'll reserve final judgment for reviews.

My take on the GPU market higher end models, is that they don't compare to previous gen outside of the model designation.

i.e. I don't think it's correct to compare a 2080 Ti to anything other than a 3080 Ti. Otherwise you have to start looking at Titan which doesn't exist now.

It's like if you bought a Ford Fusion before the Ford Taurus came out, and choose to compare the pricing to a new Taurus because the Fusion was the top line Ford sedan at the time you bought it. It's just not a valid comparison, it's a different model that is more upscale. Everyone is doing this in all industries, can't compare house prices from 30 years ago to new houses because new houses are 40% larger.

Me personally I agree with @Dragokar. I want to see what both AMD and Nvidia can do below 225W.

It's not a tree hugger / green thing, I just don't want to get a bigger PSU and noisier GPU for something I spend maybe 10% of my time doing. I also don't like how the market has moved beyond what a typical consumer can do with an OEM rig, which will usually limit you down to a 6-pin power connector for a GPU. It gets even worse when you look at 75W cards, seems like forever since that market has moved much at all.
 
It's true that a 2x increase - if true - is well above the norm for a gen-on-gen increase. But on the other hand, gen-on-gen increases for the past couple of generations have either been lacklustre (Turing - which also bumped prices, yay!) or coupled with a massive price hike (Ampere). That we now seem to be getting a big gen-on-gen increase, but with another price increase is really not good in light of this. It's not as if prices have been stagnant for the past half decade.

Definitely agree but everything is getting more expensive unfortunately cars, phones, electricity, food.... GPUs aren't immune to this and personally I'd rather them go all out doubling performance vs giving us 35% more performance at the same price and calling it a flagship. I'm as happy about the pricing as everyone else but I also try to be releastic. Again once reviews come out and competing RDNA3 cards are released I'll decide how good or bad these cards are at a given price.

All these cards could be good or bad depending on performance vs ampere and RDNA3 I'm personally rooting for much better priced amd cards I'm also not holding my breath.
 
Well i expected worst, i though the RTX4090 was going to be $2k, not saying im happy with $1600, but its lower than what i though. That said, i expect the custom coolers to go close to $2k.

I do think it was a little fishy the games that were used to compare the 4090, i want to see it vs real AAA games. Is the NDA release before Oct 12? so we can see 3rd party reviews before we buy?
 
They could, but a proper system costs money, and they don't really care which user is buying them, why should they. The inventory is easily predicted and only in certain timeframes you can get in trouble like many did within the pandemic and the mining grace.

NVIDIA most certainly knows how many Ada Lovelace GPUs they delivered to AIB partners and they probably have a good idea how many of each card is available to ship.

The point is that they don't know who the buyer is: a university purchasing agent for some research lab, a PC gamer, a crypto miner (less likely today), or a scalper who won't actually use the card but will sell it to one of the first three.

NVIDIA should actually care about scalpers because customer satisfaction is partially based on the value proposition. By using AIB partners and a third-party retail store marketplace, NVIDIA has very little control over scalping. They had very limited success thwarting scalping of Ampere cards in the USA, even their own Founders Edition models sold directly through their sole representative, Best Buy.

Certainly mining demand won't be there like it was two years ago but there's nothing preventing scalpers from scooping up available 4090 inventory and reselling to gamers, technical users, content creators, etc.

We'll all just have to wait and see what happens in October with the 4090 release.
 
I cannot see that Nvidia 4000 series feature DisplayPort 2.0 ports with at least 40 Gbps port. I can see that MSI cards have DP 1.4 only.
New DP 2.0 monitors are being validated and certified as we speak, and will launch soon.

For such expensive products, almost 2023 products, it's not acceptable not to move to newly available DP standard. Let's see whether AMD's RDNA3 cards deliver on what has been releaved by Phoronix code. Those cards are thought to have support for 80 Gbps ports in software.
 
My take on the GPU market higher end models, is that they don't compare to previous gen outside of the model designation.

i.e. I don't think it's correct to compare a 2080 Ti to anything other than a 3080 Ti. Otherwise you have to start looking at Titan which doesn't exist now.

It's like if you bought a Ford Fusion before the Ford Taurus came out, and choose to compare the pricing to a new Taurus because the Fusion was the top line Ford sedan at the time you bought it. It's just not a valid comparison, it's a different model that is more upscale. Everyone is doing this in all industries, can't compare house prices from 30 years ago to new houses because new houses are 40% larger.

Me personally I agree with @Dragokar. I want to see what both AMD and Nvidia can do below 225W.

It's not a tree hugger / green thing, I just don't want to get a bigger PSU and noisier GPU for something I spend maybe 10% of my time doing. I also don't like how the market has moved beyond what a typical consumer can do with an OEM rig, which will usually limit you down to a 6-pin power connector for a GPU. It gets even worse when you look at 75W cards, seems like forever since that market has moved much at all.

I don't disagree with your take because neither of us is wrong or right everyone has to do their own research and decide based on a number of factors if a 900 gpu is right for them. A car is a necessity for most people a gpu is a commodity and much less important but I don't necessarily look at cars any different I decide what I want to spend and buy whatever offers me the most in that price range.
 
Back
Top