Tuesday, September 20th 2022

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

NVIDIA just kicked off the GTC Autumn 2022 Keynote address that culminates in Project Beyond, the company's launch vehicle for its next-generation GeForce RTX 40-series graphics cards based on the "Ada" architecture. These are expected to nearly double the performance over the present generation, ushering in a new era of photo-real graphics as we inch closer to the metaverse. NVIDIA CEO Jensen Huang is expected to take center-stage to launch these cards.

15:00 UTC: The show is on the road.
15:00 UTC: AI remains the center focus, including how it plays with gaming.

15:01 UTC: Racer X is a real-time interactive tech demo. Coming soon.
15:02 UTC: Future games will be simulations, not pre-baked- Jensen Huang
15:03 UTC: This is seriously good stuff (RacerX). It runs on a single GPU, in real-time, uses RTX Neural Rendering
15:05 UTC: Ada Lovelace is a huge GPU
15:06 UTC: 76 billion transistors, over 18,000 shaders. 76 billion transistors, Micron GDDR6X memory. Shader execution reordering is major innovation, as big as out-of-order execution for CPUs, gains up to 25% in-game performance. Ada built on TSMC 4 nm, using 4N, a custom process designed in together with NVIDIA.

There's a new streaming multiprocessor design, with a total of 90 TFLOPS. Power efficiency is doubled over Ampere.
Ray Tracing is on the third generation now, with 200 RT TFLOPS and twice the triangle intersection speed.
Deep Learning AI uses 4th gen Tensor Cores, 1400 TFLOPS, "Optical Flow Accelerator"
15:07 UTC: Shader Execution Reordering similar to the one we saw with Intel Xe-HPG
15:08 UTC: Several new hardware-accelerated ray tracing innovations with 3rd gen RTX.
15:09 UTC: DLSS 3 is announced. It brings with it several new innovations, including temporal components, and Reflex latency optimizations. Generates new frames without involving the graphics pipeline.
15:11 UTC: Cyberpunk 2077 to get DLSS 3 and SER. 16 times increase in effective performance using DLSS 3 vs. DLSS 1. MS Flight Simulator to get DLSS 3 support
15:13 UTC: Portal RTX, a remaster just like Quake II RTX, available from November, created with Omniverse RTX Remix.
15:14 UTC: Ada offers a giant leap in total performance. Everything has been increased 40 -> 90 TFLOPS shader, 78 -> 200 TFLOPS RTX, 126 -> 300 TFLOPS OFA, 320 -> 1400 TFLOPS Tensor.
15:17 UTC: Power efficiency is more than doubled, but power goes up to 450 W now.
15:18 UTC: GeForce RTX 4090 will be available on October 12, priced at $1600. It comes with 24 GB GDDR6X and is 2-4x faster than RTX 3090 Ti.
15:18 UTC: RTX 4080 is available in two versions, 16 GB and 12 GB. The 16 GB version starts at $1200, the 12 GB at $900. 2-4x faster than RTX 3080 Ti.
15:19 UTC: New pricing for RTX 30-series, "for mainstream gamers", RTX 40-series "for enthusiasts".
15:19 UTC: "Ada is a quantum leap for gamers"—improved ray tracing, shader execution reordering, DLSS 3.
15:20 UTC: Updates to Omniverse

15:26 UTC: Racer X demo was built by a few dozen artists in just 3 months.
15:31 UTC: Digital twins would play a vital sole in product development and lifecycle maintenence.
15:31 UTC: Over 150 connectors to Omniverse.
15:33 UTC: GDN (graphics delivery network) is the new CDN. Graphics rendering over the Internet will be as big in the future as streaming video is today.
15:37 UTC: Omniverse Cloud, a planetary-scale GDN
15:37 UTC: THOR SuperChip for automotive applications.

15:41 UTC: NVIDIA next-generation Drive
Add your own comment

333 Comments on NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

#201
Valantar
SisyphusThe 3090, 4090 etc. are for semiprofessionals, creative people, not for gamers, nobody needs 24 GB VRam in gaming. Buts its nice for mixed use and a lot cheaper than the quadro series.

The only explanation for me: Gamers love to have the "best" card, naming/PR is everything. AMD isn't much better, neither much cheaper. So there is room for a good margin.
But that's precisely where this specific interpretation falls apart: Nvidia specifically cancelled the Titan programme, and renamed the Titan-equivalent cards Geforce. Geforce, crucially, is a gaming GPU series. So, they removed a "prosumer" class designation and instead included these cards in the consumer-facing Geforce series (yes, early Titans were Geforce; later ones were not). Literally the only logic to explain this is that they saw the Titan branding as something that discouraged gamers from buying these GPUs - which makes sense, as Geforce is the gaming brand, so by calling them Titan they were saying "well, you can use these for gaming, sure, but that's not really what they're for". As such, the move from Titan to Geforce is an explicit attempt at presenting these GPUs as consumer or gamer oriented, not prosumer cards (despite their technical details definitely making more sense in that context). Through removing the Titan brand and instead creating the 90 tier, Nvidia is preying on the exact mechanism you're describing - of gamers wanting "the best", no matter what it is.

Outside of this being a marketing/branding move that borders very close on being exploitative in and of itself, this has other problems: the Titan branding also brought with it price separation. Titans were wildly expensive compared to their close Geforce siblings, which was defensible through these being for business, not for gaming. Through including these same cards into the Geforce lineup, they now have the "freedom" to lift pricing for the rest of the Geforce lineup up so that instead of being a major step upwards in price, it's instead continuous with lower tier cards. That's how you go from $1200 Titan X(p) and $700 1080 Ti (+71% price) to $1600 RTX 4090 and $1200 4080 16GB (+33%). The net effect of this is not the Titan-class card being "a lot cheaper than the Quadro series" - they always were - but instead the entirety of the Geforce lineup becoming more expensive through a persistent lifting of the price ceiling for such cards, and thus slowly shifting the marketing/pricing equivalent of the Overton window - the window of what is seen as acceptable and reasonable pricing for a GPU.

There is no other explanation of this that makes any type of sense other than Nvidia wanting to increase prices and squeeze more money out of gamers.
Posted on Reply
#202
kiakk
oxrufiioxoTechnically the 4090 is 400 cheaper than the launch 3090ti given that it's 2-4x faster depending on what you're doing with it thay isn't bad
Not a smart way to count the perfromance gain into the price. With this logic... a GTX **60 middle category could be 20-30% higher price from generation to generation. With this logic from GTX 260 untill rtx 3060, the rtx 3060 should be 1000-1100USD at least, or even more...
So by this logic, if we should pay 700-800USD to an RTX 3060 we should be happy, by a 400usd "discount" as it should be? I do not think so....
Only moderate price increase acceptable, that covers inflation, design and manufacturing complexity. But by the age of mining and high margin of nVidia, the last years of price increasing is not acceptable.

Just simply look at the financial balance of these companies nVidia, AMD... Can you realize a similar financial increasement in your personal live, or in your familly or friends? They are growing as hell while an average person in the society have even less financial possibilities than years before.
Posted on Reply
#203
Valantar
kiakkNot a smart way to count the perfromance gain into the price. With this logic... a GTX **60 middle category could be 20-30% higher price from generation to generation. With this logic from GTX 260 untill rtx 3060, the rtx 3060 should be 1000-1100USD at least, or even more...
So by this logic, if we should pay 700-800USD to an RTX 3060 we should be happy, by a 400usd "discount" as it should be? I do not think so....
Only moderate price increase acceptable, that covers inflation, design and manufacturing complexity. But by the age of mining and high margin of nVidia, the last years of price increasing is not acceptable.

Just simply look at the financial balance of these companies nVidia, AMD... Can you realize a similar financial increasement in your personal live, or in your familly or friends? They are growing as hell while an average person in the society have even less financial possibilities than years before.
I tend to agree, except for one thing: naming and marketing tiers are arbitrary, while price points relate to people's real money and ability to pay - which tends not to scale with inflation in recent decades (the US has had wage stagnation for, what, 50 years?). So, price increases following inflation would still make cards unaffordable over time. Instead, the pressure should be on chipmakers to design more cost-effective chips for each tier, while still delivering gen-on-gen performance increases for the same price. When 6-tier GPUs were ~$200-250 seven years ago, it's unacceptable for them to be $329-400 in 2020, and likely $400-500 in 2022-3 even if this matches inflation. Why? Because soon enough you'll run out of product tiers going down - and it's not like Nvidia is noticeably expanding their range downwards. Those $200-$250 6 tier GPUs in 2016 were accompanied by $140 5 Ti and $110 5-tier GPUs, after all, while today the best they've got on offer is the $250 (doubled!) RTX 3050. (Please don't mention the GTX 1630.)

What does this mean? That GPUs are getting increasingly unaffordable, and the main culprit is that GPU makers simply aren't making and selling affordable GPUs, instead insisting on artificially inflating margins. The RTX 3050 could easily be $150 if Nvidia was focused on designing an affordable GPU. These are conscious product segmentation choices made on the level of chip design and tiering. Instead, Nvidia is working concertedly towards increasing prices across the board. And sadly, so far AMD is happy to follow suit.
Posted on Reply
#204
oxrufiioxo
kiakkNot a smart way to count the perfromance gain into the price. With this logic... a GTX **60 middle category could be 20-30% higher price from generation to generation. With this logic from GTX 260 untill rtx 3060, the rtx 3060 should be 1000-1100USD at least, or even more...
So by this logic, if we should pay 700-800USD to an RTX 3060 we should be happy, by a 400usd "discount" as it should be? I do not think so....
Only moderate price increase acceptable, that covers inflation, design and manufacturing complexity. But by the age of mining and high margin of nVidia, the last years of price increasing is not acceptable.

Just simply look at the financial balance of these companies nVidia, AMD... Can you realize a similar financial increasement in your personal live, or in your familly or friends? They are growing as hell while an average person in the society have even less financial possibilities than years before.
I don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.

Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...

Getting upset over how a company prices their own products is pointless... They're not a charity or your friends they will all try to make the most money possible.
Posted on Reply
#205
Valantar
oxrufiioxoI don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.

Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...
That is an incredibly privileged and myopic point of view. Which is of course your right to have - but it also demonstrates a severe lack of perspective and base level compassion on your part. These not being "necessities" doesn't make the potential consequences of not affording them any less real - from losing access to a preferred way of relaxing/unwinding, to losing access to a social group, to even limitations on future job opportunities (if you're not able to play games, it's extremely unlikely that you'll ever end up working in or around game development). And more. Are any of these things world-ending? No, but very few things in life are. That doesn't make them any less painful or real to the people experiencing them. Sure, videogames are ultimately trivial for the vast majority of people playing them. That doesn't mean their potential positive effects on social and mental well-being in an increasingly precarious world are any less real. And losing access to a social circle, an activity shared with friends, or a way of unwinding and relaxing after an exhausting and draining workday can indeed be downright horrible.
Posted on Reply
#206
THU31
oxrufiioxoI don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.

Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...
This sounds like EA's Battlefield V logic "if you don't like our game, don't buy it", or Don Mattrick's "if you don't have internet, stick with Xbox 360".

I think we all know how both of these turned out.

NVIDIA can do whatever they want. But I will listen to this advice and simply do not buy it. If they drop prices at some point, I might. If not, I will wait for the next generation (or at least consider a Radeon card, although I do want great RT performance and I do love DLSS).
Posted on Reply
#207
oxrufiioxo
ValantarThat is an incredibly privileged and myopic point of view. Which is of course your right to have - but it also demonstrates a severe lack of perspective and base level compassion on your part. These not being "necessities" doesn't make the potential consequences of not affording them any less real - from losing access to a preferred way of relaxing/unwinding, to losing access to a social group, to even limitations on future job opportunities (if you're not able to play games, it's extremely unlikely that you'll ever end up working in or around game development). And more. Are any of these things world-ending? No, but very few things in life are. That doesn't make them any less painful or real to the people experiencing them. Sure, videogames are ultimately trivial for the vast majority of people playing them. That doesn't mean their potential positive effects on social and mental well-being in an increasingly precarious world are any less real. And losing access to a social circle, an activity shared with friends, or a way of unwinding and relaxing after an exhausting and draining workday can indeed be downright horrible.
Don't get me wrong if I could wave a magic wand and make gpus affordable for everyone I would but that's not reality. Everyone has to decide with their own hard earned money what they can afford or what they would tolerate as far as pricing to me 1200 and 1600 usd is ok assuming the performance uplift justifies it to others it's a slap in the face neither of us is wrong.

@kiakk Calling me out about a flagship being cheaper than a previous gen flagship and then going on a tangent about 60 tier pricing and how it should only be priced in a way that is ok with him makes no sense to me though.
THU31This sounds like EA's Battlefield V logic "if you don't like our game, don't buy it", or Don Mattrick's "if you don't have internet, stick with Xbox 360".

I think we all know how both of these turned out.

NVIDIA can do whatever they want. But I will listen to this advice and simply do not buy it. If they drop prices at some point, I might. If not, I will wait for the next generation (or at least consider a Radeon card, although I do want great RT performance and I do love DLSS).
But what other choice do we have? I mean if you don't like the price the only option is don't buy it. Getting upset in a tech forum about it does absolutely nothing.

The 4080 12GB is stupidly priced at 900 usd I wouldn't touch it and I doubt reviews will change that.. That leaves me with the 4080 16gb and 4090 if I want a noticeable upgrade this gen... Will I buy one no idea but I will at least reserve judgment on them till 3rd party reviews go live. I still think this has more to do with them wanting to clear out ampere stock I mean they still consider 3 of them part of their current offerings as it is.

I personally like RT especially in games like Control and Cyberpunk with 3-4 different implementation so AMD is probably not an option this generation again. Sounds like their implementation will be quite a bit weaker but I'd love to be wrong.
Posted on Reply
#208
Valantar
oxrufiioxoDon't get me wrong if I could wave a magic wand and make gpus affordable for everyone I would but that's not reality. Everyone has to decide with their own hard earned money what they can afford or what they would tolerate as far as pricing to me 1200 and 1600 usd is ok assuming the performance uplift justifies it to others it's a slap in the face neither of us is wrong.
You're not wrong about this in a vacuum, but you're entirely ignoring the causes and power relations behind these developments. Nvidia could keep per-tier GPU pricing entirely fixed if it wanted to, while increasing performance each generation - it would just need to prioritize differently when laying out their GPU dice and making their reference designs. It would make per-generation gains look less impressive most of the time - but Turing demonstrated that this really isn't a problem for Nvidia. The point being: it would be entirely possible for them to design $200 6-tier GPUs, and more - if they wanted to. Instead, they're prioritizing absolute peak performance and gradually increasing average sale prices, mainly because that makes shareholders happy. But this also intrinsically pushes away their customer base - just look at how many people are still using GTX 1060s - and I can't help but think this is going to come back to bite them in the long run. For now, they're being propped up by AMD not wanting to compete on price but instead also wanting to raise average sale prices - which is understandable when AMD hasn't had any free fab capacity for something like three years now. But if that changes, if TSMC's drop-off in orders means AMD now has capacity to spare, we might be looking at a new market share/sales volume push for AMD. This is clearly me hoping for something to happen rather than necessarily thinking that it will - AMD's shareholders also love high average sale prices - but there's definitely an opportunity here for AMD.
oxrufiioxoBut what other choice do we have? I mean if you don't like the price the only option is don't buy it.
Yes, exactly. And how, precisely, is this a sustainable attitude for Nvidia - or any other mass market company - to hold towards their customers?

I mean, there are other choices. Used market, consoles (a gift to AMD, really), finding other hobbies. Nvidia has nothing to gain from alienating customers - but that's what they're doing. They've been pushing hard for quite a while to get as much money as they can from a rapidly expanding customer base, but we've already reached a saturation point, and rather than accepting that they overplayed their hand, or played hard and went out of bounds, and accepting the (small!) cost of this, Nvidia are instead continuing on a path of maximizing profits rather than designing attractive products people can actually afford.
Posted on Reply
#209
oxrufiioxo
ValantarYou're not wrong about this in a vacuum, but you're entirely ignoring the causes and power relations behind these developments. Nvidia could keep per-tier GPU pricing entirely fixed if it wanted to, while increasing performance each generation - it would just need to prioritize differently when laying out their GPU dice and making their reference designs. It would make per-generation gains look less impressive most of the time - but Turing demonstrated that this really isn't a problem for Nvidia. The point being: it would be entirely possible for them to design $200 6-tier GPUs, and more - if they wanted to. Instead, they're prioritizing absolute peak performance and gradually increasing average sale prices, mainly because that makes shareholders happy. But this also intrinsically pushes away their customer base - just look at how many people are still using GTX 1060s - and I can't help but think this is going to come back to bite them in the long run. For now, they're being propped up by AMD not wanting to compete on price but instead also wanting to raise average sale prices - which is understandable when AMD hasn't had any free fab capacity for something like three years now. But if that changes, if TSMC's drop-off in orders means AMD now has capacity to spare, we might be looking at a new market share/sales volume push for AMD. This is clearly me hoping for something to happen rather than necessarily thinking that it will - AMD's shareholders also love high average sale prices - but there's definitely an opportunity here for AMD.
I definitely see your point. I personally would prefer they push absolute performance. You're not wrong though.

A lot of Ada design was probably decided during the peak of the mining boom I wonder in retrospect if they would change anything I'm sure they are still salty with retailers making more than they did the first 6-8 months of amperes life. I personally expected the two 4080s to be cheaper but the 4090 to be more expensive but here we are.

Going forward I'm as interested as anyone to how the market reacts Turing didn't do so well but ampere did amazing for them due to reasons. It'll be interesting how Ada does.


Kudos to those who got a 10GB 3080 near msrp at launch though in retrospect that was a hell of a deal same with the 6800XT to a lesser extent.
Posted on Reply
#210
Valantar
oxrufiioxoI definitely see your point. I personally would prefer they push absolute performance. You're not wrong though.
I think there could be ways for them to do both - they could for example come up with some sort of "premium" branding for higher end cards. Titan could arguably have been that - but instead, they started out with Turing pushing per-tier pricing higher with that weird 20/16 split ("yes, there are now four 6-tier SKUs, why not?"). If they had instead expanded Titan into a "premium gaming" segment, that could have brought with it a ton of prestige (and thus desirability and sales) at high prices, while leaving attractive-seeming tier designations available for lower-end cards, opening the door for perceived good value GPUs as well. Instead they've essentially abandoned the $200 price point for two full generations now, focusing solely on the higher end.

To me, this is part of why I tend towards believing EVGA's version of their split: their description is consistent with Nvidia's behaviour towards only caring about increasing margins, ASPs and per-tier pricing for more than half a decade now. It's likely not the full story, but that glove definitely does seem to fit nonetheless.

Of course, mobile also likely plays a big part in this - and the relatively large dice and expensive designs for the past few generations match well with these being targeted towards lower clocked mobile chips that still deliver good absolute performance. Smaller, cheaper designs would inevitably do worse in mobile by simple virtue of not having the ability to clock as high. And, of course, two successive crypto booms and a two-year gloabl lockdown have no doubt also driven a belief that whatever the price, the GPUs will sell. We'll see what kind of correction this leads to in the near future, if anything. For now, their expressed tactic is to not change anything and try to rake in profits as much as possible. Which really does leave the door open for AMD.
Posted on Reply
#211
Darller
oxrufiioxoI don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.


Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...

Getting upset over how a company prices their own products is pointless... They're not a charity or your friends they will all try to make the most money possible.
Pretty much this. It's like complaining Ferraris are too expensive; if they're too expensive you're not the target market.

I have worked extremely hard my entire adult life(I am now 44) to be in the position where my hobbies aren't a financial hardship, and this is just one example where that work has paid off.
Posted on Reply
#212
Valantar
DarllerPretty much this. It's like complaining Ferraris are too expensive; if they're too expensive you're not the target market.

I have worked extremely hard my entire adult life(I am now 44) to be in the position where my hobbies aren't a financial hardship, and this is just one example where that work has paid off.
And you've thus also clearly had the privilege of having that hard work actually pay off - unlike a lot of people. Generally, the hardest working people you'll find are those working two or three shit jobs, barely covering rent and basic living expenses. Don't confuse being lucky enough for things to work out for you with people not being that lucky not having worked as hard, or not being as deserving of good things, please.

Also, 6-tier GPUs are distinctly not Ferraris - they're supposed to be the Toyotas of the GPU world. And if supposedly cheap Toyotas start being priced like Ferraris, there's something serious wrong going on.
Posted on Reply
#213
oxrufiioxo
ValantarTo me, this is part of why I tend towards believing EVGA's version of their split: their description is consistent with Nvidia's behaviour towards only caring about increasing margins, ASPs and per-tier pricing for more than half a decade now. It's likely not the full story, but that glove definitely does seem to fit nonetheless.
That was a huge blow to me I've owned at least 10 Evga gpu over the last decade. They are a small company not far from me less than 300 employees I believe. I've also tried to support them because they are local. I hope they change their minds and at least make some AMD gpus a Kingpin 7900XT that beats the 4090 in rasterization would be lovely to me.
Posted on Reply
#214
Blueberries
I noticed something about the DLSS 3 demonstration. See if you can spot it.


Posted on Reply
#215
80-watt Hamster
ValantarWhat does this mean? That GPUs are getting increasingly unaffordable, and the main culprit is that GPU makers simply aren't making and selling affordable GPUs, instead insisting on artificially inflating margins. The RTX 3050 could easily be $150 if Nvidia was focused on designing an affordable GPU. These are conscious product segmentation choices made on the level of chip design and tiering. Instead, Nvidia is working concertedly towards increasing prices across the board. And sadly, so far AMD is happy to follow suit.
I've complained about this before, but the 3050 is a 6-series card wearing Groucho glasses. Its power consumption is in line with earlier xx60, as is the generational performance uplift (well, outside of the huge 960-1060 leap). The best we could have hoped for was ~$200, the general historical price point for that level of card (again excepting the $300 1060). Two factors preventing what we know as the 3050 launching as a 3060: Nvidia's obvious campaign to push what's considered entry-level and midrange ever higher, and the existence of the 1660S/ti. The 3050's raster performance is right on top of those two, at a higher TDP, no less! Naming it the 3060 would have been branding suicide.
Posted on Reply
#216
oxrufiioxo
BlueberriesI noticed something about the DLSS 3 demonstration. See if you can spot it.


DF has some comparisons with the 3090ti... I definitely want to see a comparison with just the game maxed out though.


Posted on Reply
#217
Valantar
80-watt HamsterI've complained about this before, but the 3050 is a 6-series card wearing Groucho glasses. Its power consumption is in line with earlier xx60, as is the generational performance uplift (well, outside of the huge 960-1060 leap). The best we could have hoped for was ~$200, the general historical price point for that level of card (again excepting the $300 1060). Two factors preventing what we know as the 3050 launching as a 3060: Nvidia's obvious campaign to push what's considered entry-level and midrange ever higher, and the existence of the 1660S/ti. The 3050's raster performance is right on top of those two, at a higher TDP, no less! Naming it the 3060 would have been branding suicide.
Yeah, that's a good point. But honestly, at this point Nvidia needs to realize that their current product stack strategy is ... well, not workable. Ever since killing off the GT/GTX distinction, they've had 10 tiers to use, of which 1 has been "this is a display adapter", 3 has been "uh ... guess this can do some 3D", 5 has been entry level gaming, 6 mid-range, 7 upper mid-range to premium, and 8 premium-to-flagship. But then as higher-than-1080p resolutions have proliferated while 1080p has stayed put, the range of usable performance has widened, while the product stack has only widened by one tier - 9, halo/flagship/"this is really a Titan". And their reluctance to use more lower tier designations is understandable - there's definitely an argument to be made for 3, 4 and 5 tiers being unattractive. Not many people want to buy a Core i3, even if it's good. You buy that if it's what you can afford.

Still, I think Nvidia (and AMD for that matter, though they seem a tad more flexible currently) needs to shift things. If they just made the leap, told people that "what used to be called 6 is now called 5", and accepted the dampening effect of this on sales for one generation (which they could most likely counteract by marketing the crap out of that new 50-series), they'd have a lot more flexibility in product segmentation. Instead they're forcing themselves to fit a massive range of performance into just four numbered tiers. Which is just stupid. So instead we've now got ... what, eight SKUs across four tiers, ranging from the 3060 to 3090 Ti. The 3050 is clearly an afterthought, and a card they don't really seem to want to sell, given its pricing and market positioning.

As I suggested above, they could also have alleviated this by creating a "premium" brand of some sort, moving perhaps the top three SKUs to this tier. Call them Titan, call them GeforceX, call them Xtreme XGeforce XRTX - whatever. This way they could have called the 3050 the 3060, priced it at $200, and sold tons and tons of them. And they could still have had $700 80-tier premium/high end tier cards that would be aspirational for the people buying $200 GPUs - most of those spend far less on their PC than the cost of a 3090, so selling them on the 3090 being great isn't much of an advertisement anyhow.

Of course, the mining+lockdown WFH/gaming booms have made such shifts unnecessary - up until now. I guess we'll see how this plays out, but if they insist on keeping the "Geforce [four numbers]" structure as their only GPU brand, they need to start shifting things - the sooner the better for them. But for now, their preferred tactic seems to be "profits today, screw tomorrow".
Posted on Reply
#218
THU31
DarllerPretty much this. It's like complaining Ferraris are too expensive; if they're too expensive you're not the target market.

I have worked extremely hard my entire adult life(I am now 44) to be in the position where my hobbies aren't a financial hardship, and this is just one example where that work has paid off.
I think Ferraris have always been expensive. Graphics cards have not. If you look at the x80 cards in the last decade, we started at $500 and we got gradual increases of about $100 over the years, which is understandable even if not necessary. But now we suddenly go from $700 to $1200 for no reason. There really will come a point when people will stop buying new cards, because this does not make sense. This is just greed, and of course they have a right to be greedy.

But maybe it is time to change things. If cards are too expensive to manufacture, then they should make smaller GPUs with simpler PCBs and coolers. Maybe we should go back to 200 W being the maximum power consumption. Make smaller performance increases, but make it possible for people to actually buy cards in the range of $200 to $500.

All other PC components are super affordable, including CPUs, which have had gigantic leaps in performance in the last 5 years, but prices have basically stayed the same. So it can be done.
Posted on Reply
#219
Valantar
BlueberriesI noticed something about the DLSS 3 demonstration. See if you can spot it.


That's quite interesting. Do they mention what GPU that's running on though? It might not be the 4090.
THU31But maybe it is time to change things. If cards are too expensive to manufacture, then they should make smaller GPUs with simpler PCBs and coolers. Maybe we should go back to 200 W being the maximum power consumption. Make smaller performance increases, but make it possible for people to actually buy cards in the range of $200 to $500.

All other PC components are super affordable, including CPUs, which have had gigantic leaps in performance in the last 5 years, but prices have basically stayed the same. So it can be done.
Exactly this. These are conscious design decisions on GPU makers' parts, and decisions that could have been made differently. Yes, there are other factors in play - increasing materials prices, faster I/O requiring more expensive PCB materials, etc. - but this can be accounted for to various degrees. Instead, they're consciously pushing for higher prices in every way they can.
Posted on Reply
#220
dyonoctis
ValantarBut that's precisely where this specific interpretation falls apart: Nvidia specifically cancelled the Titan programme, and renamed the Titan-equivalent cards Geforce. Geforce, crucially, is a gaming GPU series. So, they removed a "prosumer" class designation and instead included these cards in the consumer-facing Geforce series (yes, early Titans were Geforce; later ones were not). Literally the only logic to explain this is that they saw the Titan branding as something that discouraged gamers from buying these GPUs - which makes sense, as Geforce is the gaming brand, so by calling them Titan they were saying "well, you can use these for gaming, sure, but that's not really what they're for". As such, the move from Titan to Geforce is an explicit attempt at presenting these GPUs as consumer or gamer oriented, not prosumer cards (despite their technical details definitely making more sense in that context). Through removing the Titan brand and instead creating the 90 tier, Nvidia is preying on the exact mechanism you're describing - of gamers wanting "the best", no matter what it is.

Outside of this being a marketing/branding move that borders very close on being exploitative in and of itself, this has other problems: the Titan branding also brought with it price separation. Titans were wildly expensive compared to their close Geforce siblings, which was defensible through these being for business, not for gaming. Through including these same cards into the Geforce lineup, they now have the "freedom" to lift pricing for the rest of the Geforce lineup up so that instead of being a major step upwards in price, it's instead continuous with lower tier cards. That's how you go from $1200 Titan X(p) and $700 1080 Ti (+71% price) to $1600 RTX 4090 and $1200 4080 16GB (+33%). The net effect of this is not the Titan-class card being "a lot cheaper than the Quadro series" - they always were - but instead the entirety of the Geforce lineup becoming more expensive through a persistent lifting of the price ceiling for such cards, and thus slowly shifting the marketing/pricing equivalent of the Overton window - the window of what is seen as acceptable and reasonable pricing for a GPU.

There is no other explanation of this that makes any type of sense other than Nvidia wanting to increase prices and squeeze more money out of gamers.
There's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do (at least on the same level as nvidia) because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
Posted on Reply
#221
80-watt Hamster
ValantarThat's quite interesting. Do they mention what GPU that's running on though? It might not be the 4090.


Exactly this. These are conscious design decisions on GPU makers' parts, and decisions that could have been made differently. Yes, there are other factors in play - increasing materials prices, faster I/O requiring more expensive PCB materials, etc. - but this can be accounted for to various degrees. Instead, they're consciously pushing for higher prices in every way they can.
Maybe they see the discrete market contracting long-term, and are laying the groundwork to pull the same net out of fewer units. Still crappy from our perspective, but with a much lower "Wut!?" factor. Or it's an allocation thing. The chips are more valuable for HPC, so it's got to be worth Nvidia's time to even make consumer-bound chips.
dyonoctisThere's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
Desktop market share is only going to shrink, with or without NV's influence. Laptops are more convenient, phones even more so. I feel a little gross defending NV even a little, but they may not have much of an option, broadly speaking.
Posted on Reply
#222
oxrufiioxo
dyonoctisThere's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
Not like Jensen was trying to hide anything he stated prior to these announcements any Ada gpu launched this year would be layered on top of existing Ampere gpus. Personally I figured it meant we were only getting a 4090..... Guess I was wrong.
Posted on Reply
#223
Valantar
dyonoctisThere's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
Yeah, I think that's spot on. And I also think that if Nvidia were to get their will with going that way, it'll be the end of PC gaming - barring someone else stepping in to fill their shoes. PC gaming as a hyper-expensive niche just isn't sustainable as a business. If anything, the Steam Deck has demonstrated that the future of gaming might lie in the exact opposite direction, and that we might in some ways be reaching a point of "good enough" in a lot of ways.

I also agree that this situation presents a massive opportunity for AMD. Regardless of where RDNA3 peaks in terms of absolute performance, if they flesh out the $300-700 segment with high performing, good value options relatively quickly, they have an unprecedented opportunity to overtake Nvidia - as this pitch from Nvidia (as well as their earnings call) confirms that they've got more Ampere stock than they know what to do with, and they don't want to stomach the cost of cutting prices unless they really have to. If RDNA3 actually delivers its promised 50% perf/W increase, and die sizes and MCM packaging costs allow for somewhat competitive pricing, this could be the generation where AMD makes a significant move up from their perennial ~20% market share. That's my hope - but knowing AMD and their recent preference for high ASPs and following in Nvidia's pricing footsteps, I'm not that hopeful. Fingers crossed, though.
80-watt HamsterMaybe they see the discrete market contracting long-term, and are laying the groundwork to pull the same net out of fewer units. Still crappy from our perspective, but with a much lower "Wut!?" factor. Or it's an allocation thing. The chips are more valuable for HPC, so it's got to be worth Nvidia's time to even make consumer-bound chips.



Desktop market share is only going to shrink, with or without NV's influence. Laptops are more convenient, phones even more so. I feel a little gross defending NV even a little, but they may not have much of an option, broadly speaking.
This is absolutely true. There's also the simple fact of GPUs largely being good enough for quite a lot of things for quite a while. We're no longer seeing anyone need a GPU upgrade after even 3 years - unless you're an incorrigible snob, at least. Lots of factors play into this, from less growth in graphical requirements to the massive presence of AA/lower budget/indie games in the current market to open source upscaling tech to lower resolutions in many cases still being fine in terms of graphical fidelity. It could absolutely be that Nvidia is preparing for a future of (much) lower sales, but I don't think they can do that by pushing for $1500 GPUs becoming even remotely normal - the number of people affording those isn't going to increase anytime soon. Diversifying into other markets is smart, though, and AMD also needs to do the same (not that they aren't already). But the impending/already happened death of the 1-2 year GPU upgrade cycle won't be solved by higher prices and more premium products.
Posted on Reply
#224
Sisyphus
ValantarBut that's precisely where this specific interpretation falls apart: Nvidia specifically cancelled the Titan programme, and renamed the Titan-equivalent cards Geforce. Geforce, crucially, is a gaming GPU series. So, they removed a "prosumer" class designation and instead included these cards in the consumer-facing Geforce series (yes, early Titans were Geforce; later ones were not). Literally the only logic to explain this is that they saw the Titan branding as something that discouraged gamers from buying these GPUs - which makes sense, as Geforce is the gaming brand, so by calling them Titan they were saying "well, you can use these for gaming, sure, but that's not really what they're for". As such, the move from Titan to Geforce is an explicit attempt at presenting these GPUs as consumer or gamer oriented, not prosumer cards (despite their technical details definitely making more sense in that context). Through removing the Titan brand and instead creating the 90 tier, Nvidia is preying on the exact mechanism you're describing - of gamers wanting "the best", no matter what it is.
I understand, that's within my logic. Professionals care about big VRAM and nvlink for memory pooling. If you call the product Titan or 40x0 does not matter. 4090 has both, so its the choice. 2-3 cards in a graphic server, and the software/hardware will pool everything, cudacores, rt-cores, memory. A powerful and cheap solution for small companies, advanced students. Price was always >1000$, no change at all.
It was my logic, nVidia is using PR to get gamers to buy the 40x0 products too. And successful. The explanation lies within the gamer community. Professionals mostly knew, what they buy and what for.
Outside of this being a marketing/branding move that borders very close on being exploitative in and of itself, this has other problems: the Titan branding also brought with it price separation. Titans were wildly expensive compared to their close Geforce siblings, which was defensible through these being for business, not for gaming. Through including these same cards into the Geforce lineup, they now have the "freedom" to lift pricing for the rest of the Geforce lineup up so that instead of being a major step upwards in price, it's instead continuous with lower tier cards. That's how you go from $1200 Titan X(p) and $700 1080 Ti (+71% price) to $1600 RTX 4090 and $1200 4080 16GB (+33%). The net effect of this is not the Titan-class card being "a lot cheaper than the Quadro series" - they always were - but instead the entirety of the Geforce lineup becoming more expensive through a persistent lifting of the price ceiling for such cards, and thus slowly shifting the marketing/pricing equivalent of the Overton window - the window of what is seen as acceptable and reasonable pricing for a GPU.

There is no other explanation of this that makes any type of sense other than Nvidia wanting to increase prices and squeeze more money out of gamers.
I agree partially. RT on gaming cards was an innovation, based on technology, nVidia used in the professional markets. They implemented it in gamer hardware to realize synergies and market leadership (AMD is clearly behind in this area). I remember all the discussions about the 20x0 series. My position was: wait some years, and there will be no high end card without ray tracing. This guarantees nVidia technological leadership in the gaming sector without additional development costs, as they need rt/cuda/tensor for the hpc sector anyway. So what is going on? nVidia raises the limits between semi-professional use and high-end gaming, to serve different peer groups with one product. And this from the beginning. Cudacore support was even enabled in the gaming drivers back to maxwell. Every student, who needs raytracing or cudacore power could do some crazy calculations with cheap gaming cards. My first nVidia card was a 970 for exactly that reason, therefore I had radeon. Once you are adapted to a special software/hardware environment, you will not change easily.
People who spend >1000$ for gaming, well, its ok. What I dont like is nVidias low and midtear pricing. But that's an opportunity for AMD. So there is something for everyone.
Posted on Reply
#225
oxrufiioxo
ValantarYeah, I think that's spot on. And I also think that if Nvidia were to get their will with going that way, it'll be the end of PC gaming - barring someone else stepping in to fill their shoes. PC gaming as a hyper-expensive niche just isn't sustainable as a business. If anything, the Steam Deck has demonstrated that the future of gaming might lie in the exact opposite direction, and that we might in some ways be reaching a point of "good enough" in a lot of ways.

I also agree that this situation presents a massive opportunity for AMD. Regardless of where RDNA3 peaks in terms of absolute performance, if they flesh out the $300-700 segment with high performing, good value options relatively quickly, they have an unprecedented opportunity to overtake Nvidia - as this pitch from Nvidia (as well as their earnings call) confirms that they've got more Ampere stock than they know what to do with, and they don't want to stomach the cost of cutting prices unless they really have to. If RDNA3 actually delivers its promised 50% perf/W increase, and die sizes and MCM packaging costs allow for somewhat competitive pricing, this could be the generation where AMD makes a significant move up from their perennial ~20% market share. That's my hope - but knowing AMD and their recent preference for high ASPs and following in Nvidia's pricing footsteps, I'm not that hopeful. Fingers crossed, though.
I think AMD is more focused on CPU than GPU all their products are made on the Same TSMC node unlike Nvidia that gets a special node just for their gpus. I think AMD is happy with 20-25% of the market and just making a profit on top of being in the PS5 and Series X/S.

That being said I still expect the 7900XT to be 1200 usd or less and the 7800XT to be 800 usd or less I just don't think AMD cares as long as its 50-100 usd cheaper than a comparable Nvidia sku even if its much cheaper to make. I also wouldn't be surprised if the 7900XT was 1500 usd though..... I doubt anything lower than these 2 cards will launch anytime soon though with maybe a 7700XT early next year that to me will be the more interesting card in how they price it.

Another thing and this goes for Nvidia as well is TSMC is really the one dictating prices. Not sure if true but the New Ada gpus are supposedly much more costly to make than Ampere I really would like to see a breakdown of component cost for sure though I know SMD are getting more expensive as well.
Posted on Reply
Add your own comment
Nov 22nd, 2024 23:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts