• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
NVIDIA has something like 80% of the discrete desktop graphics card market
That's revenue share, not unit share, afaik.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The 3090, 4090 etc. are for semiprofessionals, creative people, not for gamers, nobody needs 24 GB VRam in gaming. Buts its nice for mixed use and a lot cheaper than the quadro series.

The only explanation for me: Gamers love to have the "best" card, naming/PR is everything. AMD isn't much better, neither much cheaper. So there is room for a good margin.
But that's precisely where this specific interpretation falls apart: Nvidia specifically cancelled the Titan programme, and renamed the Titan-equivalent cards Geforce. Geforce, crucially, is a gaming GPU series. So, they removed a "prosumer" class designation and instead included these cards in the consumer-facing Geforce series (yes, early Titans were Geforce; later ones were not). Literally the only logic to explain this is that they saw the Titan branding as something that discouraged gamers from buying these GPUs - which makes sense, as Geforce is the gaming brand, so by calling them Titan they were saying "well, you can use these for gaming, sure, but that's not really what they're for". As such, the move from Titan to Geforce is an explicit attempt at presenting these GPUs as consumer or gamer oriented, not prosumer cards (despite their technical details definitely making more sense in that context). Through removing the Titan brand and instead creating the 90 tier, Nvidia is preying on the exact mechanism you're describing - of gamers wanting "the best", no matter what it is.

Outside of this being a marketing/branding move that borders very close on being exploitative in and of itself, this has other problems: the Titan branding also brought with it price separation. Titans were wildly expensive compared to their close Geforce siblings, which was defensible through these being for business, not for gaming. Through including these same cards into the Geforce lineup, they now have the "freedom" to lift pricing for the rest of the Geforce lineup up so that instead of being a major step upwards in price, it's instead continuous with lower tier cards. That's how you go from $1200 Titan X(p) and $700 1080 Ti (+71% price) to $1600 RTX 4090 and $1200 4080 16GB (+33%). The net effect of this is not the Titan-class card being "a lot cheaper than the Quadro series" - they always were - but instead the entirety of the Geforce lineup becoming more expensive through a persistent lifting of the price ceiling for such cards, and thus slowly shifting the marketing/pricing equivalent of the Overton window - the window of what is seen as acceptable and reasonable pricing for a GPU.

There is no other explanation of this that makes any type of sense other than Nvidia wanting to increase prices and squeeze more money out of gamers.
 
Joined
Jun 16, 2022
Messages
76 (0.08/day)
Location
Hungary
System Name UNDERVOLTED (UV) silenced, 135W limited, high energy efficient PC
Processor Intel i5-10400F undervolt
Motherboard MSI B460 Tomahawk
Cooling Scythe Ninja 3 rev.B @ 620RPM
Memory Kingston HyperX Predator 3000 4x4GB UV
Video Card(s) Gainward GTX 1060 6GB Phoenix UV 775mV@1695MHz, 54% TDP (65 Watt) LIMIT
Storage Crucial MX300 275GB, 2x500GB 2.5" SSHD Raid, 1TB 2.5" SSHD, BluRay writer
Display(s) Acer XV252QZ @ 240Hz
Case Logic Concept K3 (Smallest Full ATX/ATX PSU/ODD case ever..)
Audio Device(s) Panasonic Clip-On, Philips SHP6000, HP Pavilion headset 400, Genius 1250X, Sandstrøm Hercules
Power Supply Be Quiet! Straight Power 10 500W CM
Mouse Logitech G102 & SteelSeries Rival 300 & senior Microsoft IMO 1.1A
Keyboard RAPOO VPRO 500, Microsoft All-In-One Keyboard, Cougar 300K
Software Windows 10 Home x64 Retail
Benchmark Scores More than enough Fire Strike: 3dmark.com/fs/28356598 Time Spy: 3dmark.com/spy/30561283
Technically the 4090 is 400 cheaper than the launch 3090ti given that it's 2-4x faster depending on what you're doing with it thay isn't bad

Not a smart way to count the perfromance gain into the price. With this logic... a GTX **60 middle category could be 20-30% higher price from generation to generation. With this logic from GTX 260 untill rtx 3060, the rtx 3060 should be 1000-1100USD at least, or even more...
So by this logic, if we should pay 700-800USD to an RTX 3060 we should be happy, by a 400usd "discount" as it should be? I do not think so....
Only moderate price increase acceptable, that covers inflation, design and manufacturing complexity. But by the age of mining and high margin of nVidia, the last years of price increasing is not acceptable.

Just simply look at the financial balance of these companies nVidia, AMD... Can you realize a similar financial increasement in your personal live, or in your familly or friends? They are growing as hell while an average person in the society have even less financial possibilities than years before.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Not a smart way to count the perfromance gain into the price. With this logic... a GTX **60 middle category could be 20-30% higher price from generation to generation. With this logic from GTX 260 untill rtx 3060, the rtx 3060 should be 1000-1100USD at least, or even more...
So by this logic, if we should pay 700-800USD to an RTX 3060 we should be happy, by a 400usd "discount" as it should be? I do not think so....
Only moderate price increase acceptable, that covers inflation, design and manufacturing complexity. But by the age of mining and high margin of nVidia, the last years of price increasing is not acceptable.

Just simply look at the financial balance of these companies nVidia, AMD... Can you realize a similar financial increasement in your personal live, or in your familly or friends? They are growing as hell while an average person in the society have even less financial possibilities than years before.
I tend to agree, except for one thing: naming and marketing tiers are arbitrary, while price points relate to people's real money and ability to pay - which tends not to scale with inflation in recent decades (the US has had wage stagnation for, what, 50 years?). So, price increases following inflation would still make cards unaffordable over time. Instead, the pressure should be on chipmakers to design more cost-effective chips for each tier, while still delivering gen-on-gen performance increases for the same price. When 6-tier GPUs were ~$200-250 seven years ago, it's unacceptable for them to be $329-400 in 2020, and likely $400-500 in 2022-3 even if this matches inflation. Why? Because soon enough you'll run out of product tiers going down - and it's not like Nvidia is noticeably expanding their range downwards. Those $200-$250 6 tier GPUs in 2016 were accompanied by $140 5 Ti and $110 5-tier GPUs, after all, while today the best they've got on offer is the $250 (doubled!) RTX 3050. (Please don't mention the GTX 1630.)

What does this mean? That GPUs are getting increasingly unaffordable, and the main culprit is that GPU makers simply aren't making and selling affordable GPUs, instead insisting on artificially inflating margins. The RTX 3050 could easily be $150 if Nvidia was focused on designing an affordable GPU. These are conscious product segmentation choices made on the level of chip design and tiering. Instead, Nvidia is working concertedly towards increasing prices across the board. And sadly, so far AMD is happy to follow suit.
 
Joined
Sep 10, 2018
Messages
6,922 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Not a smart way to count the perfromance gain into the price. With this logic... a GTX **60 middle category could be 20-30% higher price from generation to generation. With this logic from GTX 260 untill rtx 3060, the rtx 3060 should be 1000-1100USD at least, or even more...
So by this logic, if we should pay 700-800USD to an RTX 3060 we should be happy, by a 400usd "discount" as it should be? I do not think so....
Only moderate price increase acceptable, that covers inflation, design and manufacturing complexity. But by the age of mining and high margin of nVidia, the last years of price increasing is not acceptable.

Just simply look at the financial balance of these companies nVidia, AMD... Can you realize a similar financial increasement in your personal live, or in your familly or friends? They are growing as hell while an average person in the society have even less financial possibilities than years before.

I don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.

Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...

Getting upset over how a company prices their own products is pointless... They're not a charity or your friends they will all try to make the most money possible.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.

Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...
That is an incredibly privileged and myopic point of view. Which is of course your right to have - but it also demonstrates a severe lack of perspective and base level compassion on your part. These not being "necessities" doesn't make the potential consequences of not affording them any less real - from losing access to a preferred way of relaxing/unwinding, to losing access to a social group, to even limitations on future job opportunities (if you're not able to play games, it's extremely unlikely that you'll ever end up working in or around game development). And more. Are any of these things world-ending? No, but very few things in life are. That doesn't make them any less painful or real to the people experiencing them. Sure, videogames are ultimately trivial for the vast majority of people playing them. That doesn't mean their potential positive effects on social and mental well-being in an increasingly precarious world are any less real. And losing access to a social circle, an activity shared with friends, or a way of unwinding and relaxing after an exhausting and draining workday can indeed be downright horrible.
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.

Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...
This sounds like EA's Battlefield V logic "if you don't like our game, don't buy it", or Don Mattrick's "if you don't have internet, stick with Xbox 360".

I think we all know how both of these turned out.

NVIDIA can do whatever they want. But I will listen to this advice and simply do not buy it. If they drop prices at some point, I might. If not, I will wait for the next generation (or at least consider a Radeon card, although I do want great RT performance and I do love DLSS).
 
Joined
Sep 10, 2018
Messages
6,922 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
That is an incredibly privileged and myopic point of view. Which is of course your right to have - but it also demonstrates a severe lack of perspective and base level compassion on your part. These not being "necessities" doesn't make the potential consequences of not affording them any less real - from losing access to a preferred way of relaxing/unwinding, to losing access to a social group, to even limitations on future job opportunities (if you're not able to play games, it's extremely unlikely that you'll ever end up working in or around game development). And more. Are any of these things world-ending? No, but very few things in life are. That doesn't make them any less painful or real to the people experiencing them. Sure, videogames are ultimately trivial for the vast majority of people playing them. That doesn't mean their potential positive effects on social and mental well-being in an increasingly precarious world are any less real. And losing access to a social circle, an activity shared with friends, or a way of unwinding and relaxing after an exhausting and draining workday can indeed be downright horrible.

Don't get me wrong if I could wave a magic wand and make gpus affordable for everyone I would but that's not reality. Everyone has to decide with their own hard earned money what they can afford or what they would tolerate as far as pricing to me 1200 and 1600 usd is ok assuming the performance uplift justifies it to others it's a slap in the face neither of us is wrong.

@kiakk Calling me out about a flagship being cheaper than a previous gen flagship and then going on a tangent about 60 tier pricing and how it should only be priced in a way that is ok with him makes no sense to me though.

This sounds like EA's Battlefield V logic "if you don't like our game, don't buy it", or Don Mattrick's "if you don't have internet, stick with Xbox 360".

I think we all know how both of these turned out.

NVIDIA can do whatever they want. But I will listen to this advice and simply do not buy it. If they drop prices at some point, I might. If not, I will wait for the next generation (or at least consider a Radeon card, although I do want great RT performance and I do love DLSS).

But what other choice do we have? I mean if you don't like the price the only option is don't buy it. Getting upset in a tech forum about it does absolutely nothing.

The 4080 12GB is stupidly priced at 900 usd I wouldn't touch it and I doubt reviews will change that.. That leaves me with the 4080 16gb and 4090 if I want a noticeable upgrade this gen... Will I buy one no idea but I will at least reserve judgment on them till 3rd party reviews go live. I still think this has more to do with them wanting to clear out ampere stock I mean they still consider 3 of them part of their current offerings as it is.

I personally like RT especially in games like Control and Cyberpunk with 3-4 different implementation so AMD is probably not an option this generation again. Sounds like their implementation will be quite a bit weaker but I'd love to be wrong.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Don't get me wrong if I could wave a magic wand and make gpus affordable for everyone I would but that's not reality. Everyone has to decide with their own hard earned money what they can afford or what they would tolerate as far as pricing to me 1200 and 1600 usd is ok assuming the performance uplift justifies it to others it's a slap in the face neither of us is wrong.
You're not wrong about this in a vacuum, but you're entirely ignoring the causes and power relations behind these developments. Nvidia could keep per-tier GPU pricing entirely fixed if it wanted to, while increasing performance each generation - it would just need to prioritize differently when laying out their GPU dice and making their reference designs. It would make per-generation gains look less impressive most of the time - but Turing demonstrated that this really isn't a problem for Nvidia. The point being: it would be entirely possible for them to design $200 6-tier GPUs, and more - if they wanted to. Instead, they're prioritizing absolute peak performance and gradually increasing average sale prices, mainly because that makes shareholders happy. But this also intrinsically pushes away their customer base - just look at how many people are still using GTX 1060s - and I can't help but think this is going to come back to bite them in the long run. For now, they're being propped up by AMD not wanting to compete on price but instead also wanting to raise average sale prices - which is understandable when AMD hasn't had any free fab capacity for something like three years now. But if that changes, if TSMC's drop-off in orders means AMD now has capacity to spare, we might be looking at a new market share/sales volume push for AMD. This is clearly me hoping for something to happen rather than necessarily thinking that it will - AMD's shareholders also love high average sale prices - but there's definitely an opportunity here for AMD.

But what other choice do we have? I mean if you don't like the price the only option is don't buy it.
Yes, exactly. And how, precisely, is this a sustainable attitude for Nvidia - or any other mass market company - to hold towards their customers?

I mean, there are other choices. Used market, consoles (a gift to AMD, really), finding other hobbies. Nvidia has nothing to gain from alienating customers - but that's what they're doing. They've been pushing hard for quite a while to get as much money as they can from a rapidly expanding customer base, but we've already reached a saturation point, and rather than accepting that they overplayed their hand, or played hard and went out of bounds, and accepting the (small!) cost of this, Nvidia are instead continuing on a path of maximizing profits rather than designing attractive products people can actually afford.
 
Joined
Sep 10, 2018
Messages
6,922 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
You're not wrong about this in a vacuum, but you're entirely ignoring the causes and power relations behind these developments. Nvidia could keep per-tier GPU pricing entirely fixed if it wanted to, while increasing performance each generation - it would just need to prioritize differently when laying out their GPU dice and making their reference designs. It would make per-generation gains look less impressive most of the time - but Turing demonstrated that this really isn't a problem for Nvidia. The point being: it would be entirely possible for them to design $200 6-tier GPUs, and more - if they wanted to. Instead, they're prioritizing absolute peak performance and gradually increasing average sale prices, mainly because that makes shareholders happy. But this also intrinsically pushes away their customer base - just look at how many people are still using GTX 1060s - and I can't help but think this is going to come back to bite them in the long run. For now, they're being propped up by AMD not wanting to compete on price but instead also wanting to raise average sale prices - which is understandable when AMD hasn't had any free fab capacity for something like three years now. But if that changes, if TSMC's drop-off in orders means AMD now has capacity to spare, we might be looking at a new market share/sales volume push for AMD. This is clearly me hoping for something to happen rather than necessarily thinking that it will - AMD's shareholders also love high average sale prices - but there's definitely an opportunity here for AMD.

I definitely see your point. I personally would prefer they push absolute performance. You're not wrong though.

A lot of Ada design was probably decided during the peak of the mining boom I wonder in retrospect if they would change anything I'm sure they are still salty with retailers making more than they did the first 6-8 months of amperes life. I personally expected the two 4080s to be cheaper but the 4090 to be more expensive but here we are.

Going forward I'm as interested as anyone to how the market reacts Turing didn't do so well but ampere did amazing for them due to reasons. It'll be interesting how Ada does.


Kudos to those who got a 10GB 3080 near msrp at launch though in retrospect that was a hell of a deal same with the 6800XT to a lesser extent.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I definitely see your point. I personally would prefer they push absolute performance. You're not wrong though.
I think there could be ways for them to do both - they could for example come up with some sort of "premium" branding for higher end cards. Titan could arguably have been that - but instead, they started out with Turing pushing per-tier pricing higher with that weird 20/16 split ("yes, there are now four 6-tier SKUs, why not?"). If they had instead expanded Titan into a "premium gaming" segment, that could have brought with it a ton of prestige (and thus desirability and sales) at high prices, while leaving attractive-seeming tier designations available for lower-end cards, opening the door for perceived good value GPUs as well. Instead they've essentially abandoned the $200 price point for two full generations now, focusing solely on the higher end.

To me, this is part of why I tend towards believing EVGA's version of their split: their description is consistent with Nvidia's behaviour towards only caring about increasing margins, ASPs and per-tier pricing for more than half a decade now. It's likely not the full story, but that glove definitely does seem to fit nonetheless.

Of course, mobile also likely plays a big part in this - and the relatively large dice and expensive designs for the past few generations match well with these being targeted towards lower clocked mobile chips that still deliver good absolute performance. Smaller, cheaper designs would inevitably do worse in mobile by simple virtue of not having the ability to clock as high. And, of course, two successive crypto booms and a two-year gloabl lockdown have no doubt also driven a belief that whatever the price, the GPUs will sell. We'll see what kind of correction this leads to in the near future, if anything. For now, their expressed tactic is to not change anything and try to rake in profits as much as possible. Which really does leave the door open for AMD.
 
Joined
Feb 9, 2015
Messages
41 (0.01/day)
System Name Raistlin
Processor Ryzen 5 5600X
Motherboard MSI X470 Gaming Pro
Cooling Noctua NH-D15S with dual fans
Memory 32GB G.Skill 3600MHz DDR4 CL16 (F4-3600C16-16GTZNC)
Video Card(s) Nvidia RTX 3090 (MSI Suprim X)
Storage 1 x 960GB SX8200, 1 x 1TB SX8200, 1 x 2TB Seagate HDD
Display(s) LG 34GP950G, 2x DELL S2721D, LG 48" C2 OLED (OLED48C2PUA), HiSense 75U78KM (75" Mini-LED 4K TV)
Case Thermaltake Core X9
Audio Device(s) Topping E30 + Drop O2 Amplifer + Sennheiser HD 600 / HIFIMAN HE4XX / Sound BlasterX Katana
Power Supply EVGA SuperNova 1300 G2
Mouse Razer Naga Pro wireless
Keyboard Ducky One 2 full size
VR HMD HP Reverb G2
Software Windows 10 Professional
I don't really care what they price the 60 tier cards at I'd never buy one regardless if they were 50 usd or 1000 usd.

I don't care what others can or can't afford if they're too expensive for you or anyone else don't buy them it's simple these aren't necessities in life we are talking about.


Same with me if the 80/90 tier ever goes out of my price range guess what I'll stop buying them life goes on...

Getting upset over how a company prices their own products is pointless... They're not a charity or your friends they will all try to make the most money possible.
Pretty much this. It's like complaining Ferraris are too expensive; if they're too expensive you're not the target market.

I have worked extremely hard my entire adult life(I am now 44) to be in the position where my hobbies aren't a financial hardship, and this is just one example where that work has paid off.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Pretty much this. It's like complaining Ferraris are too expensive; if they're too expensive you're not the target market.

I have worked extremely hard my entire adult life(I am now 44) to be in the position where my hobbies aren't a financial hardship, and this is just one example where that work has paid off.
And you've thus also clearly had the privilege of having that hard work actually pay off - unlike a lot of people. Generally, the hardest working people you'll find are those working two or three shit jobs, barely covering rent and basic living expenses. Don't confuse being lucky enough for things to work out for you with people not being that lucky not having worked as hard, or not being as deserving of good things, please.

Also, 6-tier GPUs are distinctly not Ferraris - they're supposed to be the Toyotas of the GPU world. And if supposedly cheap Toyotas start being priced like Ferraris, there's something serious wrong going on.
 
Joined
Sep 10, 2018
Messages
6,922 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
To me, this is part of why I tend towards believing EVGA's version of their split: their description is consistent with Nvidia's behaviour towards only caring about increasing margins, ASPs and per-tier pricing for more than half a decade now. It's likely not the full story, but that glove definitely does seem to fit nonetheless.

That was a huge blow to me I've owned at least 10 Evga gpu over the last decade. They are a small company not far from me less than 300 employees I believe. I've also tried to support them because they are local. I hope they change their minds and at least make some AMD gpus a Kingpin 7900XT that beats the 4090 in rasterization would be lovely to me.
 
Joined
Sep 15, 2016
Messages
484 (0.16/day)
I noticed something about the DLSS 3 demonstration. See if you can spot it.


dlsshmm.png
 
Joined
Aug 21, 2015
Messages
1,725 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
What does this mean? That GPUs are getting increasingly unaffordable, and the main culprit is that GPU makers simply aren't making and selling affordable GPUs, instead insisting on artificially inflating margins. The RTX 3050 could easily be $150 if Nvidia was focused on designing an affordable GPU. These are conscious product segmentation choices made on the level of chip design and tiering. Instead, Nvidia is working concertedly towards increasing prices across the board. And sadly, so far AMD is happy to follow suit.

I've complained about this before, but the 3050 is a 6-series card wearing Groucho glasses. Its power consumption is in line with earlier xx60, as is the generational performance uplift (well, outside of the huge 960-1060 leap). The best we could have hoped for was ~$200, the general historical price point for that level of card (again excepting the $300 1060). Two factors preventing what we know as the 3050 launching as a 3060: Nvidia's obvious campaign to push what's considered entry-level and midrange ever higher, and the existence of the 1660S/ti. The 3050's raster performance is right on top of those two, at a higher TDP, no less! Naming it the 3060 would have been branding suicide.
 
Joined
Sep 10, 2018
Messages
6,922 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I've complained about this before, but the 3050 is a 6-series card wearing Groucho glasses. Its power consumption is in line with earlier xx60, as is the generational performance uplift (well, outside of the huge 960-1060 leap). The best we could have hoped for was ~$200, the general historical price point for that level of card (again excepting the $300 1060). Two factors preventing what we know as the 3050 launching as a 3060: Nvidia's obvious campaign to push what's considered entry-level and midrange ever higher, and the existence of the 1660S/ti. The 3050's raster performance is right on top of those two, at a higher TDP, no less! Naming it the 3060 would have been branding suicide.
Yeah, that's a good point. But honestly, at this point Nvidia needs to realize that their current product stack strategy is ... well, not workable. Ever since killing off the GT/GTX distinction, they've had 10 tiers to use, of which 1 has been "this is a display adapter", 3 has been "uh ... guess this can do some 3D", 5 has been entry level gaming, 6 mid-range, 7 upper mid-range to premium, and 8 premium-to-flagship. But then as higher-than-1080p resolutions have proliferated while 1080p has stayed put, the range of usable performance has widened, while the product stack has only widened by one tier - 9, halo/flagship/"this is really a Titan". And their reluctance to use more lower tier designations is understandable - there's definitely an argument to be made for 3, 4 and 5 tiers being unattractive. Not many people want to buy a Core i3, even if it's good. You buy that if it's what you can afford.

Still, I think Nvidia (and AMD for that matter, though they seem a tad more flexible currently) needs to shift things. If they just made the leap, told people that "what used to be called 6 is now called 5", and accepted the dampening effect of this on sales for one generation (which they could most likely counteract by marketing the crap out of that new 50-series), they'd have a lot more flexibility in product segmentation. Instead they're forcing themselves to fit a massive range of performance into just four numbered tiers. Which is just stupid. So instead we've now got ... what, eight SKUs across four tiers, ranging from the 3060 to 3090 Ti. The 3050 is clearly an afterthought, and a card they don't really seem to want to sell, given its pricing and market positioning.

As I suggested above, they could also have alleviated this by creating a "premium" brand of some sort, moving perhaps the top three SKUs to this tier. Call them Titan, call them GeforceX, call them Xtreme XGeforce XRTX - whatever. This way they could have called the 3050 the 3060, priced it at $200, and sold tons and tons of them. And they could still have had $700 80-tier premium/high end tier cards that would be aspirational for the people buying $200 GPUs - most of those spend far less on their PC than the cost of a 3090, so selling them on the 3090 being great isn't much of an advertisement anyhow.

Of course, the mining+lockdown WFH/gaming booms have made such shifts unnecessary - up until now. I guess we'll see how this plays out, but if they insist on keeping the "Geforce [four numbers]" structure as their only GPU brand, they need to start shifting things - the sooner the better for them. But for now, their preferred tactic seems to be "profits today, screw tomorrow".
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Pretty much this. It's like complaining Ferraris are too expensive; if they're too expensive you're not the target market.

I have worked extremely hard my entire adult life(I am now 44) to be in the position where my hobbies aren't a financial hardship, and this is just one example where that work has paid off.

I think Ferraris have always been expensive. Graphics cards have not. If you look at the x80 cards in the last decade, we started at $500 and we got gradual increases of about $100 over the years, which is understandable even if not necessary. But now we suddenly go from $700 to $1200 for no reason. There really will come a point when people will stop buying new cards, because this does not make sense. This is just greed, and of course they have a right to be greedy.

But maybe it is time to change things. If cards are too expensive to manufacture, then they should make smaller GPUs with simpler PCBs and coolers. Maybe we should go back to 200 W being the maximum power consumption. Make smaller performance increases, but make it possible for people to actually buy cards in the range of $200 to $500.

All other PC components are super affordable, including CPUs, which have had gigantic leaps in performance in the last 5 years, but prices have basically stayed the same. So it can be done.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I noticed something about the DLSS 3 demonstration. See if you can spot it.


View attachment 262494
That's quite interesting. Do they mention what GPU that's running on though? It might not be the 4090.

But maybe it is time to change things. If cards are too expensive to manufacture, then they should make smaller GPUs with simpler PCBs and coolers. Maybe we should go back to 200 W being the maximum power consumption. Make smaller performance increases, but make it possible for people to actually buy cards in the range of $200 to $500.

All other PC components are super affordable, including CPUs, which have had gigantic leaps in performance in the last 5 years, but prices have basically stayed the same. So it can be done.
Exactly this. These are conscious design decisions on GPU makers' parts, and decisions that could have been made differently. Yes, there are other factors in play - increasing materials prices, faster I/O requiring more expensive PCB materials, etc. - but this can be accounted for to various degrees. Instead, they're consciously pushing for higher prices in every way they can.
 
Joined
Oct 28, 2012
Messages
1,190 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
But that's precisely where this specific interpretation falls apart: Nvidia specifically cancelled the Titan programme, and renamed the Titan-equivalent cards Geforce. Geforce, crucially, is a gaming GPU series. So, they removed a "prosumer" class designation and instead included these cards in the consumer-facing Geforce series (yes, early Titans were Geforce; later ones were not). Literally the only logic to explain this is that they saw the Titan branding as something that discouraged gamers from buying these GPUs - which makes sense, as Geforce is the gaming brand, so by calling them Titan they were saying "well, you can use these for gaming, sure, but that's not really what they're for". As such, the move from Titan to Geforce is an explicit attempt at presenting these GPUs as consumer or gamer oriented, not prosumer cards (despite their technical details definitely making more sense in that context). Through removing the Titan brand and instead creating the 90 tier, Nvidia is preying on the exact mechanism you're describing - of gamers wanting "the best", no matter what it is.

Outside of this being a marketing/branding move that borders very close on being exploitative in and of itself, this has other problems: the Titan branding also brought with it price separation. Titans were wildly expensive compared to their close Geforce siblings, which was defensible through these being for business, not for gaming. Through including these same cards into the Geforce lineup, they now have the "freedom" to lift pricing for the rest of the Geforce lineup up so that instead of being a major step upwards in price, it's instead continuous with lower tier cards. That's how you go from $1200 Titan X(p) and $700 1080 Ti (+71% price) to $1600 RTX 4090 and $1200 4080 16GB (+33%). The net effect of this is not the Titan-class card being "a lot cheaper than the Quadro series" - they always were - but instead the entirety of the Geforce lineup becoming more expensive through a persistent lifting of the price ceiling for such cards, and thus slowly shifting the marketing/pricing equivalent of the Overton window - the window of what is seen as acceptable and reasonable pricing for a GPU.

There is no other explanation of this that makes any type of sense other than Nvidia wanting to increase prices and squeeze more money out of gamers.
There's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do (at least on the same level as nvidia) because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
1663777473312.png
 
Last edited:
Joined
Aug 21, 2015
Messages
1,725 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
That's quite interesting. Do they mention what GPU that's running on though? It might not be the 4090.


Exactly this. These are conscious design decisions on GPU makers' parts, and decisions that could have been made differently. Yes, there are other factors in play - increasing materials prices, faster I/O requiring more expensive PCB materials, etc. - but this can be accounted for to various degrees. Instead, they're consciously pushing for higher prices in every way they can.

Maybe they see the discrete market contracting long-term, and are laying the groundwork to pull the same net out of fewer units. Still crappy from our perspective, but with a much lower "Wut!?" factor. Or it's an allocation thing. The chips are more valuable for HPC, so it's got to be worth Nvidia's time to even make consumer-bound chips.

There's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
View attachment 262502

Desktop market share is only going to shrink, with or without NV's influence. Laptops are more convenient, phones even more so. I feel a little gross defending NV even a little, but they may not have much of an option, broadly speaking.
 
Joined
Sep 10, 2018
Messages
6,922 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
There's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
View attachment 262502

Not like Jensen was trying to hide anything he stated prior to these announcements any Ada gpu launched this year would be layered on top of existing Ampere gpus. Personally I figured it meant we were only getting a 4090..... Guess I was wrong.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
There's something interesting happening right now: we used to have a lot of computers that where expensive, but still sold because they were the best thing around by a long shot(like silicon graphic) . Wintel won and became mainstream because it was cheap and comparable. But now it seems like Nvidia is determined to turn the PC into a niche that's more about bleeding edge technological display over affordability. And while it seems that AMD got a huge margin to increase their price as well, I get the impression that it would be unwise for them to do because of the perceived value of Radeon against Geforce. Nvidia made it clear that ADA is so great because of the whole closed software ecosystem around it. The Rtx 3070 is going to stay a representation of the 500$ market for a long time, so when AMD will present the successor of the RX 6800 they have the opportunity to strike a low blow. The "new RTX family still include Ampere 3060 -3080 who haven't dropped in price. History showed that people like a good deal, and now consoles also have High refresh rate, and it's easier to have a good HDR experience with a TV. The PC is betting all on ray tracing at a very heavy cost, at some point people might wonder if it's really worth it ? (especially with engines like UE 5 not relying on conventional ray tracing)
View attachment 262502
Yeah, I think that's spot on. And I also think that if Nvidia were to get their will with going that way, it'll be the end of PC gaming - barring someone else stepping in to fill their shoes. PC gaming as a hyper-expensive niche just isn't sustainable as a business. If anything, the Steam Deck has demonstrated that the future of gaming might lie in the exact opposite direction, and that we might in some ways be reaching a point of "good enough" in a lot of ways.

I also agree that this situation presents a massive opportunity for AMD. Regardless of where RDNA3 peaks in terms of absolute performance, if they flesh out the $300-700 segment with high performing, good value options relatively quickly, they have an unprecedented opportunity to overtake Nvidia - as this pitch from Nvidia (as well as their earnings call) confirms that they've got more Ampere stock than they know what to do with, and they don't want to stomach the cost of cutting prices unless they really have to. If RDNA3 actually delivers its promised 50% perf/W increase, and die sizes and MCM packaging costs allow for somewhat competitive pricing, this could be the generation where AMD makes a significant move up from their perennial ~20% market share. That's my hope - but knowing AMD and their recent preference for high ASPs and following in Nvidia's pricing footsteps, I'm not that hopeful. Fingers crossed, though.

Maybe they see the discrete market contracting long-term, and are laying the groundwork to pull the same net out of fewer units. Still crappy from our perspective, but with a much lower "Wut!?" factor. Or it's an allocation thing. The chips are more valuable for HPC, so it's got to be worth Nvidia's time to even make consumer-bound chips.



Desktop market share is only going to shrink, with or without NV's influence. Laptops are more convenient, phones even more so. I feel a little gross defending NV even a little, but they may not have much of an option, broadly speaking.
This is absolutely true. There's also the simple fact of GPUs largely being good enough for quite a lot of things for quite a while. We're no longer seeing anyone need a GPU upgrade after even 3 years - unless you're an incorrigible snob, at least. Lots of factors play into this, from less growth in graphical requirements to the massive presence of AA/lower budget/indie games in the current market to open source upscaling tech to lower resolutions in many cases still being fine in terms of graphical fidelity. It could absolutely be that Nvidia is preparing for a future of (much) lower sales, but I don't think they can do that by pushing for $1500 GPUs becoming even remotely normal - the number of people affording those isn't going to increase anytime soon. Diversifying into other markets is smart, though, and AMD also needs to do the same (not that they aren't already). But the impending/already happened death of the 1-2 year GPU upgrade cycle won't be solved by higher prices and more premium products.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
But that's precisely where this specific interpretation falls apart: Nvidia specifically cancelled the Titan programme, and renamed the Titan-equivalent cards Geforce. Geforce, crucially, is a gaming GPU series. So, they removed a "prosumer" class designation and instead included these cards in the consumer-facing Geforce series (yes, early Titans were Geforce; later ones were not). Literally the only logic to explain this is that they saw the Titan branding as something that discouraged gamers from buying these GPUs - which makes sense, as Geforce is the gaming brand, so by calling them Titan they were saying "well, you can use these for gaming, sure, but that's not really what they're for". As such, the move from Titan to Geforce is an explicit attempt at presenting these GPUs as consumer or gamer oriented, not prosumer cards (despite their technical details definitely making more sense in that context). Through removing the Titan brand and instead creating the 90 tier, Nvidia is preying on the exact mechanism you're describing - of gamers wanting "the best", no matter what it is.
I understand, that's within my logic. Professionals care about big VRAM and nvlink for memory pooling. If you call the product Titan or 40x0 does not matter. 4090 has both, so its the choice. 2-3 cards in a graphic server, and the software/hardware will pool everything, cudacores, rt-cores, memory. A powerful and cheap solution for small companies, advanced students. Price was always >1000$, no change at all.
It was my logic, nVidia is using PR to get gamers to buy the 40x0 products too. And successful. The explanation lies within the gamer community. Professionals mostly knew, what they buy and what for.

Outside of this being a marketing/branding move that borders very close on being exploitative in and of itself, this has other problems: the Titan branding also brought with it price separation. Titans were wildly expensive compared to their close Geforce siblings, which was defensible through these being for business, not for gaming. Through including these same cards into the Geforce lineup, they now have the "freedom" to lift pricing for the rest of the Geforce lineup up so that instead of being a major step upwards in price, it's instead continuous with lower tier cards. That's how you go from $1200 Titan X(p) and $700 1080 Ti (+71% price) to $1600 RTX 4090 and $1200 4080 16GB (+33%). The net effect of this is not the Titan-class card being "a lot cheaper than the Quadro series" - they always were - but instead the entirety of the Geforce lineup becoming more expensive through a persistent lifting of the price ceiling for such cards, and thus slowly shifting the marketing/pricing equivalent of the Overton window - the window of what is seen as acceptable and reasonable pricing for a GPU.

There is no other explanation of this that makes any type of sense other than Nvidia wanting to increase prices and squeeze more money out of gamers.
I agree partially. RT on gaming cards was an innovation, based on technology, nVidia used in the professional markets. They implemented it in gamer hardware to realize synergies and market leadership (AMD is clearly behind in this area). I remember all the discussions about the 20x0 series. My position was: wait some years, and there will be no high end card without ray tracing. This guarantees nVidia technological leadership in the gaming sector without additional development costs, as they need rt/cuda/tensor for the hpc sector anyway. So what is going on? nVidia raises the limits between semi-professional use and high-end gaming, to serve different peer groups with one product. And this from the beginning. Cudacore support was even enabled in the gaming drivers back to maxwell. Every student, who needs raytracing or cudacore power could do some crazy calculations with cheap gaming cards. My first nVidia card was a 970 for exactly that reason, therefore I had radeon. Once you are adapted to a special software/hardware environment, you will not change easily.
People who spend >1000$ for gaming, well, its ok. What I dont like is nVidias low and midtear pricing. But that's an opportunity for AMD. So there is something for everyone.
 
Last edited:
Top