• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Will You Buy An RTX 4090?

Will You Buy An RTX 4090?

  • Yes, on release even if I have to buy one from a scalper

    Votes: 3 1.4%
  • Yes, on release if I can get one at close to MSRP

    Votes: 17 7.9%
  • Yes, eventually

    Votes: 10 4.6%
  • Maybe

    Votes: 16 7.4%
  • No

    Votes: 169 78.2%
  • Yes, I will buy one for other uses than gaming

    Votes: 1 0.5%

  • Total voters
    216
running games at 1440p at 60 fps and medium-high settings, which is already above average gamer needs
That is your opinion.

I prefer high fps (120+) gaming @1440p in several games..

Example, GTA V looks like a sh1t slide show @ 60fps imo.....
 
That is your opinion.

I prefer high fps (120+) gaming @1440p in several games..
Steam HW and settings survey isn't an opinion and you are niche, rather small nice too and well you have moeny for that kind of hardware, not everyone does. Also people play tons fo older games too.

Example, GTA V looks like a sh1t slide show @ 60fps imo.....
Okay, edgelord.
 
I'm getting a 4090 delivered this week, upgrading from a 2080 Ti. I bought a 4K OLED mid-30 series and took the "just wait" approach. I'm looking forward to 120fps 4k gaming.
 
  • Like
Reactions: N/A
Maybe interesting for some that bought a 4090


Screenshot 2022-10-17 182453.png
 
Again, any data to support that? You are just talking out of yer bum.
Since you start this nonsense, you prove up first. Go on let's see you try..

Sorry for being harsh, but it really gringds my gears, when Lex doesn't seem to understand taht there is some supply and demand curve and there is some optimal point in it set by quntity and price variables. Also it's not just that curve, but also certain fixed overhead from company, which doesn't scale linearly with amount of units produced. This is truly economics 101, not some advanced shit. Market that could allow such a high expense is also dynamic and unstable, mining boom ending basically means that it got smaller and now RTX 4090 basically has to serve more gamers/prosumers. Despite the fact that Ada cards are basically cut down enterprise cards, nVidia just can't charge margins like they do with pro cards in consumer market and basically all consumer tier products are inherently lower margin products than enterprise ones and those are still lower margin products than data center ones. nVidia msot likely puts the most RnD into data center, that's why they already have newer arch and process node, then scale it down to enterprise, then sees the yields, stability and only then after long development process they finally end up as gamer products like RTX 4090. And your typical gaming bro is actually the most unstable and intollerant market as gaming is just entertainment and if gamer bro doesn't have cash for it, then entertainment and superfulous expenses are going to get axed first, therefore prices can't be too high for even RTX 4090, which already targets rich people with cash to burn. And hello, but we are now into worldwide recession, after what I would call a temporary boom (post lockdown people seemingly wanted to spend a lotand stocks went high for no reason), therefore nVidia won't sell as many cards in general and units soldwill be much lower for highest and high tiers of cards in consumer markets. Enterprises and data centers don't care so much about that, becasue they must buy cards to make dollars, but for us it's just gaming, maybe streaming and perhaps occasionally some amateurish video production. Having actual numbers would us see that effect more clearly, but Lex seems to be saying that it isn't how nVidia works and there are many people wanting and actually buying xx90 tier cards. He just seems to be disconnected from reality. Not to mention taht besides recession on worldwide scale, whole European market ahd energy prices climb times in price and people may legit go broke due to their energy bills, so unlike ever before, people are super vigilant about wattage of cards, which again makes RTX 4090 less appealing product. And even if yo uhave money for energy, perhaps you will save it anyway, just for showing solidarity. On top of all this, used cards got faster and cheaper. You can get GTX 1080 for 250 EUR and it beats RX 6600, not to mention that it is capable of running games at 1440p at 60 fps and medium-high settings, which is already above average gamer needs, therefore RTX xx90 cards aaren't a necessity or requirement for entertainment, but rather just something nice if you are feeling like yo ucan and want to splurge money on something superfluous. Reality is that such audience got squeezed hard and nVidia will have to take a haircut. Dutch tulips withered, the music stopped...
That was adorable. And no, that nonsense doesn't count.

Maybe interesting for some that bought a 4090


View attachment 265852
That's very clever. Should make a lot of people's installs much easier.
 
Since you start this nonsense, you prove up first. Go on let's see you try..
And so I did, even if it was very obvious. I checked 2021's annual report and found out that in 2021nVidia made a bit more than half money from gaming sector. Big surprise, but mining was in others category with tiny percentage. You would think that nV sold more cards to "gamers" (well to miners that is), but that's false, because their gaming revenue share remained basically the same since 2016. And even absolute revenue changed very little. What did grow a lot was data center revenue share, but that's neither gamers or miners, so we don't care about that here.

I wanted to find how many of each card was sold, but sadly wasn't able to find that data. So I had to resort to next best thing, the Steam survey. I thought about including price into equation too, but there's no good data about it and including MSRPs into that table would have been disingenuous at best. Also it's not really critical for us too, to see where approximately cut off point is, where cards sold clearly drops off and here is it, the chart:
1666053594079.png

Here you see a percentage of Steam users with card used to play games and card models. I didn't include RTX 3050 Ti, because it wasn't launched worldwide and most likely OEM only deal. I also had to include normal RTX 3080 model with 12GB model, because Steam HW survey sees them as same card, despite the fact that their core count and vRAM differ. And obviously RTX 3090 Ti has less than 0.15% player share and was lumped together into "others" category, so it's unclear how many were sold.

As you can see most popular card was RTX 3060, it managed to reach 3.40% Steam gamer share. You can see less RTX 3050s sold, perhaps due to worse value per dollar, maybe because it was worse value than RX 6600 and perhaps of its performance or perception of its performance weren't met (because it was usually 75 watt card without power connector needed). You can see that beyond RTX 3060 sales dropped dramatically, literally twice, despite there not being 2 times price delta. Surprisingly RTX 3070 reversed that trend a bit, but then RTX 3070 Ti sold more than 2 times than RTX 3070. Past this point strong diminishing returns are already observed, but there seems to exist rich people market and products. RTX 3080 sold relatively well, but that's technically two cards so data is unreliable. Then we have RTX 3080 Ti, it finally sold worse less than RTX 3070 Ti, but we don't see sales reduction in times anymore. RTX 3090 sold worse than RTX 3080 Ti, again, no tiems worse sales, but it seems that units sold slightly accelerated. And finally we have the RTX 3090, at this point we see the most dramatic reduction in units sold, how much is unclear, but it's less than 0.15% share of all cards used on Steam and that works out at at least 3 or 4 times less units sold than RTX 3090, despite there not being equivalent price increase. So basically RTX 3090 was the highest end SKU that nVidia could ever hope to sell well. Obviously it didn't cost MSRP, so I had to find the range of prices and here I found them:

We can see that over 2021 price range on eBay was between 2599 dollars and 3107 dollars. So what, you were right all along? Well not at all. All data so far seems to indicate that you were right, but let's take a look at 2022 data, basically after ETH croaked:

Basically three month range was just 1270-1700 dollars for RTX 3090 and that's the SKU that is the selling "well" for highest end ones still and after mining haze dissipated, we see that miners made problem worse, but gamers clearly didn't want to pay those high prices and let's talk about RTX 3090 Ti. Its price range was 1470-2000 dollars and market thinks it's clearly too much for it. So on average of 3 months, RTX 3090 cost around 1523 dollars and that was the most gamers would buy. RTX 3090 Ti's average price of 1707 dollars, was already too high for gamers. So basically ~1500 USD is the max that gamers will want to pay for GPU, no matter how fast it is. Meanwhile the most optimal cards for vast majority of people were RTX 3060 and RTX 3070. Those cars are now in 400-600 USD range.

So yeah, 3000 dollar GPU today would make zero sense and no it won't sell well at all, not even at high end. Whether it would work out financially for nV even with low units sold is unknown, but most likely it won't. nVidia can make the most cash with RTX xx80 tier cards, granted they sell well, if that's fails then they have to rely on RTX xx60 or RTX xx70 tier cards for most profit. Also it looks like there may not be RTX xx90 Ti tier card anymore, as RTX 4090 is already huge GPU, which has more cores than their best GPU Hopper H100 and RTX 4090 basically has almost the same transistor count too and no, nVidia won't release even bigger die for consumers (or in SMX form factor). Hopper H100 cannot be given to gamers, because we already have Ampere and Ada archs, having a third arch for the beefiest car just wouldn't make sense, especially when Ada hasn't really paid off yet. It would only tarnish a reputation of nV, not to mention will ruin their next gen of architecture's reputation and they would have to throw together new arch in basically no time without lithography improvement and this time just making dies bigger doesn't work, because we already are at hard power limits.

That was adorable. And no, that nonsense doesn't count.
My advice for you next time. Stop being a clown, gather data and have some common sense about basic economics and don't assume that crazy hype lasts forever.
 
And so I did, even if it was very obvious. I checked 2021's annual report and found out that in 2021nVidia made a bit more than half money from gaming sector. Big surprise, but mining was in others category with tiny percentage. You would think that nV sold more cards to "gamers" (well to miners that is), but that's false, because their gaming revenue share remained basically the same since 2016. And even absolute revenue changed very little. What did grow a lot was data center revenue share, but that's neither gamers or miners, so we don't care about that here.

I wanted to find how many of each card was sold, but sadly wasn't able to find that data. So I had to resort to next best thing, the Steam survey. I thought about including price into equation too, but there's no good data about it and including MSRPs into that table would have been disingenuous at best. Also it's not really critical for us too, to see where approximately cut off point is, where cards sold clearly drops off and here is it, the chart:
1666053594079.png

Here you see a percentage of Steam users with card used to play games and card models. I didn't include RTX 3050 Ti, because it wasn't launched worldwide and most likely OEM only deal. I also had to include normal RTX 3080 model with 12GB model, because Steam HW survey sees them as same card, despite the fact that their core count and vRAM differ. And obviously RTX 3090 Ti has less than 0.15% player share and was lumped together into "others" category, so it's unclear how many were sold.

As you can see most popular card was RTX 3060, it managed to reach 3.40% Steam gamer share. You can see less RTX 3050s sold, perhaps due to worse value per dollar, maybe because it was worse value than RX 6600 and perhaps of its performance or perception of its performance weren't met (because it was usually 75 watt card without power connector needed). You can see that beyond RTX 3060 sales dropped dramatically, literally twice, despite there not being 2 times price delta. Surprisingly RTX 3070 reversed that trend a bit, but then RTX 3070 Ti sold more than 2 times than RTX 3070. Past this point strong diminishing returns are already observed, but there seems to exist rich people market and products. RTX 3080 sold relatively well, but that's technically two cards so data is unreliable. Then we have RTX 3080 Ti, it finally sold worse less than RTX 3070 Ti, but we don't see sales reduction in times anymore. RTX 3090 sold worse than RTX 3080 Ti, again, no tiems worse sales, but it seems that units sold slightly accelerated. And finally we have the RTX 3090, at this point we see the most dramatic reduction in units sold, how much is unclear, but it's less than 0.15% share of all cards used on Steam and that works out at at least 3 or 4 times less units sold than RTX 3090, despite there not being equivalent price increase. So basically RTX 3090 was the highest end SKU that nVidia could ever hope to sell well. Obviously it didn't cost MSRP, so I had to find the range of prices and here I found them:
We can see that over 2021 price range on eBay was between 2599 dollars and 3107 dollars. So what, you were right all along? Well not at all. All data so far seems to indicate that you were right, but let's take a look at 2022 data, basically after ETH croaked:
Basically three month range was just 1270-1700 dollars for RTX 3090 and that's the SKU that is the selling "well" for highest end ones still and after mining haze dissipated, we see that miners made problem worse, but gamers clearly didn't want to pay those high prices and let's talk about RTX 3090 Ti. Its price range was 1470-2000 dollars and market thinks it's clearly too much for it. So on average of 3 months, RTX 3090 cost around 1523 dollars and that was the most gamers would buy. RTX 3090 Ti's average price of 1707 dollars, was already too high for gamers. So basically ~1500 USD is the max that gamers will want to pay for GPU, no matter how fast it is. Meanwhile the most optimal cards for vast majority of people were RTX 3060 and RTX 3070. Those cars are now in 400-600 USD range.

So yeah, 3000 dollar GPU today would make zero sense and no it won't sell well at all, not even at high end. Whether it would work out financially for nV even with low units sold is unknown, but most likely it won't. nVidia can make the most cash with RTX xx80 tier cards, granted they sell well, if that's fails then they have to rely on RTX xx60 or RTX xx70 tier cards for most profit. Also it looks like there may not be RTX xx90 Ti tier card anymore, as RTX 4090 is already huge GPU, which has more cores than their best GPU Hopper H100 and RTX 4090 basically has almost the same transistor count too and no, nVidia won't release even bigger die for consumers (or in SMX form factor). Hopper H100 cannot be given to gamers, because we already have Ampere and Ada archs, having a third arch for the beefiest car just wouldn't make sense, especially when Ada hasn't really paid off yet. It would only tarnish a reputation of nV, not to mention will ruin their next gen of architecture's reputation and they would have to throw together new arch in basically no time without lithography improvement and this time just making dies bigger doesn't work, because we already are at hard power limits.
That was a very lengthy explanation, matched only by it's uselessness. You are disregarding several points, which include worldwide inflation, eliteism and the fact that even non-rich/wealthy people would pay for the best. Your attempt at sidestepping is refused and disregarded.
 
When the game need it?

i don’t see the point having such card and running games at 500fps

i would need triple monitor at 4k
And change my entire PC

I am a gamer and use my gpu card for gaming. Not mining or working.

Maybe in 2/3years I will but I am good for now (and there will be new card)

I am waiting for PSVR2

I skipped 3000 series and will do the same with 4000
 
That was a very lengthy explanation, matched only by it's uselessness. You are disregarding several points, which include worldwide inflation, eliteism and the fact that even non-rich/wealthy people would pay for the best. Your attempt at sidestepping is refused and disregarded.
Okay, show me your data. You either do this or you are just irrelevant clown living in Metaverse.
 
I'm getting a 4090 delivered this week, upgrading from a 2080 Ti. I bought a 4K OLED mid-30 series and took the "just wait" approach. I'm looking forward to 120fps 4k gaming.
can you identify your 4k OLED display for 120fps gaming please?
 
can you identify your 4k OLED display for 120fps gaming please?

Probably an LG C1 or CX as they are the best for gaming currently. C2 is probably too recent.
 
Are we really expecting the future 4090 Ti to clock in at USD$2200 per wccftech?
 
Tens of them, eventually, provided they can show benefits beyond a 3090.

The performance isn't really relevant beyond a certain amount, but 24GB VRAM has plenty of use in workstations and the RTX A5000 is woefully expensive (3x the price of a 3090 these days and as slow as a 3070)
 
Are we really expecting the future 4090 Ti to clock in at USD$2200 per wccftech?
That'll be more like $5500+ here in Gougelandastan as the ROG Strix RTX4090 is already at an nauseating $4399 Gougelandastani Plunketts
 
I am waiting for the 4090 Ti since I play at 4K but the 4090 is a great GPU too!
 
I am waiting for the 4090 Ti since I play at 4K but the 4090 is a great GPU too!
I doubt there will be one. At best one with slightly increased clock speeds, but 4090 already uses too many watts, so not very likely.
 
There is no good reason not to wait for RDNA 3 to compare.
 
Sold out. Nvidia should price the next generation at 3000$, they are stupid if they don't.
 
Just gonna chime in that my vote is an emphatic 'No', I have my eyes set on AMD's offerings, namely the RX 7900 XT (since I'll only be able to buy one from February onward, I'll be in Canada for next few months, I might even see if AMD has anything on the RX 7950 XT). Though I suspect it'd prolly not be able to match the RTX 4090 in RT, but for pure rasterized performance (no DLSS, no FSR) I'm expecting it to come close to or match the RTX 4090.
 
Back
Top