Saturday, December 28th 2024

Potential RTX 5090 and RTX 5080 Pricing in China Leaks

What we've all been waiting for, might just have appeared and what we're talking about is of course the pricing of NVIDIA's upcoming graphics cards. @wxnod has posted a single screenshot on X/Twitter of what could be the MSRP of the RTX 5090 and RTX 5080 in China. The MSRP of the RTX 4080 was 9,499 RMB and the RTX 5080 appears to be not that much higher, at 9,999 RMB, but this still equates to about US$1,370, although do note that there's 13 percent sales tax/VAT in China.

Now as for the RTX 5090, things won't be as rosy. The RTX 4090 had an MSRP of 12,999 RMB in China and the RTX 5090 comes in at an insane 18,999 RMB or US$2,600. That's a price hike of a not insignificant 46 percent over the RTX 4090 and this might make it the most expensive consumer graphics card ever released. We'd suggest taking these prices with a helping of NaCl just to be on the safe side. The cards are expected to be available some time in January according to the screenshot.

Update 15:34 UTC: A second picture was posted in the same thread on X/Twitter that shows the expected launch months of the lower-tier RTX 5000-series cards as well and it appears to be taken from a video.
Sources: @wxnod X/Twitter, @harukaze5719 X/Twitter for additional details
Add your own comment

173 Comments on Potential RTX 5090 and RTX 5080 Pricing in China Leaks

#101
Knight47
Merluzreal question is what games people want to play with such ultra expensive hardware.

the cost of making multiple A games is skyrocketed, their release time dilated, and their quality and fun are very meh. a over 1000$/eur gpu to play what?
some notorius 2024 success like Balatro?
or to justify the capability to enable some graphics option in the settings, like that trojan horse of RT?
For 2k 144Hz+ you need these expensive gpu's. FF14 should run on a potato from what I heard, but I need a 800-1000 gpu(4070ti Super) for stable 2k 120fps.
Posted on Reply
#102
Vayra86
OnasiVery based, honestly. And quite in vein of actual PC enthusiasts of old. What the fuck happened? We used to buy cheaper hardware and tinker with it for more performance, we tweaked the shit out of games for more frames at comparable image quality, hardcore FPS enthusiasts used console commands to run Q3 with flat textures to frag more efficiently on their old 800x600 CRTs. We had tons of genres that weren’t HW intensive, yet QUINTESSENTIALY PC. And some that WERE intensive AND PC centric. Consoles used to be memed on. Now it’s all about “I bought this 9999$ GPU and a scalped 800$ CPU, turned up PT to Ultra at 16K in the latest AAA console slop and shit doesn’t work good. Fucking NVidia, PC gaming is dead”. I tell ya, we didn’t gatekeep hard enough.
Nah, let the mainstreamers mainstream, its their loss, my money is in my pocket and I'm gaming like never before. Life's good, honestly. There's more content than you can possibly play. You need to filter a LOT more, really, that's the biggest issue. The filtering includes people, sometimes. There's just a high likelihood you run into people who don't and never will get it, because they just don't want to. They're too busy figuring out the MSRP and availability of their next purchase.

Between the Deck and my PC I have access to.... everything. Literally everything. I haven't even got time proper to play new releases, I frankly barely even look at them until months or years later.
Posted on Reply
#103
Prima.Vera
konyA lot of games can be played on iGPU, even including new ones. Most of demanding games are AAA slops not worth playing. It's hard to justify going high-end nowadays.
My cousin has a laptop with mobile RTX 4070 with 8GB of VRAM and he plays on it ALL of the newest games on maximum details at ~100fps.
People are too brainwashed nowadays thinking you need 20GB of VRAM to play properly the newest and the latest.
I play on 3K with a 10GB card and so far I have yet to find a single game that runs out of VRAM.
People tend to forget that the games engines are caching most of the available VRAM, but not using it all.
Posted on Reply
#104
Vayra86
Knight47For 2k 144Hz+ you need these expensive gpu's. FF14 should run on a potato from what I heard, but I need a 800-1000 gpu(4070ti Super) for stable 2k 120fps.
The real question is why you tell yourself you need 120 FPS stable at that specific res. That's a want, not a need.
Posted on Reply
#105
Why_Me
lepudrukAnd yet another leak, this time from Australia (pricing comperable):
Big difference in prices between Australia and the US. Australia, New Zealand and Canada are used to getting hosed so this shouldn't be a schocker to them.
Posted on Reply
#106
Bomby569
so many leaks, and they are all different, could it be.... No! Let's comment them all as they are real, that seems like the smartest thing to do here. /s
Posted on Reply
#107
Garrus
Vayra86The real question is why you tell yourself you need 120 FPS stable at that specific res. That's a want, not a need.
because 120fps is perfect for OLED, it is smooth and gives you a better than console experience, i play all my games at a locked 120fps on a 360hz monitor
TheinsanegamerNHighly unlikely, not only would that create a dual market for little benefit for AMD in terms of sales (we went through this with socket FM2+ and AM1 being out at the same time) but Strix Halo would be a very expensive chip and would require LPDDR5X memory, which only comes as soldered. At that point, why not just buy a 7600? It'd be cheaper and you'd have upgrade ability and dedicated VRAM.

Lies! I was told that 8GB was plenty and only lazy game devs couldnt make it work? You mean that having less memory then the consoles became....a bottleneck?!?


Luckily for you, we have choice, as consumers. You dont HAVE to buy the 5090! You'll still have the 5080, 5070, 5060, 5050, all their TI/Super variants, the RTX 4000 lineup left in the pipe, gently used RTX 3000s, and at 5070 level you'll also have AMD and hopefully Intel with their GPUs, and with competition prices are likely to be lower.


Why would you pay $1100 for a console that can barely maintain 30 FPS in modern games, is entirely reliant on one expensive software store, and loses out on all the other benefits of software, just because nvidia bad?

You know you dont have to buy the 5090 right?
The point of Strix Halo is to advertise the true cost of building a system.

Intel's new motherboards and CPUs are $300 each, the cheapest ones. That's $900 total for RTX 4060 plus Intel 245K + motherboard.

AMD literally has $900 USD to play with.

If I can have a mini ITX motherboard with a built in Strix Halo plus iGPU + 32GB of ram, I'll buy it for $900.
Posted on Reply
#108
Daven
L0stS0ulThe more you pay more, the less you get.:rolleyes:
Nvidia has been subbing out CUDA core increases gen to gen with more fluff like tensor and RT cores. I for one would like a pure raster GPU but it looks like that ship has sailed across all GPU manufacturers.
Posted on Reply
#109
oddrobert
Hi FI monopoly, I already contributed to evil corp too much over the years. Please don't be a gamer.
Nothing to envy.
Posted on Reply
#110
lepudruk
Vayra86[..] That's a want, not a need.
It's irrelevant, as long as you pay your own bills and can afford such luxuries. Money has little value in the afterlife, you know?
Posted on Reply
#111
Sound_Card
It does not matter if Nvidia charges 3k for a 5090 and 1.5k for a 5080. The green sheep will still buy it like hot cakes and tell everyone and themselves that they are helping industry progress by buying the best and fastest.
Posted on Reply
#112
Kelben
By activating the translation of the moment in question in the video, the voice just announces that the expected prices are those indicated by the image, without any precision as to a possible source they might have had on the subject. We can therefore only rely on the general content of the video, which seems to be a summary of all the rumours already circulating on the web. As it stands, the most logical conclusion would be that the 18,999 yuan and 9,999 yuan prices shown in the image are merely the personal speculations of the video's author.


hardwareand.co/actualites/breves/1400e-pour-la-rtx-5080-2650e-pour-la-rtx-5090-rumeur-de-prix-round-2
Posted on Reply
#113
Dr. Dro
Sound_CardIt does not matter if Nvidia charges 3k for a 5090 and 1.5k for a 5080. The green sheep will still buy it like hot cakes and tell everyone and themselves that they are helping industry progress by buying the best and fastest.
Yeah we'll be helping industry progress by buying the other multi billion dollar international megacorporation's product instead :kookoo:
Posted on Reply
#114
Sound_Card
Dr. DroYeah we'll be helping industry progress by buying the other multi billion dollar international megacorporation's product instead :kookoo:
This is a flatly disingenuous argument. It does not matter if Intel or AMD have billion+ dollar values. You are making a false equivalence and intentionally muddying up the argument.

Nvidia is exploiting its consumer base via brand loyalty and market psychology. Nvidia can justify its outrageous prices precisely because its customer base rationalizes it. This is just like Apple but even worse.
Posted on Reply
#115
freeagent
Sound_CardNvidia is exploiting its consumer base via brand loyalty and market psychology
Or.. just hear me out here..

They make a better product..

:D
Posted on Reply
#116
yzonker
Prima.VeraMy cousin has a laptop with mobile RTX 4070 with 8GB of VRAM and he plays on it ALL of the newest games on maximum details at ~100fps.
People are too brainwashed nowadays thinking you need 20GB of VRAM to play properly the newest and the latest.
I play on 3K with a 10GB card and so far I have yet to find a single game that runs out of VRAM.
People tend to forget that the games engines are caching most of the available VRAM, but not using it all.
Try Indiana Jones. It brought my 3080ti to its knees by running out of VRAM on a 3440x1440 display. Had to reduce the texture buffer quite a bit to get it running reasonably well. (medium IIRC)
Posted on Reply
#117
bgx
well, we cant be too sure about the price till they are officially announced (since they can change MSRP pretty much till last minute).
So lets wait and see.

However, these last few years, Nvidia has priced its GPUs based on performance, including wrt the previous gen which does not see much price reduction.
Hence, we have more and more expansive GPUs, with close to 0 (10% at best) perf/US$ improvement.
Now, perf increases, but prices increases as well.

May be young people think its justified, but all those here for a while have been used to much better improvement / US$, 33%-50% may be when a new gen is out.

AMD could probably cut down price, but they do not seem to want to cut the price by more than 10%.
Intel may be the savior here, with the B580 prices very adequatly, and which could even be discounted further.

bottom line is, following the trend, 2600 USD for the GTX 5900 and ~1250 USD for the 1/2 GTX5900 (=5800) seems in line with what we have seen.

Personnally, i wont invest more than 500 euros in a GPU, even though i keep it for years. Nvidia can die with this strategy.

While i should not care much how others are wasting their money, if people keep overpaying for GPUs, then the price will never go down FOR ME.

Hence, please, stop buying overprices stuff, else they will never stop with the pricing strategy.
Posted on Reply
#118
Sound_Card
freeagentOr.. just hear me out here..

They make a better product..

:D
AMD's prices are set accordingly to the features and performance they offer to the competition. The 7900xt is often just a few bucks more than the 4070 super, but guess which one is selling 30 to 1?
The 7800xt is $100 cheaper while performance is not as good, the extra 4gb offers better life and frame generation. Does not matter. Market psychology. There is no other industry that rationalizes the same way as this particular segment of the PC industry.
Posted on Reply
#119
R-T-B
3valatzyillegal
Interesting take, but with no foundation in the same "free market" you argue exists.
Posted on Reply
#120
Dr. Dro
Sound_CardNvidia is exploiting its consumer base via brand loyalty and market psychology. Nvidia can justify its outrageous prices precisely because its customer base rationalizes it. This is just like Apple but even worse.
No. It's a fact that AMD has no answer to the RTX 4090 over 2 years later and that this will continue to be the case at least until UDNA comes around, and that is assuming that it isn't a flop either. It is also a fact that throughout the stack, GeForce cards tend to be either faster, more efficient, more feature complete, more compatible (and definitely) offer superior software support, usually a combination of these characteristics and in certain segments, all of them at once. This is why people buy them and 90% of the dGPU market has spoken loud and clear. It's up to AMD to listen. They must stop neglecting compute and ship out ROCm for consumer Radeons ASAP, they must stop neglecting RT, they must stop neglecting older APIs, etc.

If the only provider of high quality products decides to charge for the privilege... tough luck, it's supply and demand.
Sound_CardAMD's prices are set accordingly to the features and performance they offer to the competition. The 7900xt is often just a few bucks more than the 4070 super, but guess which one is selling 30 to 1?
The 7800xt is $100 cheaper while performance is not as good, the extra 4gb offers better life and frame generation. Does not matter. Market psychology. There is no other industry that rationalizes the same way as this particular segment of the PC industry.
Critical mistake. NVIDIA and AMD are not on equal footing when it comes to drivers and features, by the time 16 GB matters, the 7800 XT has long since faded into irrelevance due to its low performance. This is true even for the RTX 4070 Ti Super and the duo of 4080 cards. Moreover, FSR and its frame generation technology is rudimentary when compared to DLSS and even Intel's XeSS. The evidence is everywhere and thankfully, TPU now has image quality comparison reviews for all 3 major upscalers... FSR always comes dead last behind DLSS and XeSS, with rare few exceptions where it usually takes a back seat to just DLSS.

www.techpowerup.com/review/stalker-2-dlss-vs-fsr-vs-xess-comparison/
The FSR 3.1 upscaling implementation is extremely underwhelming in this game. At 4K resolution in its "Quality" mode, the small details in tree leaves, vegetation and of thin steel objects are noticeably degraded, the overall image looks very blurry, even when standing still[...]the FSR 3.1 image is completely broken, producing simply a wrong image quality with extreme loss of all details, it looks like an oil painting
Very reassuring! Sounds pleasant to look at, excuse me while I go out to buy a Radeon card. And if you go dig in all of these comparison reviews, you'll find there is maybe two games where it's not a disaster in the past 10 they reviewed. It's really God of War Ragnarök, and that's about it.

The only reason anyone should buy a 7800 XT over a 4070 Super is if they use Linux, making the Windows driver situation inconsequential, and even then thanks to the AI push Nvidia has been spending some time on their Linux stack. If AMD doesn't watch out, then their Linux advantage will be reduced to just an "open source advantage", which will surely strain things with anyone who stops short of being a Stallmanian free software activist.

If this sounds like doomposting, that's because the situation is really that bad. If 90% plus of the market is choosing to purchase the competition's product at inflated prices, the problem is YOU. And this is what AMD needs to understand.
Posted on Reply
#121
FoulOnWhite
If I had the cash to pee up the wall, I would buy a 5090 even if it was $3500, why not, if you have a 6 figure bank account would you even miss it. How many hobby’s do you have to soak up your spare cash. I’m poor but really wish I had money to blow on these top shelf GPUs
Posted on Reply
#122
Krit
Dr. DroThe only reason anyone should buy a 7800 XT over a 4070 Super
The main problem is when RX 7800 XT was released there were no such a 4070 Super GPU. RX 7800 XT at the time was better deal than vanilla RTX 4070. Right now RX 7900 XT is priced very close to RTX 4070 Super and overall RX 7900 XT is better deal! 320bit vs 192bit low end slavery!
Posted on Reply
#123
Dr. Dro
KritThe main problem is when RX 7800 XT was released there were no such a 4070 Super GPU. RX 7800 XT at the time was better deal than vanilla RTX 4070. Right now RX 7900 XT is priced very close to RTX 4070 Super and overall RX 7900 XT is better deal! 320bit vs 192bit low end slavery!
I looked at PC Part Picker and it seems that currently this is an $100 USD delta between them (cheapest 4070S vs. cheapest 7900 XT), given their difference in stature, I tend to agree. The extra horsepower will allow you to have the same experience while forgoing the use of upscaling, and even with AMD's RT deficit, it should still hold up the same frame rates on average, and at 4K, even with the compute deficit the frame rates should still be a little better thanks to the generous memory bandwidth. In this scenario, I would pick the 7900 XT. The 7800 XT, however, would have no such redeeming qualities as a product.





In my opinion, $100 is acceptable wiggle room since a GPU is not something you buy every month or even every year. But it must be noted the 4070 Super is still $100 cheaper while providing comparable performance in more scenarios than it ought to, given that it is basically 3 tiers below. And just like that... you might have made the Nvidia card attractive from a financial perspective, and that is mind blowing.
Posted on Reply
#124
Hankieroseman
TumbleGeorgeThere is no such thing as freedom of choice in a consumer society. You are obliged to buy. Here someone writes in the topic: I have a card that is enough for me for the next 5 years. I will not buy! And the next day he messes with his child's education savings and buys himself an RTX 5090. He has no choice and no free will, even though he claims so in the thread.
I'm putting my 5090 card in my will for my Grandson.

Snoop Dogg"...like peanut butter 'n jelly on gold." ..."Y'all giving it away too fast, (Nvidia) slow down."
Snoop Dogg Bowl Ad
Posted on Reply
#125
Prima.Vera
yzonkerTry Indiana Jones. It brought my 3080ti to its knees by running out of VRAM on a 3440x1440 display. Had to reduce the texture buffer quite a bit to get it running reasonably well. (medium IIRC)
I am, and he is playing it also right now. No RT, no issues.
Posted on Reply
Add your own comment
Jan 4th, 2025 05:11 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts