• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA has revealed the prices for the RTX 5090, 5080 and 5070

Joined
Apr 2, 2008
Messages
554 (0.09/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |

So ngreedia still has not learned, ok hard pass... The stack should have been 1200/750/500/350, but they will learn the hard way when Im hoping user will en-mass see sense and not pay these prices. I also sincerly hope these cards will flop from a perf pov, as in will only marginaly faster than previous gen as a result of higher power draw. Unless I can finagle a deal like like I did for my 4080 where I actually only paid £300 for it, then I will pause on this cycle.
 
The stack should have been 1200/750/500/350, but they will learn the hard way when Im hoping user will en-mass see sense and not pay these prices.
I too enjoy science fiction. NVidia will be in shambles, no doubt, won’t sell a single card.
 
I wouldn't go that far... will be interesting to see if there is a run on the 40 series if they have significant price drops - people who don't care about frame gen and are more interested in raw rending power have little to look forward to with the RTX 50 cards (i.e. as a generational leap in terms of core fps/MHz it seems that its minor compared to say going from RTX 20>30), especially if availability is not great early on and stock hoarding / scalping comes in to play.

Although that said, as Nvidia haven't actually moved the real grunt performance forward this does represent a good opportunity for AMD especially to catch up a bit with RT and ML performance, at least in this round / generation.

On a side note, will be interesting to see how the reviews for these end up summing things up - a lot of the outcome is really going to be performance vs cost. I think generally, ignoring DLSS and fake generated tricks / hacks, reviewers are going to struggle to find much in the way of performance architectural improvements - I'm hoping I'm wrong - as it seems Nvidia spent a lot more time working on the tensor / AI / DLSS / inference based bits with this series. Depending on the games people will be playing that could literally have no value to some potential customers.
 
Last edited:
If the 54x$ for the 5080 holds and it performs like the 4090, like Jenson said. Then that's ok with me.
But going from 54x$ to 1,999$ for the 5090 is insane. But looking back at the 1080 / 1080Ti (I owned both) The 1080 was VERY good for the money and VERY silent compared to the 1080Ti.
So if the 5080 runs whisper silent I would prefer that to the 5090.
 
If the 54x$ for the 5080 holds and it performs like the 4090, like Jenson sai
He said 5070 and only with DLSS4 and frame gen. So, not really really 4090

The stack should have been 1200/750/500/350,
NGL, I'd say we better start forgetting about 350 for anything mid range
 
I too enjoy science fiction. NVidia will be in shambles, no doubt, won’t sell a single card.
But "en masse" doesn't necessarily mean "the majority".
 
I'll wait until AMD has made up their mind with regards to whatever price range the RX 9000 series is gonna get and reviews for everything are out. I don't care for ML/AI stuff much, and I don't have high-res/high-refresh reqs, so that's irrelevant for me and I can probably just stick to either just the 5070 or the 9070. All depends on pricing
 
Except the one I'm going to buy. :pimp: They're dirt cheap. Think I'll get 2. Upgrade that old crappy 3090 card that's 4 years old now.
I read you sit on golden eggs!!! I will polish eggs for small fee of 5090 :)
 
But "en masse" doesn't necessarily mean "the majority".
Let’s not be delusional here, we know how this will play out. By this time next year, the 5060 will climb into Top 3 GPUs on Steam and the rest of the stack will have good presence in the Top 20. No AMD GPUs are likely to be found there. It’s like watching a play you already read the script of.
 
I'm just retired with a decent(nest egg) pension. Spend my money on Tequila, beer and computer parts because I'm old and bored. I'll leave it all to #1 Grandson
If I had golden eggs I'd be cruising around in my Dodge Hellcat or in my Suburban headed to the fishin' hole.
 
Only the 5090 is a decent generational uplift, obviously the price as well.

The rest looks like single digit improvements with a lot "AI" horse manure shovelling.

What we need are image quality and gameplay experience comparisons in reviews. Obviously you can get more FPS with other trade offs.

Stagnation or even going backwards is the name of the game.
 
Genuine question:
If it only performs like that IF you need to use DLSS + frame gen to do it, is that still ok with you?
No.

If that performance isnt native the it doesnt count. If the 5080 is genuinely only single digit % faster then the 4080 then it's not worth anything, and I'll be sitting on my 6800xt another 2 years waiting for an actual upgrade path.
 
Well the 5090 is crazy but I had zero interest in it anyways but the rest. I can live with t6hat pricing. The 5080 I'm interested in and will pre order one because there just aren't many 4080,s around here anymore and I need a new card badly.
 
I feel like the 40 series was a bigger jump - this is 30% more performance for 30% more power and some more fake frames inserted. Will be interesting to see if these actually move.

5070ti looks decent if you skipped the 4xxx series.
 
I feel like the 40 series was a bigger jump - this is 30% more performance for 30% more power and some more fake frames inserted. Will be interesting to see if these actually move.

5070ti looks decent if you skipped the 4xxx series.
That's expected as this is on the same node. Last time we jumped from samsung 8nm to TSMC 4n.
 
Depending on the raw performance 5090 might be worth it. If it's actually around %50 (they said twice), it's worth it. I mean about double the tfops and rt cores might actually make it happen. Also, that 32 gddr7 memory will be even more leap for semi to open world high end games than current cards. I just hope Suprim doesn't end up being like 2300-2500.
 
They can keep that 5090. Not paying Jensen's 50 series tax. It can sit and rot.
 
It was... in 2016.
Yeah, at this point it's up to AMD and Intel to fill the 350-500 range... and I'm not sure either of them actually wants to.

Intel has to keep prices low because no way anyone will buy their GPUs otherwise, as they're still not all that there (both hardware and software)

And AMD... honestly, I don't even know what to say about them. Their RX 9000 announcement wasn't exactly an announcement, more like an afterthought.
 
At the rate Nvidia drops all those AI trickery/features, some games could start with, AMD and Intel is not supported.
Who cares anymore about AMD users when Nvidia has 90% market share.
Let's be real here, Nvidia really worked hard to introduce ray tracing and DLSS, many people laughed at this, even me, all Intel and AMD does is copy paste, they are the Samsung of this tech space, Nvidia deserves to charge whatever they want, they are a private company and have no obligation to run a charity here.
 
At the rate Nvidia drops all those AI trickery/features, some games could start with, AMD and Intel is not supported.
Who cares anymore about AMD users when Nvidia has 90% market share.
Let's be real here, Nvidia really worked hard to introduce ray tracing and DLSS, many people laughed at this, even me, all Intel and AMD does is copy paste, they are the Samsung of this tech space, Nvidia deserves to charge whatever they want, they are a private company and have no obligation to run a charity here.
DLSS was huge. RT is still meh - to me it looks worse than conventional lighting since it's a grainy mess. DLSS ray reconstruction let the real secret slip: You dont need RT or at least full RT if you have ML that can reconstruct lighting 100x faster with better realism.

Between TAA and RT i feel like graphics have regressed a bit. Games from 2016, to me look as good if not better than some games today, and it's a little weird.

It's going to be interesting to see what happens to the market in this space. The 9070 might be well positioned to win back some market share here.
 
At the rate Nvidia drops all those AI trickery/features, some games could start with, AMD and Intel is not supported.
Who cares anymore about AMD users when Nvidia has 90% market share.
Let's be real here, Nvidia really worked hard to introduce ray tracing and DLSS, many people laughed at this, even me, all Intel and AMD does is copy paste, they are the Samsung of this tech space, Nvidia deserves to charge whatever they want, they are a private company and have no obligation to run a charity here.
They can implement whatever they want, but unless there is widespread API support then little will come of it. Game makers code to the platform unless they are getting their development cost money from somewhere else - then I guess they wouldn't care and would make a vendor specific game.... although eventually it would probably be released generally if it was any good.

Whilst I applaud their raytracing efforts being the best, at this point in time it is still a feature that has dubious value in the sense that a) the performance penalty is massive, and b) it isn't always an improvement in graphics quality (be it due to game engine or other software / hardware limitation). This isn't like when moving from 16-bit to 32-bit colour rendering which some graphics chips struggled with more than others (or if you were 3Dfx you just shrugged and did the best you could to play off not supporting it until you couldn't afford not to) as there would be an overall improvement - i.e. nothing looked worse in any scenario.

DLSS I would never say I laughed at - I laughed more at the fact that the solution to improving performance at high resolutions is to use low resolution and then use logic to upscale the output, in some sort of black comedy like complete u-turn in terms of how the GPU makers were thinking 20+ years ago when doing anti-aliasing by rendering higher and downsampling.
 
NGL, considering theyll have no competition on high-end market, these are better prices than I expected. But it looks like raster performance is only up 20-30% up, so it's very 'meh' generation.

However, seeing Nvidia comparing fake frames to even more fake frames and based on that claiming 5070 = 4090 left a really bad taste in my mouth (again, but even worse than what they have done with 4000 series where they at least put raster results in the slides too). This is plain deception.
 
Back
Top