• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA has revealed the prices for the RTX 5090, 5080 and 5070

Nvidia are for doing a Cisco circa 2000. Massively overconfident because of their insane valuation and sales for products that have little $$$ front end use case.

The 5090 is expensive, but at least it has big uplift in hardware specs compared to 4090. It's worth 2k if 4090 worth 1.6k.

5070 only has 4.35% more shaders and gddr7 v 4070. In the real world at 1440p native I doubt much difference between them.

Just mountains of AI horse manure served up to polish it.
 
With three times the latency. Literally.
Latency is one part of the shyte show. Artifacts, ghosting, shimmering and othe anomalies have to be ironed out before determining it's worth. I guess if you remove reflex from the marketing comparison slides and add it to the Blackwell exclusively the latency does seem improved. Digital Foundries shown no significant latency penalty with more generated frame as well as Lossless scaling clames.
 
Especially since it seems like NVIDIA is not pushing for too much raw performance gains
They are focusing on raw perf gains... for their ML performance, not game raster.
Converting the NN to Transformers is hardly of note, that takes a bare minimum amount of effort.
i still find it pretty cool and would love to see how they done it. I'm still using CNNs for my research, that kind of thing makes me want to move towards ViTs as well, but I'd like a clearer performance comparison (both in quality, but also compute cost).
I was close..... Hate to even imagine what the 5090 is gonna cost.... Even Canada doesn't get screwed that hard.
Here in Brazil it'll likely cost around 3.3k USD, whereas the minimum wage is ~$230. High import taxes + shitty currency really doesn't help.
Im fxxked either way. It's hard to keep everybody happy.
Then keep no one happy and post a pic of your 5090 as a doorstop :laugh:
Prices look good, except 5090. Jensen's in fantasy land with that one. I'm not paying two THOUSAND dollars for that. WAY overpriced. NOT worth it. All others are in line with previous gen, so this is just gouging.
Many people will be paying that. Given that the 4090 was the highest VRAM you could get in a consumer product, the next step was either the RTX A6000 (~3090 chip with 48GB GDDR6) for almost $5k, or a RTX 5000 (which is way slower than a 4090, but had 32GB GDDR6) for $4k+.
Considering that the 5090 has the same amount of VRAM as the RTX 5000 while being a good 50~100% faster than the 5090 for AI tasks (80% more memory bandwidth, way higher tensor perf), $2k actually becomes quite cheap and half the price of previous options.

That's a very good point. Blender or games with path tracing should reveal the true increase in ray tracing performance. The 4090, for instance, is twice as fast as the 3090 in these cases.
Yeah, the 5090 should be twice as fast than the 4090 when it comes to blender, or pretty close to that. And let's not forget that CUDA is already pretty fast in blender, but Optix makes it 2~3x faster than CUDA.
 
Latency is one part of the shyte show. Artifacts, ghosting, shimmering and othe anomalies have to be ironed out before determining it's worth. I guess if you remove reflex from the marketing comparison slides and add it to the Blackwell exclusively the latency does seem improved. Digital Foundries shown no significant latency penalty with more generated frame as well as Lossless scaling clames.
Was talking about lossless for 7$ a month. It's really laggy. Feels like you are streaming the game
 
The RTX 5090 will have great longevity due to the large 32GB memory size and high memory bandwidth. It will probably perform well even compared to cards from 4-5 years from now. Obviously paying +2K for a card is crazy, but this is the price to pay for Nvidia tax and for the fastest card in the market.

To me the RTX 5080 is the worst card of the bunch due to it having only 16GB memory. The RTX 5070 and RX 5070 TI look much better especially if performance is great at the reported prices.
 
The RTX 5090 will have great longevity due to the large 32GB memory size and high memory bandwidth. It will probably perform well even compared to cards from 4-5 years from now. Obviously paying +2K for a card is crazy, but this is the price to pay for Nvidia tax and for the fastest card in the market.

To me the RTX 5080 is the worst card of the bunch due to it having only 16GB memory. The RTX 5070 and RX 5070 TI look much better especially if performance is great at the reported prices.
32GB is total overkill. 16GB works great, the consoles have 16GB of RAM and they are the benchmark for most game devs. 5080 is likely right at 4080 performance, you dont need more than 16GB to saturate it.

Look at where the mighty 3090 is in the charts now. The GPU will age much faster than that memory bus.

Nvidia are for doing a Cisco circa 2000. Massively overconfident because of their insane valuation and sales for products that have little $$$ front end use case.

The 5090 is expensive, but at least it has big uplift in hardware specs compared to 4090. It's worth 2k if 4090 worth 1.6k.

5070 only has 4.35% more shaders and gddr7 v 4070. In the real world at 1440p native I doubt much difference between them.

Just mountains of AI horse manure served up to polish it.
Which is great, it's always fun to watch a titan get kneecapped and fall. Problem is, AMD seems content to just let that happen.... One can dream of the lost RX 8900 XTX.
 
Was talking about lossless for 7$ a month. It's really laggy. Feels like you are streaming the game
Subjectively dlss 3.5 frame gen feels laggy to me as well outside of a few titles.
 
32GB is total overkill. 16GB works great, the consoles have 16GB of RAM and they are the benchmark for most game devs. 5080 is likely right at 4080 performance, you dont need more than 16GB to saturate it.

Look at where the mighty 3090 is in the charts now. The GPU will age much faster than that memory bus.


Which is great, it's always fun to watch a titan get kneecapped and fall. Problem is, AMD seems content to just let that happen.... One can dream of the lost RX 8900 XTX.
Its not overkill but i can agree that 18/24 could be enough, for example i would like to use some AI processing stuff while playing a game (ex. resemble-enhance , OpenAI Whisper etc.) or some want to run some local LLMs to experiment stuff, hardly can do it with 16gb.
 
32GB is total overkill. 16GB works great, the consoles have 16GB of RAM and they are the benchmark for most game devs. 5080 is likely right at 4080 performance, you dont need more than 16GB to saturate it.
Yes that is the problem. PC gaming has stagnated due to developers designing/optimizing games to meet console specs, then porting buggy and slow games to the PC.

The RTX 5080 being a high end card (priced at +$1000?), released in 2025, and having only 16GB is a bad deal especially for users running at 4K resolution and don't upgrade for some years.
 
This should bump their earnings up quite a bit with the gen. I bet next gen will be insane.
 
Yes that is the problem. PC gaming has stagnated due to developers designing/optimizing games to meet console specs, then porting buggy and slow games to the PC.
I can assure you, that's not it. Developers are making super unoptimized POS for both consoles and PCs. Plenty of console games are dropping to below 720p (I kid you not, jedi was casually dropping to 648p) and below 20 fps. It's not a PC issue.
 
Their allocation for 2025 is sold out so it's already a done deal, another year of money printing!
 
i still find it pretty cool and would love to see how they done it. I'm still using CNNs for my research, that kind of thing makes me want to move towards ViTs as well, but I'd like a clearer performance comparison (both in quality, but also compute cost).

According to Nvidia the new transformer models take 4x the compute. It will be interesting to see how this impacts the 2000 and 3000 series in addition to lower end cards. The quality should be very good if they implemented it correctly.

I'll be looking into a 5080/5090. Everybody's psychology is different though.

What's your use case / goal? If it's max performance at any cost you might as well go 5090 seeing as the price is exactly equal to what you get resource wise. I'm not sure a 5080 will be enough of an upgrade over your existing 4080 otherwise. The core count increase is pretty small. Have to wait for reviews on that one. If the 5080 isn't as performant as the 4090 and doesn't carry efficiency improvements you could always considering picking up a 4090 as well. 24GB is very nice to have IMO.
 
If you ask me, I think its choice architecture to an extent. the 5080 is genuinely at a decent price point, assuming you can even get it for that MSRP. 5070ti and 5070 are fine. I wouldn't put it past NVIDIA to use choice architecture.

Honestly, I feel ya. Especially since it seems like NVIDIA is not pushing for too much raw performance gains; and AMD is aiming for mid-range. If stock of certain PC components wasn't suddenly disappearing, right now would be a great time to buy used components. But I can attest that the wells are drying up again.

I have serious doubts the majority of 5080s will be $999 past the initial wave of FE cards. The majority of 4080s cards were $1100-1200
 
I remember when milk was 50 cents a gallon too…

While I appreciate that your message is that inflation exists...let me provide the context.

At an average rate of inflation of 4% a year...because we might as well be reasonable... you've got 1.04^3 as your generational gap between generations. 1.04^4 = 1.1699 -> a 17% price hike could realistically be reasonable when you estimate inflation high and want to pretend there's actually 3 whole years. If that was reasonable, your milk would cost $0.58, and the xx80 cards, assuming the 3080 as your base, would be as follows:
2019 - 3080 - MSRP $699
2022 - 4080 - MSRP $786.28 (1199 actual)
2025 - 5080 - MSRP $884.46 (999 actual)

That's assuming a very high rate of inflation, rising costs rather than relatively decreasing ones with no node investment change, and we are still $114.54 off of inflation, or 12.95% above the value if we assumed that the inflationary value is our base. What exactly justifies that level of pricing premium from MSRP...when the primary distinction between the products is that the software they want you to run is basically within spitting distance of each other without the assistance of frame generation? I think that the 4xxx series as a whole was highway robbery, but pretending that Nvidia isn't charging a premium above and beyond that justified by inflation is...just silly. I get the expectation that some pricing will be going up by virtue of base inflation, despite the decrease in the cost of computing due to miniaturization and optimizations, but pretending they aren't pricing themselves as a premium brand is missing the forest for all those pesky trees.



Before you ask...I actually support Nvidia doing this. It's the same garbage as you'd expect from so many other things. Xerox had the printer market locked up, until they lost the monopoly and have been relegated to third class actors. Craftsman was an amazing tool until they cut quality. Harley Davidson was an amazing bike, until they relocated overseas and pursued profitability per unit over selling a good product. The common thread amongst all of these entities is that they were the best, and charged for it, until the day they weren't. The second that rug is pulled from Nvidia we'll see how much their brand matters, because believe it or not they are not Apple. Their brand is not inherently flashy, or inherently valuable, so if they continue pricing themselves as a premium brand they're going to have to get better...and their current strategy is not that. Their current strategy is to fabricate better results from AI tech...which will not be viable if AMD and Nvidia can release a high volume mover at a reasonable price point. Spending $2000 for a gaming computer is not something most people want to do...especially when people balked at the new Playstation pricing. My gamble is that Nvidia will one day soon lose the gaming market below halo products because they price themselves out of it...and they won't have a regret until they need gamers again. It's at that moment Nvidia will decide whether they are Apple, or whether they are GE.
 
While I appreciate that your message is that inflation exists...let me provide the context.

At an average rate of inflation of 4% a year...because we might as well be reasonable... you've got 1.04^3 as your generational gap between generations. 1.04^4 = 1.1699 -> a 17% price hike could realistically be reasonable when you estimate inflation high and want to pretend there's actually 3 whole years. If that was reasonable, your milk would cost $0.58, and the xx80 cards, assuming the 3080 as your base, would be as follows:
2019 - 3080 - MSRP $699
2022 - 4080 - MSRP $786.28 (1199 actual)
2025 - 5080 - MSRP $884.46 (999 actual)

That's assuming a very high rate of inflation, rising costs rather than relatively decreasing ones with no node investment change, and we are still $114.54 off of inflation, or 12.95% above the value if we assumed that the inflationary value is our base. What exactly justifies that level of pricing premium from MSRP...when the primary distinction between the products is that the software they want you to run is basically within spitting distance of each other without the assistance of frame generation? I think that the 4xxx series as a whole was highway robbery, but pretending that Nvidia isn't charging a premium above and beyond that justified by inflation is...just silly. I get the expectation that some pricing will be going up by virtue of base inflation, despite the decrease in the cost of computing due to miniaturization and optimizations, but pretending they aren't pricing themselves as a premium brand is missing the forest for all those pesky trees.



Before you ask...I actually support Nvidia doing this. It's the same garbage as you'd expect from so many other things. Xerox had the printer market locked up, until they lost the monopoly and have been relegated to third class actors. Craftsman was an amazing tool until they cut quality. Harley Davidson was an amazing bike, until they relocated overseas and pursued profitability per unit over selling a good product. The common thread amongst all of these entities is that they were the best, and charged for it, until the day they weren't. The second that rug is pulled from Nvidia we'll see how much their brand matters, because believe it or not they are not Apple. Their brand is not inherently flashy, or inherently valuable, so if they continue pricing themselves as a premium brand they're going to have to get better...and their current strategy is not that. Their current strategy is to fabricate better results from AI tech...which will not be viable if AMD and Nvidia can release a high volume mover at a reasonable price point. Spending $2000 for a gaming computer is not something most people want to do...especially when people balked at the new Playstation pricing. My gamble is that Nvidia will one day soon lose the gaming market below halo products because they price themselves out of it...and they won't have a regret until they need gamers again. It's at that moment Nvidia will decide whether they are Apple, or whether they are GE.
I believe the poster is being facetious. Even in 1940, milk had an average price of 52 cents per gallon in the USA.
 
Sadly, South African prices are messed up. The 4090 I think was 1599? cannot remember as I didn't check prices lately. That comes down to 30.2k my currency. They were selling them for 45k+ New. That is basically 15k profit. Yes, import tax etc should be calculated as well but on bulk buy the price should also be better. Give or take 12-13k profit. That is how greedy people are here. With converted currency, the 5090 should be 37.8k. Our retails will sell them for 55k+. You can get a descend second hand car with that price. Why pay 45k for a 4090 when you can get a 4080 all byte with a little bit lower settings for half the price. The 7900XTX is also a lot cheaper than the 4080 but it comes down to FSR vs DLSS or Raytracing vs No Raytracing. Does not matter how fast a card is. It is not worth 45k...
 
Then keep no one happy and post a pic of your 5090 as a doorstop

But that will keep my wife happy and not me! It's hard to keep everybody happy (or not) matey :)

Can't wait for W1z's reviews now. Looking forward to the price, performance, efficiency and cluster fxxx of comments about the Nvidia 5 series.
 
I was figuring about 2,000 for the 5090. If it was 2600 I wouldn't get it. Sell my 4090 and it makes it somewhat palatable. Although I would never gouge and expect more than I paid for the 4090. I really don't know how people live with themselves.
 
But that will keep my wife happy and not me! It's hard to keep everybody happy (or not) matey :)

Can't wait for W1z's reviews now. Looking forward to the price, performance, efficiency and cluster fxxx of comments about the Nvidia 5 series.

It's going to be fun especially with the lower end cards 5080/70ti/70 because if they only offer 4060ti like generational improvements the comments from both sides will be pretty hilarious... I got my popcorn ready.

Two Buttons.png
 
Back
Top