• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA has revealed the prices for the RTX 5090, 5080 and 5070

And, that was the last time I got an eVGA GeForce brand new. :( RIP, eVGA.
The same little computer store that I got it at, also went under in July, 2024! :cry:
 
Great interview imo fyi.
 
I believe Nvidia has incorporated significant AI capabilities into these GPUs primarily for the AI business, and subsequently allocated resources to leverage that AI power to gaming as well. If gaming were the sole focus, the GPU architecture might be different, but they need to cater to multiple demands.
 
I wonder what the price of the 5090 will be in Europe.

2500€ or less?
 
I did a graph some time ago comparing the different generations and their % to the top die, similar to the one that has been floating around here, but that compares an assumed "top product" for each generation instead of the actual die nvidia uses:
I actually did a graph once comparing the biggest die of each generation, and the percentage that each consumer product fell in (not sure if I posted it in this forum before):
View attachment 377261

Blackwell is supposed to make this graph even bigger, it'll be fun to update it once the products are available.


I had also thought about this, but there is no GB204 chip so far, (04 were where the previous x70 models usually used), so seems like it just got downgraded, but the segmentation from the 70ti and 70 is not news.
I decided to give it a go and update it today with the known values of the blackwell gen:
9TAAghf.png

Conclusions are up to each one.
 
Update lossless scaling app now makes use of igpu or second gpu to improve latency. Goes for the Jugular dlss 10.0 they have.
1000057782.jpg
 

Attachments

  • 5fb509714c401729fd4a248ce3b991d1b8cda5922d275fcb2864989a4046e016.png
    5fb509714c401729fd4a248ce3b991d1b8cda5922d275fcb2864989a4046e016.png
    33.1 KB · Views: 46
DLSS was huge. RT is still meh - to me it looks worse than conventional lighting since it's a grainy mess. DLSS ray reconstruction let the real secret slip: You dont need RT or at least full RT if you have ML that can reconstruct lighting 100x faster with better realism.

Between TAA and RT i feel like graphics have regressed a bit. Games from 2016, to me look as good if not better than some games today, and it's a little weird.

It's going to be interesting to see what happens to the market in this space. The 9070 might be well positioned to win back some market share here.
My thoughts also, we have regressed visually whilst need tons more horsepower.
 
My thoughts also, we have regressed visually whilst need tons more horsepower.
Dune awakening looks subjectively worse than Crysis lol but needs a 5090 to play the game at 4k 60 fps.

Nvidia is enabling this shyte show brute forced approach to game optimization. Disgusting imo!
 
Well
Unless I can finagle a deal like like I did for my 4080 where I actually only paid £300 for it, then I will pause on this cycle.
For what its worth, I havent bought a GPU new for over 10yrs at this point. I have been trading my stuff to cex.co.uk to upgrade.
 
Lmao, genius... it's like the second coming of physx.
 
I wonder what the price of the 5090 will be in Europe.

2500€ or less?
3000 - 3500 more likely. We always get the short stick.
 
3000 - 3500 more likely. We always get the short stick.
Videocard article from Gigabyte has 1 model for 5080 at MSRP and some as high as 35% above before vat. So probably something similar for 5090.
1000058092.jpg
 
Last edited:
Will the performance of the 5090 remain consistent whether it operates at X16 or X8 bandwidth? It's worth noting that the 4090 exhibited very tight performance metrics under varying conditions, X16 or X8?
 
Back
Top