• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 5080 - premature review - it sucks

It was a glitch in the publishing system, computers sometimes do that, you cannot prevent it completely... ;) It happens even to the best of reviewers.
Moose Muffins! It was entirely preventable. If you don't want something published, you DON'T copy it into the server software space until you're ready to click the publish button.
 
And there is this analysis...
 
I don't give a crap about neither Nvidia or AMD launches, my wallet prevails. That being said, no, I have no reason to "upgrade" to, I'm playing at 1440p (ultra most cases).

P.S. It's quite funny, last time I bought a new card was GTX 970 (Gigabyte Gaming whatever, piece of garbage). Since then, SH market it is, at least in my region.
 
Last edited:
Another similar video:


It really seems that NVidia is trying to do the same thing as previously with the weak and by consumers rejected 12GB 4080.

This lame 5080 should be unlaunched too and reintroduced as 70 series card.
 
people keep focusing on the gen on gen gains, something that i find ridiculous, and not focusing enough on the price

i blame influencers for this idiocy, i guess if you get free shit every time it makes sense from a freeloading perspective
 
Given all the previous data we had on 5000-series - that's about in-line with what was expected. No big surprise there. Slightly faster VRAM, slightly higher TDP, and marginal core improvements only add up so far. As someone here on TPU said - it's a software upgrade.
This.

Just count shaders. Its not rocket science. Same node, similar arch, marketing takes the 1+1 to eleven.

At the same time, you can just buy a 5070 now and get 4090 perf, so stop whining

And my god can we stop posting these idiot Youtube nobodies doing their ugliest duckfaces for us with 15 minutes of nerd rambling you could have caught in one headline? Its part of the problem a lot of people were thinking this 5080 was ever going to be more than it literally said on the box.
 
i wonder how much of the R&D development money shifted to AI
 
And my god can we stop posting these idiot Youtube nobodies doing their ugliest duckfaces for us with 15 minutes of nerd rambling you could have caught in one headline? Its part of the problem a lot of people were thinking this 5080 was ever going to be more than it literally said on the box.
Bro they gotta "build their brand" by "getting themselves out there" doncha know?
 
people keep focusing on the gen on gen gains, something that i find ridiculous, and not focusing enough on the price

i blame influencers for this idiocy, i guess if you get free shit every time it makes sense from a freeloading perspective
Well the things, gen to gen gains are important because price hasn't changed and we got less gen to gen performance.

1738233828641.png


Oh and that graph is from "influencer", from the OPs post right above yours.
 
This.

Just count shaders. Its not rocket science. Same node, similar arch, marketing takes the 1+1 to eleven.
Agreed. I can't wait for the "I swapped my 4080 for a 5080 and I don't see an increase, do you think I need a new CPU?" posts. :roll:

And my god can we stop posting these idiot Youtube nobodies doing their ugliest duckfaces for us with 15 minutes of nerd rambling you could have caught in one headline? Its part of the problem a lot of people were thinking this 5080 was ever going to be more than it literally said on the box.
Agreed as well. I could just write "it's a 4080 Super" and call it a review. I don't see why such a simple thing needs a 15-minute video.

No offence to anyone in the community here, but even the TPU review took me a minute and a half at most. "Relative performance", "power consumption", done. Oh, and I also looked at the "Architecture" page only to notice that Nvidia couldn't be asked to send out a block diagram.
 
Last edited:
Sure, I did. I had HD6970, 1080Ti for a while and also RTX 2080 Ti for a while. HD5870 was not a high end card, nor was GTX 770.
There were few games that I played which indeed were unable to achieve all time min. framerate of 60 fps at ultra on 1080p.
The HD 5870 was the flagship evergreen GPU. The GTX 770 was a renamed 680,t he flagship first gen kepler GPU.
There were games like Crysis, Metro that were really demanding in terms of hardware resources, but they fuckin' looked good. What I'm trying to say is that today we get shitty games that not only look like shit, but they even require shitload of compute power to look that shitty. And that is insane. This is something that was not standard before. Many games are made for consoles, then just ported to PCs. Consoles are basically much less powerful than PCs. Still, they do optimize those games for consoles to achieve 60 fps framerate. Sure, graphics on PC can be much better
Most console games do not hit 60 FPS. Many struggle to maintain 30.
 
Sure, I did. I had HD6970, 1080Ti for a while and also RTX 2080 Ti for a while. HD5870 was not a high end card, nor was GTX 770.
There were few games that I played which indeed were unable to achieve all time min. framerate of 60 fps at ultra on 1080p.

There were games like Crysis, Metro that were really demanding in terms of hardware resources, but they fuckin' looked good. What I'm trying to say is that today we get shitty games that not only look like shit, but they even require shitload of compute power to look that shitty. And that is insane. This is something that was not standard before. Many games are made for consoles, then just ported to PCs. Consoles are basically much less powerful than PCs. Still, they do optimize those games for consoles to achieve 60 fps framerate. Sure, graphics on PC can be much better
Just FYI, every game that is unoptimized on pc runs like complete crap on consoles as well. We are talking about 640p and below res and 15 to 20 fps. Jedi survivor forspoken etc.
 
Just FYI, every game that is unoptimized on pc runs like complete crap on consoles as well. We are talking about 640p and below res and 15 to 20 fps. Jedi survivor forspoken etc.

There's plenty of examples where that isn't true. When devs HAVE to target modest fixed specs they usually manage to extract a lot better performance than they would on PC where they can just up the system requirements and expect users to brute force it.
 
There's plenty of examples where that isn't true. When devs HAVE to target modest fixed specs they usually manage to extract a lot better performance than they would on PC where they can just up the system requirements and expect users to brute force it.
Sure, go ahead and name those plenty of examples.
 
people keep focusing on the gen on gen gains, something that i find ridiculous, and not focusing enough on the price
1000$ for a RTX 5070/Ti class gpu are you mad or just working for nvidia ?
 
Looking at CC the prices are insane for these cards.
 

Attachments

  • 1738245917596.png
    1738245917596.png
    356.6 KB · Views: 82
Looking at CC the prices are insane for these cards.
Right now it costs around ~1600€, i personally wouldn't buy that piece of shit even for 999€ because it's a RTX 5070 Ti tier gpu at the best. As Daniel said it should cost around 749-799$.
 
Last edited:
Sure, go ahead and name those plenty of examples.

Flight Simulator is incredibly demanding on PC yet manages to run decently on an Xbox Series S with - on paper - puny system specs.

The Last of Us Part 1, it used the same engine from TLOU2 which ran flawlessly on a base PS4. Yet on PC the system requirements became dramatically higher.

Grand Theft Auto IV, still doesn't run well even on modern PCs yet on a Series S runs at a flawless 60fps even though it's emulated.

Forza Motorsport loads a lot quicker on Series S (there are lots of loading screens, not just the initial load) than on a PC with an nvme drive which is faster. Why? Ask the devs I guess. The game also runs flawlessly which it doesn't always even on a good PC where it seems to hit the CPU hard. The Series S has weaker everything but does better.

Then we have all the PC games which use the Unreal engine or denuvo, either of which can cause stutters on PC versions which don't exist on console.

The idea that all badly coded PC games run even worse on console is false - some do, some don't.

Another one off the top of my head, the old Sega Rally Revo game didn't run properly on PC because it was hard coded to have a strange internal frame rate which wasn't divisible by 30 or 60, so there was nothing the user could do to make it smooth, even by capping the frame rate or vsync or whatever.

If that doesn't convince you (as I expect will be the case) then just google bad PC ports, it is hardly an unknown phenomenon.
 
Last edited by a moderator:
I can't comprehend why anyone would buy a 5080... get a 2nd hand 4090 instead, way better value...
 
Flight Simulator is incredibly demanding on PC yet manages to run decently on an Xbox Series S with - on paper - puny system specs.

The Last of Us Part 1, it used the same engine from TLOU2 which ran flawlessly on a base PS4. Yet on PC the system requirements became dramatically higher.

Grand Theft Auto IV, still doesn't run well even on modern PCs yet on a Series S runs at a flawless 60fps even though it's emulated.

Forza Motorsport loads a lot quicker on Series S (there are lots of loading screens, not just the initial load) than on a PC with an nvme drive which is faster. Why? Ask the devs I guess. The game also runs flawlessly which it doesn't always even on a good PC where it seems to hit the CPU hard. The Series S has weaker everything but does better.

Then we have all the PC games which use the Unreal engine or denuvo, either of which can cause stutters on PC versions which don't exist on console.

The idea that all badly coded PC games run even worse on console is false - some do, some don't.

Another one off the top of my head, the old Sega Rally Revo game didn't run properly on PC because it was hard coded to have a strange internal frame rate which wasn't divisible by 30 or 60, so there was nothing the user could do to make it smooth, even by capping the frame rate or vsync or whatever.

If that doesn't convince you (as I expect will be the case) then just google bad PC ports, it is hardly an unknown phenomenon.
I'm getting over 200 fps on tlou maxed out on a 2021 cpu. Obviously you are comparing a maxed out pc version vs low 720p console version. That's not an apples to apples.
 
I'm getting over 200 fps on tlou maxed out on a 2021 cpu. Obviously you are comparing a maxed out pc version vs low 720p console version. That's not an apples to apples.

No, TLOU pt. 1 ran terribly on systems which make the PS4 look like a toaster. Also TLOU2 runs at 1080p on PS4 and not at low settings either, it's still better looking than probably most games on PC. Also its predecessor 'TLOU Remastered' ran at 1080p60 on a base PS4 and looked 95% as good as pt.1. (I have all the games on both platforms.)

Anyway you're just going to move the goalposts when given the evidence you asked for so this is a waste of time obviously. You have something you believe and that's it.
 
@LittleBro
so far, its 1 gpu giving up on me (out of 20 cards i had), while 20 out of 20 women, and i dont have to worry about buying them things.

besides, if you need to buy flowers to make your partner happy, your doing it wrong.
 
reasons why 5080 sux, same as it's manufacturer:
1.is out of stock
if this is not enough,

it's have a horredous price increase over msrp at retailers in europe
you need pci 5.0 not to loose 1% of that 10% perf increase
you need atx 3.1 850w and a special connector most users probably hate
10% with new architecture, pci 5, gddr7 and atx 3.1 is pathetic
their 4th generation rt core increase result is pathetic
only 16gb vram barely fits 2k in AAA games
thus make sense, if you turn off fake frame
but buying an 80 series for 2k and have vram butchered, not 24 or 32 means it's rather an x70 card and you still pay 1.6-1.8msrp in europe at least - well you are scalped
/rant over - upset on Scrooge McDuck with crocodile jacket
 
Back
Top