• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 5080 - premature review - it sucks

1738271932037.png


That part seems correct
 
Killzone. And Killzone 3 on the PS3. Absolutely graphical marvels for that hardware.


Going on about Guerilla, I think Horizon made a mark too for the PS4. And it took quite awhile for it to appear on the PC, and we agree it was an excellent port - definitely performance wise - but the base performance is PS4 exclusive.
How can kill zone be unoptimized for pc since it is a ps exclusive? I think you are missing what the discussion was about.
 
How can kill zone be unoptimized for pc since it is a ps exclusive? I think you are missing what the discussion was about.
Look at PC games from that era with similar hardware running similar shooters, its not hard, you're evading the point. The only reason you don't see more recent games like that on the PS4/5 that are usually directly ported is because they're built not with a console limitation in mind, but adapted to it. There is no economical sense in putting massive effort in the console performance just so you can raise a slider from medium to high. Its not a matter of can't do, its a matter of 'won't do'.

I also gave you the Horizon example. Ported much later. Designed PS4 first.
 
Last edited:
Look at PC games from that era with similar hardware running similar shooters, its not hard, you're evading the point. The only reason you don't see more recent games like that on the PS4/5 that are usually directly ported is because they're built not with a console limitation in mind, but adapted to it. There is no economical sense in putting massive effort in the console performance just so you can raise a slider from medium to high. Its not a matter of can't do, its a matter of 'won't do'.

I also gave you the Horizon example. Ported much later. Designed PS4 first.
I still think you are missing what the discussion was about. I said those games that are unoptimized on PC are also unoptimized on consoles. They are just unoptimized full stop, pc has nothing to do with it. Jedi, golum, forspoken and the likes run like crap everywhere.
 
I still think you are missing what the discussion was about. I said those games that are unoptimized on PC are also unoptimized on consoles. They are just unoptimized full stop, pc has nothing to do with it. Jedi, golum, forspoken and the likes run like crap everywhere.
Yes, that is true for this gen of games, and I'm not denying that either, but showing you a bigger picture.

We saw something similar with the era of meh console ports during PS3. Except now the overall thing seems much more problematic. But that's true for a selection of games. There will always be shitty games. And they're shitty everywhere no surprise...
 
How can kill zone be unoptimized for pc since it is a ps exclusive? I think you are missing what the discussion was about.
No, his point stands, somewhat.

Killzone 2 is an example of a beautiful looking game, running on what is a very inefficient cell processor structure and obviously outdated GPU. Yet, with optimization, it looks great and runs sufficient.

Something like that is, for the most part, dead. I can't really think of a well optimized game these days. Maybe Doom series and new Indiana Jones maybe.
 
Well the things, gen to gen gains are important because price hasn't changed and we got less gen to gen performance.

View attachment 382454

Oh and that graph is from "influencer", from the OPs post right above yours.

not sure on what is your point here. Like i said the problem is the price, not the gen on gen improvement
 
not sure on what is your point here. Like i said the problem is the price, not the gen on gen improvement
The point is that Paul also mentioned the price creep that's tied to this.
 
not sure on what is your point here. Like i said the problem is the price, not the gen on gen improvement
The problem is that it's a 4080 Super at 4080 Super price.

As the video where the screenshot is from points out, gen-on-gen increase in performance has historically been way above inflation and the resulting price hike. Nvidia has been closing that gap recently, following the increase in performance with price. Historically, you've been getting 40-50-70% more performance at a 10-15% higher price, but now you're only getting 10% more performance for the same price.

It's basically the Intel quad-core era in Nvidia flavour.
 
The problem is that it's a 4080 Super at 4080 Super price.

As the video where the screenshot is from points out, gen-on-gen increase in performance has historically been way above inflation and the resulting price hike. Nvidia has been closing that gap recently, following the increase in performance with price. Historically, you've been getting 40-50-70% more performance at a 10-15% higher price, but now you're only getting 10% more performance for the same price.

It's basically the Intel quad-core era in Nvidia flavour.
Yeah, that's why I expect a 40% gen on gen from AMD. Basically a card twice as fast as the 6800xt at 649$. If they don't deliver that they are just as greedy.
 
Yeah, that's why I expect a 40% gen on gen from AMD. Basically a card twice as fast as the 6800xt at 649$. If they don't deliver that they are just as greedy.
I'm not sure if that expectation is realistic.
I mean, everybody is under the impression that they postponed the launch because they saw how cheap Nvidia's 50-series is, so they had to adjust the price downwards.
But what if it's the opposite? What if they just waited to see what turds of an offer the 5080 and 5070 Ti are, so that they can up the price on the 9070 XT accordingly?

Let me be wrong, please. :oops:
 
I'm not sure if that expectation is realistic.
I mean, everybody is under the impression that they postponed the launch because they saw how cheap Nvidia's 50-series is, so they had to adjust the price downwards.
But what if it's the opposite? What if they just waited to see what turds of an offer the 5080 and 5070 Ti are, so that they can up the price on the 9070 XT accordingly?

Let me be wrong, please. :oops:
Why isnt it possible? Read your previous post, 40% is at the low end of what's expected.
 
Why isnt it possible? Read your previous post, 40% is at the low end of what's expected.
I didn't say it's not possible. I just said it's probably not realistic. I don't think AMD intends to give anything away for free just to claw back some market share.

But again, I hope I'm wrong.
 
I can't comprehend why anyone would buy a 5080... get a 2nd hand 4090 instead, way better value...

RTX 5080 = ~€1250, used 4090 = ~€1750 unless you find a really good deal.
 
Australian input:

1738328240872.png


LOL that graph. The 5090 is not considered an 'extreme flagship' (whatever the fck that is?!) and they count it as the 100% GPU. No shit, the rest looks far worse now. Lmao
Except it really is, if you consider the price but also its TDP and shader count upgrade. And then, suddenly, the 5090 is the '130%(+)' card relative to the 4090, and the 5080 is actually a 65%(+) card, making it better than what Ada had on offer in this segment. After all, this gen is just a shader bump. It makes no sense to not compare it directly to Ada, its practically the same hardware.

And this is evidently also the truth. Goes to show how the way you look at things makes the picture. Its no secret all GPU stacks the last few generations cover a much greater range of performance than the generations prior: 1080p > 4K is a new thing, and its a whole other thing than the choice between 720p and even a very high end 1600p of yesteryear, and those super high resolutions usually took SLI/Crossfire setups. So obviously, the gap between the slowest and largest SKU is going to increase.

See this is why you don't need YT. Its just smoke and mirrors loaded with utter bullshit to make a video.
 
Last edited:
RTX 5080 = ~€1250, used 4090 = ~€1750 unless you find a really good deal.

That HEAVILY depends on the country you are from. Here used 4090's can be had for the same price as an rtx 5080. And the 4090 is just a better product in every way. The other people talking about warranty does have a point in regards to 2nd hand hardware, but even so, i'd get a 2nd hand 4090 over a new rtx 5080 - if the price is the same.
 
... i'd get a 2nd hand 4090 over a new rtx 5080 - if the price is the same.
Not all 4090 come from OCD and extremely careful millionaire entrepreneurs, who had so little time, that they used the card half an hour a week.

Some may have been used a lot, some may have been subjected to mechanical stress, water damage, liquid metal damage, improper handling (including bad application of thermal pads and paste) while removing air coolers and mounting waterblocks, and then again removing water blocks and mounting air coolers back.

4090 are still pretty expensive and I would be very anxious for buying a used one for the above reasons.
 
Not all 4090 come from OCD and extremely careful millionaire entrepreneurs, who had so little time, that they used the card half an hour a week.

Some may have been used a lot, some may have been subjected to mechanical stress, water damage, liquid metal damage, improper handling (including bad application of thermal pads and paste) while removing air coolers and mounting waterblocks, and then again removing water blocks and mounting air coolers back.

4090 are still pretty expensive and I would be very anxious for buying a used one for the above reasons.

Again, depends where you live - in denmark there are auction sites for danish people only, where you have to register with your personal ID, which means it's rare you get scammed. And unlike on ebay, it won't be cards from mining farms.

View attachment 382688

LOL that graph. The 5090 is not considered an 'extreme flagship' (whatever the fck that is?!) and they count it as the 100% GPU. No shit, the rest looks far worse now. Lmao
Except it really is, if you consider the price but also its TDP and shader count upgrade. And then, suddenly, the 5090 is the '130%(+)' card relative to the 4090, and the 5080 is actually a 65%(+) card, making it better than what Ada had on offer in this segment. After all, this gen is just a shader bump. It makes no sense to not compare it directly to Ada, its practically the same hardware.

And this is evidently also the truth. Goes to show how the way you look at things makes the picture. Its no secret all GPU stacks the last few generations cover a much greater range of performance than the generations prior: 1080p > 4K is a new thing, and its a whole other thing than the choice between 720p and even a very high end 1600p of yesteryear, and those super high resolutions usually took SLI/Crossfire setups. So obviously, the gap between the slowest and largest SKU is going to increase.

See this is why you don't need YT. Its just smoke and mirrors loaded with utter bullshit to make a video.

What you say would make sense if the 5090 was the 4090 ti or titan - but as a new gen it provides little uplift, then the video is spot on imo :) Extreme flagships are what the titans (and 3090 ti) are considered, in case you haven't seen their vids before.
 
Last edited:
LOL that graph. The 5090 is not considered an 'extreme flagship' (whatever the fck that is?!) and they count it as the 100% GPU. No shit, the rest looks far worse now. Lmao
Except it really is, if you consider the price but also its TDP and shader count upgrade. And then, suddenly, the 5090 is the '130%(+)' card relative to the 4090, and the 5080 is actually a 65%(+) card, making it better than what Ada had on offer in this segment. After all, this gen is just a shader bump. It makes no sense to not compare it directly to Ada, its practically the same hardware.
Finally I'm not the only one saying this, thank you! :)

Turing, Ampere, Ada and Blackwell are all the same hardware! Any perf. bump we ever saw came from clock speed increases due to smaller nodes - which Blackwell didn't get, and now the secret sauce is spilling.

And this is evidently also the truth. Goes to show how the way you look at things makes the picture. Its no secret all GPU stacks the last few generations cover a much greater range of performance than the generations prior: 1080p > 4K is a new thing, and its a whole other thing than the choice between 720p and even a very high end 1600p of yesteryear, and those super high resolutions usually took SLI/Crossfire setups. So obviously, the gap between the slowest and largest SKU is going to increase.

See this is why you don't need YT. Its just smoke and mirrors loaded with utter bullshit to make a video.
It's true, but Nvidia is really taking it to the extreme. I mean, 600 W in a GPU? Does anyone seriously need that?
 
Again, depends where you live - in denmark there are auction sites for danish people only, where you have to register with your personal ID, which means it's rare you get scammed. And unlike on ebay, it won't be cards from mining farms.



What you say would make sense if the 5090 was the 4090 ti or titan - but as a new gen it provides little uplift, and the video is spot on imo :)
Its just a name. Look at the specs

you mean that Titan they started calling an x90 for three consecutive generations now? Anything built on a chip bigger than x104 is 'Titan' territory bud. Again... just look at shader count and the older titans. A name is just a name.

It's true, but Nvidia is really taking it to the extreme. I mean, 600 W in a GPU? Does anyone seriously need that?
Well... I hate to put the mirror in front of you, but wasn't it you who said this class of GPU is the new alternative to SLI? Divide by two and you've got two very 'reasonable' 300W GPUs, and you don't pull 600W continuously from a 5090 either.
 
Last edited:
Back
Top