• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are people planning an upgrade?

The most advanced tech has always been made for the most advanced / future hardware in mind. Just like Ubersampling in The Witcher 2. Nobody was meant to use it when the game came out.


I'm not sure most are aware that DLDSR even exists. Most just go "look at my dee-el-es-es powah, I've got a billion frames, bah". :slap:

 
I know the tech - I read this exact same article back in the days. :)

I just never had a go at it in person, so I have/had no idea how good it is. I only have experience with "regular" super resolution (DSR on Nvidia, VSR on AMD), which is good, but not necessary for my experience.
 
Are you trying to say nobody should ...

i have no idea where you get that from what I'm saying, zero.
I'm just stating reality as it is right now in front of you, nothing more. It's like you asking me, a couple years ago when i would say no one cares about bulldozer, if i meant no one should buy amd ever again. Makes no sense, that's not the point at all.
 
With the impending release of new generation Nvidia, Intel and AMD cards, new AMD and Intel CPUs and APUs, is anyone thinking of upgrading their rigs? Maybe an upgrade to older generation parts which now might come down in price? This is not a question for new builds.

Yeah, I've been waiting for an upgrade beyond the 3080 for a while now. I'm not in a rush, although mostly GPU bottlenecked at the mo, it still handles everything I throw at it quite well. It has crept into that borderline acceptable territory where I have to make a few compromises on quality settings to hit the desired performance, which is definitely triggering my upgrade itch.

I got a few built-in-stone requirements. Must have 16GB VRAM minimum without bandwidth skimping, a bare minimum of 50% performance uplift and hopefully something in the ~£800 region. For this type of money i would have expected far greater performance uplift (+80-100%) without the artificial adornments but seeing how the GPU perf segmentation (raster) calculatedly plays out i've settled with lower expectations going fwd.

I'm on the fence right now. It all comes down to the benchmarks! I might go for the 5070 TI but if the 5080 delivers performance close to the 4090, I might just 'splurge' on that instead. As for backup options, if prices drop considerably, the 7900 XTX/4080 SUPER are on my radar but without a heavy fat trim on price I won't settle with anything below +50% perf. Anyway, i'm not in a rush, I prefer upgrading well after the initial release buzz (50-series) so plenty of time to consider all options.
 
i have no idea where you get that from what I'm saying, zero.
I'm just stating reality as it is right now in front of you, nothing more. It's like you asking me, a couple years ago when i would say no one cares about bulldozer, if i meant no one should buy amd ever again. Makes no sense, that's not the point at all.

You just keep going in circles about how nobody should buy AMD gpus that Nvidia is the only option. I get it you like your beastly 3060ti that was so good Nvidia couldn't come up with a worthy successor at the 400 usd mark but hey almost 0 progress means we can keep our gpus longer win/win I guess.
 
Saw this article with some leaked 3DMark scores.

The leak gives the 5090 a Time Spy Extreme score of "24000+" and Speedway: "13500+".

Stock scores for 4090 were ~19000 for Time Spy Extreme and ~9950 for Speedway. That's +26% for non-RT and about +35.7% for the RT benchmark.

Going from this, if you ignore MFG: it would mean +26 to 36% performance uplift for +25% price and +28% power. i.e. little to no change in performance/price from 4090.

Also, the article looks at the scores of overclocked 4090's (there's an update with stock scores) from 3DMark's hall of fame, indicating that the 5090 (assuming stock) is ~5.3% faster than one of the fastest 4090's ever in TSE and ~8% faster than the fastest 4090 in Speedway.

The article stops there, but I was curious, so I looked at those stock 4090 numbers vs the fastest 3090ti scores...there was a ~30% increase in TSE and ~39% increase in Speedway from the fastest 3090ti to a stock 4090. That gets even more drastic when you consider the stock 4090 is likely running at lower power levels than those overclocked 3090ti's and you cannot say that for the stock 5090 vs overclocked 4090's (I can't even get my 4090 to 575W haha).

We obviously still have to wait for real reviews to get data we can trust (grains of salt always required for leaks lol), but all the current information indicates that this is a generation for AI improvements, not aimed at gamers. The AI improvements can help with frame gen...but for most gaming purposes, it's just slight improvements over 4000-series from a generational improvement perspective.
 
i have no idea where you get that from what I'm saying, zero.
I'm just stating reality as it is right now in front of you, nothing more. It's like you asking me, a couple years ago when i would say no one cares about bulldozer, if i meant no one should buy amd ever again. Makes no sense, that's not the point at all.
I could say I don't understand why anybody would buy anything else than a Ford if all cars cost the same, because only Ford has heated windscreens and it's a must-have feature. I love my heated windscreen because I don't even need to keep an ice scraper in the car, let alone do anything about freezing over. I just sit down, press a button, and 5-10 minutes later, I drive off.

But like I said, people have priorities, not everybody cares about a heated windscreen. Getting out to scrape ice is no trouble for some, and if I said that it's only because they haven't experienced the greatness of heated windscreens, it would be an extremely dumb and ignorant statement. Some people just prefer the design, or comfort, or whatever in X car, and it's more important for them than a heated windscreen.

See what I mean now? Even with price and performance being the same in AMD and Nvidia, they're not the same, and it's not a matter of better and worse, just preference of technologies, company image, etc.

At the end of the day, they both sell you a GPU that runs your games. Not worth arguing about on a forum (unless you're an avid fan, I suppose).
 
But this brings the big question: if DLDSR can lower the performance penalty over regular DSR while still providing a much crisper than native image, then why is DLSS celebrated like the next best thing since the wheel was invented, and not this? This is actually awesome tech that no one talks about for some reason. I don't get why.
Well not a lot of people know about it because it's not part of games. You don't activate it from within the game but from NVCP.
 
Well not a lot of people know about it because it's not part of games. You don't activate it from within the game but from NVCP.
That wouldn't be a problem, but it's not being advertised well enough, imo.

If the difference in the performance impact between DLDSR and regular DSR is really that huge, then I'd much more easily get behind it than I ever could behind fake frames.
 
That wouldn't be a problem, but it's not being advertised well enough, imo.

If the difference in the performance impact between DLDSR and regular DSR is really that huge, then I'd much more easily get behind it than I ever could behind fake frames.
Yeah the difference is pretty much ~linear - 4x for DSR vs 2.25x for DLDSR. So 1620p DLDSR looks as good as 2160p DSR. But then you add DLSS and it kinda turns into 1200p vs 2160p, so yeah, you end up with ~80% more performance for almost identical IQ to 4x DSR.

EG1. The above numbers are assuming you are on a 1080p display.
 
Saw this article with some leaked 3DMark scores.

The leak gives the 5090 a Time Spy Extreme score of "24000+" and Speedway: "13500+".

Stock scores for 4090 were ~19000 for Time Spy Extreme and ~9950 for Speedway. That's +26% for non-RT and about +35.7% for the RT benchmark.

Going from this, if you ignore MFG: it would mean +26 to 36% performance uplift for +25% price and +28% power. i.e. little to no change in performance/price from 4090.

Also, the article looks at the scores of overclocked 4090's (there's an update with stock scores) from 3DMark's hall of fame, indicating that the 5090 (assuming stock) is ~5.3% faster than one of the fastest 4090's ever in TSE and ~8% faster than the fastest 4090 in Speedway.

The article stops there, but I was curious, so I looked at those stock 4090 numbers vs the fastest 3090ti scores...there was a ~30% increase in TSE and ~39% increase in Speedway from the fastest 3090ti to a stock 4090. That gets even more drastic when you consider the stock 4090 is likely running at lower power levels than those overclocked 3090ti's and you cannot say that for the stock 5090 vs overclocked 4090's (I can't even get my 4090 to 575W haha).

We obviously still have to wait for real reviews to get data we can trust (grains of salt always required for leaks lol), but all the current information indicates that this is a generation for AI improvements, not aimed at gamers. The AI improvements can help with frame gen...but for most gaming purposes, it's just slight improvements over 4000-series from a generational improvement perspective.

25% in raster and 33% in RT sounds about right and considering they are on the same node, power will go up as well since it doesn't look like they've overhauled too much in the arch itself but I guess we'll know once reviews hit.

I think the biggest increase will come from bandwidth limited workloads, so I guess some AI workloads and the like
 
Because when both are used together you take a small hit to performance vs native and it looks substantially better.
Careful saying such things around here! there are still people that somehow believe that it's not possible to get better IQ than rendering at a given panels native resolution :kookoo:

Personally running 4K, DLSS Quality is so often either a match or better than Native+TAA it's just not worth the FPS drop / IQ shortcomings of sub par TAA / increased heat and power consumption to justify it (or justify the perf of DLAA for that matter with a 3080 at 4k). With any luck, the same will be able to be said for FSR4, as with FSR3 @4kQ I still found image instability, shimmer and disocclusion artefacts too distracting and immersion breaking.

Naturally, if you have excessive GPU power on tap relative to the rendering task at hand, DLAA/FSR Native AA etc or DLDSR/Super sampling (whatever the in game menu's might call that) is such a gorgeous icing on the cake.
 
Careful saying such things around here! there are still people that somehow believe that it's not possible to get better IQ than rendering at a given panels native resolution :kookoo:

Personally running 4K, DLSS Quality is so often either a match or better than Native+TAA it's just not worth the FPS drop / IQ shortcomings of sub par TAA / increased heat and power consumption to justify it (or justify the perf of DLAA for that matter with a 3080 at 4k). With any luck, the same will be able to be said for FSR4, as with FSR3 @4kQ I still found image instability, shimmer and disocclusion artefacts too distracting and immersion breaking.

Naturally, if you have excessive GPU power on tap relative to the rendering task at hand, DLAA/FSR Native AA etc or DLDSR/Super sampling (whatever the in game menu's might call that) is such a gorgeous icing on the cake.

Some prefer TAA issues over DLSS issues neither are perfect and it also comes down to the game the person is playing.

I personally can't stand TAA and believe its the bane of modern game development but honestly I only really like DLSS at 4k at 1440p what most people likely game at the artifacts are too obvious and 1080p is a lost cause to my eyes.

I'm hoping this new DLSS changes that.
 
Some prefer TAA issues over DLSS issues neither are perfect and it also comes down to the game the person is playing.

I personally can't stand TAA and believe its the bane of modern game development but honestly I only really like DLSS at 4k at 1440p what most people likely game at the artifacts are too obvious and 1080p is a lost cause to my eyes.

I'm hoping this new DLSS changes that.
I think there's something large to be said for what artefacts or let's say image quality shortcomings individuals are sensitive too. I can't stand shimmer, and garbling / pixelation, and (screen space reflections disappearing with vertical camera movement, barf) What's much less apparent to me is ghosting (unelss its absolutely egregious), and not that 4k DLSS Quality is soft, but I can easily tolerate some softness if I need to drop the DLSS setting to keep other visual settings intact. And some arguments against would be like "haha so you're playing at 1080p", well actually just run 1080p and tell me which one looks a LOT better lol.

What DLSS does exceptionally well imo is overall image stability, and secondarily to me the lack of shortcomings that draw my eye to them and break immersion.

From what I've sene of AMDs research project / FSR4, it would appear they have come a hell of a ways to solving the issues that I cannot tolerate.
 
Last edited:
I think there's something large to be said for what artefacts or let's say image quality shortcomings individuals are sensitive too. I can't stand shimmer, and garbling / pixelation, and (screen space reflections disappearing with vertical camera movement, barf) What's much less apparent to me is ghosting (unelss its absolutely egregious), and not that 4k DLSS Quality is soft, but I can easily tolerate some softness if I need to drop the DLSS setting to keep other visual settings intact. And some arguments against would be like "haha so you're playing at 1080p", well actually just run 1080p and tell me which one looks a LOT better lol.

What DLSS does exceptionally well imo is overall image stability, and secondarily to me the lack of shortcomings that draw my eye to them and break immersion.

From what I've sene of AMDs research project / FSR4, it would appear they have come a hell of a ways to solving the issues that I cannot tolerate.

It's pretty great on my C1/G2 but ever since I swapped to my G8 I only use it if I have no choice and need the extra fps which is thankfully rare....
 
Last edited by a moderator:
Not entirely sure it belongs here, but there are rumors that amd was preparing a huge GPU (MCM - 2x9070xt dies) but was canceled when they got wind of the 5090 specs. They thought that based on specs, it would outperform their high end offering. Now that the 5090 turns out to be average - I guess they are banging their heads at the wall at AMD HQ.

AMD never loses an opportunity to lose an opportunity.
 
Heh, have a 980ti and will just keep that. I lost interest in games. However a new threadripper cpu would be nice!
 
  • Like
Reactions: N/A
Not entirely sure it belongs here, but there are rumors that amd was preparing a huge GPU (MCM - 2x9070xt dies) but was canceled when they got wind of the 5090 specs. They thought that based on specs, it would outperform their high end offering. Now that the 5090 turns out to be average - I guess they are banging their heads at the wall at AMD HQ.

AMD never loses an opportunity to lose an opportunity.

Yeah if they just made a chip the size of the 7900XTX monolithic they likely would have competed decently assuming the leaked performance is accurate. Around 500 mm2
 
to be fair, when you're going in the highway and see cars coming straight at you and start shouting "hey why are these idiots all going the wrong way", odds are (high ones) you're the one on the wrong.
Someone buying the product everyone buys hardly makes anyone a fanboy, but buying the product no one cares about really do.
I don't think there's a wrong in buying whatever product. Its just money spent and really, it shouldn't be important to people 'what the others do'. That herd mentality is the core of many of todays' problems if you ask me. Multinationals thrive on the camp mentality, because they use it to create the herd and now all they have to do is manage that single herd, not individual customers. That's what influencers do for them now, and YTbers, and tech sites, many of them oblivious of what they're helping others achieve.

And on the receiving end, herds are also oblivious of what's being done to them, they are being manipulated, and that's the same herd saying what you say up here 'you're the one on the wrong'. But it might just be that the individual is the only one seeing it right. Not always, but it happens more often than you might think. We have countless examples in history where whole countries placed themselves on the wrong side of it. WW2 rings a bell, or the current situation in Russia. Are the few that stand up against that propaganda, the ones in the wrong? Hmmmmmm :)
 
Not necessarily if they don't see it being lucrative enough. AMD and Intel are riding the AI bandwagon, too. Nobody cares about gamers.


They're obviously not trying if all they have is "more fake frames" and no data to offer on raw performance. The 5070, for example, has almost the same core count as the 4070. I doubt it'll be a lot faster in pure raster. 4070 Super level, perhaps.

The only thing they're upgrading is the x90 level because that's where people seriously interested in AI are shopping, and like I said, AI is where the big money is.
Just seen a UK tiktoker post up telling everyone to buy a 5070 because it matches the 4090. :laugh:
 
Just seen a UK tiktoker post up telling everyone to buy a 5070 because it matches the 4090. :laugh:
Brace yourselves guys. Cheap used 4090s are coming! :roll:
 
Just seen a UK tiktoker post up telling everyone to buy a 5070 because it matches the 4090. :laugh:

u gotto give it to Nvidia. They don't just sell graphics cards - they sell dreams. On a good night sleep you can get em for free, but these masters of marketing wizardry will have you buy em.
 
Back
Top