• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are people planning an upgrade?

I'm a fanboy of hardware in general and progress so as long as both are offering that in their own way awesome.
I'll drink to that. :toast:

I'm also a believer that everybody has different software and hardware needs which, if we're true geeks, we need to respect and understand. When someone asks me for advice and help building a PC, I always look at the person to see their needs. If they care about RT and DLSS, of course I'll get them a GeForce. If they don't, I'll get them whatever is cheap and available. I built a mini PC for a friend's daughter with an RX 6400 about a year ago and she loves it. I've also built PCs for colleagues with Nvidia GPUs. There is no right or wrong. Whoever thinks there is, is the fanboy, not me.
 
Definitely no plans to upgrade at this point. I'm very happy with my current hardware. I've mentioned it before but the upgrade to my 7900 XTX blew me away with its performance and even if it doesn't get FSR4, i'm happy with its raw raster power.

At worst I'll be expanding the storage a little as I get more games.
 
The games I own? My system still happily puts out decent fps n settings …. 9900k is still purrs and gpu I’ll just think I got rtx5060ti for 3090

intels offering is expensive for lower perf than previous gen
AMD 9800x3d is nice but its gonna be a complete change of system
Nvidia 50 series… sure its nice to have a new shiny toy but I think my 3090 can do another 2 years

so its 2026 or even 2027… was planning this year
Was gonna upgrade the ram cpu n mobo but I will waits

unless some parts decided to go kaboom then well I’ll tank it with my laptop n steam deck
 
Nope. My backlog is hella long and I don't want to sell my kidney for getting from 4K DLSS Q 40fps to 55fps.
increase from 40fps, (below vrr range of adaptive sync or gsync compatible) to 55fps is a major improvement WRT sync rate. variable rate.

the “golden“ range is 48-60 for standard VRR monitors (60hz 4k monitors, that or cheap)… so even if it raised it to 48 it would be a major improvement (minor dips to 44)
 
It's not about DLSS being a thousand times better. It's about FSR being both much worse, much later to the party, and much harder to update for an end gamer. With DLSS, you just download new .dll and you're rocking it. With FSR, however, you're mostly out of luck because only the game developers know where they had dug it.

Probably your standards are extremely low. I have limited attention because of head injuries and the image tends to blur on its own in motion. However, even considering that, I see A WHOLE LOT of difference and whilst DLSS is mostly okay-ish in 70+ % games and it's considerably bad in the others the FSR is just one single game where it didn't make me wanna throw my monitor outta window. And that's it. Total massacre. FSR4, however, proved to be much better but still, is it really worth buying a brand new video card..?

I guess to each his own, I don't know if my standards are extremely low as I'm pretty sensitive to motion clarity and jaggies but I guess less bothered by shimmering and one who universally thinks RT looks like crap and tanks performance for fuck all reason. When I use DLSS or FSR either they work or they both don't. I had issues with DLSS and FSR in horizon 5 (lighting) and warzone (dropped weapons had moving dots around them or something) and a few other games.

The DLL's can be downloaded for FSR 3.1 too, most new games allow it. They were late to the party on that one though.

290X was released 11 years ago and is no longer supported. Last drivers: mid '22. Resounding almost 9 years of support. Massive TGP. Extreme size.
970 was released 10 years ago. Still receives updates. TWO TIMES lesser power consumption. Overclocks better.

View attachment 379355View attachment 379356View attachment 379357
Performance difference is insignificant. GTX 970 was cheaper, today's price difference is insignificant.

Literally whatever RTX GPU makes fun of AMD GPUs in RT titles which become more and more of a thing. The only exception arises when we're talking 8 GB GPUs, these suck. Even if they're NVIDIA, they still suck. But 3080 Ti aged leagues better than RX 6900 XT did. They were around the same cost.

FTFY

Mate I don't know if you've actually used the 290X and GTX970 but we were talking about longevity, not launch reviews and the 290X aged massively better than 970 and 980 ever did. The GTX970 was released one year before the 290X and reviews praised it to the moon and back because it matched 290X while consuming 160W instead of 260W. I mean, it was almost like that extra 100W was the end of the world and all that mattered without talking once about which arch is more future proof etc. Thankfully I didn't fall for that BS and had a custom loop anyway, kept my 290X and still do to this day. GTX970? Forgetting the BS and absolute crock of lies that nvidia pulled with 3.5GB shenanigans, it was equal to 290X on launch, 10% slower a year and a half later and continued to fall off as time wore on. It still having driver support means fuck all if it's performance started tanking ages ago. I mean, TPU graphs don't lie, these are from GTX 970 launch and a year and a half later.

Know what's even more funny with this longevity argument? The 290X was supposed to be GTX780's competitor, not 9xx. Guess how the GTX780 aged compared to the 290X. Let me give you a hint - don't even bother looking it up because 290X aged 2x better than the 780 ever did. It literally aged better than the previous and future nvidia generation of it's time.

I also have the 6900XT and a 3090, and i'm not sure what you mean when you say 3080 aged better. I run benches and sure enough, on raster they are still about equal in general. RT is where 3090 pulls ahead, but we knew that at launch.

Your last line makes zero sense, i'll ignore it.
 

Attachments

  • RP.jpg
    RP.jpg
    159 KB · Views: 59
Sure, but it's been worse than ever lately. :(

Or what if I talk about the fact that I've had lots of Intel, Nvidia and AMD hardware in my life, that I was on intel + Nvidia for roughly 7 years before I built my current rig, and that my two HTPCs are both Intel + Nvidia... Oh no, people don't want to hear about such things because I'm an AMD fanboy, right? Or am I an Intel and Nvidia fanboy? I can't follow these things anymore. I guess I'm everything. :D
You are a value fanboy. Join the club and be proud of it, :toast:
 
First prices just came in (EU), 2.2k for the Asus astral. Very decent price considering it's the highest end model, and with asus tax.



And then I realized thats for the 5080. The 5090 is at 3.4k€. :roll:


I guess ill stick to my 4090. 20 fps is fine
 
At 8K native? Which game? :eek:
Nah, anything that looks very pleasing to the eyes is kinda slow. Would really like to use DLDSR to downscale from higher res and DLSSQ Q on top of that to upscale the downscaled output, but gpu doesn't have the oomph at heavy games. Looks amazing when I can hit 80+ fps, but a lot of games can't. Anything with any decent amount of RT is out of the question to use with DLDSR and then I have to drop to plain old DLSS Q.
 
Nah, anything that looks very pleasing to the eyes is kinda slow. Would really like to use DLDSR to downscale from higher res and DLSSQ Q on top of that to upscale the downscaled output, but gpu doesn't have the oomph at heavy games. Looks amazing when I can hit 80+ fps, but a lot of games can't. Anything with any decent amount of RT is out of the question to use with DLDSR and then I have to drop to plain old DLSS Q.
That's mad considering that I'm just about starting to feel my 6750 XT struggle. Strange how different our expectations are, but those differences are what make life great, eh? :)
 
That's mad considering that I'm just about starting to feel my 6750 XT struggle. Strange how different our expectations are, but those differences are what make life great, eh? :)
Well for example stalker 2 with DLDSR + DLSS Q drops to the 40-50 range. Even with FG - it looks smooth but the delay is noticeable. So I basically have to give up on DLDSR completely and use just DLSS Q to get decent enough framerate. Once you realize how insanely crisp DLDSR looks, oh boy :D
 
Well for example stalker 2 with DLDSR + DLSS Q drops to the 40-50 range. Even with FG - it looks smooth but the delay is noticeable. So I basically have to give up on DLDSR completely and use just DLSS Q to get decent enough framerate. Once you realize how insanely crisp DLDSR looks, oh boy :D

You poor thing, how can you even survive not being able to use DLDSR.... Such a tragedy.


I get you though DLDSR/DLSS quality looks amazing.
 
Well for example stalker 2 with DLDSR + DLSS Q drops to the 40-50 range. Even with FG - it looks smooth but the delay is noticeable. So I basically have to give up on DLDSR completely and use just DLSS Q to get decent enough framerate. Once you realize how insanely crisp DLDSR looks, oh boy :D
Perhaps. People around here know how harshly I like to speak against DLSS (and upscaling in general), but I've never tried DLDSR, so I can't comment.
 
Perhaps. People around here know how harshly I like to speak against DLSS (and upscaling in general), but I've never tried DLDSR, so I can't comment.

It's a combination of super sampling and dlss almost cheating... you are basically rendering at around 4k but it looks similar to 8k

But othet than the 4090 and soon 5090 it its pretty heavy. I use it mostly for older games that support DLSS.
 
Perhaps. People around here know how harshly I like to speak against DLSS (and upscaling in general), but I've never tried DLDSR, so I can't comment.
Well if you hate DLSS, youll love DLDSR. It's literally the exact opposite. It takes a higher internal res and downscales it down to your monitor.
 
Well if you hate DLSS, youll love DLDSR. It's literally the exact opposite. It takes a higher internal res and downscales it down to your monitor.
Now that actually sounds cool. Does it compare to AMD's VSR in any way? It's weird that the media is full of DLSS vs FSR tests, but not so much of this.
 
Now that actually sounds cool. Does it compare to AMD's VSR in any way? It's weird that the media is full of DLSS vs FSR tests, but not so much of this.
Yes, in a way. VSR is akin to nvidias DSR, which is basically just a stupid normal downscaler. Being a normal stupid downscaler - they are both super heavy, much heavier than DLDSR. VSR / DSR is literally just playing in higher resolution than your monitor, with the performance penalty that comes with it. DLDSR on the other hand uses the same algorithm as DLSS (in reverse, I guess) so instead of having to render at 4x your native res (like vsr / dsr) and have 4 times the performance impact, DLDSR can land at the same image quality with only 2.2x times the render resolution.

TLDR, say you have a 1080p screen, with VSR / DSR you need to render at 4k to get the same quality as 1620p with DLDSR. The magical part of the equation is that you can add DLSS into the mix to get into some inception style kind of mushrooms.

EG1. It's one of the many reasons im saying native is dead. DLDSR + DLSS balanced looks better than native with just a tiny perf impact.

Just an example. As you can see, there is a performance penalty even with DLSS balanced (so I guess with DLSS P = native in framerate) but image quality is oh my god better

dldsr.JPG
 
Well if you hate DLSS, youll love DLDSR. It's literally the exact opposite. It takes a higher internal res and downscales it down to your monitor.

I think Uber sampling in Witcher 2 was my first time using anything similar... I remember barely getting ok framerates with RTX 680 sli lol... I actually tried to get it working well with 7970 crossfire but it was too buggy.
 
So on one hand you're saying both cards are identical and now you're moving the goal post again so what you really wanted to ask is if I like Nvidia's or AMDs features better and would that lead me to choose Nvidia at the same price which the answer to that is yes currently. Although AMD would only need to make FSR comparable to wipe that advantage out and FSR4 is looking pretty good.

my bad i didn't put exactly my idea in the initial post, still it's being a bit obtuse to not realize what my point is even after i explain it. People didn't bought AMD when they were cheaper with more vram, would they ever buy them if they were the same price? no. it's a fact.

Just look at the collection of posts here, everyone is asking AMD to make their cards cheaper, why?
 
Just look at the collection of posts here, everyone is asking AMD to make their cards cheaper, why?
Because if Nvidia is expensive, if they charge $550 for a 12 GB card in 2025, then one can at least hope for AMD to bring fairer prices to the mix.

Yes, in a way. VSR is akin to nvidias DSR, which is basically just a stupid normal downscaler. Being a normal stupid downscaler - they are both super heavy, much heavier than DLDSR. VSR / DSR is literally just playing in higher resolution than your monitor, with the performance penalty that comes with it. DLDSR on the other hand uses the same algorithm as DLSS (in reverse, I guess) so instead of having to render at 4x your native res (like vsr / dsr) and have 4 times the performance impact, DLDSR can land at the same image quality with only 2.2x times the render resolution.

TLDR, say you have a 1080p screen, with VSR / DSR you need to render at 4k to get the same quality as 1620p with DLDSR. The magical part of the equation is that you can add DLSS into the mix to get into some inception style kind of mushrooms.

EG1. It's one of the many reasons im saying native is dead. DLDSR + DLSS balanced looks better than native with just a tiny perf impact.

Just an example. As you can see, there is a performance penalty even with DLSS balanced (so I guess with DLSS P = native in framerate) but image quality is oh my god better

I can't see much in the screenshot because of Youtube's compression, but I can imagine what it's like. I've played around with super resolutions on both AMD and Nvidia, I just don't use it because of the performance penalty, and because it messes with windows on two screens (at least on Windows - I haven't tried it on Linux).

But this brings the big question: if DLDSR can lower the performance penalty over regular DSR while still providing a much crisper than native image, then why is DLSS celebrated like the next best thing since the wheel was invented, and not this? This is actually awesome tech that no one talks about for some reason. I don't get why.
 
my bad i didn't put exactly my idea in the initial post, still it's being a bit obtuse to not realize what my point is even after i explain it. People didn't bought AMD when they were cheaper with more vram, would they ever buy them if they were the same price? no. it's a fact.

Just look at the collection of posts here, everyone is asking AMD to make their cards cheaper, why?

Are you trying to say nobody should buy AMD DGPUs anymore that they should leave the market and just let Nvidia do whatever they want?

Me personally I want the same from them as I do with any hardware company to offer us better products at the same price or at the very least better price to performance each generation.

I don't understand why you're so obsessed with the AMD vs Nvidia DGPU market and what others should or shouldn't buy. Most people buy Nvidia but it wouldn't be a good thing if that was the only option.

Amd gpus do tend to be cheaper and offer more vram to some that's appealing and what the green team offers in their price range isn't appealing to them. Nobody needs to justify their purchases to anyone but themselves.

Everyone has biases it's human nature there isn't anything wrong with having a preference for a specific company even if the reason only makes sense to them.

You're obviously dead set on purchasing an Nvidis gpu. Worry about how good the 50 series is hopefully Nvidia offers you somthing that isn't terrible in whatever price range you decide to purchase not what someone who leans AMD may or may not purchase.

We at at least know if you like interpolated frames they will be very good at that.

But this brings the big question: if DLDSR can lower the performance penalty over regular DSR while still providing a much crisper than native image, then why is DLSS celebrated like the next best thing since the wheel was invented, and not this? This is actually awesome tech that no one talks about for some reason. I don't get why.

Because when both are used together you take a small hit to performance vs native and it looks substantially better.

If you just use DLDSR 2.25 you are still going to halve your frame rate for somthing that looks 4x as sharp but if you add DLSS you can get similar image quality for a much smaller 10-15% hit.
 
Last edited:
Everyone has biases it's human nature there isn't anything wrong with having a preference for a specific company even if the reason only makes sense to them.
I'm biased towards generational improvements, fair prices and open standards. If that makes me a fan in someone's eyes, it says more about that person than it says about me.

And I really don't care what other people are buying. I buy stuff to satisfy my own needs, not theirs (let alone follow trends, which is dumb as heck).

Because when both are used together you take a small hit to performance vs native and it looks substantially better.

If you just use DLDSR 2.25 you are still going to halve your frame rate for somthing that looks 4x as sharp but if you add DLSS you can get similar image quality for a much smaller 10-15% hit.
Then DLSS should be celebrated together with DLDSR, not on its own.

With DLSS, you get a blurrier image and a bit more performance. Woohoo! :sleep:
But if you add DLDSR into the mix, you get a crisper image, but a smaller hit to performance than you would with regular DSR and no DLSS. Now, that's actually something!
 
I'm biased towards generational improvements, fair prices and open standards. If that makes me a fan in someone's eyes, it says more about that person than it says about me.

And I really don't care what other people are buying. I buy stuff to satisfy my own needs, not theirs (let alone follow trends, which is dumb as heck).


Then DLSS should be celebrated together with DLDSR, not on its own.

With DLSS, you get a blurrier image and a bit more performance. Woohoo! :sleep:
But if you add DLDSR into the mix, you get a crisper image, but a smaller hit to performance than you would with regular DSR and no DLSS. Now, that's actually something!

Becuase it isn't practical on most gpus or games with heavy RT... Nvidia rather show Cyberpunk for the 18th time with interpolated frames than talk about the cool shit that actually improves image quality.

It's mostly a perk of owning a 4090 I don't think I'd use it on a lesser gpu.

RT use to be their big talking point now it's frame gen X9000...

That demo they showed was impressive though I wish they would allow us to download it like they did with their older tech demos.

I've been using DLDSR in combination with DLSS since it launched in 2022 I'm not even sure most are aware you can use both together.
 
Interesting that the 9070XT's rectangular die is pretty much the same size as 4080 super with the same bus width as well. It's safe ish to assume it'll be close enough to it in raster and RT, with the difference in RT being larger.

Price it at $450-475 and it'll surely sell a boatload and I might just snag one to play with the last of the RDNA cards. But pease for the love of god AMD stop with the stupid launch prices and get your fucking drivers ready and baked in before launch. Launch 2 months later, who cares, just dont f with the drivers or release price.
 
Last edited:
Becuase it isn't practical on most gpus or games with heavy RT... Nvidia rather show Cyberpunk for the 18th time with interpolated frames than talk about the cool shit that actually improves image quality.

It's mostly a perk of owning a 4090 I don't think I'd use it on a lesser gpu.

RT use to be their big talking point now it's frame gen X9000...

That demo they showed was impressive though I wish they would allow us to download it like they did with their older tech demos.
The most advanced tech has always been made for the most advanced / future hardware in mind. Just like Ubersampling in The Witcher 2. Nobody was meant to use it when the game came out.

I've been using DLDSR in combination with DLSS since it launched in 2022 I'm not even sure most are aware you can use both together.
I'm not sure most are aware that DLDSR even exists. Most just go "look at my dee-el-es-es powah, I've got a billion frames, bah". :slap:
 
Back
Top