• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
That's the reality we live in though Nvidia’s $h!+ is worth more than AMD's $h!+.
I don't care whose shit is worth what. I'm not gonna pay extra for features that I don't care about, features that make my 1080p image look like crap, features that are for streamers, features that are supposed to justify simple cards with small GPUs and little VRAM being sold at premium prices, features that I call gimmicks because they are just that. Simple.

Edit: typo
 
Last edited:
A noble and just sentiment. I hope you'll enjoy the card you currently use just as much in ten or twenty years, since you'll never be able to buy another.
 
I remember I tried DLSS on F1 2021 at 1080p (IIRC the track I tried is Australia), and the result picture quality is...with a little bit of exaggeration, traumatic.
I tried DLSS on Forza Horizon 5 yesterday, and found the result acceptable. Still prefer native though. DLAA(?) is also a nice one, but I don't need it. (I'm still on 1080p high 144hz; ultra is a bit unstable for whatever reason, and I don't really need DLSS here. I tried just for lolz.)

I ended up completely not caring about CP2077, Hogwarts and Starfield, so no super heavy game here.
Every game I play for now can do 1080p High/Ultra at >75fps so that is very very fine, but I planned to go 4K around next month.
It will be interesting to see how my 3070 with 8GB VRAM can handle 4K native. I should be fine with 45fps If I ended up not spoiled by high refresh rate; maybe I will have a taste on DLSS on 4K but I don't think I will like it, but stability may also throw a spanner here.
If that's the case......I probably will curse the stupid pricing / possibly wrong amount of VRAM of the 4060ti~4070ti range, and try AMD (likely 7800XT) again.

4080 and 4090 is super impressive, but I'm not shopping at that kind of price point.
Everything below 4080 is still at the wrong price after all those months IMO. So I voted no here.

EDIT: I somehow have about the same amount of instability with my last card (5700XT, I think it's after AMD sorted out most of the driver issues) and my current 3070. My last last card (1070) is a tad better IIRC. So without bugs big enough to be on the news, the driver stability argument is not something I would consider in this gen.
 
Last edited:
A noble and just sentiment. I hope you'll enjoy the card you currently use just as much in ten or twenty years, since you'll never be able to buy another.
Who is this meant to reply to?
 
Yours, given that I very much doubt we'll see cards without the features you detest and refuse to buy in the lifetimes of anyone here.
 
Yours, given that I very much doubt we'll see cards without the features you detest and refuse to buy in the lifetimes of anyone here.

He will buy amd cards with the same but worse features as long as they are cheaper and offer a little bit more raster nothing wrong with that.

Even I picked up a 6700XT because what Nvidia offers at 300 usd is horrible.

I'm sure the 8800XT and 9800XT assuming they keep the naming which AMD seems allergic to will be good cards who knows what price segment they will belong two because AMD brands their 8 series cards from the low all the way to the high end depending on Generation.

Edit. I should have added there is no sense paying extra for features you don't use or don't like.
 
Last edited:
He will buy amd cards with the same but worse features as long as they are cheaper and offer a little bit more raster nothing wrong with that.

Even I picked up a 6700XT because what Nvidia offers at 300 usd is horrible.
Exactly.

I never said I'd never buy a card with DLSS, Tensor cores, or any other gimmick/feature that I'm not interested in. I still have a 2070 on the shelf, after all. What I said was that I'm not willing to pay an extra penny for said gimmicks/features if there's a cheaper alternative that suits my needs just fine (especially if that cheaper alternative comes with more VRAM and/or raw performance).

Speaking of Tensor cores, there's actually one use of them that I find quite cool. It's called Nvidia Canvas. If someone figures out how to run it on RDNA 3 AI cores, let me know, please. :D
 
Exactly.

I never said I'd never buy a card with DLSS, Tensor cores, or any other gimmick/feature that I'm not interested in. I still have a 2070 on the shelf, after all. What I said was that I'm not willing to pay an extra penny for said gimmicks/features if there's a cheaper alternative that suits my needs just fine (especially if that cheaper alternative comes with more VRAM and/or raw performance).

Speaking of Tensor cores, there's actually one use of them that I find quite cool. It's called Nvidia Canvas. If someone figures out how to run it on RDNA 3 AI cores, let me know, please. :D

Originally I was going to leave the AMD part out because I think you would buy an Nvidia/intel alternative priced the same with the same amount of vram and similar performance but let's be real the next generation and the generation after that I doubt things will be all that different.
 
So as stated but with no pre set theoretical statement.

I want your opinions and keep them real, polite and none flamey!?!, I'll add mine after page 5.
I would have loved to buy a 4080.. almost did. But it was hella cash after tax.. like 1800 Canadian Pesos. Saved 800 by buying the 4070 Ti :D

Do I regret it? Maybe just for the benchmarks.. but we got to go on a weeks vacation in the mountains instead :)

And.. I still dont trust AMD after my last experience with their GPUs. These guys talked sense into me by buying AM4 when I was ready to build a couple years ago.

And even now, AMD is doing great with GPU performance.. but it isnt all sunshine and lollipops with them for even just daily stuff.

Having been hanging around with it for most of a year now it kinda just "works". Good and bad I guess. Good is that I've got better things to worry about than guinea pig for these companies for free, in hopes that they'll eventually fix stuff

Bad is that it's just not that interesting. Nothing really new about how the GPUs fundamentally work, just more of the same and a bit better. Like, clocks/power/volts just follows same rules from years ago. No real revolutionary stuff. Tried RT a few times and compared to things like DCS VR I'm not sure RT deserves to be described as such. DLSS is solid as usual but none of that's new if not using frame gen.

4070 was a little interesting for efficiency and that compact FE cooler. 4090 is 4090. Pretty telling that "kinda small" and "really expensive" are the only real highlights. 4070 Ti gets a pass because I'm biased, but it's just a more-tolerable 3070 Ti as a previous 3070 Ti owner. No real interesting LP/slot power stuff either (that GB 4060 definitely does not count).
 
Having been hanging around with it for most of a year now it kinda just "works". Good and bad I guess. Good is that I've got better things to worry about than guinea pig for these companies for free, in hopes that they'll eventually fix stuff

Bad is that it's just not that interesting. Nothing really new about how the GPUs fundamentally work, just more of the same and a bit better. Like, clocks/power/volts just follows same rules from years ago. No real revolutionary stuff. Tried RT a few times and compared to things like DCS VR I'm not sure RT deserves to be described as such. DLSS is solid as usual but none of that's new if not using frame gen.

4070 was a little interesting for efficiency and that compact FE cooler. 4090 is 4090. Pretty telling that "kinda small" and "really expensive" are the only real highlights. 4070 Ti gets a pass because I'm biased, but it's just a more-tolerable 3070 Ti as a previous 3070 Ti owner. No real interesting LP/slot power stuff either (that GB 4060 definitely does not count).


Maybe I'm in the minority but I do hope Nvidia continues to push RTX remix I'm pretty excited about the Half Life 2 version and hope other games get the treatment. I thought Quake 2 was pretty neat and made the remaster less exciting for me.

Even Apple is pushing RT now though not sure it makes sense on a phone but cool I guess.
 
Personally I was fully set on an XTX, but there were none to be had when launch day came or the next few days and I needed a card. Also it was just $300 cheaper at launch than the 4090, so that soured the whole thing further. Hence a 4090 it was.
 
I played CP with the DLSS 2 update, so yeah... DLSS is crap at 1080p. It gives you some extra frames when you really need them... like when you're playing on a 2070, for example. ;) Upscaling is an aid to get the performance that you don't have. Nothing more. That's my opinion and it's final. I've had this argument a million times over and I don't intend to do it again.


Just the price. And that's why I think the entire 40-series sucks.

Upscaling is not great for low resolution, nothing new. 1080p is a joke anyway. My phone has been running higher than 1080p for many years. 1440p is bare minimum for me on PC. However, DLDSR and DLAA can improve 1080p visuals immensely. AMD has no features that even comes close to DLDSR and DLAA.

DLSS is not only for performance, it works as great AA and if you only want visual quality, DLAA exist which beats any other AA method on the market with ease. A DLSS preset now. Meaning DLSS/DLAA can be used in 450+ games and DLDSR can be used in all games.

FSR is even worse for resolutions lower than 4K/2160p than DLSS. DLSS at Quality is magic for 1440p users meanwhile FSR is mediocre for 1440p regardless of preset and works best for 2160p only, where it still losses to DLSS.

Price haha, yeah. AMD is cheaper for a reason; Inferior features, inferior support and drivers, lower resell value. Nothing new. You get what you pay for. My last AMD GPU was a 5700XT and it was horrible. First 9 months with drivers issues and black screen issues + hot VRMs - simply google it - It took AMD close to a year to fix it via software (with slightly degraded performance) and most lesser popular games ran like crap. It is very obvious that AMD spends their time optimizing for games that gets benchmarked alot, when you leave these, performance is mediocre.

If you look at AMDs prices they are high for what you actually get, considering how many years they are behind Nvidia on features. This is why AMD is losing more and more marketshare. Too many people had bad experiences with AMD and avoid them now + they want superior features or good ray tracing performance. People vote with their wallets and AMD is closer to drop below 10% marketshare than to get to 20% if you look at Steam Hardware Survey.

I would not be surprised if Intel gobbles up some more AMD GPU marketshare over the next years. Nvidia dominates the high end market and have done for years. This is probably why AMD won't release high-end GPUs with RDNA4 / 8000 series. Low sales. AMD needs to improve their features, rt performance and focus on performance per dollar. However this costs alot of money and AMD dont have them. They spends most R&D funds on CPU/APU segment, like they should, because GPU segment is barely profitable for them.
 
Last edited:
Originally I was going to leave the AMD part out because I think you would buy an Nvidia/intel alternative priced the same with the same amount of vram and similar performance but let's be real the next generation and the generation after that I doubt things will be all that different.
Totally! :) I have nothing against (upscaling) tech except that I don't need it, don't want it, and don't think anybody should need it at 1080p anyway.

Out of the 8 GPUs I currently own, only 3 are AMD, and out of the 7 CPUs I have, 5 are Intel. I buy whatever piques my curiosity - couldn't care less what colour the box is. The reason I never had, or I'm not interested in an Ampere or Ada card is 1. They're way overpriced, 2. They don't offer anything on top of Turing that I'm interested in, and I do have a 2070. I've also wanted to buy an A770 to have in my collection, but it's too expensive, especially for that purpose alone. I might get an A380 one day, though, just for shits and giggles.
 
Totally! :) I have nothing against (upscaling) tech except that I don't need it, don't want it, and don't think anybody should need it at 1080p anyway.

Out of the 8 GPUs I currently own, only 3 are AMD, and out of the 7 CPUs I have, 5 are Intel. I buy whatever piques my curiosity - couldn't care less what colour the box is. The reason I never had, or I'm not interested in an Ampere or Ada card is 1. They're way overpriced, 2. They don't offer anything on top of Turing that I'm interested in, and I do have a 2070. I've also wanted to buy an A770 to have in my collection, but it's too expensive, especially for that purpose alone. I might get an A380 one day, though, just for shits and giggles.
Im with you, but I hope you realize how myopic it is to call it useless cause you are playing on 1080p. It's just useless for your very particular usecase.

I find them extremely useful cause I game on a laptop a lot. On 14" screens, FSR looks fine cause the screen is so small you don't notice its shortcomings, and many games I just couldn't play on my laptop without it .
 
Upscaling is not great for low resolution, nothing new. 1080p is a joke anyway. My phone has been running higher than 1080p for many years. 1440p is bare minimum for me on PC.
I disagree. 1080p or higher is useful in a phone because you look at it from literally inches away. You don't want to be staring at pixel chunks from that close. You sit further away from a monitor, so if a 24" 1080p panel was good enough 5-8 years ago, then it's damn good enough now, too. If you're thinking in bigger size, then you should probably look at 1440p, for sure.

Price haha, yeah. AMD is cheaper for a reason; Inferior features, inferior support and drivers, lower resell value. Nothing new. You get what you pay for. My last AMD GPU was a 5700XT and it was horrible. First 9 months with drivers issues and black screen issues + hot VRMs - simply google it - It took AMD close to a year to fix it via software (with slightly degraded performance) and most lesser popular games ran like crap. It is very obvious that AMD spends their time optimizing for games that gets benchmarked alot, when you leave these, performance is mediocre.

If you look at AMDs prices they are high for what you actually get, considering how many years they are behind Nvidia on features. This is why AMD is losing more and more marketshare. Too many people had bad experiences with AMD and avoid them now + they want superior features or good ray tracing performance. People vote with their wallets and AMD is closer to drop below 10% marketshare than to get to 20% if you look at Steam Hardware Survey.
What inferior support? What inferior drivers? What are you talking about? Oh wait, you're talking about the 5700 XT. That really wasn't a great card. That one really had crap drivers, was overheating (even my Asus Strix, which was a botched series anyway), came with underpowered VRMs, etc. But guess what - times change. ;) RDNA 2 is just a brilliant series altogether. AMD and card manufacturers learned from all the mistakes made with the 5700 XT, and did everything right next time. No driver issues, either. And not just no issues, the GUI works much better than Nvidia's Windows 95 style crap - not to mention that Nvidia still hasn't fixed the age-old issue where some driver settings reset themselves to default after an update.

Condemning AMD based on the 5700 XT alone equals sending the kids and grandkids to prison based on something their grandfather did in 1939. It reeks of narrow-minded thinking. Every brand has good and bad series of products. Every single one.

Im with you, but I hope you realize how myopic it is to call it useless cause you are playing on 1080p. It's just useless for your very particular usecase.
Except that 60.75% of gamers are still on 1080p, according to the Steam survey, so I don't think my use case is that special.

If you find good use of upscaling, good for you and enjoy. :)
 
Except that 60.75% of gamers are still on 1080p, according to the Steam survey, so I don't think my use case is that special.

If you find good use of upscaling, good for you and enjoy. :)
60% of those steam users are probably on a laptop or / and on integrated graphics. They don't care about games cause they can't even run them.


With that said, and as I was telling you before regarding monitors, this is what you can do with DLSS.

99d5165acf0009eaaecba9d22c16613ff139468c24ffe491ae5d11b08ec56311.png
 
I disagree. 1080p or higher is useful in a phone because you look at it from literally inches away. You don't want to be staring at pixel chunks from that close. You sit further away from a monitor, so if a 24" 1080p panel was good enough 5-8 years ago, then it's damn good enough now, too. If you're thinking in bigger size, then you should probably look at 1440p, for sure.


What inferior support? What inferior drivers? What are you talking about? Oh wait, you're talking about the 5700 XT. That really wasn't a great card. That one really had crap drivers, was overheating (even my Asus Strix, which was a botched series anyway), came with underpowered VRMs, etc. But guess what - times change. ;) RDNA 2 is just a brilliant series altogether. AMD and card manufacturers learned from all the mistakes made with the 5700 XT, and did everything right next time. No driver issues, either. And not just no issues, the GUI works much better than Nvidia's Windows 95 style crap - not to mention that Nvidia still hasn't fixed the age-old issue where some driver settings reset themselves to default after an update.

Condemning AMD based on the 5700 XT alone equals sending the kids and grandkids to prison based on something their grandfather did in 1939. It reeks of narrow-minded thinking. Every brand has good and bad series of products. Every single one.

Nah 1080p is absolute trash compared to 1440p and up. Horrible visuals with 1080p. 1080p can only be saved with DLAA or DLDSR with 4K downsampling. Sadly you can't use that.

AMD 100% has inferior support and worse drivers + very often lacks day one drivers for new games. Features are way worse in general. You keep denying this and say you don't use or need them. Duh, you can't use DLSS/DLAA/DLDSR that is why you hate instead. It is like having a SDR TV and claim HDR is a joke. FOMO in full force. You are in denial to justify your purchase. Human nature, I know.

5700XT is one of AMDs best selling GPUs of all time. Along with RX480/470/580/570 Stop being clueless. Radeon 6000 series sold like crap compared to these GPUs. Go take a look at Steam HW Survey top GPUs (which is dominated by Nvidia). Radeon 6000 and 7000 series are barely present here.

Nvidia drivers don't reset after an update, sigh :laugh: :laugh:

FYI I have tested 6700XT, 6800XT and 7900XT as well.

Most >1080p users on Steam are using laptops with inferior hardware, meaning DLSS/FSR is highly relevant. You either get bad performance or find a balance.
Or they play eSport Games using 360-500 Hz 1080p panels.

1080p for regular PC AAA gaming is absolutely horrible compared to 1440p and up. Terrible image quality. You must be half blind if you think 1080p is actually great in 2023. 1440p/144 Hz IPS panels are literally cheap as dirt.

Hold on... You are ACTUALLY USING 1080p @ 60 Hz? This is a joke right? :D
 
Last edited:
Nah 1080p is absolute trash compared to 1440p and up. Horrible visuals at 1080p. 1080p can only be saved with DLAA or DLDSR with 4K downsampling. Sadly you can't use that.
Yeah, I've been enjoying a trash gameplay experience in the last 7 years. Lol! :laugh:

AMD 100% has inferior support and worse drivers + very often lacks day one drivers for new games. Features are way worse in general. You keep denying this and say you don't use or need them. Duh, you can't use DLSS/DLAA/DLDSR that is why you hate instead. It is like having a SDR TV and claim HDR is a joke. FOMO in full force. You are in denial to justify your purchase. Human nature, I know.
I have a 2070 that I used for 2 years after I bought it. If I wanted to use an Nvidia feature, I could just pop it in my PC and use it. I also just bought a 7800 XT. If I had any FOMO, then why didn't I buy a 4060 Ti? Or coughed up a few extra bucks for a 4070? I honestly don't know why it hurts your ego so much that not everybody is an Nvidia fan, but I'd recommend facing this reality sooner rather than later.

Nvidia drivers don't reset after an update, sigh :laugh: :laugh:
I challenge you: go to the 3D settings in the driver, change the AA method to multisampling, update the driver and see for yourself. I actively use two Nvidia cards in my HTPCs, so I know what I'm talking about.

FYI I have tested 6700XT, 6800XT and 7900XT as well.
And I assume you weren't impressed. So what? Am I supposed to throw a tantrum just like you did, or what? Unlike you, I'm fine with whatever you like or dislike. I honestly don't care. ;)

Hold on... You are ACTUALLY USING 1080p @ 60 Hz? This is a joke right? :D
And that hurts your feelings because...?
 
Why would I force AA method to multisampling in drivers ... Like seriously

It does not hurt my feelings at all, I am just amazed that some people are still using 1080p at 60 Hz for desktop PC gaming, especially people on a hardware forum that just bought a new mid-end GPU made for 1440p

I was impressed with 6800XT and 7900XT in pure rasterization but I missed DLSS, DLAA, DLDSR and Reflex plus ray tracing were not usable at all because performance dropped like crazy and FSR looked too bad to make up for it.
 
Last edited:
60% of those steam users are probably on a laptop or / and on integrated graphics. They don't care about games cause they can't even run them.


With that said, and as I was telling you before regarding monitors, this is what you can do with DLSS.

99d5165acf0009eaaecba9d22c16613ff139468c24ffe491ae5d11b08ec56311.png
Why compare to a 720p native image? Does anyone even have a 720p monitor these days? 4K + DLSS should always be compared to 4K native, so that you know how much image quality you're sacrificing for the extra performance.

Why would I force AA method to multisampling in drivers ... Like seriously
It was just an example. There are other settings that reset, but I can't remember which. I'll check next time I update drivers on my HTPCs.

It does not hurt my feelings at all, I am just amazed that some people are still using 1080p at 60 Hz for desktop PC gaming, especially people on a hardware forum that just bought a new mid-end GPU made for 1440p
Have you never played on a 1080p screen in your life? If you have, I'm sure it was good enough back then. ;) I just never felt the urge to upgrade, that's all. I bought a mid-tier GPU for longevity and curiosity, not because I actually need it. Not to mention higher resolutions need more expensive hardware to drive them.

I find playing with PC hardware just as much fun as playing on it. :)

I was impressed with 6800XT and 7900XT in pure rasterization but I missed DLSS, DLAA, DLDSR and Reflex plus ray tracing were not usable at all because performance dropped like crazy and FSR looked too bad to make up for it.
That's your opinion, and I respect that. The world would be a boring place if we all liked the same things.
 
Why compare to a 720p native image? Does anyone even have a 720p monitor these days? 4K + DLSS should always be compared to 4K native, so that you know how much image quality you're sacrificing for the extra performance.


It was just an example. There are other settings that reset, but I can't remember which. I'll check next time I update drivers on my HTPCs.


Have you never played on a 1080p screen in your life? If you have, I'm sure it was good enough back then. ;) I just never felt the urge to upgrade, that's all. I bought a mid-tier GPU for longevity and curiosity, not because I actually need it. Not to mention higher resolutions need more expensive hardware to drive them.

I find playing with PC hardware just as much fun as playing on it. :)


That's your opinion, and I respect that. The world would be a boring place if we all liked the same things.
I think you are again missing the point. I kinda get the feeling you are doing it on purpose now..

Ill try one last time, if you still don't get it, it's on purpose. I have a 27" 4k monitor and a 27" 1440p monitor. Playing on the 4k with dlss quality gives me a Huge image quality increase with similar framerate compared to 1440p native. So dlss increases image quality, by a ton.
 
So dlss increases image quality, by a ton.
No. Your monitor does. DLSS only lets you play on that monitor's native resolution with an acceptable frame rate while decreasing image quality. If you find that decrease acceptable, or even better than your old monitor's native, that's up to you.

We've established earlier that upscaling is way more useful at higher resolutions, so I don't know why we have to run the same circles again.
 
No. Your monitor does. DLSS only lets you play on that monitor's native resolution with an acceptable frame rate while decreasing image quality. If you find that decrease acceptable, or even better than your old monitor's native, that's up to you.

We've established earlier that upscaling is way more useful at higher resolutions, so I don't know why we have to run the same circles again.
What do you mean my monitor does?

I'm telling you dlss improves image quality MASSIVELY when you target the same framerate as native. That's not really up for debate, the screenshot up there kinda proves the point.
 
Status
Not open for further replies.
Back
Top