• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Next-Gen GPUs: What Matters Most to You?

Next-Gen GPUs: What Matters Most to You?

  • Raster Performance

    Votes: 6,487 27.0%
  • RT Performance

    Votes: 2,490 10.4%
  • Energy Efficiency

    Votes: 3,971 16.5%
  • Upscaling & Frame Gen

    Votes: 662 2.8%
  • VRAM

    Votes: 1,742 7.3%
  • Pricing

    Votes: 8,667 36.1%

  • Total voters
    24,019
  • Poll closed .
I'm also speaking from a little dated experience. That was the latest verison... as of Dec'23. However, on my 27" 4K display, I failed to realise why should I use native instead of DLSS P. Image quality was just a smidge worse but I had like 90 percent more FPS.
That's because you're on 27" 4K. The situation is way different when you're on something like a 24" 1080p, which is more like what budget gamers use.
 
Raster performance leading, yet no one buys AMD...

Better raster performance is the most important for anyone gaming whether on Nvidia, AMD or Intel GPUs. Even games that use RT only do so with a mixture of RT and rasterization. RT is only implemented in small ways for now and in the near future. Also gamers buying entry level GPUs aren't interested in RT to begin with but better raster performance definitely is a plus. The vast majority of gamers buy entry level through midrange.
 
Raster performance leading, yet no one buys AMD...
People arn't to smart that's a big problem right here.... pricing only 14% voters almost no one cares about next gen prices. :) So nVIDIA can easily put higher prices again and there will be no resistance.
 
People arn't to smart that's a big problem right here.... pricing only 14% voters almost no one cares about next gen prices. :) So nVIDIA can easily put higher prices again and there will be no resistance.
A bad economy makes people poorer. Being poor makes people depressed. To compensate depression, people spend more on luxuries like alcohol, cigarettes, or GPUs. It's psychological.

Besides, we're a tech forum. You can't fault people for having a hobby, however expensive it may be. :)
 
In any case, thinking at the poll choice selection at a general level:

Energy efficiency: Guaranteed to improve for the same performance level, so it's a moot point.
Performance: Guaranteed to be the same or better.
Upscaling/Frame gen: Tied to general performance improvements and software support.
VRAM: Tied to what NVidia will allow users to have in order not to cannibalize their datacenter lineup (many at a professional level—think universities or small startups—ended up using 3090/4090 and sometimes 4080 for their compute+VRAM needs). Depends on commercial factors.
Pricing: Of course, it's going to be the same or more expensive if the GPUs are better in some capacity. Depends on commercial factors.

I wouldn't say that the poll is poorly designed, but the only things that truly matter if you think about it are VRAM and pricing, because the other points are going to have the usual incremental technological improvements that the professional market already needs anyway. Efficiency in particular is really important given how much power datacenters need, but it shouldn't be misunderstood for "maximum power required", because if the useful work per Joule improves, then the GPUs will be more efficient (and we also already know that for gaming cards NVidia and AiB partners tend to milk the last 5~10% of performance for 50% higher power consumption or so).
 
What matters to me is that ngreedia would not scam potential buyers of the next-gen cards with cut down eunuchs instead of normal gpu's. I bought rtx 4070ti for 800$ equiv in my country, which is twice the median month salary around here, and got myself a card that will probably die in most new games couple of years down the line because this asian schmuck decided to put 12gigs on 192 bus in it. And then, same sucker less than a year later launches a normal card with sufficent 16 gigs and 256 bit bus, calls it a super. But i don't have another 800-1000 bucks, so shame on me.
But what the hell am i talking about? Of cource they will scam us every gen. Because they will never forgive themselves for Pascal architecture and how they failed to line up their pockets because for most of the gamers 10-series was sufficent to play any game up until ue5 maybe.
You always have a choice not to buy anything you don't consider worth buying.
 
This poll is moving all over the place. At the current 16k votes, RT and SS/FG together are at 10%. This is a more reasonable result. The vast majority of no to some knowledge buyers do not care or have any idea about these features. I am still surprised pricing is not getting over 50% of the vote.
 
All of them.
This right here, why can't I want improved Raster, RT, VRAM, efficiency and Upscaling all at once? I'll be chasing an improvement in all areas when I upgrade from a 3080.
featuresets that work everywhere will remain the only real development. Proprietary isn't gonna fly, even with 80% Nvidia marketshare or more. It'll last as long as it'll last, but it'll never carry an industry forward as a whole.
Hard disagree, I'd say that Nvidia has moved the industry forward as a whole technologically by pioneering RTRT in games and ML Upscaling. Even if/when 5-10-15+ years from now their proprietary feature set isn't the enduring standard, we wouldn't have any open/enduring standards if it wasn't for them pushing the bar forward, or at the very best, we'd have them later.

It's also amusing to see upscaling continue to be called a gimmick (hint - it's categorically not), but I can see why people that can't/don't use DLSS might say that, I've certainly enabled FSR and laughed at the results before.
 
This right here, why can't I want improved Raster, RT, VRAM, efficiency and Upscaling all at once? I'll be chasing an improvement in all areas when I upgrade from a 3080.
I agree with you there...

Hard disagree, I'd say that Nvidia has moved the industry forward...
...but hard disagree there. The only thing Nvidia moved forward is their brand recognition and their wallet. Proprietary technologies never help any industry as a whole, only the company that makes them.

It's also amusing to see upscaling continue to be called a gimmick (hint - it's categorically not), but I can see why people that can't/don't use DLSS might say that, I've certainly enabled FSR and laughed at the results before.
It depends on where you're coming from. If you have a 4K TV that you're sitting 3 metres away from, and you've got a relatively low-end HTPC attached to it, then it's great. But if you have a high-end PC and you only game at 1440p or below, then chances are you don't need it.
 
hard disagree there.
Agree to disagree then? the way I see it, Nvidia has absolutely moved the industry forward in at least two areas since they launched RTX (feature set), but I can accept not everyone agrees with that, despite it seeming obvious to me.
It depends on where you're coming from. If you have a 4K TV that you're sitting 3 metres away from, and you've got a relatively low-end HTPC attached to it, then it's great. But if you have a high-end PC and you only game at 1440p or below, then chances are you don't need it.
So if it has great use cases even in your books, wouldn't that make it not a gimmick? I see it as a very useful feature with broad spanning utility. And just like optimized settings, it's another tool in the box to tweak IQ and FPS to the taste of the user.
 
Agree to disagree then? the way I see it, Nvidia has absolutely moved the industry forward in at least two areas since they launched RTX (feature set), but I can accept not everyone agrees with that, despite it seeming obvious to me.
Ok, let's settle with that. You say Nvidia moved the industry forward by developing technologies that AMD and Intel also made equivalents of. I say, they only really helped their own pockets by developing a proprietary technology that locked the competition out of the game (DLSS), and put them in a disadvantage. We may both be right in our own ways.

So if it has great use cases even in your books, wouldn't that make it not a gimmick? I see it as a very useful feature with broad spanning utility. And just like optimized settings, it's another tool in the box to tweak IQ and FPS to the taste of the user.
Yeah, it's got some great uses. I finished Hogwarts Legacy on a 6500 XT thanks to FSR. It wasn't pretty, but it was certainly usable. I just wish some developers didn't use upscaling as an excuse to shove mediocre graphics down our throats that only runs great on a 4090 without it.
 
Nvidia moved the industry forward by developing technologies that AMD and Intel also made equivalents of.
Which happened first though? We'll never know the result of this next hypothetical question, but would AMD or Intel have pursued RTRT or GPU Upscaling if Nvidia didn't trailblaze them both? Certainly feels reactionary from the both of them, but hey they might have gotten there themselves, eventually, we'll never know. Chicken and egg really, but I believe we have NVidia to thank for RTRT in games (and on consoles), and for FSR and XeSS's existence. Given where we are today, I'd even say DLSS is to thank for FSR and XeSS not being locked out of games.
I just wish some developers didn't use upscaling as an excuse to shove mediocre graphics down our throats that only runs great on a 4090 without it.
I also wish some developers did better and did need to count on upscaling to get poorly performing games running at acceptable framerates, it should remain a bonus to improve performance, not a requirement for bare minimum FPS levels.
 
Which happened first though? We'll never know the result of this next hypothetical question, but would AMD or Intel have pursued RTRT or GPU Upscaling if Nvidia didn't trailblaze them both? Certainly feels reactionary from the both of them, but hey they might have gotten there themselves, eventually, we'll never know. Chicken and egg really, but I believe we have NVidia to thank for RTRT in games (and on consoles), and for FSR and XeSS's existence. Given where we are today, I'd even say DLSS is to thank for FSR and XeSS not being locked out of games.
Like I said, it depends on which way you look at it. You can say that Nvidia created something revolutionary with RTRT and DLSS that AMD and Intel followed with their own standards because they also wanted a piece of the cake. The way I look at it, though, is Nvidia created DLSS and made it fully dependent on their in-house designed tensor cores to lock the competition out of the game and gain an unfair advantage. They're a for-profit company, so I don't have any illusions of them having the slightest intention to move anything forward other than their own bank accounts. If both AMD and Intel succeeded in creating versions of their own standards that run on any hardware, then what prevented Nvidia from doing so other than greed?

I also wish some developers did better and did need to count on upscaling to get poorly performing games running at acceptable framerates, it should remain a bonus to improve performance, not a requirement for bare minimum FPS levels.
Exactly my thoughts. Whenever a developer showcases a game saying "look how great it runs with DLSS Q", I just think "show it without DLSS first, and only then I'll decide whether upscaling is necessary or not".
 
If both AMD and Intel succeeded in creating versions of their own standards that run on any hardware, then what prevented Nvidia from doing so other than greed?
This.

Similar things occurred with FreeSync except there Nvidia clearly lost the battle because its much easier to 'finish' the development on that feature. But this historically has been happening every time. New technology simply becomes possible and various companies will chase it. Someone will always be first with it, but the industry only moves forward once it becomes a standard feature.

Right now, upscaling is part vendor agnostic and DLSS 'unique selling point' space is rapidly dwindling, as per the Gsync / Freesync situation. We will arrive at a point where its totally irrelevant. Already it is using constant updates and options to expand the feature to keep it 'unique' - and Nvidia directly makes you pay for them by not offering a lot of it for past gen cards. Now of course: they can. Simply because DLSS now, still commands unique selling points. I think there's no denying that. The question is whether you should or want to pay for them, because you do.

Its not difficult to see the similarities here, they are striking and time will prove this point. Nvidia isn't a magician, they just pre-empt industry developments and make you pay for it. VRR would have come regardless of Nvidia; RT much the same; look at Nanite, or Crytek's implementation in Neon Noir. There is no question in my mind all of that would have been created regardless of whether Nvidia exists. Rather, I think that's incredibly naive and narrow-minded. Humans simply arrive at certain development stages in all things. Its like the invention of fire or writing - different cultures in totally disconnected places discovered those things, without 'stealing' any ideas from each other. Developments simply become logical at some point in time. Pre-empting those can be a strategy to corner the market, but it will never last. You can draw that parallel in almost everything: the Ipod, the Iphone... electric vehicles... the internet... first vs second and third world countries... Facebook.

Even RT wasn't new when Nvidia introduced it to the real time GPU processing arena. They just accelerated it differently. That's all RTX was, and still is.

we'd have them later.
Agreed. A lot of things chicken/egg have been moved to vibrant industry thanks to Nvidia, at least 'faster' than without them. The underlying question though is whether any gamer really benefits, I think that really depends on your perspective. Because part of the move to RTX has been 'less hardware for your money' because you're part buying featuresets now. Nvidia wants to be a software company, so you're paying for software. It doesn't really give you better games. Au contraire even - apart from a few poster childs, the overall quality of graphics showoff AAA has been abysmal the last five to seven years.

If you are primarily interested in the prettiest pictures in your gaming, and you pixel peep a lot and are truly interested in the tech development, I think Nvidia has something on offer. But if you're primarily interested in gaming for its mechanics, its gameplay, the game itself, whatever it has graphically being secondary... Nvidia's move has just made your gaming more expensive. Substantially, I might add.
 
Last edited:
Surprised 'Pricing' isn't in the lead, when the global economy is edging a recession, unless the most common priorities have shifted and now it does not matter how much a GPU costs, people (in general) will buy it.
Honestly, for 3 decades (until 2017, thanks bitcoin) it was about price/perf. ratio that dictated the winner, at the various market segments.
 
Surprised 'Pricing' isn't in the lead, when the global economy is edging a recession, unless the most common priorities have shifted and now it does not matter how much a GPU costs, people (in general) will buy it.
Honestly, for 3 decades (until 2017, thanks bitcoin) it was about price/perf. ratio that dictated the winner, at the various market segments.
I agree, but I think it's because:
A bad economy makes people poorer. Being poor makes people depressed. To compensate depression, people spend more on luxuries like alcohol, cigarettes, or GPUs. It's psychological.

Besides, we're a tech forum. You can't fault people for having a hobby, however expensive it may be. :)
 
Like I said, it depends on which way you look at it.
Always.
You can say that Nvidia created something revolutionary with RTRT and DLSS that AMD and Intel followed with their own standards because they also wanted a piece of the cake. The way I look at it, though, is Nvidia created DLSS and made it fully dependent on their in-house designed tensor cores to lock the competition out of the game and gain an unfair advantage.
I don't see it through such a negative lens at all. Nvidia created a product / feature that didn't exist, and in the world we live in of corporations, you'd be downright foolish to not at least try to capitalise on that while you can if the market conditions and your position within it allows it.
They're a for-profit company, so I don't have any illusions of them having the slightest intention to move anything forward other than their own bank accounts. If both AMD and Intel succeeded in creating versions of their own standards that run on any hardware, then what prevented Nvidia from doing so other than greed?
AMD and Intel are for profit too last time I checked, and I'd wager heavily on them trying to capitalise on pioneering a new to market feature / product of the same nature, if they were in an equally market dominant position. Be it monetarily or another ploy to have you point your wallet at them instead of the competition. They're all greedy, and if anyone thinks they're not I've got a bridge to sell them, I've certainly seen enough from both AMD and Intel to believe this to be true.
I'm glad you agree that "Nvidia has moved the industry forward as a whole technologically by pioneering RTRT in games and ML Upscaling. Even if/when 5-10-15+ years from now their proprietary feature set isn't the enduring standard, we wouldn't have any open/enduring standards if it wasn't for them pushing the bar forward, or at the very best, we'd have them later."
 
I don't see it through such a negative lens at all. Nvidia created a product / feature that didn't exist, and in the world we live in of corporations, you'd be downright foolish to not at least try to capitalise on that while you can if the market conditions and your position within it allows it.
Sure, from Nvidia's point of view, this is an entirely positive thing. They created something hugely popular that the competition will never even have a chance to access because it requires their own hardware to work. If I was Mr Jensen, I'd be laughing all day and night. But the thing is that I'm not. I'm just an ordinary home user, and I benefit far more from open standards than from proprietary tech that limits my buying choice to one single brand.

AMD and Intel are for profit too last time I checked, and I'd wager heavily on them trying to capitalise on pioneering a new to market feature / product of the same nature, if they were in an equally market dominant position. Be it monetarily or another ploy to have you point your wallet at them instead of the competition. They're all greedy, and if anyone thinks they're not I've got a bridge to sell them, I've certainly seen enough from both AMD and Intel to believe this to be true.
I didn't say that was not the case (even though I have never seen a single proprietary feature on an AMD GPU besides TrueAudio, which never gained any traction, but that's besides the point). I'm not defending any company here. But like I said, I, a home user, have to decide what's best for me (not for them), and vote with my wallet. Open standards promote freedom of choice, which is exactly what I want. Relying on closed features limits your choice to a single brand, the acceptance of which is the precursor to a monopoly. We're already seeing signs of what such a monopolistic situation brings with itself, in terms of GPU prices relative to their value.
 
Holy crap, the number of votes just jumped to 24k and now energy efficiency is over 50%! RT continues to drop. Crazy poll results. If this is really the majority preference, the 600W 5090 is going to go over like a lead balloon.
 
What's to be confused about. When someone posts that barely anyone has heard about RT or wants it that's pretty clear what they are saying and then went on to make the accusation that the poll is being spammed with RT votes. That's why I posted what I did several comments back before what you are replying to.

The bottom line is that you don't have to use RT so there is not a stick for Nvidia to beat you on the head with to begin with and it is ok for a lot of gamers to want more RT progress as well.

The vote was spammed, it was raster and price a good margin ahead of RT and within the space of 12+- hours there were thousands of votes for RT, it is a stick for Nvidia just the same as all their proprietry technologies that AMD is behind on to justify the ridiculous prices they are charging for GPU's these days, honestly £$1500-2000 for a top end card and 1200 for the next rung down, is extortionate, and people can mention inflation this and covid, energy prices etc, they are making more money now per SKU than they ever had by a huge margin, it's just a case of "tough shit, this is the price, don't like don't buy" mentality, don't get me wrong AMD are just as greedy and price theirs accordingly, so we don't have x,y,z features but are close enough in raster so we will just go $100-$200 cheaper, and yes you may have had previous comments alluding to this, I didn't quote them, I quoted one and replied to it, we don't have to go back through everything you have ever posted for you to make your point, and my previous reply to you is still valid, IMO.

You...do realize that this is a website for nerds? Nerds who basically all want to demonstrate that they have the best of the best, and one means to do so is by having the latest game and the latest bleeding edge tech.

I ask and answer this because it's like going to a bacon lover conference, and asking whether there's potentially an overblown link between eating bacon and higher cholesterol. As in your sampling pool is extremely biased...and one of the things which currently differentiates AMD from Nvidia is RT performance. So instead of measuring a thing that both cards can do (Raster, or price), you artificially remove discussion of the opposition (and demonstrate you "made the right choice") by choosing a metric which it is not competing in.


This accounts for "spamming the vote." It accounts for most discrepancies between "have" and "have not" logic. It also removes the consideration that people are not adequately representing the statistical spread of purchased components...given that from point zero we know that the pool of respondents is more likely than not Nvidia purchasers, purchasers of middle to high end cards, and willing to jump through hoops to test new tech.
 
@AusWolf I love open standards too, even more so, but I'm also OK with the path to them potentially being paved with a innovative, trailblazing closed standard. From my chair, we're all benefitting in the long term from that happening in this space in particular in the recent past. Now if an open standard can topple DLSS (or even just cure the artefacts that ruin my immersion the most) I will absolutely cheer for it and welcome it with open arms, and I have used FSR on non RTX hardware to positive effect too even in it's current state, hell AMD's FG has completely mooted Nvidia FG as a selling point feature to me, yet we wouldn't have it without DLSSFG.

And with that perhaps it seems like we're right back to agree to disagree, yet we both open this can of worms relatively often don't we :peace::roll:
 
Given I use laptops, discrete GPU is more interesting. My Dell XPS is an older machine with a GTX 1050 Ti 4GB. Machine has a 4K LCD so its more interesting to see what works at 3840x2160 resolution.
 
I'm glad you agree that "Nvidia has moved the industry forward as a whole technologically by pioneering RTRT in games and ML Upscaling. Even if/when 5-10-15+ years from now their proprietary feature set isn't the enduring standard, we wouldn't have any open/enduring standards if it wasn't for them pushing the bar forward, or at the very best, we'd have them later."
I only agree that we would have them later. Look at Intel XeSS, they would have gone through with that anyway, and the DX12 Ultimate standard was also a unified box of tricks already. Similarly, AMD just needs FSR on consoles. That's not Nvidia's achievement in any possible way - in fact Nvidia told us the lie that it would be impossible to upscale on anything pre Turing. The only thing Nvidia achieved is pushing its own accelerator hardware for it. They were nor are in it to make gaming better, they're in it to sell GPUs, whatever games come out of it is secondary. And it shows - not a single game is truly influenced by RT in any way, mechanically or in gameplay. Nothing's really added in innovative power for gaming. Its just prettier picture at immense perf cost - again something Nvidia is keen to promote. So I guess that's a bar you could say has moved forward. Gaming got more expensive. Yay.

Even Nvidia's upscale is strategically placed to make sure you don't really gain anything from it, you'll still upgrade, because now you need iteration 3, or 4. The industry isn't gaining anything from it, again, on the contrary, because developers are now forced to implement up to 3 upscalers, and there is more segmentation in hardware capabilities on top of it, the latter only thanks to Nvidia. Nvidia also takes dev time away from the game itself and diverts more to the RT implementation in it, even if it contributes to that as well. And all that for historically extremely minor graphical changes, whether its different grass or hair physics Hair/TurfFX or RT, its all more of the same, and none of it really stuck just yet. And if it DOES stick, they're ready to axe it themselves at any point it doesn't earn money: see PhysX.

But let's look back a bit further, for relative comparisons, at what AMD has done on the API front. Now that's a real industry push. They enabled and accelerated the adoption of the APIs we have now, and those APIs have directly contributed to more complex games, better threading, fixing the CPU issue for a large part in gaming performance. In turn, those APIs even enabled RTX. And us consumers? We just stood there and watched it happen. GPUs didnt get more expensive for it, they actually just got a lot better at gaming overall and suffered less bottlenecking; work more smoothly overall on any desktop system, we gained tiled rendering and various other improvements out of it. Real ones, that enable far more complex environments that we now see in every game and engine.
 
Last edited:
I only agree that we would have them later.
No worries, I'm saying later that was an optimistic best case scenario given what we know, so we still got some open features sooner, but I'll leave it there for that topic. As for the rest I have varying levels of agreement and disagreement but I think I've hogged thread space enough making my point.

Very much looking forward to what the next 6-9 months bring, and there's no way I'm spending $1500+ USD / $2500+ AUD on a video card so I'm very interested in the RDNA4 vs midrange Blackwell battle. $700 USD tops, so here's hoping for that upgrade in all key areas, which I think will be realistic in that segment.
 
Back
Top