• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Next-Gen GPUs: What Matters Most to You?

Next-Gen GPUs: What Matters Most to You?

  • Raster Performance

    Votes: 6,487 27.0%
  • RT Performance

    Votes: 2,490 10.4%
  • Energy Efficiency

    Votes: 3,971 16.5%
  • Upscaling & Frame Gen

    Votes: 662 2.8%
  • VRAM

    Votes: 1,742 7.3%
  • Pricing

    Votes: 8,667 36.1%

  • Total voters
    24,019
  • Poll closed .
Unless my browser cache is not updating for some reason, it looks like W1zzard just cleaned up the junk votes. Before the vote was almost over 70K. Now it's just over 21K and all the choices have changed ranks.
Yeah, that did change things around more than just a little bit.
 
Well this is good news. Now the poll will be useful and a true representation. I knew some shenanigans had happened when @_roman_ posted that he had voted multiple times for VRAM in post #91. :shadedshu:
 
Last edited:
Now, you're just being ignorant. How about a datacenters running several thousands GPUs, instead of your house? You might not think a datacenter affects you but they do drive up the cost of water and electric that is reflected on your energy bill.
The survey I assumed was for desktop users for a single gaming card. Not Data Centre manager looking for 1000s of AI GPUs.

Not ignorant, just specific...........single gaming card for home use.

And water, not where I live, I don't even have a meter, it's flat-rate based on rooms in the house, so I could use 10,000 usgal and my water bill would be the same.............one of the perks living in the middle of no where with no population and a lake every mile. Electricity price is also regulated here as well, so goes up predictably every few years, no craziness like the gas pumps.

edit
looks like about 80k population in 141,268.51 km2 (54,544.08 sq mi)
 
Last edited:
Just hinting out that the vote thing is bugged by design. Consider it as a bug report please. Or that the dataset may be not accurate
 
Just hinting out that the vote thing is bugged by design. Consider it as a bug report please. Or that the dataset may be not accurate
He returns to the scene of the crime!!! That takes ballz.

So here’s a question. Would you rather have min/max 120 native 4k fps synced raster at the highest quality and antialiasing settings or avg 60 FG/SS 4k fps synced raster/RT mix at the highest quality settings?
 
Last edited:
I we had to choose one or the other..
avg 60 FG/SS 4k fps synced raster/RT mix at the highest quality settings?
This.

min/max 120 native 4k fps synced raster at the highest quality
Not this.

antialiasing settings
Never this.

However, who says we need to use highest setting? Only benchmarking reviewers generally do that. Most everyone else adjusts their setting to suit their needs/preferences. So instead how about we tailor our settings so we get RT without AA, turn a few things down and get most of the pretty and all of the 120 frames per second? That's what I generally do. 4k is nice, but it's not critical. 1080p is still a great gaming res.
 
Without AA?
Absolutely! I haven't used AA since the days of CRTs. Tried it a few times since then, still don't like the effect nor the hit to performance. AA is something I always shut off and leave off. AA can be useful in resolutions lower than 720p, but above that the benefit is minimal at best. Not worth using at all, IMPO, especially at 1080p+.
 
Absolutely! I haven't used AA since the days of CRTs. Tried it a few times since then, still don't like the effect nor the hit to performance. AA is something I always shut off and leave off.
Well then you really must have a tiny display or otherwise a high tolerance for aliasing on that 1080p res.
 
Well then you really must have a tiny display or otherwise a high tolerance for aliasing on that 1080p res.
See edit. Try it sometime and for more than 5 minutes.
EDIT: Also, I went back to 1080p screens(a 24" & a 27") after having been on 1440p for a number of years. Too many scaling problems and I didn't want to jump to 2160p. 1080p is a sweet spot resolution. It has been for a long time and likely will be for a few more years no matter how many people push for 2160p.
 
Last edited:
I have 27" and 32" displays. It's all good. Unless your sitting 36cm from the screen or closer(and at that point you've got bigger issues) it's just not a thing.
I have more than an arms length to my 3440x1440p 34" and when I tried a 28" 4k monitor it was sharper both in windows and in games. 27" @ 1080p is hellish.
(but it felt small compared to ultrawide so it went back)
 
Take a look at the latest review of a game here on VRAM use which is just one more example of the 'need moar VRAM' hysteria being nonsense. An entry level gamer on 1080p with 8 GB isn't going to be using ultra settings to begin with and a midrange gamer is fine with 12 GB even on higher settings at greater 1440p.

vram.png


When people argue opinions over facts most people go with facts and that would mostly explain Nvidia's dominance of the consumer market. Yes, a part of it is mindshare but there's nothing that AMD or Intel can do about that except compete by actually competing and not by slapping unnecessary gobs of VRAM on their specs. That is why their commitment to much improved RT performance in the next gen is smart.
What about Harry Potter or Dragons Dogma 2? One game is not the entire market.
 
So now that the poll results have been ‘corrected’, the rankings make a little more sense. I still would have thought that pricing (21%) and raster (51%) results should be reversed what they are now but everything else looks more or less understandable.

Endusers want traditional raster performance at a good price. Everything else is a gimmick.
 
There is a lot of narrative vs truth in this thread. Some of the tech media pumped Ray Tracing and DLSS just like any other Nvida hook. Yep Physx comes to mind. As soon as AMD caught Nvidia on raster though all tech media reported DLSS became a competitor. Just look at how many people wax on about DLSS like it can work without raster in the first place. I have a 4K Mini LED and the most dramatic thing I can do to improve picture quality is turning up the Contrast and Saturation and turning down the brightness. You can do that in AMD software. Jensen is a great salesman though Nvidia is the only Company I know that can make a card that was over $1000 2 years ago seem meh for the extra.......DLSS performance for $400 more. I watched it with the 2080Ti, 3090 and I am willing to bet we will see it in the 4090 when the 5090 gets launched. The truth however is in this very poll.
 
RT performance, consoles will be using it, it's the way forward.
 
I have more than an arms length to my 3440x1440p 34" and when I tried a 28" 4k monitor it was sharper both in windows and in games. 27" @ 1080p is hellish.
(but it felt small compared to ultrawide so it went back)
Personal preference on both point. I can't stand ultra wide. Though "hellish" seems very much like an exaggeration to me. It's all good though.
 
I haven't needed more of anything in the poll since used RTX 3090s became affordable.
It's not like there are games that really challenge the 3090. And it doesnt have a burn-my-house-down connector as a plus.

A little surprised GPGPU / AI isnt in the list, because that's the only reason I'd upgrade is to have non-gaming workloads happen faster. VRAM helps somewhat. But really, just more cuda cores and bandwidth.
 
Back
Top