• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

CPU Reviews 720p/1080p/4K or 1080p/1440p/4K?

CPU Reviews 720p/1080p/4K or 1080p/1440p/4K?

  • 720p/1080p/4K

    Votes: 28 26.7%
  • 1080p/1440p/4K

    Votes: 77 73.3%

  • Total voters
    105
Lemming wasnt negative... just a popular belief many follow is all...:)

What i think may be getting missed here by many are people buy high end cards with old cpus. They are going to put a glass ceiling in some titles on higher end cards.. this testing would show that. :)

Edit... overclocking on the clus wpuld be good to see as well... a good way to tell if titles scale with the clu and gpu
 
in my country its not nice to say.
its an idiot who follows the sheeps over the cliff(if one jumps i follow blindly).
but i can take it.

but i understand your point.

and if i think again its maybe enough to test at 1080 where i can see a diff?
ya see i am open minded. i really hope so
 
I feel personally that above just the choice of resolutions, we should be voting on the actual benchmarks used as well - the vast majority of newer games that we see in reviews these days is extremely GPU limited, and/or a console port that is scaled to the 8-core weakling Jaguar setup.

I would be very interested in seeing actual play sessions in several MMO's, for example, but also a BF1 high-player count map would be interesting material. End-game in a 4X such as Cvilization would also be useful. Even in online environments, while more difficult, it is possible to set up a reproducible test run. For example: logging in at Wednesday (server reset) prime time (20.00-21.00) and standing in Dalaran (WoW) would provide a very similar scenario every time. Or a 15-man raid in the same instance.

These are actual real world scenarios in gaming that heavily stress the CPU. Much more so than the difference between running at 720p/1080p.

Another thing most CPU -gaming reviews don't cover entirely is multitasking, which is realistically something most gamers do and also keep doing more and more. Run game + have Youtube running + Discord app open + Twitch server, for example. Or Run game + Windows' built in recording functionality. These tests will definitely show the value of having more cores, favoring newer CPUs like Ryzen which can be helpful to put some perspective to the i7 7700k as 'ultimate gaming cpu' - and show the merit of generally lower-clock, more core workstation builds.
 
Last edited:
Low-res gaming tests are a dud. Highly unreliable, and not even related to the "real world" usage.
I'd say, drop 720p in favor of 1080p low. If high-FPS testing is really that critical - you can always add a couple of lightweight titles, like Dota 2, CS:GO, WoW, WoT or anything that gives upwards of 100FPS @ 1080p even on my puny GTX950.
 
When i read a review i want to see what it would perform like in my PC case. So anything under 1080p (while it might be relevant to the overall performance of the chip) is useless IMHO.
 
When i read a review i want to see what it would perform like in my PC case. So anything under 1080p (while it might be relevant to the overall performance of the chip) is useless IMHO.

Fun fact, most reviewers test on a test bench ie open case and, not unusual, with below-pleasant ambient temps 16-18 C.
 
I care about performance not cooling. Cooling is dependent on multiple factors from ambient temps. to the speed of the system fans.
 
Keep all, 720p, 1080p, 1440p, and 4k.

Personally, I couldn't care less about 1440p though. And 4k is just unrealistic for me but I read 4k results for kicks.

720p and 1080p are what majority of people use.
 
Fun fact, most reviewers test on a test bench ie open case and, not unusual, with below-pleasant ambient temps 16-18 C.
They do? Interesting. I test on an open bench, but, ambient is kept to around 22C... I note temps before testing, and if its higher/lower, I normalize it.

Who tests in such conditions consistently? Links?
 
They do? Interesting. I test on an open bench, but, ambient is kept to around 22C... I note temps before testing, and if its higher/lower, I normalize it.

Who tests in such conditions consistently? Links?

Your avatar says you test while submerged in LN2?
 
When i read a review i want to see what it would perform like in my PC case. So anything under 1080p (while it might be relevant to the overall performance of the chip) is useless IMHO.
And you're annoyed frequently when internet reviews don't revolve around your PC case :D
 
As I keep saying, you need a low resolution when benching CPU framerate performance, so I've voted for 720p.

Not doing so just bottlenecks it with the graphics card and makes all the CPUs in the test look the same. This is so mindnumbingly obvious that I can't believe anyone actually argues this point. :rolleyes:

And again, this is in addition to the higher resolution tests, not replacing them.

I agree that the lower resolution provides a better side by side comparison, assuming raw power in X game scenario is actually useful. The problem I see with it is that it's not indicitive of current or future performance for the actual users. You can get an idea of how a specific engine is handled with each CPU design but for one off games, I don't see this as particularly useful in the long run.

That said, I think a more in depth analysis for something like "unreal engine' and 'unity' at a low res may provide useful game estimation in the future, whereas if you actually care about horsepower the synthetics and computational workloads there are plenty of other benchmarks to provide more useful long term comparisons.

I would however appreciate more in depth framerate lows and latency constency analysis across the board.
 
And you're annoyed frequently when internet reviews don't revolve around your PC case :D

This was my point about open bench and ambient temp. And back to the subject: what you want in a review is THEREFORE: the best case scenario.

Also @Dippyskoodlez 100% agreed on all of that, especially that last sentence. Its real awkward we don't get min fps on TPU at the very, very least (in GPU reviews that is)
 
I agree that the lower resolution provides a better side by side comparison, assuming raw power in X game scenario is actually useful. The problem I see with it is that it's not indicitive of current or future performance for the actual users. You can get an idea of how a specific engine is handled with each CPU design but for one off games, I don't see this as particularly useful in the long run.

That said, I think a more in depth analysis for something like "unreal engine' and 'unity' at a low res may provide useful game estimation in the future, whereas if you actually care about horsepower the synthetics and computational workloads there are plenty of other benchmarks to provide more useful long term comparisons.

I would however appreciate more in depth framerate lows and latency constency analysis across the board.

Agree 100% with this!
 
As long as 1080p isn't held back by the GTX1080(and I don't believe it will be) then 1080p is as low as you need to go.
 
Last edited:
This was my point about open bench and ambient temp. And back to the subject: what you want in a review is THEREFORE: the best case scenario.
What, that reviewers do typically review in 65F or less as your post infers/says?? If they do, I'd say so what. We should know that a rise of 1C in ambient yields to a rise in 1C of temps... So, it just takes paying attention and hopefully any review (still no links.............. dont know of any.....) mention their ambient...
 
I don't see the point in 720p anymore... Low end mobile chipsets do 720p... I want to see better from a desktop.
 
lower or mobile end chips should include 720 (768?), high end should start at 1080
 
I agree that the lower resolution provides a better side by side comparison, assuming raw power in X game scenario is actually useful. The problem I see with it is that it's not indicitive of current or future performance for the actual users. You can get an idea of how a specific engine is handled with each CPU design but for one off games, I don't see this as particularly useful in the long run.

That said, I think a more in depth analysis for something like "unreal engine' and 'unity' at a low res may provide useful game estimation in the future, whereas if you actually care about horsepower the synthetics and computational workloads there are plenty of other benchmarks to provide more useful long term comparisons.

I would however appreciate more in depth framerate lows and latency constency analysis across the board.
Well, it certainly is indicative of current performance, as that's how the CPU is performing right now on a particular game. Extrapolating to future games is more iffy though due to changes in game engines and Windows versions, including DirectX version, but then again, Sandy Bridge had strong gaming performance and that's still true 5 years after I bought my 2700K, so perhaps it is indicative?

In the end, comparing the framerate performance of different CPUs by having the framerates capped by the graphics card is idiotic beyond belief and really doesn't need any explanation why. I mean seriously, how hard can this be to understand? I can't even believe that we're having this discussion!

Perhaps the biggest reason to have these low res tests, is so that a user can pick the fastest CPU and know what the max framerates it can achieve are. They will then be safe in the knowledge that it will provide the least bottleneck when they upgrade to a faster graphics card down the road, since CPUs aren't upgraded that often. I've had a slow CPU bottleneck a fast new card and it sucks, I can tell you.

By all means have the high res tests as that's the real world scenario, just don't cut out the low res ones. I remember UT2003 actually had a benchmark mode where the graphics card was taken out of the loop altogether, simulating an infinitely fast card. If I remember correctly, it did this by terminating draw calls before they were sent to the graphics card. All you saw was a static picture while the benchmark ran.

@Vayra86 You've been making a valliant effort in explaining in simple and clear language why we need lo res tests, but at 35 votes to 14 right now and the kind of comments being posted all wanting to cut out the lo res ones, it's clearly falling on deaf ears, so it might not be worth wasting any more time on it, to protect your sanity. :ohwell: I'm gonna get flamed by them, aren't I? lol

@newtekie1 A GTX 1080 can certainly bottleneck at 1080p as I discovered when playing CoD: Infinite Warfare a while back. It was still running pretty fast mind you and still perfectly smoothly, but definitely rather slower than at the lowish resolution I compared it with.
 
Voted for option #1: if you're really testing the CPU, it's important to include at least one scenario where the GPU is NOT the bottleneck. Also, it happens to cover a wider range of resolutions. You need the numbers for 1440? You can derive them from 1080 and 4k (mostly).

This!

Benching CPU is benching CPU. You need to stress CPU, not GPU. Lower res to see how fast is the CPU in games, and higher resolution to show how the GPU will be the bottleneck... That's it! Yes, nobody or almost plays 720P, but you will see how the CPU is fast vs the competition!

If you only do test where it shows GPU bottleneck. It's like if you do a GPU benchmark, but test it at 1280x1024 to see CPU bottleneck. Getting there, don't bench games or do only à 1080P/4K, and nothing else, as anyway the GPU will start getting bottleneck with newer games I guess. Older game will show anyway à CPU bottleneck if you reach 200+FPS xD
 
  • Like
Reactions: bug
Cool. We're going to win this argument definitively. Resistance is futile!! Good riddance 720p!!!
 
To be fair I think a better combination would be 720p/1080p/4k

720p : pure CPU perf
1080p: most common res
4K: GPU limited

The point of 1440p eludes me a little bit.

@qubit thanks for the support bud :)
 
720p/1080p/4K

It's the 2017 version of low medium and high graphics, and is simple to understand for just about everyone for gaming performance (with direct parallels to consoles)
 
TPU members have spoken and voted! It seems overly clear that 720p should be dropped if you really want/need to drop something...

Capture.JPG
 
Back
Top