• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ViewSonic Readying a 27-inch 520 Hz OLED Gaming Monitor for CES 2025 Launch

Yet somehow I said that 90% of people can't distinguish anything higher that 100 Hz and that 500 Hz is ridiculous.

Strawman much ?

Most probably you, yes. Where did you invent that random number "90%" from ?

Windows ain't the same as Android/iOS though. Even Windows 11 has UI elements that doesn't scale, meaning if you're not at 100% it'll be fuzzy.

Microsoft is lazy. The only thing they need to do is to add size/bold to the fonts.
 
Most probably you, yes. Where did you invent that random number "90%" from ?

Obviously you have comprehension impairment, and apart from asserting that people are wrong, have nothing to reply. lol.

Also, I like that the video you linked to is made 'in partnership with nVidia' ... Right ... seems legit.

It's just asking the conman if this is a con ...

And vsync, screen tearing : this has been resolved by VRR and has nothing to do with high refresh rates. This a real feature, not just a high number to drive people to buy useless hardware.
 
Last edited:
Yeah, but OLEDs are rather new, coating on them as well as text quality hasn't been the greatest.
As I already have an ultrawide monitor I'd rather it had some higher resolution for instance than 3440x1440p
I'm on 34' IPS Ultrawide as well.

5120x2160 @144 or 240 OLED is nice but that would have to be on a 40-42" Ultrawide for me and comes with a more taxing load for the gpu.

In the medium term I would love to see more 38" 3840 x 1600 @ 144 or 240Hz OLED options this will have smaller hit on the gpu to run.
 
Or you are wrong, because for me it perfectly align with my experience. High refresh rate is a gimmick to sell hardware. You are just unwilling to admit you have been conned.
I agree, but up to a point. Everyone is different & that of course includes their visual perception of high refresh rates. But 520Hz? this is getting ridiculous imo. 165Hz is good enough for me & I play action games.
 
Ah another chance for a big old refresh rate discussion, gotta love them! It's a fascinating topic though. So to roll out my thoughts on it once again:

Everyone is different. Everyone has very different subjective experiences of refresh rate. You can blind test me & I can tell 240Hz vs 144Hz just from moving the mouse in a circle in Windows for a second. My dad however, he literally can't tell 60Hz from 30Hz! I've blind tested him a few times & he definitely can't tell. I don't believe I'm a super human so if some people say they can see the difference between 520Hz & 240Hz then I'm inclined to believe that's possible, though would still like to see a blind test to confirm.

I've recently gone from a 42" 120Hz W-OLED, to a 32" 240Hz QD-OLED, & the refresh rate difference wasn't super obvious the first minute I used the 240Hz. However I've definitely become accustomed to it. I reused a DP cable I already had routed when I installed this 240Hz monitor & it occasionally drops back to 120Hz. What's interesting is that when it's happened on startup, or I missed the telltale flicker, I've still started to notice pretty quickly when it's dropped to 120Hz, it's just a tiny bit less responsive.

So with all that in mind, I'm kinda tempted to try this 27" 1440p 520Hz monitor to see if I grow accustomed to 520Hz, or whether there is a point at which I just simply cannot tell, & cannot train myself to be able to tell. For some reason I find this topic absolutely fascinating.
 
500+ Hz is ridiculous. I would bet that 90% of people are unable to distinguish anything beyond 100 Hz. It's just having large numbers to compensate for something else, even if it is patently useless.
Agree. Personally I don't see any difference after 120Hz.
 
As someone who just recently went back to 27" screens from 32" screens, no 27" isn't too small..
In this case I think everyone is referring to 4k specifically.

27" at 1440p is fine and pretty standard.

4k on 27" you have to start playing with scaling in windows etc.
 
In this case I think everyone is referring to 4k specifically.

27" at 1440p is fine and pretty standard.

4k on 27" you have to start playing with scaling in windows etc.
It's all subjective and depends on personal preference. My point was to show that the statement they made was not as definitive as they directly implied.

Agree. Personally I don't see any difference after 120Hz.
You could likely tell the difference between the smoothness of 120hz vs 240hz but above 240hz is dubious at best as it enters the range of extreme diminishing returns..
 
Last edited:
It will be interesting if the monitor could toggle between 240hz 4k and 520hz 1440p. I think they still need to develop some better anti burn in tech before I consider buying a $1k+ oled.
 
It will be interesting if the monitor could toggle between 240hz 4k and 520hz 1440p. I think they still need to develop some better anti burn in tech before I consider buying a $1k+ oled.
240Hz 4K monitor is not capable of 520Hz and 1440p.
Instead it will be 480Hz 1080p because 240*2=480 and 1080p because 4K/4=1080p.
It must evenly double or divide.
 
240Hz 4K monitor is not capable of 520Hz and 1440p.
Instead it will be 480Hz 1080p because 240*2=480 and 1080p because 4K/4=1080p.
It must evenly double or divide.
The issue is that it's not an integer scaling so 1440p on a 4k panel wouldn't look quite as clear as 1080p integer scaled. Bandwidth wise, 1440p is 2.25x easier to run than 4k, so 4k 240Hz = 1440p 540Hz, so you could make a monitor that supports both those modes, it just wouldn't look great at 1440p.
 
240Hz 4K monitor is not capable of 520Hz and 1440p.
Instead it will be 480Hz 1080p because 240*2=480 and 1080p because 4K/4=1080p.
It must evenly double or divide.
which is why I want a 27" 5k(5120x2880) 240hz/1440p 480hz monitor :D
 
which is why I want a 27" 5k(5120x2880) 240hz/1440p 480hz monitor :D
Aahh, that actually makes a lot of sense! ...I also now want that :D though in 32" still please, I can't go back to 27".
 
What stupid nonsense.
They should focus on other areas for improvement instead of useless refresh rate noone can make use of anyway.
 
What stupid nonsense.
They should focus on other areas for improvement instead of useless refresh rate noone can make use of anyway.
Why can't anyone make use of it? There's a few games that can be driven at 500+ fps at 1440p with competitive settings. You may think that type of gaming is just for the sweaty kids (& I'd agree tbh), but that is a market segment, & not an insignificant one.
 
Stop the trolling, bickering, arguing, etc.
Don't report problems and then go back and keep the BS going.
Stick to the topic.
 
Last edited:
I can actually see the 520Hz panel being useful-ish if it allows for a reasonably high refresh ceiling with little brightness loss while using BFI. Should be an almost flawless motion performance experience then. A very niche use-case, but one that exists nonetheless.
 
Its actually amazing this dumb "discussion" is even going on, I dont know what fuels it that people can be so passionate about it....

Either way, watch some DF videos, they will explain in detail why higher refresh rates helps, Blurbusters, the experts on image retention etc said that for an OLED you will need about 1000 hz before you can have a totally clean image, otherwise blackscreen insertion could help but that is apparently really hard to get running perfectly


 
Last edited:
People can see unlimited Hz, which means people will tell you the difference between 1000 Hz and 10 000 Hz.

I think it is right to claim something else - humans can adapt, if you use a 60 Hz screen and see nothing better, it would be fine for you, if you use a 500 Hz screen for some time and see nothing better, it would be fine for you, if you use a 10 000 Hz screen and see nothing better, it would be fine for you. Until you see that difference, and your brain recognises the existence of that thing better.






Cellphoness have very high pixels per area density. Like 400 ppi, 500 ppi, which corresponds to ultra high crispness and image quality.
The assumption here is wrong to begin with. Higher is not universally better. It applies to ppi, to refresh rates and even things like brightness.

Higher peak FPS and higher refresh = more frametime variance. Higher brightness = more tiring on eyes and likely less color accurate; higher resolution = more processing power and a reduced competitive edge in various games. Etc etc

Its exactly about what you grow used to. The longer you stay on something the more you adapt to it. So its entirely true that higher isnt better. Its just higher. Ive got and played on high refresh and the difference is there. But it doesnt make or break much.

There is actually only one certainty with 'bigger and higher' numbers: they are actively used to funnel higher amounts of money out of your wallet. The rest is just perception.
 
Why can't anyone make use of it? There's a few games that can be driven at 500+ fps at 1440p with competitive settings. You may think that type of gaming is just for the sweaty kids (& I'd agree tbh), but that is a market segment, & not an insignificant one.
I am 200% certain noone can tell the difference between 120 and 500Hz in a blind test. This is a marketing gimmick and useless product at best.
These so called competitive gamers are nothing but a bunch of clowns who parrot nonsenses they heard from other CounterStrike "patients". Or their sponsors.
 
mfs on 120-240hz LCDs arguing with 100% confidence that most people can't see past 100hz is something special.
The motion clarity difference is insane on 60-90-120hz oled screens (not just scrolling test [on same touch sample rate] but actual high fps video).
But dropping ad hominem after they point out clear flaws in your logic is just off-putting.
 
I am 200% certain noone can tell the difference between 120 and 500Hz in a blind test. This is a marketing gimmick and useless product at best.
These so called competitive gamers are nothing but a bunch of clowns who parrot nonsenses they heard from other CounterStrike "patients". Or their sponsors.

point is, it does not matter what you are certain of or what you believe to be true, just dont go spreading unsubstantiated believes around.
 
The assumption here is wrong to begin with. Higher is not universally better. It applies to ppi, to refresh rates and even things like brightness.

Higher peak FPS and higher refresh = more frametime variance. Higher brightness = more tiring on eyes and likely less color accurate; higher resolution = more processing power and a reduced competitive edge in various games. Etc etc

Its exactly about what you grow used to. The longer you stay on something the more you adapt to it. So its entirely true that higher isnt better. Its just higher. Ive got and played on high refresh and the difference is there. But it doesnt make or break much.

There is actually only one certainty with 'bigger and higher' numbers: they are actively used to funnel higher amounts of money out of your wallet. The rest is just perception.
or you could just buy the older and now cheaper model since the new bigger and better number is actively used to funnel higher amount of money out of your wallet.

higher is not universally better, but there are already many metrics which can be more specific in addressing specific wants. Marketing is marketing and they will always try to throw out simple sentence featuring big numbers. But if there are reviewers whose livelihoods depends on accurate measurement, reporting and analysis on benefit of such metrics as %APL brightness, response times and such, then I believe there is a reason why people care.

Higher number does not equal better, until you define the what, how and result of better. A 10000 nits edge-lit display may be no more useful than 1000nits, but 10%APL of 2000 nits vs 1000 nits, or all else being the same, 1024 dimming zones vs OLED, man that lightning looks so good. Or, 360hz OLED vs 240hz LCD, man those shots I fired is straight liquid, did I really just do that?

High end studio monitors can pump high brightness with unmatched colour reproduction, high resolution allows you to see more details from the same source file, higher motion resolution allows you to see a moving object more clearly. There are constant demand of "higher" and "better", some are mistaken, some are lead astray, some are straight up lied to, about the benefits of bigger numbers. But those happen because of that demand.

on the other hand, higher frametime variance also doesn't equal bad. sure, frametime graph jumping between 1ms and 20ms is going to feel brutal for any game, but buy™ a 9800x3d and 4090 rig, and then the frametime may very well become 1-4, 1-3 or 1-2, (you could go below 1 but then you'd require state of art equipment for truly diminishing returns for the purpose of HCI with a monitor, keyboard and mouse), and throw in a 200+hz OLED, I'm somewhat confident that this setup produce a more enjoyable result than the few guys arguing that 1xxhz is enough, most can't feel it, it is diminishing returns etc.

with the rig attached to my profile which i must admit is not the GOAT setup to date, I hit ~130 0.1% in my competitive anger management software of choice, on my 170hz non-strobed (at least not a good implementation) M27Q (rev2 because i got scammed by gigabyte). But now its being a few years and those are rookie numbers, why not find out engine limits (for %lows) with a NASA pc? you could always buy a "90% of" product for less than "90% of" cost. and that's what I'm waiting for, a cheaper OLED. not this one ofc,

Except GPU market, fk the GPU market.

I am 200% certain noone can tell the difference between 120 and 500Hz in a blind test. This is a marketing gimmick and useless product at best.
These so called competitive gamers are nothing but a bunch of clowns who parrot nonsenses they heard from other CounterStrike "patients". Or their sponsors.
 

Attachments

  • Untitled.png
    Untitled.png
    373.4 KB · Views: 48
Back
Top