• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alienware 500 Hz Gaming Monitor Leaks Ahead of CES Reveal

I think the large amount of criticisms in this thread are about 500Hz being a joke when not even 240Hz displays can really do what they claim. There is only one LCD display made to date (the Odyssey G7) that can actually draw a whole frame in 1/240th of a second, and that's with caveats.
Yeah the issues are everywhere; G2G, input frame info, server tickrates, frame time variance... and thats omitting the actual performance of your own PC to begin with. This is pure marketing BS, if not an outright lie.
 
Motion blur is generated from two sources.
Motion blur in games is far higher than in real world. I think because productivity of geomean graphic cards(gtx 1060 source Steam) is not enough to calculate fast moving on game scene with high resolution and bluring effects are used to mask the reduced resolution behind the blur.
 
Yeah the issues are everywhere; G2G, input frame info, server tickrates, frame time variance... and thats omitting the actual performance of your own PC to begin with. This is pure marketing BS, if not an outright lie.
Which is why I wish manufacturers would focus on strobing/BFI :)

My CPU and GPU are capable of >120fps averages with minimal dropped frames. Sure, I might be using DLSS and medium settings on a 5800X and 3070, but even 120fps is demanding in some games. I just want that 120Hz experience to be as visually good, clear, and fluid as it can be.

500fps? LOL. Sign me up for a time machine so I can grab an i9 18900KS and RTX 8090Ti please!
 
That's maybe something for the 0.01% of competitive gamers that actually can use this..(and of course not those who pretend they are competitive but suck at 144 or 500hz alike).
I feel most "competitive gamers" are like so and just want to think themselves as competitive when they are nowhere near the skill where any gear matters :D I used to be captain for finnish national team in world championships two years in a row with Quake 2 back in the day and done other competitive stuff too but I just feel most that think themselves as competitive are just casuals. Many of them have a stream setup but don't really stream or have watchers.. And most have a racing chair :D No offense to anyone, sorry if this sounds so.. :) As you can see I'm a long time member and haven't posted often(just 7 posts with this included) but I just felt I need to comment on that "people thinking they need competitive gear" thingie as I feel it so.
 
Last edited:
I used to play in sponsored Q3A clan tournaments and even when I was at my best (I am too old and slow now) I chose 85Hz at 1024x768 over 120Hz at 640x480 because the refresh rate wasn't the bottleneck, it was always about getting your mouse and mat combo right.

I feel like 85Hz would be a distinct disadvantage on an LCD these days. 85Hz sounds low, but the pixel response time on those CRTs was a true 0ms, (actually 0.01-0.02ms) with zero sample-and-hold blur. Many of us old-school competitive gamers will probably still say that LCD displays never really caught up; I think they finally have, but it took LCDs around 25 years to do so and it's not like-for-like either because CRTs still outperform every other technology for sample-and-hold blur, which affects motion clarity more than anything else.

I'm not giving up my strobing 120Hz Odyssey. There are far too many other advantages to modern displays, but sample-and-hold blur is the key LCD disadvantage all of these high-refresh gaming displays are trying to fight and in that respect the CRT is still unbeaten. We're just finally getting to the point where it's good enough, either at 240fps at 240Hz, or at lower framerates with strobing/BFI.

500Hz would be an upgrade over 240Hz, but only if the technology behind it can actually generate images in 1/500th of a second. IPS cannot, for sure - and very few combinations of GPU/CPU/game engine can, either.
 
I used to play in sponsored Q3A clan tournaments and even when I was at my best (I am too old and slow now) I chose 85Hz at 1024x768 over 120Hz at 640x480 because the refresh rate wasn't the bottleneck, it was always about getting your mouse and mat combo right.

I feel like 85Hz would be a distinct disadvantage on an LCD these days. 85Hz sounds low, but the pixel response time was a true 0ms, with zero sample-and-hold blur. Many of us old-school competitive gamers will probably still say that LCD displays never really caught up; I think they finally have, but it took LCDs around 25 years to do so and it's not like-for-like either because CRTs still outperform every other technology for sample-and-hold blur, which affects motion clarity more than anything else.

I'm not giving up my strobing 120Hz Odyssey. There are far too many other advantages to modern displays, but sample-and-hold blur is the key LCD disadvantage all of these high-refresh gaming displays are trying to fight and in that respect the CRT is still unbeaten. We're just finally getting to the point where it's good enough, either at 240fps at 240Hz, or at lower framerates with strobing/BFI.

500Hz would be an upgrade over 240Hz, but only if the technology behind it can actually generate images in 1/500th of a second. IPS cannot, for sure - and very few combinations of GPU/CPU/game engine can, either.
Back in the day(my Quake 2 post just before yours), I had 22" trinitron 204hz 800x600 in quake games. I felt totally that 204hz as a good thing but I mean I wouldn't have lost a match for only having 100hz to name something.
 
Back in the day(my Quake 2 post just before yours), I had 22" trinitron 204hz 800x600 in quake games. I felt totally that 204hz as a good thing but I mean I wouldn't have lost a match for only having 100hz to name something.
Exactly.

I later (2002? Post tournament years for sure) bought my last CRT which was a Mitsubishi Diamondtron capable of 1600x1200@85Hz, but it let me run 640x480 at whatever my Geforce3 could handle, 240Hz, I think. I still played a lot of Q3A and CS back then and there was no competitive edge from the higher refresh. If anything, I found the reduced details needed to hit high framerates a bigger problem.

Competitive eSports titles are typically optimised for 120Hz at most now. Sure, the engine may run locally at higher framerates but you're relying on clientside prediction that means your drawn frame isn't actually the truth of what happened. "OMG I'm sure that was a perfect headshot" didn't actually happen on the game server. Your GPU made it up, locally, as a lie.
 
I have nothing to add as you said it pretty much :) I agree on everything you said. It's just this new competitive casual genre that wants gear to be better even if it doesn't matter. I know this sounds a bit hostile to some but I have no bad feelings. Just feels people want to feel that they are important. :) EDIT: I was top20 in europe with guild wars 2 to name something. Other games too but never it was about having best hz :D It was mostly about understanding art of war. If curious steelseries contacted me on guild wars 2.. wanted to sponsor me.. I said no and left the game. I don't do games as a job. Huge respect btw as you are quake 3 player. Lots of good fps gamers there and a very insanely high skill level.
 
On a IPhone with a shitty screen protector it takes .723 seconds, I’m sure with a faster screen and a mouse someone can be faster. Let’s try it?

 
And I've always said "why just not wait for 1kHz" when people have suggested a high refresh monitor for me, an year or two and those probably also hit the market..
 
I kind of want to CES this year. It looks like we are going to get some crazy tech reveals.
 
On a IPhone with a shitty screen protector it takes .723 seconds, I’m sure with a faster screen and a mouse someone can be faster. Let’s try it?

My bestest on gamur gear was 219ms. Im old :(

Still winning a good number of Apex matches though, and keep saving my team in Darktide :D
 
I am sure you already know this but I want to say it, we do have 240 Hz OLED Monitors
Appreciate the heads up about the LG. I’ve been keeping an eye on the AW3423DW but the triangular pixel setup and the resulting fringing/text clarity turned me off. Not to mention the risk of burn-in after extended use.
 
It'll be popular with the professional e-sports crowd where the smaller size works best for slapping a dozen PC's right next to each other

99% of techpowerup commentators can't see past 2 fps.
Forums update at one frame per F5, who needs a higher refresh rate?

On a IPhone with a shitty screen protector it takes .723 seconds, I’m sure with a faster screen and a mouse someone can be faster. Let’s try it?

Me on my 70Hz VA display
1672448103325.png
 
On a IPhone with a shitty screen protector it takes .723 seconds, I’m sure with a faster screen and a mouse someone can be faster. Let’s try it?

Best of 3 runs:

255ms on a wireless mouse and VA TV at 60Hz
13ms of input lag on the TV according to RTINGS , and 17ms between frames.

217ms on a wired mouse and VA monitor at 240Hz
4ms of input lag on the monitor according to RTINGS, and 4ms between frames.

So, there should be a 22ms difference due to the displays, add a bit for the wireless mouse, and a bit of variance because I'm made of meat, not silicon. 500Hz displays sure do seem a bit pointless when you're only going to gain a 2-3 milliseconds; 90% of the problem is me, and the other 9% is stuff unrelated to refresh rate.
 
The thing is, using a browser like that is worst case scenario
you're running with Vsync on, and hitting that vsync limit - so in a 60hz setup it's adding a least 50ms latency (This is what running a -2 FPS cap is all about, cutting that off plain old render latency)

If i can get 210 from a web browser with a 50ms delay, I'm going to do alot better in a proper low latency environment or on a high refresh display
 
I can tell the difference between 120Hz and 144Hz, but it's subtle.
1. I can too but only on the desktop while moving around windows.

In a game? Not at all. Motion Picture has too much happening at once for me to notice the difference between 120/144 fps/Hz.

2. All this talk of refresh rate made me remember that I switched my phone to 60Hz a few weeks ago to save battery (from 120 Hz). I noticed the difference for the first few hours and then promptly forgot that there was even a difference. :laugh:

Ah brain you beautiful adaptive thing.

3. Back in the day I used to play Crysis 2 at 20 fps (everything on lowest possible settings) and remember having shit tons of fun with it. Even beat it on hardest difficulty.

Sigh. I wish I had gotten a 60 Hz HDR400 IPS Monitor instead of wasting money on a basic 144 Hz IPS Monitor.

4. I have noticed (in my limited testing) that tearing is a lot less noticeable when fps is less than half the refresh rate. Like I don't notice tearing at 50 fps on my 144 Hz monitor but I definitely notice it at 100 fps on my 144 Hz monitor.

Maybe that's a good use case for 500 Hz? No visible tearing at 200 fps. :p
 
Last edited:
@WhoDecidedThat

1 Definitely on the desktop. In a game, it depends on the scene and what I'm doing. If it's something requiring fast reflexes, then it helps for sure.

2 My iPad Pro does 120Hz. Switching that off results in an awfully juddery looking animation. However, my iPhone is 60Hz and it's much less noticeable. The reason will be due to the screen size being so much smaller.

3 I couldn't play any game at 20fps.

4 It's true, when the framerate is so high, the tears are much more fleeting and small, so can be hard or impossible to see. The best solution of course, is adaptive sync. Since I bought my G-SYNC compatible monitor, it's been like night and day. On top of that, I no longer feel the need to upgrade my graphics card. My ancient 2700K however, is another matter, but it can still game competently in most games in the meantime. I've had that CPU for 11 years now and it still feels fast. Been one helluva great investment! :cool: Will be sorry to see it go once I upgrade.
 
It's true, when the framerate is so high, the tears are much more fleeting and small, so can be hard or impossible to see.
To have any tearing at all at 500Hz, you'd need a framerate above 500FPS
And then you'd have visible tearing for less than 0.5ms


Your adaptive sync is still Vsync on, so zero tearing is possible
(Vsync off disables adaptive sync/Gsync/Freesync)

Vsync on, VRR (variable refresh rate, whatever brand name) on, FPS cap 2 under your refresh rate and you'll get that amazing experience you're after
 
To have any tearing at all at 500Hz, you'd need a framerate above 500FPS
And then you'd have visible tearing for less than 0.5ms


Your adaptive sync is still Vsync on, so zero tearing is possible
(Vsync off disables adaptive sync/Gsync/Freesync)

Vsync on, VRR (variable refresh rate, whatever brand name) on, FPS cap 2 under your refresh rate and you'll get that amazing experience you're after
It's possible to have tearing when the GPU FPS is below the monitor's refresh too, as all that matters is that they're not synced for a new frame to start being drawn halfway through a scan cycle. The exact amount of tearing and how visible it is depends on various factors though, including the monitor refresh, GPU FPS and game engine*. I know, because I've seen it myself: UT2004 on an FX5200 will produce this nicely as the card is so low performance it couldn't even hit 60FPS on a 60Hz CRT much of the time. It really looked a mess lol. Man, that takes me back 20 years... btw, I still have that card, a nice Asus AGP low profile one with VGA output only. Hasn't been used for years now.

As far as my VRR setup is concerned, I have it set to VSYNC on above the monitor's refresh rate. When I play CoD (online or single player) my ancient 2700K CPU can't hit 144Hz anyway, even at low res, so I'm always in the G-SYNC range**. It's absolute bliss I tell you. Having experienced it now, I can say that VRR is one of the best advancements in PC tech for years and significantly reduces the need to upgrade the graphics card for improved framerate. Good thing too at today's ridiculous prices.

*This applies whether the GPU rendering is above or below the monitor refresh rate. Agreed that with a 500Hz refresh rate it's likely impossible to ever see it and that's a Good Thing. I certainly noticed very little tearing when running the monitor at 144Hz and the GPU hitting 200-400 FPS VSYNC off in older games, so at 500Hz, I doubt anyone will ever see it, although it's still there.

**Sometimes it even drops below 48FPS, the bottom of the G-SYNC range for my monitor and looks bad, but don't tell anyone.
 
It's possible to have tearing when the GPU FPS is below the monitor's refresh too
Only if you run Vsync forced off, which disables your variable refresh rate technology from working anyway
 
Only if you run Vsync forced off, which disables your variable refresh rate technology from working anyway
VSYNC can be set to switch off above the monitor's refresh rate, with G-SYNC taking over below it. Sounds like it might work differently on AMD perhaps?
 
VSYNC can be set to switch off above the monitor's refresh rate, with G-SYNC taking over below it. Sounds like it might work differently on AMD perhaps?
That's not what he said is happening

If he had adaptive sync and adaptive v-sync (very different things), he'd only get tearing above refresh rate
 
That's not what he said is happening

If he had adaptive sync and adaptive v-sync (very different things), he'd only get tearing above refresh rate
I suspect we might be at cross purposes somewhere.

I know how these things work and haven't said anything incorrect, which you seem to think I have.
 
Back
Top