• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

BOE Creates 27-Inch Full HD Display With 500 Hz Refresh Rate

Joined
Dec 17, 2011
Messages
359 (0.08/day)
Considering that the human eye can not determine single frames above 80hz and can not perceive a framerate above 220hz, a 500hz framerate is a waste. Give us a quality 240hz display and call it a day.
Source please?

The 80 Hz one makes sense to me anecdotally though. I have experienced that when increasing fps upto and beyond 70-80 fps on my 144 hz screen there is a noticable increment in level of smoothness/butteriness.
 
Joined
Feb 3, 2005
Messages
499 (0.07/day)
Sure, whatever you say.

Nobody's being dismissive. But this is early adopter stuff, not many will care about it at this point.
And then there's diminishing returns.
Also, it's Chinese. Until someone reviews this properly, we can't tell how many corners were cut.

BOE supplies a lot of displays to major brands.
 
Joined
Jun 10, 2014
Messages
3,019 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Considering that the human eye can not determine single frames above 80hz and can not perceive a framerate above 220hz, a 500hz framerate is a waste. Give us a quality 240hz display and call it a day.
It's been around a decade since i looked into the research on this topic, and I believe the limit is in the ~200 Hz range as you said. But it is worth mentioning that it's situation dependent and even to some extent varying between individuals.

The most important takeaway though, is that human vision is much more perceptive to smoothness of motion than to detecting individual frames. So while having >60 Hz is certainly useful, the frame rate consistency is even more useful. Years ago, I conducted an experiment of rendering at ~60 FPS (on a 60 Hz panel) and having stutter in ~1-2 ms range vs. <0.1 ms, and the difference was easily noticeable. So in order for higher frame rates to be useful, the computer needs to be able to produce the new frames with a higher precision. The reason why high frame rates are advantageous is not because details may appear earlier on the screen, it's mostly because it's easier for the brain to filter out what is actually moving. And stutter is the worst enemy of this, as it distracts the brain when processing the image. I know I'm fairly sensitive to stutter, and find it quite straining.

So 500 Hz is not just wasteful because people can't see the difference, it's also a bad idea because it cuts the tolerances for frame rate consistency in half, so you can get to a point where the picture becomes noticeable worse. At 500 Hz there is only 2 ms between frames, and with the precision in the Windows scheduler you will struggle to keep a good consistency at these rates.

But I believe no one has addressed the biggest elephant in the room; can games even produce unique frames at this rate?
Modern game engines work at a fixed tick rate, and if you render frames at a higher rate than this, the GPU will just render multiple identical frames, rendering the 500 Hz screen utterly pointless (pun intended).
A few years ago, I remember CS:GO had 120 Hz tick rate (30 Hz server), and 60-100 Hz was fairly typical. I haven't checked the most recent games, but I doubt there are many running at >120 Hz.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.88/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
This has to be the dumbest thing the tech industry made in quite some time.
It's "future tech" - commercial protoypes

a 500hz monitor is no use for gamers, but in certain industries it'd be magical... imagine if you were testing high frame rate, slow motion videography?


the 8K 120hz is definitely made for commercial purposes and not home users, they could literally use that for a small cinema display, or slap it outside buildings like they do in NYC
 

bug

Joined
May 22, 2015
Messages
13,960 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Chinese does not automatically mean it's not true. They make a lot of stuff that check out. Besides a 480Hz prototype panel was demoed nearly 5 years ago:
I didn't say it was not true. But their standards tend to be different from ours. Quite often.
Besides, this is just a prototype, we don't know if it's meant to hit retail this decade.
 
Joined
Dec 9, 2007
Messages
38 (0.01/day)
System Name Boris
Processor C2D Q6600 @ 3.6ghz 24/7 H20
Motherboard Gigabyte GA-X48-DS4
Cooling 2x240mm + 1x120mm + 1x360mm rads one loop.
Memory 4gb Corsair XMS2 6400 @ 1100mhz
Video Card(s) 2x ASUS HD4870 XFire H20
Storage 2x320gb Seagates
Display(s) 2 x 22" AOC + 1 x 24" AOC extended desktop
Case Boris
Audio Device(s) X-Fi xtreme music
Power Supply Enermax Liberty 620W
Software Vista x64
Benchmark Scores 1 Penis push up, then it broke.
Having switched from a Rubbish Samsung Odyssey G7 240hz to an LG C1 120hz, the C1 is LEAGUES better than the G7. LCD is just a blurry mess and can barely display 60hz without blur (or some hacky backlight flickering that doesn't work with gsync/freesync). LCD either needs to die or substantially improve, otherwise 500hz on this display is going to be entirely pointless. The proof will be in the pudding though.
 
Joined
May 21, 2009
Messages
275 (0.05/day)
Processor AMD Ryzen 5 4600G @4300mhz
Motherboard MSI B550-Pro VC
Cooling Scythe Mugen 5 Black Edition
Memory 16GB DDR4 4133Mhz Dual Channel
Video Card(s) IGP AMD Vega 7 Renoir @2300mhz (8GB Shared memory)
Storage 256GB NVMe PCI-E 3.0 - 6TB HDD - 4TB HDD
Display(s) Samsung SyncMaster T22B350
Software Xubuntu 24.04 LTS x64 + Windows 10 x64
One word, why?

Because i like feel a champion in counter strike



:)
 
Joined
Dec 17, 2011
Messages
359 (0.08/day)
Modern game engines work at a fixed tick rate, and if you render frames at a higher rate than this, the GPU will just render multiple identical frames
Uh what? I haven't looked much into this but I know for sure that Doom Eternal has its tick rate tied to fps. I have seen the console commands relevant to this and speedrunners have to deal with the implications of this when doing their runs.

Having switched from a Rubbish Samsung Odyssey G7 240hz to an LG C1 120hz, the C1 is LEAGUES better than the G7. LCD is just a blurry mess and can barely display 60hz without blur (or some hacky backlight flickering that doesn't work with gsync/freesync). LCD either needs to die or substantially improve, otherwise 500hz on this display is going to be entirely pointless. The proof will be in the pudding though.
So you're saying that something like the 175 Hz Alienware OLED is gon be gud?
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.88/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Uh what? I haven't looked much into this but I know for sure that Doom Eternal has its tick rate tied to fps. I have seen the console commands relevant to this and speedrunners have to deal with the implications of this when doing their runs.


So you're saying that something like the 175 Hz Alienware OLED is gon be gud?
some shitty games use low server tickrates

As an example, PUBG used this and it varied per region - after some big fancy upgrades americans got a whopping 60Hz tickrate, while us aussies got 20Hz.
Led to a lot of "what shot me, i was behind cover" moments and so on
Fortnite was 30Hz, and so on

It not only varies between games but varies within the match itself... so it'll speed up at the end of the game as less players are alive, but run like dogs ass early on with all 100 players
Below info is screencaps from the highlights of this video
PUBG 60Hz Tickrate Update 14 Netcode Analysis - YouTube
1643699252986.png

Early PUBG
1643699354676.png


Updated Pubg
1643699386737.png

(Aussie PUBG is the red bar at the bottom)
(It's almost like they want to save money on the servers)

Now is that relevant here? Not really, because not every game does things this way, and since you cant match your PC's refresh rates to the servers due to distance, a faster monitor refresh rate does give you a higher chance of receiving the visual update before your opponent, if they dont have a ping advantage

Examples with math here:
One critical example is how fortnite keeps the network latency much much lower despite only running at 30hz. Anti cheat, server location, server power, all sorts of things add up far beyond just tickrate.
1643699515083.png
 

Attachments

  • 1643699273829.png
    1643699273829.png
    483.1 KB · Views: 64
Joined
Jul 5, 2013
Messages
28,660 (6.79/day)
The only good thing higher refresh rate monitors higher than 240hz give is less eye strain because of the blur effect but besides that we still need proof of any benefit.

Good topic about eye strain https://forums.blurbusters.com/viewtopic.php?t=8446

What helped me to contain that eye strain is to block the constant blue light, I had to buy a special glasses that block almost all blue light and so far has been my saver, It really works great.

Another article about glasses that block blue light.

Not everyone suffers from eye-strain. For example, doesn't effect me at all.

Source please?
I would if I could remember where I read it and I can't find it. University study done 15 some-odd years ago. Sorry. This is known science though, if you go looking you'll find it. I know there's been more research done since then.

But it is worth mentioning that it's situation dependent and even to some extent varying between individuals.
The most important takeaway though, is that human vision is much more perceptive to smoothness of motion than to detecting individual frames.
This is true. However, where flat panels displays are concerned we can't perceive framerate changes above 200hz to 220hz. About 8 years ago a few companies were testing out 480hz displays with 480hz content and comparing them to 60hz, 120hz and 240hz displays They did this at my local BestBuy. Everyone could see the difference between 60->120->240hz. But when they switched to 480hz, no one could tell much of a difference, if at all. This is why it never took off. 480hz displays have been made, but there's no point as the human eye just can't see it.
 
Last edited:
Joined
Feb 18, 2005
Messages
5,889 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Joined
Dec 17, 2011
Messages
359 (0.08/day)
480hz displays have been made, but there's no point as the human eye just can't see it.
The fps needed to take advantage of such high refresh rate would only be feasible in competitive multiplayer games (and even there it will be limited in improvement over existing 360 Hz). But since those have at max 128 Hz tick rate, having fps be 4 times the tick rate doesn't really make a lot of sense.

For leisure games like Metro Exodus, God of War, Sekiro etc... even 120 Hz is more than plenty fast, not to mention the hardware needed to reach 100+ Hz in these titles. VRR support in monitors is far more useful than 300+ Hz monitors for most of us.

I also have to agree with Lay-kun that OLED is the tech with the ultra fast response time, not LCDs. It would be interesting to see how 200+ Hz OLEDs perform.
 
Last edited:
Joined
Sep 15, 2011
Messages
6,832 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I've got the same "fluidity" with G-Sysnc/Freesysnc enabled at ~40-50fps/Hz, compared to 200Hz monitors. I test it right on spot ;)
 
Joined
Dec 17, 2011
Messages
359 (0.08/day)
360 Hz means 1 frame every 2.8 ms. 500 Hz means 1 frame every 2 ms. The difference is 0.8 millisecond. There is no latency advantage to going 500 Hz in case anyone is wondering. The only usable advantage may be smoothness if anyone's eyes can see the difference between 360 Hz and 500 Hz.

As additional data, 240 Hz means 1 frame every 4.2 ms so the latency difference is 2.2 ms. Very difficult to notice.
 
Joined
Aug 6, 2020
Messages
729 (0.45/day)
In mt CRT days, I couldn't tell the difference between gaming at 85 hz and 120 hz.

I also couldn't tell tell the difference between 75 hz (except for noticeable flicker) - 60 hz was noticeable versus the 75 hz option though

After transitioning to both oled TV (B7 120 hz at 1080p) and TN (1 ms 1080p running at an overclocked 75 hz), I still can't tell the difference between the two!
 
Joined
Jun 10, 2014
Messages
3,019 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Uh what? I haven't looked much into this but I know for sure that Doom Eternal has its tick rate tied to fps. I have seen the console commands relevant to this and speedrunners have to deal with the implications of this when doing their runs.
Adding to what others have said in the meantime;
It's very uncommon to have tick rate tied to frame rate today, at least in real time "precision" games and especially multiplayer games.
You can have a local and a server tick rate (e.g. 120 Hz local and 30 Hz server which used to be the defaults for CS:GO ~5 years ago). And the way this works is the local game simulates the game while waiting for the next server tick, and then corrects any difference when it finally arrives. So in theory, this means that you can see yourself kill an opponent on your screen, only to be immediately "corrected" and killed yourself. Usually this kind of glitching is minimal, but it can certainly be noticeable, especially when watching other players move rapidly.

Also keep in mind that even if the server tick rate is fairly high, you still have to live with the latency difference, so there will be edge cases where "strange things" happen.

I assume engines like Unreal, Id tech, etc. have similar mechanisms for "latency compensation".

The fps needed to take advantage of such high refresh rate would only be feasible in competitive multiplayer games (and even there it will be limited in improvement over existing 360 Hz). But since those have at max 128 Hz tick rate, having fps be 4 times the tick rate doesn't really make a lot of sense.
Yes, you're starting to get it. Technically there is a very minor latency gain though, or at least for the first of those four frames. So you gain a tiny bit in best case input lag, but nothing in smoothness.

After transitioning to both oled TV (B7 120 hz at 1080p) and TN (1 ms 1080p running at an overclocked 75 hz), I still can't tell the difference between the two!
Have you tried just dragging av window quickly around on your screens?
At least I can easily see that on 60 vs. 120/144 Hz.
 
Last edited:
Joined
Aug 6, 2020
Messages
729 (0.45/day)
Have you tried just dragging av window quickly around on your screens?
At least I can easily see that on 60 vs. 120/144 Hz.

Havn't noticed that in like at least a decade - most modern notebooks are plenty fast enough for "interactiveness" at 60 hz (so why you would imagine a tn screen running without the notebook power consumption limits would be so much worse, I cant imagine?)


I do agree that 120 hz brings other video playback benefits, but it's more than enough for gaming (and overkill for basic desktop use)
 
Joined
Dec 17, 2011
Messages
359 (0.08/day)
Have you tried just dragging av window quickly around on your screens? At least I can easily see that on 60 vs. 120/144 Hz.
I recently switched from 120 Hz to 144 Hz and I can tell there is a difference in smoothness between 120 and 144 while scrolling text (not in games though). It really is noticable.
 
Joined
Aug 6, 2020
Messages
729 (0.45/day)
I recently switched from 120 Hz to 144 Hz and I can tell there is a difference in smoothness between 120 and 144 while scrolling text (not in games though). It really is noticable.
And do you actually ever scroll text in your daily work? or is it only encountered when you're testing your display at on of these "Motion Clarity Overkill Review+++ " sites?

You do realize that these sites were dreamed up before monitor makers added Overdrive, right (and that makes even dog-slow VA acceptable for most!)?

A worst-case test failure doesn't mean that youre ever going to notice the difference in the real world.
 
Last edited:
Joined
Jun 10, 2014
Messages
3,019 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
And do you actually ever scroll text in your daily work? or is it only encountered when you're testing your display at on of these "Motion Clarity Overkill Review+++ " sites?
So, you don't ever scroll on a web page, document, etc.? :eek:

There are probably hundreds of thousands of developers who spend all day looking at text, not to mention all the people working with documents.
I was actually surprised when I noticed that coding on a high refresh monitor was actually more comfortable (I noticed when switching back). It's certainly noticeable and comfortable, but not anywhere close to a necessity. But like many other factors, like general responsiveness and using a tactile mechanical keyboard, it does help productivity a tiny bit.
 
Joined
Aug 6, 2020
Messages
729 (0.45/day)
So, you don't ever scroll on a web page, document, etc.? :eek:

There are probably hundreds of thousands of developers who spend all day looking at text, not to mention all the people working with documents.
I was actually surprised when I noticed that coding on a high refresh monitor was actually more comfortable (I noticed when switching back). It's certainly noticeable and comfortable, but not anywhere close to a necessity. But like many other factors, like general responsiveness and using a tactile mechanical keyboard, it does help productivity a tiny bit.
Sorry man, I thought you were talking about some overkill motion testing site (think Blurblusteer)

I haven't noticed any smearing while scrolling vertically in my last ten years of desktop LCD display here at work.
 

bug

Joined
May 22, 2015
Messages
13,960 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
360 Hz means 1 frame every 2.8 ms. 500 Hz means 1 frame every 2 ms. The difference is 0.8 millisecond. There is no latency advantage to going 500 Hz in case anyone is wondering. The only usable advantage may be smoothness if anyone's eyes can see the difference between 360 Hz and 500 Hz.

As additional data, 240 Hz means 1 frame every 4.2 ms so the latency difference is 2.2 ms. Very difficult to notice.
Reminds of the time Sony announced their first 4k phone. Journalists were marveling at how sharp the image was, when they got to play with the actual unit. And then the Sony representative showed up: "umm, that's the FHD unit, the 4k unit is over here".

People just have a way of seeing what they want to see...
 
Top