• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Quietly Relaxes Certification Requirements for NVIDIA G-SYNC Ultimate Badge

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.23/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
UPDATED January 19th 2021: NVIDIA in a statement to Overclock3D had this to say on the issue:

Late last year we updated G-SYNC ULTIMATE to include new display technologies such as OLED and edge-lit LCDs.

All G-SYNC Ultimate displays are powered by advanced NVIDIA G-SYNC processors to deliver a fantastic gaming experience including lifelike HDR, stunning contract, cinematic colour and ultra-low latency gameplay. While the original G-SYNC Ultimate displays were 1000 nits with FALD, the newest displays, like OLED, deliver infinite contrast with only 600-700 nits, and advanced multi-zone edge-lit displays offer remarkable contrast with 600-700 nits. G-SYNC Ultimate was never defined by nits alone nor did it require a VESA DisplayHDR1000 certification. Regular G-SYNC displays are also powered by NVIDIA G-SYNC processors as well.

The ACER X34 S monitor was erroneously listed as G-SYNC ULTIMATE on the NVIDIA web site. It should be listed as "G-SYNC" and the web page is being corrected.

NVIDIA has silently updated their NVIDIA G-SYNC Ultimate requirements compared to their initial assertion. Born as a spin-off from NVIDIA's G-SYNC program, whose requirements have also been laxed compared to their initial requirements for a custom and expensive G-SYNC module that had to be incorporated in monitor designs, the G-SYNC Ultimate badge is supposed to denote the best of the best in the realm of PC monitors: PC monitors that feature NVIDIA's proprietary G-SYNC module and HDR 1000, VESA-certified panels. This is opposed to NVIDIA's current G-SYNC Compatible (which enables monitors sans the G-SYNC module but with support for VESA's VRR standard to feature variable refresh rates) and G-SYNC (for monitors that only feature G-SYNC modules but may be lax in their HDR support) programs.





The new, silently-edited requirements have now dropped the HDR 1000 certification requirement; instead, NVIDIA is now only requiring "lifelike HDR" capabilities from monitors that receive the G-SYNC Ultimate Badge - whatever that means. The fact of the matter is that at this year's CES, MSI's MEG MEG381CQR and LG's 34GP950G were announced with an NVIDIA G-Sync Ultimate badge - despite "only" featuring HDR 600 certifications from VESA. This certainly complicates matters for users, who only had to check for the Ultimate badge in order to know they're getting the best of the best when it comes to gaming monitors (as per NVIDIA guidelines). Now, those users are back at perusing through spec lists to find whether that particular monitor has the characteristics they want (or maybe require). It remains to be seen if other, previously-released monitors that shipped without the G-SYNC Ultimate certification will now be backwards-certified, and if I were a monitor manufacturer, I would sure demand that for my products.

View at TechPowerUp Main Site
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Bwhahahaahahahaa jokers.

Maybe ask VESA to devise a special badge for you, so you can take a spot next to that ridiculous set of ' HDR standards' for monitor settings that can barely be called pleasant to look at and can't ever be called HDR in any useable form.

We've seriously crossed into idiot land with all these certifications and add on technologies. The lengths companies go to hide inferior technology at its core... sheet. And if you bought into that Ultimate badge, you even paid for inferior on top... oh yeah 'subject to change' of course. Gotta love that premium feel.

(No, I'm not sour today, dead serious. And laughing my ass off at all the nonsense 2020 brought us, and 2021 seems eager to continue bringing)
 
Joined
Feb 11, 2009
Messages
5,570 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,769 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
I guess doing proper backlight with FALD was just too costly and Nvidia wanted more "Ultimate" monitors in the market...

We've seriously crossed into idiot land with all these certifications and add on technologies. The lengths companies go to hide inferior technology at its core... sheet. And if you bought into that Ultimate badge, you even paid for inferior on top... oh yeah 'subject to change' of course. Gotta love that premium feel.
In all fairness, the few monitors that met the old standard were actually quite decent, if you could afford them...
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I guess doing proper backlight with FALD was just too costly and Nvidia wanted more "Ultimate" monitors in the market...


In all fairness, the few monitors that met the old standard were actually quite decent, if you could afford them...
Yeah, all two of them?

The whole ultimate badge is technology pushed beyond reasonable at astronomical price increases. Now, Nvidia figured that out too because the stuff didn't sell or make waves, and adapts accordingly. Meanwhile, there are still hundreds of monitors out there with VESA HDR spec that are going to get a new Ultimate badge now.

Selling no is bad sales I know, but did mommy not teach us not to lie?

Nvidia is still marketing Gsync as something special, and one by one, all those supposed selling points fall apart as reality sets in.

I wonder where DX12 Ultimate is going to end up, I'm still trying to come to terms with all the goodness that API has brought to gaming. /s
 
Joined
Feb 23, 2019
Messages
6,105 (2.87/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
They're doing the same thing with G-sync, where G-sync compatible is often listed as "Nvidia G-sync" on the packaging or on stickers attached to display itself.
 
Last edited:
Joined
Nov 3, 2013
Messages
2,141 (0.53/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
VESA's retarded certifications can go hand in hand with the entire idiocy that is USB 3.0.1.2.gen 4.3.C.5gbps.

No 2 such clusterfucks have been released into the tech space.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,769 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Yeah, all two of them?

The whole ultimate badge is technology pushed beyond reasonable at astronomical price increases. Now, Nvidia figured that out too because the stuff didn't sell or make waves, and adapts accordingly. Meanwhile, there are still hundreds of monitors out there with VESA HDR spec that are going to get a new Ultimate badge now.

Selling no is bad sales I know, but did mommy not teach us not to lie?

Nvidia is still marketing Gsync as something special, and one by one, all those supposed selling points fall apart as reality sets in.

I wonder where DX12 Ultimate is going to end up, I'm still trying to come to terms with all the goodness that API has brought to gaming. /s
It seems like there's a massive 17 of them. 10 of them are below 4K. Two of them are 65-inches and shouldn't really be counted as monitors imho.

I guess it's a matter of trying to drive technology forward as well, but yeah, the pricing is insane in this case, as it would seem the technology isn't quite ready to transition to the kind of specs Nvidia wanted. This is also most likely why they're quietly scaling it back now.

I guess we need to move from Ultimate to Preeminent for the next generation of kit...

VESA's retarded certifications can go hand in hand with the entire idiocy that is USB 3.0.1.2.gen 4.3.C.5gbps.

No 2 such clusterfucks have been released into the tech space.
Did you see they added a new 500 category, as well as True Black 400 and 500?
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It seems like there's a massive 17 of them. 10 of them are below 4K. Two of them are 65-inches and shouldn't really be counted as monitors imho.

I guess it's a matter of trying to drive technology forward as well, but yeah, the pricing is insane in this case, as it would seem the technology isn't quite ready to transition to the kind of specs Nvidia wanted. This is also most likely why they're quietly scaling it back now.

I guess we need to move from Ultimate to Preeminent for the next generation of kit...


Did you see they added a new 500 category, as well as True Black 400 and 500?
The more I see the stronger I get the impression Nvidia wants to pre-empt everything to own it, but the reality is that the movement exists without them as well, just not on their terms.

Gsync is the perfect example. It has lost the war of standards already, so we get an upsell-version of it still. Like... what the actual f... take your loss already, you failed. Its like presenting us now with a new alternative to BluRay that does almost nothing better but still 'exists' so you can throw money at it.

Did you see they added a new 500 category, as well as True Black 400 and 500?
WOW.

Stuff like this makes me wonder if my VESA mount might not just fall off the wall anyway at some point. What are those stickers worth anymore?

Its really just so simple... you either have OLED or you have nothing. I guess thats not the best marketing story, but yeah.
 
Last edited:
Joined
Sep 28, 2012
Messages
982 (0.22/day)
System Name Poor Man's PC
Processor Ryzen 7 9800X3D
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
Hoping for broader and wider adoption? Funny, I wonder where that GSync Compatible came from.
 
Joined
Oct 28, 2012
Messages
1,194 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
VESA's retarded certifications can go hand in hand with the entire idiocy that is USB 3.0.1.2.gen 4.3.C.5gbps.

No 2 such clusterfucks have been released into the tech space.
That remind me of something that I've learned while doing a research paper about why some institution are failing their communication: "Scientists and engineers often wish to handle all the communication by themselves, because after all, who better than the people directly involved with the project to talk about it? The issue is that sometimes they are so deep into their high-level work, with coworkers of the same level that they become clueless when it comes to talk to "normal" people." It took so long before they finally decided to streamline the naming of the wifi, meanwhile the USB-IF made a huge mess...USB-C is just a physical connector, not every USB-C can handle 10Gbps, video signal, and power delivery, (some smartphone support video out, while some don't, but it's not something that's clearly specified) you need to find the right cable...

I wish that the vesa certification would be used to enforce a strict minimum about the general quality of screens. 100% sRGB, factory calibration 1000:1 contrast ratio and 400nits. I was looking for a decent laptop two months ago, but I was baffled to see that even with a budget of 800-900€, it's nearly impossible to get a decent screen, but put the same amount of money on a phone/tablet and you'll end up with a great screen.
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
That remind me of something that I've learned while doing a research paper about why some institution are failing their communication: "Scientists and engineers often wish to handle all the communication by themselves, because after all, who better than the people directly involved with the project to talk about it? The issue is that sometimes they are so deep into their high-level work, with coworkers of the same level that they become clueless when it comes to talk to "normal" people." It took so long before they finally decided to streamline the naming of the wifi, meanwhile the USB-IF made a huge mess...USB-C is just a physical connector, not every USB-C can handle 10Gbps, video signal, and power delivery, (some smartphone support video out, while some don't, but it's not something that's clearly specified) you need to find the right cable...

I wish that the vesa certification would be used to enforce a strict minimum about the general quality of screens. 100% sRGB, factory calibration 1000:1 contrast ratio and 400nits. I was looking for a decent laptop two months ago, but I was baffled to see that even with a budget of 800-900€, it's nearly impossible to get a decent screen, but put the same amount of money on a phone/tablet and you'll end up with a great screen.

The thing is every certification gets into those details anyway so you have to kinda know what you buy regardless. Even if they keep it simple, they find some way to introduce FUD and push clueless people to something it really isn't.

HD Ready.
Watt RMS
Dynamic Contrast

the list is long and its not getting more simple, but more complicated as technology gets pushed further. Look at VESA's HDR spec. They HAVE a range of limitations much like you say... but then they lack some key points that truly make it a solid panel.

The spec intentionally leaves out certain key parts of a spec to leave wiggle room for display vendors to tweak their panels 'within spec' with the same old shit they always used, just calibrated differently. G2G Rise and fall time for example.... VESA only measures rise time. G2G has always been cheated with, now its part of a badge so we no longer look at a fake 'x' ms G2G number (which was actually more accurate than just measuring rise time)... so the net effect of VESA HDR is that we know even less. Basically all we've won is... 'your ghosting/overshoot will look a specific way'.

The new additions also underline that the previous versions actually had black levels that were not good - which is obvious with weak local dimming on a backlit LCD. 'True black' actually spells 'Not true black elsewhere'.

Kill it with fire. :)
 
Last edited:
Joined
Aug 31, 2016
Messages
104 (0.03/day)
The whole idea of certifying HDR together with G-sync is somewhat strange, and if you really want to certify HDR then make some precise categories, because "1000 nits FALD" is rather vague. I generally think that different G-sync tiers should signify the quality of VRR implementation and nothing else and they should remove the HDR mention entirely, because if we start calling displays with 8-zone edge-lit local dimming that need to blow out half of the screen to illuminate small bright object a "lifelike HDR" then this is just nothing but misinformation and marketing nonsense. Certifications should work as a quality assurance, and basically everything that happened around HDR so far does the opposite. For example Vesa HDR certifications are completely useless, allowing some abominations like for example Samsung G9 (only a few edge-lit zones on a massive screen) to get HDR1000 certification, implying that is it some kind of high-end HDR screen, while in reality it is unusable and first thing everyone does is turning local dimming off, so effectively turning any actual HDR off.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Personally, I think DisplayHDR 600 is good enough. But if it's not required in the spec, that's a problem.
And yes, anything better than DisplayHDR 600 is as rare as hen's teeth (and DisplayHDR 600 monitors are only slightly easier to find). But the failure is not Nvidia's, the failure lies with manufacturers. Monitors are presented at various shows, yet are only actually made available one year later. Even later than that, in some cases. Same goes with mini and microLED. Those were supposed to improve the HDR experience, yet here we are, years later after at least miniLED was supposed to be available and we can buy like 3 or 4 monitors for like 3 grand. It looks like LG will be able to give us OLED monitors (I know, with pros and cons) before anyone else can tame mini or microLED.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Maybe ask VESA to devise a special badge for you, so you can take a spot next to that ridiculous set of ' HDR standards' for monitor settings that can barely be called pleasant to look at and can't ever be called HDR in any useable form.
That "low brightness" bitching is getting old.
OLED TVs dish out 600-700nit (and not full screen, of course) and even that is "it hurts me eyes" level of bright.
And that is something that is viewed from several meters distance.

That figure naturally needs to go down, when one is right next to the screen.
400/500nit "true black" (i.e. OLED) makes perfect sense on a monitor.

1000+ nit on a monitor is from "are you crazy, dude" land. People who think that makes sense should start with buying an actual HDR device (and not "it supports HDR" crap like TCL and other bazingas)
 
Joined
Sep 10, 2015
Messages
530 (0.16/day)
System Name My Addiction
Processor AMD Ryzen 7950X3D
Motherboard ASRock B650E PG-ITX WiFi
Cooling Alphacool Core Ocean T38 AIO 240mm
Memory G.Skill 32GB 6000MHz
Video Card(s) Sapphire Pulse 7900XTX
Storage Some SSDs
Display(s) 42" Samsung TV + 22" Dell monitor vertically
Case Lian Li A4-H2O
Audio Device(s) Denon + Bose
Power Supply Corsair SF750
Mouse Logitech
Keyboard Glorious
VR HMD None
Software Win 10
Benchmark Scores None taken
I guess it's a matter of trying to drive technology forward as well...

You can't drive technology forward with closed standards. If something, that could've been learned at the aquisition of 3DFx... What closed standard of nVidia were managed to spread industry-wide? Some professional stuff, like CUDA managed to get along, but all the consummer stuf were either killed, terminadet or just left behind when an open standard cam to it's place. Even if it was inferior. Anyone remembers Phys-X GPU-acceleration? Or maybe 3D Vision?

Left behind... All of them... PhysX has grinded down to one of many already existing CPU-phisics core engines where it was unable to compere with Havoc. Why? Because Havoc could be unleashed to all CPU-cores while PhysX had to been restricted in order to make up the difference between GPU and CPU PhysX... The year 2016 saw 6 titles with PhysX, the years after saw 1 in each year except in 2018 in which there was none...
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
That "low brightness" bitching is getting old.
OLED TVs dish out 600-700nit (and not full screen, of course) and even that is "it hurts me eyes" level of bright.
And that is something that is viewed from several meters distance.

That figure naturally needs to go down, when one is right next to the screen.
400/500nit "true black" (i.e. OLED) makes perfect sense on a monitor.

1000+ nit on a monitor is from "are you crazy, dude" land. People who think that makes sense should start with buying an actual HDR device (and not "it supports HDR" crap like TCL and other bazingas)
For bonus points that 1,000nits in DisplayHDR 1000 is not for the whole screen either. It is only local, for a brief period of time.
Sure an OLED will not be as good as traditional LED when viewing a scene that goes from a cave to full-blown sunlight. But try to count how often do you see that in a day.

I also have an OLED TV (LG, which is supposed to be on the dimmer side of OLEDs) and I can vouch the brightness is high enough I don't go near 100%. Except when viewing HDR content, because you can't control brightness in DolbyVision mode.
 
Joined
Oct 22, 2014
Messages
14,170 (3.81/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
The thing is every certification gets into those details anyway so you have to kinda know what you buy regardless. Even if they keep it simple, they find some way to introduce FUD and push clueless people to something it really isn't.

HD Ready.
Watt RMS
Dynamic Contrast
Watts RMS is fine, it's Watts PMPO that is useless.
 
Joined
Aug 31, 2016
Messages
104 (0.03/day)
That "low brightness" bitching is getting old.
OLED TVs dish out 600-700nit (and not full screen, of course) and even that is "it hurts me eyes" level of bright.
And that is something that is viewed from several meters distance.

That figure naturally needs to go down, when one is right next to the screen.
400/500nit "true black" (i.e. OLED) makes perfect sense on a monitor.

1000+ nit on a monitor is from "are you crazy, dude" land. People who think that makes sense should start with buying an actual HDR device (and not "it supports HDR" crap like TCL and other bazingas)
1000+ nit brightness for the most part is only used locally for small highlights, base of the frame is still 120 nits and average frame brightness from entire movies for example is typically around 160 nits. That is of course if you have a display with enough light precision, because with something like 8-zone edge-lit local dimming it is very common for a display to blow out the entire screen in the process of illuminating small highlights. Most common complaint about HDR from mainstream users was actually that it is too dim to watch in bright rooms, so manufacturers are now starting to implement ambient light dependent tonemapping, so "too bright HDR" isn't really a thing, unless either implemented wrong or displayed wrong.

But I agree about 500 nit OLED making perfect sense, I'd take that over 1000 nit FALD any day. Natural performance of the panel is always more important and applies universally to everything with no side effects.
 

lluvia

New Member
Joined
Jul 8, 2020
Messages
14 (0.01/day)
When others are so confident, I start to doubt myself but... on the current market "HDR" on computer monitors itself are a gimmick.

If any of the monitor tests back in day proved anything aside from game developers having to waste spend more time tuning their game for HDR for it to be even a thing, it was that either computer monitors need to have DisplayHDR1000 levels of brightness at a constant to even pretend it was HDR. Otherwise brightness would not matter (and would be nice especially for me who can barely even handle an IPS at full brightness) since its trying to make up for the lack of true contrast via FALD -- DisplayHDR600 or maybe even lower would be fine as long as FALD on computer monitors was a thing. On top of that, even monitors that purportedly met "G-SYNC Ultimate" requirements before weren't able to do high refresh rate (120Hz+), 1440p resolution or higher, and "HDR" together anyway (misleading customers from the beginning), due to HDMI2.1 only finally introduced (on "consumer" graphics cards) last year with the mythical Nvidia 30xx series. I don't care about this whole mumbo jumbo since it's something I can worry about after I get a HDMI2.1/DP2.0 graphics card and Cyberpunk 2077 is actually playable (or at least supports mods, so we can have something like USSEEP/USLEEP) in the way I envisioned, lol.

tl;dr G-SYNC Ultimate is a stupid "certification" to begin with and will remain that way for several more years but, Jensen needs to deceive consumers to buy designer leather jackets, one in every color to match his spatulas.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.87/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I don't normally see standards watered down and it can't be a good thing. The fact that they've done it quietly is very telling.

I reckon, it's happened because the royalty-free FreeSync is killing the royalty-bearing G-SYNC, regardless of technical merit, since it reduces the price of monitors.
 
Joined
Jul 20, 2018
Messages
128 (0.05/day)
System Name Multiple desktop/server builds
Processor Desktops: 13900K, 5800X3D, 12900K | Servers: 2 x 3900X, 2 x 5950X, 3950X, 2950X, 8700K
Motherboard Z690 Apex, X570 Aorus Xtreme, Z690-I Strix
Cooling All watercooled
Memory DDR5-6400C32, DDR4-3600C14, DDR5-6000C36
Video Card(s) 4090 Gaming OC, 4090 TUF OC, 2 x 3090, 2 x 2080Ti, 1080Ti Gaming X EK, 2 x 1070, 2 x 1060
Storage dozens of TBs of SSDs, 112TB NAS, 140TB NAS
Display(s) Odyssey Neo G9, PG35VQ, P75QX-H1
Case Caselabs S8, Enthoo Elite, Meshlicious, Cerberus X, Cerberus, 2 x Velka 7, MM U2-UFO, Define C
Audio Device(s) Schiit Modius + SMSL SP200, Grace DAC + Drop THX AAA, Sony HT-A9, Nakamichi 9.2.4
Power Supply AX1200, Dark Power Pro 12 1500W
Mouse G Pro X Superlight Black + White
Keyboard Wooting 60HE, Moonlander
VR HMD Index, Oculus CV1
DisplayHDR1000 certification became a joke to me when they gave it to the Odyssey G9 with its 10 vertical dimming zones. RTINGS measured the contrast ratio with HDR enabled and it DROPPED to 446:1 from the native 2231:1.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,769 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
You can't drive technology forward with closed standards. If something, that could've been learned at the aquisition of 3DFx... What closed standard of nVidia were managed to spread industry-wide? Some professional stuff, like CUDA managed to get along, but all the consummer stuf were either killed, terminadet or just left behind when an open standard cam to it's place. Even if it was inferior. Anyone remembers Phys-X GPU-acceleration? Or maybe 3D Vision?

Left behind... All of them... PhysX has grinded down to one of many already existing CPU-phisics core engines where it was unable to compere with Havoc. Why? Because Havoc could be unleashed to all CPU-cores while PhysX had to been restricted in order to make up the difference between GPU and CPU PhysX... The year 2016 saw 6 titles with PhysX, the years after saw 1 in each year except in 2018 in which there was none...
I was referring to things like FALD in this case, which was part of "Ultimate", as in driving forward new monitor tech. Not talking about anything Nvidia specific.
 
Joined
May 27, 2014
Messages
18 (0.00/day)
That "low brightness" bitching is getting old.
OLED TVs dish out 600-700nit (and not full screen, of course)...
New semi-pro Panasonic OLED has beefier power supply and does it full-screen. It has no ABL (automatic brightness limiter) at all.

DisplayHDR1000 certification became a joke to me when they gave it to the Odyssey G9 with its 10 vertical dimming zones. RTINGS measured the contrast ratio with HDR enabled and it DROPPED to 446:1 from the native 2231:1.
Is it not possible to enable HDR mode and disable local dimming hence getting ~1000 nits and standard 2000:1 contrast?
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
New semi-pro Panasonic OLED has beefier power supply and does it full-screen. It has no ABL (automatic brightness limiter) at all.
Increasing the brightness is not much of an issue. The issue is increasing the brightness also increases the chance of image retention.
Is it not possible to enable HDR mode and disable local dimming hence getting ~1000 nits and standard 2000:1 contrast?
You can't meet the contrast required for HDR without local dimming on LCD. That said, rtings did not actually find lowered brightness in HDR mode, they found it's about the same: https://www.rtings.com/monitor/reviews/samsung/odyssey-g9
But that monitor is an abomination even before factoring in the lame local dimming implementation.
And yes, it's a major VESA failure they did not mandate proper local dimming implementation. They did this because to this day very few monitors do a semi-proper job implementing local dimming, but when you're making standards to help customers choose, bending the standards to get on the good side of the manufacturers is not the way to go. And don't get me started on DisplayHDR 400.
 
Top