• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 660 (Gigabyte) Windforce. HDR support?

Joined
Nov 24, 2022
Messages
253 (0.43/day)
I have a NVIDIA GeForce GTX 660 (Gigabyte) Windforce to this screen. I guess it do not support HDR? or DRM Even if the functions can be turned on in the screen.

I bought a AOC Gaming G2 CU34G2XP/BK WLED in Sweden as i live there for 4489sek/428.62$. In US it is much cheaper but the shipping costs is around 200$. It is the deluxe model to CU34G2XE/BK. It has PiP. 2 DP 1.4 ports, 2 HDMI 2.0 ports, a 3.5mm audio output, 4 USB ports and AC power input (internal power converter). 4489.44kr. Basic HDR10 support is provided at the VESA DispalyHDR 400 level. So not very bright by HDR standards and no local dimming required – plus a restrictive colour gamut by HDR standards. But enough to give a different look to things with 10-bit colour processing put to use. A 1ms grey to grey response time is specified alongside 1ms MPRT using the MBR strobe backlight setting, approach with caution as usual. ‘Low Input Lag’ is also noted. The stand provides tilt, swivel and height adjustment and can removed using a quick-release catch to reveal provision for 100 x 100mm VESA mounting. The ports face downwards and include; 2 DP 1.4 ports, 2 HDMI 2.0 ports, a 3.5mm audio output, 4 USB ports and AC power input (internal power converter).

I read. All NVIDIA GPUs from the 900 and 1000 series support HDR display output. The presence of HDMI 2.0 provides the bandwidth necessary for the higher quality signal desirable for HDR. The 1000 series adds support for HDR over future Display Port revisions.
 
Joined
Oct 22, 2014
Messages
13,504 (3.81/day)
Location
Sunshine Coast
System Name Lenovo ThinkCentre
Processor AMD 5650GE
Motherboard Lenovo
Memory 32 GB DDR4
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Lenovo
Software W11 Pro 64 bit
The GTX 660 only has a HDMI 1.4A
It might be possible (I don't know) over a Display Port cable.
 
Joined
Dec 25, 2020
Messages
5,247 (4.06/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Razer DeathAdder Essential White
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Not possible, and not really a matter of HDMI version - it's the Kepler GPU itself that does not support HDR. This limitation applies even to the Titans and the GTX 690. Fortunately for you, just about anything would be a massive upgrade - buy a RTX 4060 as a replacement, or if on a very low budget, the RX 6600 will do the trick.

Anyhow... that display is one of the "fake HDR" ones and you're better off using it in SDR. Upgrade your GPU asap though, the 660 is an obsolete and unsupported midrange model that is over 12 years old by now. It's just not a good card to have anymore, even if you don't play much in the way of video games.
 
Joined
Nov 24, 2022
Messages
253 (0.43/day)
Thanks. Is it much quality loss between 1.4 and 2.0? Yes i will find another card. Sad it was a fake HDR screen. It is misleading marked selling. What means by fake HDR? Does it not support it at all. By the nits on 300 it is to little. When it takes 400.
 
Joined
Dec 25, 2020
Messages
5,247 (4.06/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Razer DeathAdder Essential White
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Thanks. Is it much quality loss between 1.4 and 2.0? Yes i will find another card. Sad it was a fake HDR screen. It is misleading marked selling. What means by fake HDR? Does it not support it at all. By the nits on 300 it is to little. When it takes 400.

There is no quality loss whatsoever. HDMI 1.4 has lower bandwidth than 2.0 or 2.1, which limits the maximum resolution and refresh rate. For a 1080p display, HDMI 1.4 is more than enough, until you're talking higher than 165 Hz refresh rates. For higher resolution monitors you will lose refresh rate, your monitor seems to be 3440x1440 @ 180 Hz and has a VA panel. It's a pretty decent monitor, but it is well beyond your GTX 660's capabilities to drive it to its full potential. To get playable frame rates that are near your monitor's full potential on modern-day games, you will want at least an RTX 4070 SUPER.

By "fake HDR" I mean that it's just not good enough for it to truly take advantage of HDR, for that you will want an OLED monitor or TV, and even the more basic models cost between twice and thrice the cost of your monitor. Don't worry about it. That doesn't mean it's bad, it's just a traditional monitor that won't have self-emissive capabilities or at the very least, localized dimming zones in generous quantities, which are really important for HDR. There's a reason the 42-inch LG C-series OLED is a favorite for PC gamers, it's one of the cheapest truly HDR capable displays out there. ;)
 
Joined
Nov 24, 2022
Messages
253 (0.43/day)
I guess i need a graphic card from 3/1/2016 as Displayport 1.4 was released then. AOC Gaming G2 CU34G2XP/BK WLED is on 3440x1440 but i see 3840 * 2160. Can i run that when i see that in my computer?
 
Joined
Dec 25, 2020
Messages
5,247 (4.06/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Razer DeathAdder Essential White
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I guess i need a graphic card from 3/1/2016 as Displayport 1.4 was released then. AOC Gaming G2 CU34G2XP/BK WLED is on 3440x1440 but i see 3840 * 2160. Can i run that when i see that in my computer?

Not advised. Also 4K on Kepler will result in 4:2:0 compression and the image quality will be reduced. I don't recommend it.

The minimum you should purchase is an RTX 2060, but for that monitor I recommend the aforementioned RTX 4070 SUPER or faster.
 
Top