• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What's the downside of using a 2nd GPU for a single display?

Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
So I have a monitor (it's a TV actually) that supports HDMI 2.1 (4k@120hz w/VRR) but I'm still using a GTX 1080 with HDMI 2.0 so it'll only do 60hz@4k. I've been running at 1080p so I can run 120hz and VRR.

Is there a downside (beside power consumption) to using a 2nd GPU to drive the monitor but still run games on the 1080? Latency? Image quality? Anything?

I was thinking of getting the cheapest HDMI 2.1 capable card I can find for this and I have no desire, or need, to upgrade to a more powerful GPU at this time.

Also, would I run into trouble if the 2nd GPU is AMD with the primary being Nvidia?

Thanks :)

(and yes, I tried to look this up but it kept taking me to dual-monitor articles/posts)
 
Joined
May 22, 2024
Messages
405 (2.66/day)
System Name Kuro
Processor AMD Ryzen 7 7800X3D@65W
Motherboard MSI MAG B650 Tomahawk WiFi
Cooling Thermalright Phantom Spirit 120 EVO
Memory Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-76 1.36V
Video Card(s) PNY XLR8 RTX 4070 Ti SUPER 16G@200W
Storage Crucial T500 2TB + WD Blue 8TB
Case Lian Li LANCOOL 216
Power Supply MSI MPG A850G
Software Ubuntu 24.04 LTS + Windows 10 Home Build 19045
Benchmark Scores 17761 C23 Multi@65W
Hybrid graphics is not trivial. I don't think that is possible with another dGPU, but others should know more. Please correct me if I'm wrong.

Your best option right now might be either using your upgrade GPU alone, or weigh how much you really want 4k@120Hz. Is there no DisplayPort on your TV?
 
Last edited:
Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
Hybrid graphics is not trivial. I don't think that is possible with another dGPU, but others should know more. Please correct me if I'm wrong.

Your best option right now might be either using your upgrade GPU alone, or weigh how much you really want 4k@120Hz. Is there no DisplayPort on your TV?

No Displayport on the TV, but it just hit me, they have Displayport to HDMI 2.1 adapters. Is there a downside to that? The GTX 1080 has DP 1.4.
 
Joined
Sep 20, 2019
Messages
519 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
GTX 1080 with HDMI 2.0 so it'll only do 60hz@4k
I wouldn't know where the settings are for Nvidia because I've been using AMD cards for so long now but you should be able to use compression to get 4K 120Hz out of a HDMI 2.0 capable port. Using 4:2:0 compression is enough to allow for 4K 120Hz. I would think it's somewhere where you would select color settings in Nvidia's control panel

I did this for a while with a RX 5700 XT (HDMI 2.0 port) going to a 4K 120Hz TV (HDMI 2.1 port). I did buy a new cable for this.....don't recall if it was actually necessary though.....I doubt it was because whole point of using this compression was to fit the signal in the available bandwidth of a 2.0 port/cable but I got an 8K 60Hz capable cable anyways since I knew I was eventually going to upgrade the GPU to one with a 2.1 port


Also, would I run into trouble if the 2nd GPU is AMD with the primary being Nvidia?
I've never mixed AMD/Nvidia on Windows before. I did it in macOS with a 2009 Mac Pro but GPU drivers are very different in that ecosystem. Likely not for the faint of heart to setup in Windows, if it's even possible.

No Displayport on the TV, but it just hit me, they have Displayport to HDMI 2.1 adapters. Is there a downside to that?
I would do the compression method I mention above instead. It's cheaper (free) and no worries about compatibility. I looked into these adapters for same exact reason when they hit the market a few years back but obviously went with the free option....because.....free lol
 
Joined
May 22, 2024
Messages
405 (2.66/day)
System Name Kuro
Processor AMD Ryzen 7 7800X3D@65W
Motherboard MSI MAG B650 Tomahawk WiFi
Cooling Thermalright Phantom Spirit 120 EVO
Memory Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-76 1.36V
Video Card(s) PNY XLR8 RTX 4070 Ti SUPER 16G@200W
Storage Crucial T500 2TB + WD Blue 8TB
Case Lian Li LANCOOL 216
Power Supply MSI MPG A850G
Software Ubuntu 24.04 LTS + Windows 10 Home Build 19045
Benchmark Scores 17761 C23 Multi@65W
No Displayport on the TV, but it just hit me, they have Displayport to HDMI 2.1 adapters. Is there a downside to that? The GTX 1080 has DP 1.4.
Other than the vast majority of those adaptors - not sure about those with DP 1.4 to HDMI 2.1 claims - do not appear to actually support 4K@120Hz, I don't see much else. I have never used one myself.
 
Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
I wouldn't know where the settings are for Nvidia because I've been using AMD cards for so long now but you should be able to use compression to get 4K 120Hz out of a HDMI 2.0 capable port. Using 4:2:0 compression is enough to allow for 4K 120Hz. I would think it's somewhere where you would select color settings in Nvidia's control panel

I did this for a while with a RX 5700 XT (HDMI 2.0 port) going to a 4K 120Hz TV (HDMI 2.1 port). I did buy a new cable for this.....don't recall if it was actually necessary though.....I doubt it was because whole point of using this compression was to fit the signal in the available bandwidth of a 2.0 port/cable but I got an 8K 60Hz capable cable anyways since I knew I was eventually going to upgrade the GPU to one with a 2.1 port



I've never mixed AMD/Nvidia on Windows before. I did it in macOS with a 2009 Mac Pro but GPU drivers are very different in that ecosystem. Likely not for the faint of heart to setup in Windows, if it's even possible.


I would do the compression method I mention above instead. It's cheaper (free) and no worries about compatibility. I looked into these adapters for same exact reason when they hit the market a few years back but obviously went with the free option....because.....free lol
Doing a quick search I see that though this is possible, you lose HDR and VRR. It seems you lose VRR with an adapter as well (if it works at all!). That sucks.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,682 (3.98/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
So I have a monitor (it's a TV actually) that supports HDMI 2.1 (4k@120hz w/VRR) but I'm still using a GTX 1080 with HDMI 2.0 so it'll only do 60hz@4k. I've been running at 1080p so I can run 120hz and VRR.

Is there a downside (beside power consumption) to using a 2nd GPU to drive the monitor but still run games on the 1080? Latency? Image quality? Anything?

I was thinking of getting the cheapest HDMI 2.1 capable card I can find for this and I have no desire, or need, to upgrade to a more powerful GPU at this time.

Also, would I run into trouble if the 2nd GPU is AMD with the primary being Nvidia?

Thanks :)

(and yes, I tried to look this up but it kept taking me to dual-monitor articles/posts)
I don't get it. You want a cheap video card to use as a sort of a pass-through for your GTX1080, only to add HDMI 2.1 out? I don't think that's possible.
 
Joined
Feb 6, 2021
Messages
2,852 (2.11/day)
Location
Germany
Processor AMD Ryzen 7 7800X3D
Motherboard ASRock B650E Steel Legend Wifi
Cooling Arctic Liquid Freezer III 280
Memory 2x16GB Corsair Vengeance RGB 6000 CL30 (A-Die)
Video Card(s) RTX 4090 Gaming X Trio
Storage 1TB Samsung 990 PRO, 4TB Corsair MP600 PRO XT, 1TB WD SN850X, 4x4TB Crucial MX500
Display(s) Alienware AW2725DF, LG 27GR93U, LG 27GN950-B
Case Streacom BC1 V2 Black
Audio Device(s) Bose Companion Series 2 III, Sennheiser GSP600 and HD599 SE - Creative Soundblaster X4
Power Supply bequiet! Dark Power Pro 12 1500w Titanium
Mouse Razer Deathadder V3
Keyboard Razer Black Widow V3 TKL
VR HMD Oculus Rift S
Software ~2000 Video Games
flip the GPU on ebay and buy a new one would be the best solution.
 
Joined
Sep 20, 2019
Messages
519 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
Is there no DisplayPort on your TV?
I've never seen any TV with DP. They are always HDMI

Doing a quick search I see that, though this is possible, you lose HDR and VRR. It seems you lose VRR with an adapter as well. That sucks.
that is def wrong about losing VRR via use of 4:2:0 compression. I used VRR like that, with 4:2:0 compression. at the very least I know VRR was used all the time on any game. HDR I only used on a couple games and I may have been using 1440p res because the games were too much to get 120+FPS (Doom Eternal, Hitman III and the like).....but I want to say HDR worked at 4K 120Hz too. Maybe HDR is too much for that bandwidth even with compression, I don't recall exactly for HDR. But VRR was super important to me and main driving point of me getting the TV which was a Samsung QN90A. Wouldn't know about the adapter but your quick search regarding use of compression is wrong for VRR (might be right for HDR though)

only HDR would maybe be lost capability at 4K 120. any lower resoulution would absolutely also fit HDR like 3200x1800 120Hz for ex
 
Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
I don't get it. You want a cheap video card to use as a sort of a pass-through for your GTX1080, only to add HDMI 2.1 out? I don't think that's possible.
I assumed it was since windows can now select which GPU is used per-app. I'm running with the integrated graphics and can still run programs on it with no monitor connected to it:
graphicspref.jpg

I've never seen any TV with DP. They are always HDMI


that is def wrong about losing VRR via use of 4:2:0 compression. I used VRR like that, with 4:2:0 compression. at the very least I know VRR was used all the time on any game. HDR I only used on a couple games and I may have been using 1440p res because the games were too much to get 120+FPS (Doom Eternal, Hitman III and the like).....but I want to say HDR worked at 4K 120Hz too. Maybe HDR is too much for that bandwidth even with compression, I don't recall exactly for HDR. But VRR was super important to me and main driving point of me getting the TV which was a Samsung QN90A. Wouldn't know about the adapter but your quick search regarding use of compression is wrong for VRR (might be right for HDR though)

only HDR would maybe be lost capability at 4K 120. any lower resoulution would absolutely also fit HDR like 3200x1800 120Hz for ex
Actually I meant that VRR won't work with an adapter (but HDR will). This is what I read so far, I have more research to do.
 

bug

Joined
May 22, 2015
Messages
13,682 (3.98/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I assumed it was since windows can now select which GPU is used per-app. I'm running with the integrated graphics and can still run programs on it with no monitor connected to it:
View attachment 364626
I meant it's not possible to output games rendered on your GTX1080 through the second card. Again, not sure that's what you were asking, that's what I understood.
 
Joined
Dec 25, 2020
Messages
6,389 (4.58/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Hybrid graphics is not trivial. I don't think that is possible with another dGPU, but others should know more. Please correct me if I'm wrong.

Your best option right now might be either using your upgrade GPU alone, or weigh how much you really want 4k@120Hz. Is there no DisplayPort on your TV?

Perfectly possible nowadays. Since Windows 7 multi-vendor configurations are explicitly supported as multiple graphics drivers can now be loaded. What you cannot do is load two versions of the same driver (eg. 391.35 for Fermi and a current driver version), however, you can have both the AMD and NVIDIA driver coexist and be active at the same time. Some machines even require this by design (such as my AMD+NV laptop).

I assumed it was since windows can now select which GPU is used per-app. I'm running with the integrated graphics and can still run programs on it with no monitor connected to it:
View attachment 364626


Actually I meant that VRR won't work with an adapter (but HDR will). This is what I read so far, I have more research to do.

Indeed, can be done. There are no downsides other than perhaps a struggle for system resources (PCIe bandwidth) depending on your PC specs. Sometimes, that might not be worth losing 8 primary GPU lanes for. But depending on the hardware you have, it doesn't matter either. However, expect a slight hit in rendering performance when operating your GPU headless, as Windows will have to copy the output to the framebuffer of the adapter connected to the display. Worst case scenario, 10% - this is the same on non-Optimus laptops that do not support MUX

I meant it's not possible to output games rendered on your GTX1080 through the second card. Again, not sure that's what you were asking, that's what I understood.

Headless operation is possible and can be done. It is also explicitly supported on Windows 10 and newer. The mobile RTX 3050 in fact is specifically designed for this as it does not support MUX switch, so laptop panel is connected to integrated graphics and the dedicated graphics output is managed by Windows.
 
Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
On a side note: I was just looking at benchmark comparisons and it seems it will cost at least $300 to match the performance of this 7 year-old GTX1080 with a HDMI 2.1 capable card (I would never buy a used GPU, that's like buying an open jar of mayo, ya just never know lol). Either that 1080 was an incredible value @$550, or we haven't come very far. Then again, that's around $700 in 2024 dollars, but still...

$130 just for HDMI 2.1 (cheapest new 2.1 card I could find) isn't too appealing either.
 
Joined
Dec 25, 2020
Messages
6,389 (4.58/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
On a side note: I was just looking at benchmark comparisons and it seems it will cost at least $300 to match the performance of this 7 year-old GTX1080 with a HDMI 2.1 capable card (I would never buy a used GPU, that's like buying an open jar of mayo, ya just never know lol). Either that 1080 was an incredible value @$550, or we haven't come very far. Then again, that's around $700 in 2024 dollars, but still...

$130 just for HDMI 2.1 (cheapest new 2.1 card I could find) isn't too appealing either.

There has been massive progress since Pascal in all fronts including performance, what happened in the meantime was the unfortunate death of the low end GPU and prices that never truly recovered after Covid. I've a 1070 Ti and it's almost rudimentary when put to work next to my RTX 4080, to the point that I wanna grab a 50 series card (hopefully 5090) and just retire the 4080 to secondary function instead of selling it.
 
Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
After screwing around with this, I realized that I can indeed run 4k @120hz with HDR (and VRR works fine). I'm assuming this capability is what the "Plus" means in "Input Signal Plus". It's limited to 8-bit color and 4:2:0 though. The 4:2:0 I can handle, but 8-bit banding gets on my nerves. Crap. :wtf:
 
Last edited:
Joined
Sep 20, 2019
Messages
519 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
After screwing around with this, I realized that I can indeed run 4k @120hz with HDR (and VRR works fine). I'm assuming this capability is what the "Plus" means in "Input Signal Plus". It's limited to 8-bit color and 4:2:0 though. The 4:2:0 I can handle, but 8-bit banding gets on my nerves. Crap. :wtf:

Glad you checked it out!

Here is a pointer for HDR with these Samsung TVs (assuming you have one too since you mention "Input Signal Plus"). I thought HDR looked like SHIT on my TV until I realized the color space settings were wacky. It was not using the correct color space for SDR mode. It was using "Native" color space and would therefore totally over-saturate things in SDR. So when I would use HDR, by a relative comparison it would look much, much, much duller and darker as a result. TV shipped like this, because they want the "o0o0o0o0.....aaahhHhHhH" effect when they are displayed in store. Generally people are going to think the brighter/over saturated TV is better. It's kinda like music, even if a mix is worse, a lot of people are going to say the louder one is better even though its mix might be worse by comparison.

When I change color space for SDR to either Auto or Custom it will look appropriate for SDR. If you are used to seeing how things are over-saturated in SDR this will also now look a little washed out by relative comparison, but it's going to be accurate to the source material and will make for an appropriate difference when switching into HDR.
When in HDR mode it needs to use Native color space.
With those settings changed correctly now SDR and HDR modes look great

Wanted to share that because I did not discover the cause for this until like at least a year after getting my TV lol. My TV saves separate settings for each picture mode (game, film/movie, etc modes) along with even SDR and HDR. so this allows me to set the color space settings appropriately whether running a game in SDR or HDR. I hope your model works the same way!

my eureka moment was somehow stumbling onto this article one day. It will explain this phenomenon better than me regurgitating points

oh I also do NOT set my PC to be detected as PC because I think it makes some additional changes in that setup. I instead make the TV think my PC is "Game Console" which still allows for use of Game Mode (low input lag) and VRR, but colors look better in my opinion. I might be not remembering the details but there is def a reason why I landed on this choice. Might have been HDR related too. I want to say PC mode makes everything way too bright/cool. If you haven't played around with that you should try that out too.
 
Joined
Mar 23, 2005
Messages
195 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
Actually I was wrong. HDR does not work at 4k 120hz.

Glad you checked it out!

Here is a pointer for HDR with these Samsung TVs (assuming you have one too since you mention "Input Signal Plus"). I thought HDR looked like SHIT on my TV until I realized the color space settings were wacky. It was not using the correct color space for SDR mode. It was using "Native" color space and would therefore totally over-saturate things in SDR. So when I would use HDR, by a relative comparison it would look much, much, much duller and darker as a result. TV shipped like this, because they want the "o0o0o0o0.....aaahhHhHhH" effect when they are displayed in store. Generally people are going to think the brighter/over saturated TV is better. It's kinda like music, even if a mix is worse, a lot of people are going to say the louder one is better even though its mix might be worse by comparison.

When I change color space for SDR to either Auto or Custom it will look appropriate for SDR. If you are used to seeing how things are over-saturated in SDR this will also now look a little washed out by relative comparison, but it's going to be accurate to the source material and will make for an appropriate difference when switching into HDR.
When in HDR mode it needs to use Native color space.
With those settings changed correctly now SDR and HDR modes look great

Wanted to share that because I did not discover the cause for this until like at least a year after getting my TV lol. My TV saves separate settings for each picture mode (game, film/movie, etc modes) along with even SDR and HDR. so this allows me to set the color space settings appropriately whether running a game in SDR or HDR. I hope your model works the same way!

my eureka moment was somehow stumbling onto this article one day. It will explain this phenomenon better than me regurgitating points

oh I also do NOT set my PC to be detected as PC because I think it makes some additional changes in that setup. I instead make the TV think my PC is "Game Console" which still allows for use of Game Mode (low input lag) and VRR, but colors look better in my opinion. I might be not remembering the details but there is def a reason why I landed on this choice. Might have been HDR related too. I want to say PC mode makes everything way too bright/cool. If you haven't played around with that you should try that out too.
When in PC mode the color space is locked to native. It has two picture modes though: graphic and entertain. Graphic seems fine, but entertain feels like my retinas are getting burned out of my eyeballs. lol

I'm not noticing any lag at all in PC mode though.

I realized that I'm pretty much at my limit on power (I like to stay as close to 50% load on the PSU as I can for efficiency) and I don't want to have to buy a new PSU to run two cards.

I'm just going to replace the card with a RX 7600 XT and sell the 1080 and the 1060 I have in a box here somewhere, plus it comes with two games I might be able to sell. It's about 15%-ish faster than the 1080 at stock so what the hell. I might even spring for a 7700 XT but I'll have to do some research to see if it's worth it.
 
Top