• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

LG OLED TVs Receive NVIDIA G-SYNC Upgrade Starting This Week

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,543 (0.96/day)
LG Electronics USA announced that 2019 OLED TVs are receiving a firmware update starting this week, enabling stunning game play via NVIDIA G-SYNC Compatible technology.

NVIDIA G-SYNC Compatible support will be available on the E9 series (65 and 55 inches) and C9 series (77, 65 and 55 inches) as well as the B9 series (65 and 55 inches) LG OLED TV models in the United States in November.


With G-SYNC Compatible support, the critically-acclaimed LG OLED TVs will deliver the smoothest, most immersive gaming experience without the flicker, tearing or stuttering common to most displays. Gamers have long valued LG OLED TVs for their exceptional picture quality, low input lag and ultra-fast response time. The addition of G-SYNC Compatible support allows gamers with GeForce RTX 20-Series or GTX 16-Series GPUs to fully enjoy extreme responsiveness and optimized visuals on LG's large OLED TVs from 55 inches up to an immersive 77 inches.

"If you'll pardon the pun, this is truly a game changer for the legions of gamers out there," said Tim Alessi, head of home entertainment product marketing at LG Electronics USA. "Partnering with NVIDIA to integrate their G-SYNC Compatible support into our category-leading LG OLED TVs delivers a new standard in gaming performance and opens a new world of large-screen 4K gaming experiences only found on LG OLED TVs."

For more information on LG TVs, visit LG.com.

View at TechPowerUp Main Site
 
Joined
Jun 24, 2011
Messages
121 (0.02/day)
Location
Germany
System Name MonsterPC
Processor AMD X7950X3D
Motherboard MSI MEG E670X ACE
Cooling Corsair Cappelix H150i
Memory 128GB DDR5 @5600
Video Card(s) AMD RX7900XTX
Storage 120TB HDD + 3x 2TB SSD RAID, External - 390 TB Various HDD drives
Display(s) C49RG90
Case BIG Tower
Audio Device(s) Creative X-FI 4 + SteelSeries Arctis Pro USB
Power Supply Corsair AX1200i
Mouse SteelSeries Rival710
Keyboard SteelSeries Apex Pro
VR HMD Valve Index
Software W10 x64
Why is everyone agreeing to be misled by NVidia PR department is beyond my understanding...

"NVIDIA G-SYNC Compatible" - as in article - has nothing to do with G-Sync.
In monitors (Display Port)-- it is either Adaptive Sync (called also Freesync) -- a standard developed by AMD and VESA consortium, or -
in Monitors/TVs (through HDMI) - it is VRR (Variable Refresh Rate) - a standard developed by AMD and HDMI consortium (also sometimes called Freesync through HDMI).

Nvidia played no role in developing both of these standards.
What they "only" did is - after a few years of milking their G-Sync fanbase they partially stopped, and agreed to support the free standards...eventually... but as we see even this they try to sell as their own....
Shady marketing practices...
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
B9 getting that as well is good news.

In monitors (Display Port)-- it is either Adaptive Sync (called also Freesync) -- a standard developed by AMD and VESA consortium, or -
in Monitors/TVs (through HDMI) - it is VRR (Variable Refresh Rate) - a standard developed by AMD and HDMI consortium (also sometimes called Freesync through HDMI).
Displayport Adaptive-sync has become synonymous with Freesync. Freesync is an implementation of Displayport Adaptive-sync on AMD cards. Gsync Compatible is its Nvidia counterpart, late and building on the AMD-initiated standard but still. Gsync Compatible also includes a real value-add in form of an actual certification for monitors.
Freesync over HDMI is not a standard but AMD proprietary technology.

LG 2019 models have HDMI 2.1 with its optional standard VRR.
 
Last edited:

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,083 (4.65/day)
Location
Kepler-186f
Why is everyone agreeing to be misled by NVidia PR department is beyond my understanding...

"NVIDIA G-SYNC Compatible" - as in article - has nothing to do with G-Sync.
In monitors (Display Port)-- it is either Adaptive Sync (called also Freesync) -- a standard developed by AMD and VESA consortium, or -
in Monitors/TVs (through HDMI) - it is VRR (Variable Refresh Rate) - a standard developed by AMD and HDMI consortium (also sometimes called Freesync through HDMI).

Nvidia played no role in developing both of these standards.
What they "only" did is - after a few years of milking their G-Sync fanbase they partially stopped, and agreed to support the free standards...eventually... but as we see even this they try to sell as their own....
Shady marketing practices...


Nvidia is 70% market share of gpu world, so it makes sense manufactures want to state that name if possible on their monitor. Also, there are a lot of freesync monitors that are terrible, and they do not qualify for the Nvdiia COmpatible label per Nvidias requirements, which is good for people trying to find quality.

Overall though yes I agree with you, but I also understand their side of it, but they could word it different.
 
Joined
Oct 25, 2019
Messages
203 (0.11/day)
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
What they "only" did is - after a few years of milking their G-Sync fanbase they partially stopped, and agreed to support the free standards...eventually... but as we see even this they try to sell as their own....
If you look at the situation with monitors today, this is a skewed comparison. You might want to get the timeline correct:
2013 October - Nvidia demos Gsync.
2013 December - Nvidia releases Gsync Module that basically fails to do anything market-wise.
2014 January - Gsync monitors are released, wider availability in spring.
2014 January - AMD demos prototype Freesync on laptops and eDP.
2014 May - Displayport 1.2a standard gets optional Adaptive-sync added.
2015 March - AMD announces Freesync
2015 Summer - Freesync-over-HDMI is demoed.
2015 Autumn - Availability of Freesync monitors.
2015 November - LFC is introduced.
2015 December - Freesync-over-HDMI is launched with monitor availability in early 2016.

Gsync (the one with the module) has been a static thing from the beginning when it comes to features and VRR-related specs. High refresh rates, frame-doubling/LFC. When you see Gsync logo, this is what you know you get.
First batch of Freesync monitors at release (1.5 years after Gsync release ) had a lot of 48-75Hz monitors and even for monitors that were capable, LFC was released another quarter later.

For almost 2 years, Freesync was not on par with Gsync in a purely technical level. After that it took another year or so for significant amount of competitive Freesync monitors to appear.
This is why Nvidia went for proprietary solution and their own scaler hardware. At that point there were no other solutions available.

I did not like the price part of that equation and AMD's inability to compete at high end GPU market for a significant amount of time definitely played a part here. Trying to look at it objectively though - judging by the trouble AMD had to go through to get traction on VRR standards in terms of monitors on the market, I really cannot blame Nvidia for going this way.

Edit:
AMD made significant mistakes with Freesync as well. They clearly went to quantity over quality. Any DP Adaptive-sync capable monitor gets Freesync logo. Some that do not work well and some that are not authorized by AMD (Freesync is their logo so they should authorize monitors to have it). This is what left the door open for Gsync Compatible. If we overlook the AMD vs Nvidia thing, compared to Freesync Gsync Compatible has significantly more restrictive requirements and monitors are more closely scrutinized before certifying them. This makes it a more desirable logo to have.

Both AMD and Nvidia seem to have overestimated significance (or rather, availability) of HDR in the monitor market with both Freesync2 and Gsync Ultimate. Both are effectively stillborn.

Edit2:
Leaving logos and marketing aside, VRR fortunately seems to be well on the road to ubiquity. DP Adaptive Sync is there for most higher-refresh rate monitors, HDMI 2.1 is coming and brings its standard VRR with it. All good things. I wish we could get rid of both Freesync and Gsync branding but that's unfortunately not likely to happen.
 
Last edited:
Joined
Aug 23, 2018
Messages
88 (0.04/day)
LG 2019 models have HDMI 2.1 with its optional standard VRR.

That may be, but the 20- and 16-Series GPUs have HDMI 2.0b, which means there is something else going on here.

I wonder if there is any money involved or if it's simply a partnership meant to benefit both parties. I'd really be interested to know the answer to that, because it's possible there are some shenanigans going on here considering that AMD is more-or-less the proprietor of VRR via HDMI 2.0b and earlier versions.
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Question: Is there anything beyond firmware/software locks blocking this from working with AMD GPUs/FreeSync?
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
AMD's Freesync-over-HDMI is not a standard solution. It is a custom thing that should be based on DP Adaptive-sync signaling.
Given that LG 2019 OLEDs have HDMI 2.1 with VRR, it is most likely that Nvidia's Gsync Compatible thing here is same type of mixed tech - HDMI 2.1 VRR signaling from HDMI 2.0 source.

Question: Is there anything beyond firmware/software locks blocking this from working with AMD GPUs/FreeSync?
No. Might need custom firmware developed with AMD though.
Pretty sure when GPUs with HDMI 2.1 come out - next gen for both manufacturer's I would assume - this will be a moot point.
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
No. Might need custom firmware developed with AMD though.
Pretty sure when GPUs with HDMI 2.1 come out - next gen for both manufacturer's I would assume - this will be a moot point.
Thanks :) I'll be in the market for both a new GPU and a new TV in the coming year or so, so stuff like this is ... interesting.
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos

Important technical bit about this news is that you can get 1440p@120hz (40-120Hz range) or 2160p@60hz (40-60Hz range).
Tvs can do 2160p@120Hz but it simply won't fit through HDMI 2.0.
 
Joined
Aug 23, 2018
Messages
88 (0.04/day)
AMD's Freesync-over-HDMI is not a standard solution. It is a custom thing that should be based on DP Adaptive-sync signaling.

That's exactly my point. So, what sort of technology is Nvidia using since it was mostly, if not entirely, AMD who developed VRR via HDMI before the HDMI 2.1 standard fully implemented it? Maybe it's a simple driver update to support the HDMI 2.1 standard, but I wasn't sure that was possible. However, AMD has apparently said that's exactly what they'll do...

"...Radeon™ Software will add support for HDMI 2.1 Variable Refresh Rate (VRR) technology on Radeon™ RX products in an upcoming driver release. This support will come as an addition to the Radeon™ FreeSync technology umbrella, as displays with HDMI 2.1 VRR support reach market."

So, I believe these new TVs should work fine with AMD cards if/when there is a driver update that enables support. For all I know, support is already built-in to current AMD drivers. If it turns out the TVs won't work with AMD cards, then I think people should cry foul since the HDMI 2.1 standard should be brand/manufacturer analogous and indiscriminate.

My Samsung QLED TV fully supports VRR via HDMI, but it's using HDMI 2.0b and, presumably, some sort of AMD-developed, proprietary technology or IP since it predates HDMI 2.1 VRR. Currently, VRR on my TV won't work with my GTX 1070 (via HDMI), but it works fine with Polaris and newer AMD cards. The real disappointment is that it seems Nvidia might arbitrarily block or otherwise fail to support Pascal/10-Series and Maxwell/900-Series cards from using VRR via HDMI 2.1, even if a driver update could enable it. Those of us with 10- and/or 900-Series cards should make enough noise to get Nvidia to add support, if possible.

Important technical bit about this news is that you can get 1440p@120hz (40-120Hz range) or 2160p@60hz (40-60Hz range).
Tvs can do 2160p@120Hz but it simply won't fit through HDMI 2.0.

Exactly, and that's currently what I can do on my Samsung QLED, except that it actually supports down to 20 Hz at 1440p, supposedly. Because of the limited VRR range at 2160p/4K, I usually run games at 1440p @ 120 Hz, which the 5700 XT I'm currently using is better equipped for, anyway.
 
Joined
Jan 8, 2017
Messages
9,407 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Now that everything is stained by Nvidia and their nifty little branding tricks the chances of VRR becoming something standard across all displays have been drastically lowered.
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
That's exactly my point. So, what sort of technology is Nvidia using since it was mostly, if not entirely, AMD who developed VRR via HDMI before the HDMI 2.1 standard fully implemented it? Maybe it's a simple driver update to support the HDMI 2.1 standard, but I wasn't sure that was possible.
With these LG 2019 OLEDs? pretty sure it is just HDMI 2.1 VRR backported to work with HDMI 2.0 signal source (read: Nvidia GPUs). TV does not need anything Nvidia specific besides perhaps enabling access to it even if HDMI 2.1 is not used.
Now that everything is stained by Nvidia and their nifty little branding tricks the chances of VRR becoming something standard across all displays have been drastically lowered.
How do you figure? Anything with DP Adaptive-sync works with both AMD and Nvidia cards. Logo on the box is pretty irrelevant. HDMI 2.1 will follow suit.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Nvidia is 70% market share of gpu world, so it makes sense
65%. If even that.
And AMD is 35% of discrete GPU market and nearly 100% of console market, want to talk about it?

No, it makes no sense to piss on a sizable (if not greater. as consoles are almost guaranteed to be connected to a TV, unlike PCs)) part of your customer base.

Most likely yet another portion of shady shit from NV, such as "you can advertise it as NV compatible, only if you avoid mentioning FreeSync".
 
Joined
Jan 8, 2017
Messages
9,407 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Logo on the box is pretty irrelevant.

"Partnering with NVIDIA to integrate their G-SYNC Compatible support into our category-leading LG OLED TVs delivers a new standard in gaming performance and opens a new world of large-screen 4K gaming experiences only found on LG OLED TVs."

Irrelevant, sure. All of these companies are just really good buddies.
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Nvidia partners with LG, a little while ago AMD partnered with Samsung for the exact same reason. This is business and marketing, nothing else. Why get upset about it?
 
Joined
Jan 8, 2017
Messages
9,407 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Both.
If monitor or TV supports DP adaptive sync or HDMI 2.1 VRR, I could not care less if the box or logos on it are red, green or blue.
On the other hand it is too much optimism expecting companies to stop branding things so logos will inevitably be there.
 
Joined
Nov 6, 2016
Messages
1,740 (0.59/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Nvidia partners with LG, a little while ago AMD partnered with Samsung for the exact same reason. This is business and marketing, nothing else. Why get upset about it?

Because it's worth getting upset about....anytI me a company takes anything that should otherwise be a completely open standard and starts closing it off, it's a default loss to consumers...and who wouldn't be upset with having to pay for something that should otherwise be free.

Also, I'd like to point out how the following attitude is completely toxic and counterproductive:

"Well, who cares if Nvidia does underhanded tactics and strategies, all companies do it"

Just because despicable behavior becomes ubiquitous, doesn't make it any more justified or less despicable. Just throwing up your hands and shrugging and saying "oh well, there's nothing I can do about it" is the exact reason why this behavior has become common place and the reason why it will continue to be.

Fighting against this behavior isn't about being pro-AMD or anti-Nvidia and any fan boys who try an place it in that context or doing so either to knowingly or ignorantly silence debate in discussion by seating the discussion in the false binary of AMD vs Nvidia when it has nothing to do with that. ALL behavior that results in negative consequences for the PC community (even the ones too blinded by their own loyalties to understand that) should be zealously fought against regardless of the source.
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
ALL behavior that results in negative consequences for the PC community (even the ones too blinded by their own loyalties to understand that) should be zealously fought against regardless of the source.
What is a negative consequence of adding VRR capability to a TV here?
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
What is a negative consequence of adding VRR capability to a TV here?
It's not "adding VRR capability" it's "possibly adding vendor lock in for VRR" or, in the best case, deceiving customers.
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
It's not "adding VRR capability" it's "possibly adding vendor lock in for VRR" or, in the best case, deceiving customers.
Ok, again:
- LG 2019 OLED TVs have HDMI 2.1 with VRR support. Very much a standard thing.
- What Nvidia arranged was to add support for the same feature in their current GPU lineup that only has HDMI 2.0.
- No extra cost, no effect to the standard feature.

What exactly is here to be upset about?
What vendor lock? Deceiving how?
 
Top