- Joined
- May 2, 2017
- Messages
- 7,762 (2.81/day)
- Location
- Back in Norway
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
Implementing HDMI 2.1 features on a HDMI 2.0 GPU with only HDMI 2.0 hardware is by definition a bespoke implementation. It bypasses and supersedes the standard, and is thus made especially for that (combination of) part(s) - thus it is bespoke, custom-made. Beyond that, nothing you said contradicts anything I said, and to reiterate: it is still unconfirmed from LG whether 2019 OLEDs will support HDMI 2.1 VRR universally - which would after all make sense to do given that both next-gen consoles support it, as well as upcoming GPUs. The absence of unequivocal confirmation might mean nothing at all, or it might mean that LG didn't bother to implement this part of the standard properly (which isn't unlikely given how early it arrived). And yes, I am also willing to bet both camps will have HDMI 2.1 ports with VRR support on their upcoming GPUs.LG OLED TVs do not have a bespoke GSync implementation. These have HDMI 2.1 and its VRR which is a pretty standard thing (and not compatible with bespoke FS-over-HDMI). Although no cards have HDMI 2.1 ports, Nvidia added support for some HDMI 2.1 features - in this context namely VRR - to some of their cards with HDMI 2.0. Nothing really prevents AMD from doing the same. FS-over-HDMI will not be added but AMD can add VRR support in the same way Nvidia did. And it will probably be branded as Freesync something or another.
Not confirmed but I am willing to bet both next-gen GPUs will have HDMI 2.1 ports and VRR support.
I'm not arguing the same as @ARF here, but using on-paper boost specs for Nvidia-vs-AMD comparisons is quite misleading. GPU Boost 3.0 means that every card exceeds its boost clock spec. Most reviews seem to place real-world boost clock speeds for FE cards in the high 1800s or low 1900s, definitely above 1620MHz. On the other hand, AMD's "boost clock" spec is a peak clock spec, with "game clock" being the expected real-world speed (yes, it's quite dumb - why does the boost spec exist at all?). Beyond that though, I agree (and frankly think it's rather preposterous that anyone would disagree) that Nvidia still has a significant architectural efficiency advantage (call it "IPC" or whatever). They still get more gaming performance per shader core and TFlop, and are on par in perf/W despite being on a much less advanced node. That being said, AMD has (partially thanks to their node advantage, but also due to RDNA's architectural improvements - just look at the VII vs. 5700 XT, both on 7nm) gained on Nvidia in a dramatic way over the past generation, with the 5700 (non-XT) and especially the 5600 outright beating Nvidia's best in perf/W for the first time in recent history. With the promise of dramatically increased perf/W for RDNA 2 too, while Nvidia is moving to a better (though not quite matched) node makes this a very interesting launch cycle.It probably can, but WTF does that have to do with my point? I wasn't even replying to you - you've just successfully trolled the discussion with your inability to understand english.
Last edited: