- Joined
- Sep 20, 2019
- Messages
- 527 (0.28/day)
Processor | i9-9900K @ 5.1GHz (H2O Cooled) |
---|---|
Motherboard | Gigabyte Z390 Aorus Master |
Cooling | CPU = EK Velocity / GPU = EK Vector |
Memory | 32GB - G-Skill Trident Z RGB @ 3200MHz |
Video Card(s) | AMD RX 6900 XT (H2O Cooled) |
Storage | Samsung 860 EVO - 970 EVO - 870 QVO |
Display(s) | Samsung QN90A 50" 4K TV & LG 20" 1600x900 |
Case | Lian Li O11-D |
Audio Device(s) | Presonus Studio 192 |
Power Supply | Seasonic Prime Ultra Titanium 850W |
Mouse | Logitech MX Anywhere 2S |
Keyboard | Matias RGB Backlit Keyboard |
Software | Windows 10 & macOS (Hackintosh) |
Long story short, I have it, but am holding off on a review until Club3D has a chance to fix some bugs.
Out of the box it really doesn't work well with Radeon GPUs, at least. Not sure if it's a brand issue or if LG just falsely advertised my TV as HDMI 2.1 compatible, or the part doesn't work. So many things to diagnose.
I'd like to hear more context on this when you have a chance. The specific equipment you used namely the GPU, TV / monitor, the length and type and/or brand of cable(s) for connecting GPU and display
I too have an RX 5700 XT (going by your listed system specs) and would very likely purchase this adapter to go with a new TV purchase like LG's 48" CX OLED which can do 4K 120HZ. As everything being advertised in good faith, on paper that setup should perform just as expected. I noticed they had a mentioning of specific length and type of cable, so it seems important to make sure you are not using a standard HDMI 2.0 (2.0b) cable which wouldn't have the bandwidth required. Makes total sense to me of course. Also it seems many TV's are finicky with needing a specific "mode" enabled to ensure some or all of their inputs are able to accept the full bandwidth they are advertised at. Just some two cents stuff there, but I know it's basic stuff that most all ready know.