- Joined
- Jul 31, 2014
- Messages
- 491 (0.13/day)
System Name | Diablo | Baal | Mephisto |
---|---|
Processor | Ryzen 9800X3D | 2x Xeon E5-2697v4 | i7-13900H |
Motherboard | ASRockRack B650D4U-2L2T/BCM | Supermicro X10DRH-iT | Lenovo Thinkpad P1 Gen 6 |
Cooling | Custom loop | SC846 Chassis cooled| dual-fanned heatpipes with LM |
Memory | 64GiB DDR5-5600 ECC | 256GiB DDR4-3200 ECC RDIMM | 64GiB DDR5-5600 |
Video Card(s) | RTX 3090 Ti Founder's Edition | Embedded ASPEED2400 | RTX 5000 Ada Mobile (80W) |
Storage | many, many SSDs and HDDs.... |
Display(s) | Dell U3014 + Dell U3011 | SMCI IPMI KVMoIP | 3840×2400 Samsung OLED |
Case | Caselabs TH10A | Supermicro SC846 | Lenovo Thinkpad P1 Gen 6 |
Audio Device(s) | Creative SoundBlaster X4 | None | On-board + Moondriver2 Ti + Bluetooth |
Power Supply | Corsair AX1600 | 1200W PSU (Delta) | Lenovo 230W or 300W |
Mouse | Logitech G604 |
Keyboard | 1985 IBM Model F 122-key, Lenovo integrated |
VR HMD | The wait for 4K per eye is long and winding.... |
Software | FAAAR too much to list |
facepalm.jpg
There's two monumental problems with 8K:
#1: No cable that can carry it. I have my doubts whether or not DisplayPort can even be expanded to handle it.
#2: If #1 were solved and the workload was only 2D, GPUs could handle 8K today without a problem. Where the "monumental problem" comes from is that a load any greater than desktop software is going to make any GPU croak at 8K and the only way to combat that is with more transistors which means bigger chips which means more power which means more heat. Unless there is some breakthrough, displays are going to run away from graphics technology because graphics can't scale at a rate LCD panels do.
The demand for these panels is coming from the film and TV industry where the GPUs only task is to render the video frames, not process massive amounts of triangles. I don't think gaming will see reasonable 4K for a long time, never mind 8K. These things are for mostly film enthusiasts and professionals, not gamers. Games will have to be played at a lower-than-native resolution for acceptable frame rates.
Oh, and speaking of film industry, HDMI is going to have to be kicked to the curb and a new standard (probably adapted from DisplayPort) will have to replace it to handle 8K. That's going to take a very long time to phase out HDMI in favor of something newer.
#1 Diplayport Can do ~25Gbit/s right now, and according to the DiaplyPort page on wikipedia, 8K*24bit@60Hz lies around 50-60Gbit/s. For 8K*30bit@120Hz, 125-150Gbit/s should be the bandwidth we're looking at. CAT-8 cabling (4-pair, 8wires) is currently being finalized to provide 40Gbit/s, over 100m. DP is a 4-lane cable (one "pair" per lane). Using CAT-8 grade of cabling with the right transceivers over the usual 5m max needed length (a 5m CAT-8 cable should be good for at least 100Gbit/s) of DP cabling, 8K is perfectly feasible with current tech. Expensive, but feasible. Hell, odds are that CAT-8 cabling will be good for 100Gbit/s over the full 100m length thanks to new electronics... Pennsylvania State University people theorized that 32 or 22nm circuits will do 100Gbit/s over 100m of CAT-7A in 2007.
#2 Most of us care a about 8K for productivity, not games. I for one am happy with only the current 2K screens for games, but olawd the text is nasty
![Frown :( :(](https://tpucdn.com/forums/data/assets/smilies/frown-v1.gif)
And HDMI can go diaf. It's a shitty hack based of DVI with a licensing fee (wuuut, with free to use DisplayPort also around?!) and limited to external interfaces only. Personally, the only place I need HDMI for is TVs. If TVs had DisplayPort inputs, I wouldn't need HDMI at all!