- Joined
- Dec 29, 2017
- Messages
- 5,041 (1.99/day)
- Location
- Swansea, Wales
System Name | Silent |
---|---|
Processor | Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader |
Motherboard | ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2 |
Cooling | Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure |
Memory | 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled |
Video Card(s) | RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock |
Storage | Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB |
Display(s) | 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount |
Case | Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white |
Audio Device(s) | Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro |
Power Supply | SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua |
Mouse | Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape |
Keyboard | Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | Legendary |
That's true.Sort of.
Ray reconstruction occurs after the ray tracing calculations happen (which are done at the API level and vendor independent) but replaces the normal denoiser with a custom pre-trained AI denoiser. That's the "reconstruction" part.
Technically AMD and Intel could come up with their own denoiser that leverages their their own tensor/vector/matrix cores.
It's just math.
That information comes from two MLID videos/podcasts.
- Tom covered that Microsoft is trying to pressure AMD for lower pricing on next-gen custom silicon and threatened to go to ARM or Intel.
- Microsoft also did this last-gen and AMD called their bluff.
- Intel is super-hungry for Microsoft's business, which is when Tom and Dan started spitballing the idea of the XBox using XeSS.
- They also mentioned why it's highly unlikely that Microsoft would switch to Intel unless Intel just made them a ridiculous at-cost deal.
- Tom and his guest in a more recent podcast mentioned that Sony is rumored to be developing an upscaling tech. It's important to point out that:
- The PS4 and 5 don't use DirectX or Vulkan. They have their own low-level API called GNM and GNMX. Their shader language is PSSL.
- They're developing their own hardware accelerated upscaling tech for the same reason Microsoft are.
So why are Microsoft and Sony developing their own tech?
I hope I explained it more clearly this time rather than just making a simple statement. I sometimes forget that most people don't have a lot of experience working in software or hardware development.
- To prevent vendor lock in (e.g. Switching from AMD to Intel)
- To make it easier to develop for and port games from console to PC and vice versa.
- Developing for three different proprietary upscaling technologies means comitting expansive engineering time for three different code paths and at least three more quality control efforts. That's ridiculously expensive.
- Coincidentally that's also why most studios ship with FSR initially - they can hit both consoles and the PC in one swimlane.
However, with the way RT "just math" performance requirements are so high, I don't think that the technical separation between the RT algorithms and the proprietary tech used to make running that math fast enough to be useful in real time gaming is a significant one. At least not currently or for the near future. I also think it's not realistic to assume open standards will win just because they're open, but more that they provide a minimum implementation that vendors can agree on. In fact I'd argue that the likely further segmentation of the AI upscaling tech is to compensate for supporting open standard DirectX/Vulkan RT, while still offering the advantages of massively funded and developed proprietary tech that offers significant IQ and performance increases, while keeping local hardware cheap. Sure, companies could expect people to all be running a supercomputer to get native RT at high FPS, but the vast majority won't spend that much.
Cheaper NVIDIA cards lower in the stack having comparable performance with RT engaged to otherwise faster AMD cards higher in the stack, plus having the advantage of superior proprietary tech resulting in better IQ, promotes dominance. I don't think it's a stretch to say that dominance is, in a large part, due to said proprietary tech, and therefore it's unlikely to disappear.
Arguing technicalities is fun, we are on a technical forum, but I just don't think you can meaningfully separate those RT and upscaling/AI technologies.
The Playstation 5 not using DirectX or Vulkan is a vulnerability in my opinion. Sure, it's sold better than the Xbox, but I wonder how much of that is due to brand recognition and comfortable repeat purchasing habits. The PS5 has a serious shortage of exclusive games, and while that could be attributed to everything being cross platform these days, I suppose you could also make the argument that it's because it's an uncommon API.
Xbox made its own compromise/mistake with having two massively different tiers of processing power with the Series X and the Series S, from what I hear, developers hate it, considering every Xbox game has to run on the Series S.
Nintendo likely getting DLSS in the Switch 2 is also a big boon for proprietary upscaling tech in my opinion, so I really do think that the statement "all proprietary upscaling tech will be shit canned" is misleading.