- Joined
- Jan 8, 2017
- Messages
- 9,505 (3.27/day)
System Name | Good enough |
---|---|
Processor | AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge |
Motherboard | ASRock B650 Pro RS |
Cooling | 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30 |
Memory | 32GB - FURY Beast RGB 5600 Mhz |
Video Card(s) | Sapphire RX 7900 XT - Alphacool Eisblock Aurora |
Storage | 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB |
Display(s) | LG UltraGear 32GN650-B + 4K Samsung TV |
Case | Phanteks NV7 |
Power Supply | GPS-750C |
I bet they want it disabled just so that people can't run AVX 512 benchmarks that would expose even more laughable power consumption figures. Other than that it speeds up the validation process and practically no consumer software needs AVX 512, so it's completely irrelevant whether it's there or not.
That doesn't really mean anything from an emulation stand point , at the end of the day you still need to emulate more or less the same thing irrespective of the ISA. The reason you couldn't use CUDA or OpenCL is not because the CPU is RISC but because of the software that runs on those SPEs which needs complex thread synchronization logic that simply can't be done on a GPU. The PS3 GPU is documented, it's just some run of the mill 7000GTX series Nvidia architecture, nothing special there so there is no point in trying to use anything else than just OpenGL or any other graphics API.God, the PS3 is a risc based platform with a undocumented GPU. You cant just use OpenCL or Cuda to "Emulate" a certain specific console and it's hardware.