Sunday, March 24th 2019
Quantic Dreams' Detroit: Become Human to Make use of Vulkan API on PC, System Requirements Revealed
The former PS4-exclusive Detroit: Become Human, from Quantic Dreams, will be making use of the Vulkan API on its PC version. This isn't completely outside expectations: PlayStation doesn't feature DirectX like the XBOX does, and its custom API, geared for an AMD-based solution, would certainly borrow from Vulkan's programming. This likely means that Detroit: Become Human will feature improved performance on AMD's solutions, if only going by history: Vulkan-based games have historically performed better on AMD than on NVIDIA.
An NVIDIA GTX 660 graphics card is set as the minimum graphical requirement, alongside an i5-2400 and a measly 4 GB of system RAM. The recommended specs, however, list an I7-2700K, a much more relevant 12 GB of system RAM, and an NVIDIA GTX 1080. That's a huge disparity between the minimum and recommended system requirements, particularly on the RAM and graphics side of the equation, which leaves us to wonder exactly what kind of settings will be adjustable in-game. Detroit: Become Human ran at 2160p checkerboard at 30 FPS on the PS4 Pro, with volumetric lighting being rendered at 235x135x64 resolution - with a graphics chip that is close (but slower) than an RX 470 graphics card (4.2 TFLOPs on the PS4 Pro chip compared to 4.9 TFLOPs on the RX 470).
Source:
EPIC Games Store Page
An NVIDIA GTX 660 graphics card is set as the minimum graphical requirement, alongside an i5-2400 and a measly 4 GB of system RAM. The recommended specs, however, list an I7-2700K, a much more relevant 12 GB of system RAM, and an NVIDIA GTX 1080. That's a huge disparity between the minimum and recommended system requirements, particularly on the RAM and graphics side of the equation, which leaves us to wonder exactly what kind of settings will be adjustable in-game. Detroit: Become Human ran at 2160p checkerboard at 30 FPS on the PS4 Pro, with volumetric lighting being rendered at 235x135x64 resolution - with a graphics chip that is close (but slower) than an RX 470 graphics card (4.2 TFLOPs on the PS4 Pro chip compared to 4.9 TFLOPs on the RX 470).
14 Comments on Quantic Dreams' Detroit: Become Human to Make use of Vulkan API on PC, System Requirements Revealed
What about 4C\4T CPUs with much higher IPC like i5 6600K\7600K?
What about CPUs like the Ryzen 3 1200?
Lazy requirements. Lazy as all hell.
“PlayStation doesn't feature DirectX like the XXBOX does, and its custom API, geared for an AMD-based solution, would certainly borrow from Vulkan's programming.”
GNM is D3D12/Vulkan-like
GNMX is D3D11/OpenGL-like
PS4 uses PSSL which is allegedly similar to D3D11's HLSL.
They're definitely not copy-pasta but going with Vulkan could mean easier ports to Android, iOS, OS X, etc. in the future.
Seeing how PS4 is an 8c/8t CPU, 8t is probably a minimum requirement.
How The Crew was ported to PlayStation 4
I'll cite Witcher 2 as example of porting from Xbox 360/PC to PS3 as being a disaster.
Code that's designed to execute in parallel can create terrible locking scenarios when stacked. It can then, in turn, be too costly to fix because it would require an entire paradigm shift the engine. At that point, you might as well forget about porting the game and focus on changing it for the next game.
Xbox and Xbox One are the only consoles that ever really shared software similarities with PC. The rest might as well be from another planet.
"An NVIDIA GTX 660 graphics card is set as the minimum graphical requirement, alongside an i5-2400 and a measly 4 GB of system RAM. "
Also, comparing this console gen to the PS3/X360 is... well. Like comparing planes and cars, yes they both go from A to B...
And there are two distinct things here, one is parallelism and one is concurrency. You don't usually get locking scenarios trough parallelism because using this paradigm involves some fundamental properties and guarantees involving the type of computation that you do which assures it's scalability and hazard free behavior.
I.g : Execute the same sequence of code 100 times for 100 different pieces of data simultaneously and write back the answer in 100 different places. Here you exploit the notion of parallelism and there is no way your threads can become interlocked.
And then you have concurrency, I.g : execute 100 different sequences of code which may or may not share data, simultaneously* ( in however much quantity that is possible ).
Parallelism is usually easy and risk free, concurrency isn't. Parallelism requires multiple execution units (or cores), concurrency doesn't. What you described are the problems that come with concurrency and that isn't something that's necessarily linked to hardware (number of threads, cores , etc). It basically all has to do with you ability as a programmer to manage your code.
So in the end to wrap it all up : No, there is no fundamental reason why something that was written and ran on an 8c/8t machine wouldn't be able to run just fine on a 4c/4t machine, that is of course unless that machine is simply not fast enough for your purposes (which may very well be the case).