In this test with a 5600G, Doom: Eternal 1080p low, CL16IGP performance is primarily latency-sensitive, and all else being equal, bandwidth-sensitive.
So DDR5 6400 CL32 is the same latency as DDR4 3200 CL16. In a like-for-like IGP test, you'd assume that they both perform similarly at 720p but in games where the IGP has enough performance to push 1080p and beyond the DDR5 option would scalebetterless poorly.
Given that even the UHD 750 is too slow to provide meaningful framerates at 720p in many current AAA titles, scaling up to higher resolutions isn't really even relevant, so the discussion of DDR4 vs DDR5 isn't as important as "Hey Intel, where the hell are the other 64 EU in my IGP?"
- 2666 - 38.9 fps
- 3200 - 41.2 fps
- 3600 - 42.9 fps
- 4000 - 45.0 fps
* 42.3 fps
* 41.2 fps
* 40.9 fps
So much more effect from speed than latency.
Last edited: