Different test scene, different graphics card, and they have hairworks turned off for witcher
Still, the titles used arent
that popular and the fact
- "Multi-core optimizations, overclocking, and Turbo tweaks were disabled" - Most of which are *enabled* by default, particularly turbo.
is definitely skewing the results & could easily lead a reader to believe ram speed makes a small difference. While some of the most popular titles are showing significant performance gaps, even /w the 7700k it made a big diff.
https://www.eurogamer.net/articles/digitalfoundry-2017-intel-kaby-lake-core-i7-7700k-review (make sure to scroll down to the memory tests). You can even see them running through a scene, how on earth did techspot find an area where the difference between 2133 & 4000 was <2fps across all speeds with no linear gains. To test that many speeds and not be comprehensive doesnt make sense.
It made a difference with a 6700k//GTX 1080 in battlefield 1 for another example, yet ZERO scaling or consistency results for TPU's BF1 tests.
It even makes a difference with a 6700k(weaker) CPU & GTX 1080, but not a 8700k & GTX 1080?
https://www.gamersnexus.net/game-bench/2677-bf1-ram-benchmark-frequency-8gb-enough/page-2
But 8600k & 1080Ti?
Literally every test I've seen with 6700k>7700k>8700k or even i5 model CPUs & GTX 1060 or higher GPUs have shown
linear performance scaling /w ram frequency, these are the only results I've found that show almost zero, in similar titles(BF1, Witcher 3).
Even with weaker configurations(which you'd logically think, less memory bandwidth requirement),, still linear scaling.
And since I doubt a 8700k + gtx 1080 are immune to the negative effects, further investigation needs to be done, only the hitman results looked close to normal, (though they went overboard with
memory speeds. BF1 & witcher performance *should* scale with ram speed in a linear fashion, just like other similar open world or large map titles.
The only logical explanation is the setup/lack of turbo/something else is affecting the memory results by a large amount, which people should know or be mislead to think 2133 vs 3000 = 2fps.
The gaming benchmarks in particular, I think turbo off might be similar to doing CPU benchmarks @ 4k ultra, which masks the difference completely & why the 4000mhz ram is all over the place. If the CPU was running at it's stock turbo potential, I believe its enough to saturate DDR4 bandwidth when paired with a GTX 1080 in those titles, & we would see a much bigger difference & linear scaling with speeds as bandwidth improves between ram speed increment.