Benchmarking
With the information from the previous page, we now know the weak spot of DRAM-less SSDs, which is random write with relatively small block sizes.
I tested this by sending random writes of block size 4K to a file of varying size, to control the locality of the writes, which should put different levels of stress on the flash translation layer.
The results follow an asymptotic curve pretty exactly, which suggests that some kind of small, fixed size buffer is used (in the controller), which gets overwhelmed by far-apart requests more and more, the larger the test area is set.
For our synthetic testing on the following pages, I include two sets of data (where relevant):
- Our standard testing, which uses a test area size of 128 GB
- And a smaller test size of 16 GB to show performance with more consumer-oriented workloads (without cherry-picking the best case, like 4 GB or smaller, which a lot of benchmark programs do by default).