If it’s constantly reading the same data it’s stored in cache, which is significantly faster than reading from the actual drive. Because the latency is average and cache is very fast it lowers the latency shown in that graph.
If it’s constantly reading the same data it’s stored in cache, which is significantly faster than reading from the actual drive. Because the latency is average and cache is very fast it lowers the latency shown in that graph.
IRC, it’s ancient which may sound bad but it means that most implementations have grown mature and won’t cause issues.