This just isn't true for a lot of use cases. If your requirement is to get low latency response on large data sets you need to cluster data carefully and compress it to reduce I/O. That's why data warehouses use column storage and strong typing.
Compressed storage size for uniform datatypes can be phenomenally efficient. I've seen 10,000x size reduction in ideal cases like monotonically varying integers stored using double delta codec + ZSTD compression.
Compressed storage size for uniform datatypes can be phenomenally efficient. I've seen 10,000x size reduction in ideal cases like monotonically varying integers stored using double delta codec + ZSTD compression.