Data compression exploits statistical redundancy and predictable patterns in data. The degree of compressibility is fundamentally limited by the entropy of the source.
Low entropy - High predictability
AAAAAABBBBCCCC
Contains exploitable patterns
High entropy - Maximum randomness
3F7A92E1D5B8...
No exploitable patterns
Shannon's Source Coding Theorem: No lossless compression scheme can compress data below its entropy limit. For truly random data with uniform distribution, entropy equals the bit length—compression is impossible.
Kolmogorov Complexity: The shortest possible description of data is its algorithmic information content. Random data has maximum complexity and cannot be described more concisely.