Clip
Understanding Entropy and Unknown Information in Systems
Entropy refers to the amount of unknown information in a system, represented by the difference between what is known and the full exact microscopic state. As more information is acquired, the dimensional space of the system increases correspondingly.