Clip

Understanding Entropy and Unknown Information in Systems
listen on Spotify
37:29 - 39:28 (01:59)

Entropy refers to the amount of unknown information in a system, represented by the difference between what is known and the full exact microscopic state. As more information is acquired, the dimensional space of the system increases correspondingly.

Similar Clips