Video - What is Information Entropy?
Entropy is a measure of the uncertainty in a random variable (message source). Claude Shannon defines the "bit" as the unit of entropy (which is the uncertainty of a fair coin flip). In this video information entropy is introduced intuitively using bounce machines & yes/no questions.
TRANSCRIPT
Written by Melvin Draupnir on November 27, 2013.