Shannon's Entropy
Digital Print
2023
Claude Shannon (1916--2001), rendered using ones and zeros with increasing entropy from top to bottom.
Background and Inspiration
In 1948, Shannon applied the physical concept of entropy to measure the complexity contained in sequences of bits (1s and 0s) leading to a theory of how much information can be communicated in a bit sequence.
One can think of entropy as measuring the uncertainty of a sequence. For example, flipping a coin has two outcomes (heads or tails). If the coin is biased to result in always landing heads (or tails), there is no information in a sequence of flips and thus has entropy 0. A truly random coin will result coin flip sequence with an entropy of 1.