
Shannon entropy
Shannon entropy is a measure of uncertainty or unpredictability in a set of information. It quantifies how much surprise or variability there is when predicting the next message or data point. For example, if a coin toss always lands the same way, there's no uncertainty, so entropy is low. If it’s equally likely to land on heads or tails, uncertainty is higher, and entropy is greater. In essence, Shannon entropy helps us understand how much information is needed on average to accurately describe or transmit data, with higher entropy indicating more complexity or unpredictability.