Image for Shannon's Entropy

Shannon's Entropy

Shannon's Entropy is a way to measure how much unpredictability or surprise is in a set of information. Imagine flipping a biased coin—if it lands heads most of the time, there's less surprise when it does, so entropy is low. If each outcome is equally likely, there's maximum unpredictability, and entropy is high. In essence, entropy quantifies the amount of uncertainty or the average information contained in messages from a source, helping us understand how efficiently we can encode or transmit data by capturing its inherent unpredictability.