Image for Spectral Entropy

Spectral Entropy

Spectral entropy is a measure used to quantify the complexity or randomness of a signal's frequency content. It assesses how evenly the energy is distributed across different frequencies: low spectral entropy indicates energy concentrated in specific frequencies (more predictable), while high spectral entropy means energy is spread out (more random). For example, a pure tone has low spectral entropy, whereas white noise has high spectral entropy. This metric helps analyze signals like EEGs or audio recordings, providing insight into their underlying structure and variability.