Image for Gibbs phenomenon

Gibbs phenomenon

Gibbs phenomenon occurs in signal processing when approximating a sharp change or jump in a function using a series of smooth waves (sine and cosine functions). Instead of perfectly matching the jump, the series creates small, repetitive overshoots and undershoots near the discontinuity. These oscillations don't go away, even as more waves are added, and they become smaller in width but not in size. This effect highlights the limitations of using smooth waves to accurately model sudden changes and is important in fields like data compression and digital signal processing.