
Leaky ReLU
Leaky ReLU, or Leaky Rectified Linear Unit, is an activation function used in artificial neural networks. It helps decide which information to keep or discard as data moves through the network. Unlike the standard ReLU, which outputs zero for negative inputs, Leaky ReLU allows a small, non-zero value for negative inputs. This prevents the "dying ReLU" problem, where neurons can become inactive and stop learning. By allowing a slight leak, it helps the network learn better and improves performance in tasks like image recognition or natural language processing. Overall, it helps maintain neuron activity and enhances the learning process.