Image for Rectified Linear Unit (ReLU)

Rectified Linear Unit (ReLU)

The Rectified Linear Unit (ReLU) is a mathematical function commonly used in artificial intelligence and machine learning, especially in neural networks. It transforms input values by outputting them directly if they are positive, or zero if they are negative. This simple approach helps models learn complex patterns in data more effectively. By allowing only positive values to pass through, ReLU promotes efficient computation and helps combat issues like vanishing gradients, facilitating faster training of models. Overall, it's a fundamental tool that enhances the performance of many AI applications.