
ReLU (Rectified Linear Unit)
The Rectified Linear Unit (ReLU) is a mathematical function frequently used in artificial intelligence and deep learning. It takes an input value and outputs it directly if it's positive; otherwise, it outputs zero. This means that negative values are transformed to zero, while positive values remain unchanged. ReLU helps models learn complex patterns in data by allowing them to focus on features that matter while simplifying calculations. Its popularity stems from promoting faster training and better performance in neural networks, making it a key component in modern AI applications.