
ReLU
ReLU, or Rectified Linear Unit, is a function used in neural networks that helps the model learn complex patterns. It works by taking an input number and outputting it directly if it's positive. If the input is negative, it outputs zero. This simple rule allows the network to introduce non-linearity—meaning it can handle more complicated data relationships—while remaining computationally efficient. ReLU's ability to activate only positive signals speeds up learning and helps models better understand data patterns, making it a popular choice for building deep learning systems.