08

ReLU Activation

Mathematical Rectification

Function Analysis


f(x) = max(0, x)
Linear for x > 0
Zero for x ≤ 0
Positive: Preserved (Linear)
Negative: Nullified (Rectified)

Simulation Control


RAW INPUT
RECTIFIED
Sparsity (Zeros) 52%
Mean Activation 0.45
Scientific Note:
ReLU solves the vanishing gradient problem by maintaining a constant gradient of 1 for positive inputs, enabling deep networks to learn faster.