Back to Glossary
Activation Function
Data / Training / Evaluation
Introduces non-linearity into neural networks.
Activation functions allow neural networks to model complex, non-linear relationships. Without them, networks would behave like linear models.
- Examples: ReLU, Sigmoid, Tanh, Softmax.
- Impact: Affects training stability and representational power.