Activation Function

    0

    Mathematical functions used in neural networks to determine whether a neuron should be activated or not. They introduce non-linearity into the network, enabling it to learn complex patterns. Common activation functions include ReLU, sigmoid, and tanh. The choice of activation function significantly impacts a neural network’s learning capability and performance.

    Wikipedia