Activation Functions
Topics to be covered:
1.1 What is Activation Function?
1.2 Different types of Activation Function
- Linear Function
- Step-Function
- Sigmoid Function
- Tangent Hyperbolic Function
- Arc Tan Function
- ReLU Funtion
- Leaky ReLU Function
Linear Function:
f(x)=x
Fig: Linear Function |
Step Function:
Step Function is also called as Heaviside Function. A binary step function is an threshold based activation function. If the input value is above or below a certain threshold, the neuron is activated.
Fig: Step Function |
Sigmoid Function:
Sigmoid Function is also called as Logistic Function. It is a continuous function that varies gradually between the values 0 and 1. Here whatever the input value may be the output will be between 0 and 1 only. This is very useful in neural networks.
Fig: Sigmoid Function |
Tangent Hyperbolic Function:
Tangent Hyperbolic function is similar to sigmoid function. This function gradually varies from -1 to +1. This is one of the important function for Neural Network.
f(x)=tanh(x)
Fig: Tangent Hyperbolic Function |
Arc Tan Function:
Arc Tan Function is replaceable or alternative to sigmoid or Tanh function. Here the values will rage between (-π/2,+π/2). The arc tan is in the following form.
f(x)=arctan(x)=tan-1 x
Fig: Arctan Function |
ReLU Activation Function:
ReLU Activation function is an very expensive function. It removes the negative part of our function. This is very popular when it comes to deep learning as a neural network. The following is the ReLU Activation Function.
f(x)=max(0,x)
Fig: ReLU Activation Function |
Leaky ReLU Activation Function:
Leaky ReLU function is similar to ReLU Activation function but it does not make the negative input as 0. It just reduce the magnitude of it.
Fig: Leaky ReLU Activation Function |