Thursday 6 August 2020

Activation Functions and Its Types

0 comments

Activation Functions

Topics to be covered:

In this topic we going to get a brief idea about -
            1.1 what is Activation Function?
            1.2 Different types of Activation Functions

1.1 What is Activation Function?

        An Activation Function decides whether a Neuron should be activated (or) not by calculating weighted sum and further adding bias with it. The purpose of Activation Function is to introduce Non-linearity to the output of a Neuron.  

1.2 Different types of Activation Function

        The following are the different types of activation functions which does the non-linear transformation to the input making it capable to learn and perform more complex task.
        • Linear Function
        • Step-Function
        • Sigmoid Function 
        •  Tangent Hyperbolic Function
        • Arc Tan Function
        • ReLU Funtion
        • Leaky ReLU Function

Linear Function:

        Linear Function is also known as identity function. Linear Function are those whose graph is a straight line. A Linear Function has the following form 

                                                                    f(x)=x

Fig: Linear Function

Step Function:

        Step Function is also called as Heaviside Function. A binary step function is an threshold based activation function. If the input value is above or below a certain threshold, the neuron is activated. 

Fig: Step Function

Sigmoid Function:

        Sigmoid Function is also called as Logistic Function. It is a continuous function that varies gradually between the values 0 and 1. Here whatever the input value may be the output will be between 0 and 1 only. This is very useful in neural networks.

Fig: Sigmoid Function

 Tangent Hyperbolic Function:

        Tangent Hyperbolic function is similar to sigmoid function. This function gradually varies from -1 to +1. This is one of the important function for Neural Network.

                                                             f(x)=tanh(x)

Fig: Tangent Hyperbolic Function

 Arc Tan Function:

        Arc Tan Function is replaceable or alternative to sigmoid or Tanh function. Here the values will rage between (-π/2,+π/2). The arc tan is in the following form.

                                                 f(x)=arctan(x)=tan-1 x

Fig: Arctan Function

 ReLU Activation Function:

        ReLU Activation function is an very expensive function. It removes the negative part of our function. This is very popular when it comes to deep learning as a neural network. The following is the ReLU Activation Function.

                                                          f(x)=max(0,x)

Fig: ReLU Activation Function

Leaky ReLU Activation Function:

        Leaky ReLU function is similar to ReLU Activation function but it does not make the negative input as 0. It just reduce the magnitude of it.

Fig: Leaky ReLU Activation Function

            

                                                                                                                                                                                                                                                                                     


























No comments:

Post a Comment

If you have any doubts, please let me now