Activation Function in Brief
Introduction
The Internet provides access to plethora of information moment. Whatever we need is just a Google ( hunt) down. Still, when we've so important information, the challenge is to insulate between applicable and inapplicable information.
When our brain is fed with a lot of information contemporaneously, it tries hard to understand and classify the information into “ useful” and “ not- so-useful” information. We need a analogous medium for classifying incoming information as “ useful” or “ less-useful” in case of Neural Networks.
This is important in the way a network learns because not all the information is inversely useful. Some of it's just noise. This is where activation functions come into picture. The activation functions help the network use the important information and suppress the inapplicable data points.
Let us go through these activation functions in neural networks, learn how they work and figure out which activation functions fits well into what kind of problem statement.
Brief Overview
This is important in the way a network learns because not all the information is equally useful. Some of it's just noise. This is where activation functions come into image. The activation functions help the network use the consequential data and suppress the irrelevant data points.Let us go through these activation functions, master how they work and choose out which activation functions fits well into what kind of problem statement.
An Artificial Neural Network tries to mimic a analogous geste. The network you see below is a neural network made of connected neurons. Each neuron is characterized by its weight, bias and activation function.
- Popular types of activation functions and when to use them
- Binary Step
- Linear
- Sigmoid
- Tanh
- ReLU
- Leaky ReLU
- Parameterised ReLU
- Exponential Linear Unit
- Swish
- Softmax
For More Just Visit On Insideaiml And Join The Courses.
Comments
Post a Comment