How to write a ReLU function and its derivative in python?
Hiii musketeers Then I want to bandy about activation functions in Neural network generally we've so numerous papers on activation functions . Then I want bandy every thing about activation functions about their derivations , python law and when we will use . This composition will cover …. Function Equations and its Derivatives Types of Activation function: Sigmoid Tanh or Hyperbolic ReLu Activation Function (Rectified Linear Unit) Now we will look each of this 1)Sigmoid: It is also called as logistic activation function. f(x)=1/(1+exp(-x) the function range between (0,1) Derivative of sigmoid: just simple u/v rule i.e (vdu-udv)/v² df(x)=[(1+exp(-x)(d(1))-d(1+exp(-x)*1]/(1+exp(-x))² d(1)=0, d(1+exp(-x))=d(1)+d(exp(-x))=-exp(-x) so df(x)=exp(-x)/(1+exp(-x))² df(x)=[1/(1+exp(-x))]*[1-(1/(1+exp(-x))] df(x)=f(x)*(1-f(x)) Python Code: import matplotlib.pyplot as plt import numpy as np def sigmoid(x): s=1/(1+np.exp(-x)) ds=s*(1-s) ret