Relu Activation Function(ReLU)
The derivative of ReLU is,
A simple python function to mimic the derivative of ReLU function is as follows,
def der_ReLU(x):
data = [1 if value>0 else 0 for value in x]
return np.array(data, dtype=float)
ReLU is used extensively currently, but it has some problems. let's say if we've input lower than 0, also it labors zero, and the neural network can not continue the backpropagation algorithm. This problem is generally known as Dying ReLU. To get relieve of this problem we use an extemporized interpretation of ReLU, called Leaky ReLU.
The fine representation of ReLU Activation Function is,
The coding logic for the ReLU function is simple,
if input_value > 0:
return input_value
else:
return 0
A simple python function to mimic a ReLU function is as follows,
def ReLU(x):
data = [max(0,value) for value in x]
return np.array(data, dtype=float)
Comments
Post a Comment