How to write a ReLU function and its derivative in python?

Hiii musketeers

Then I want to bandy about activation functions in Neural network generally we've so numerous papers on activation functions.

Then I want bandy every thing about activation functions about their derivations, python law and when we will use.

This composition will cover ….





Function Equations and its Derivatives

Types of Activation function:

  1. Sigmoid
  2. Tanh or Hyperbolic
  3. ReLu Activation Function(Rectified Linear Unit)

Now we will look each of this

1)Sigmoid:

It is also called as logistic activation function.

f(x)=1/(1+exp(-x) the function range between (0,1)

Derivative of sigmoid:

just simple u/v rule i.e (vdu-udv)/v²

df(x)=[(1+exp(-x)(d(1))-d(1+exp(-x)*1]/(1+exp(-x))²

d(1)=0,

d(1+exp(-x))=d(1)+d(exp(-x))=-exp(-x) so

df(x)=exp(-x)/(1+exp(-x))²

df(x)=[1/(1+exp(-x))]*[1-(1/(1+exp(-x))]

df(x)=f(x)*(1-f(x))

Python Code:

import matplotlib.pyplot as plt
import numpy as np

def sigmoid(x):
s=1/(1+np.exp(-x))
ds=s*(1-s)
return s,ds
x=np.arange(-6,6,0.01)
sigmoid(x)
# Setup centered axes
fig, ax = plt.subplots(figsize=(9, 5))
ax.spines['left'].set_position('center')
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.yaxis.set_ticks_position('left')
# Create and show plot
ax.plot(x,sigmoid(x)[0], color="#307EC7", linewidth=3, label="sigmoid")
ax.plot(x,sigmoid(x)[1], color="#9621E2", linewidth=3, label="derivative")
ax.legend(loc="upper right", frameon=False)
fig.show()

Comments

Popular posts from this blog

Unleashing Creativity: The Latest Frontier of Animation AI

Tanh Activation Function

Unveiling the Hidden Gems: Exploring Data Mining Functionality