Implementation Stage Of Tanh Activation Function

Tanh help to break non zero centered problem of sigmoid function. Tanh squashes a real- valued number to the range (-1, 1). It’s non-linear too.


Secondary function give us nearly same as sigmoid’s outgrowth function.

It break sigmoid’s debit but it still can’t remove the evaporating grade problem fully.

When we compare tanh activation function with sigmoid, this picture give you clear idea.


# tanh activation function
def
tanh(z):
return (np.exp(z) - np.exp(-z)) / (np.exp(z) + np.exp(-z))
# Derivative of Tanh Activation Function
def
tanh_prime(z):
return 1 - np.power(tanh(z), 2)

For More Just Visit On InsideAIML.

Comments

Popular posts from this blog

Tanh Activation Function

Sigmoid Activation Function And Its Uses.

Unleashing Creativity: The Latest Frontier of Animation AI