Posts

Showing posts from January, 2022

Namespace in python

Image
In Python, a namespace is a system that assigns a unique name to each and every object. A variable or a method can be considered an object. Python has its own namespace, which is kept in the form of a Python dictionary. Let's look at a directory-file system structure in a computer as an example. It should go without saying that a file with the same name might be found in numerous folders. However, by supplying the absolute path of the file, one can be routed to it if desired. In the real world, a namespace in python serves the same purpose as a surname. There may be more than one "Alice" in the class, but when you specifically ask for "Alice Lee" or "Alice Clark" (with a surname), there will only be one (for the time being, don't assume that both first and surname are the same for numerous students). Similarly, based on the namespace, the Python interpreter understands what particular function or variable one is trying to point to in the

Implementation Stage Of Tanh Activation Function

Image
Tanh help to break non zero centered problem of sigmoid function. Tanh squashes a real- valued number to the range (-1, 1). It’s non-linear too. Secondary function give us nearly same as sigmoid’s outgrowth function. It break sigmoid’s debit but it still can’t remove the evaporating grade problem fully. When we compare tanh activation function with sigmoid, this picture give you clear idea. # tanh activation function def tanh(z): return (np.exp(z) - np.exp(-z)) / (np.exp(z) + np.exp(-z)) # Derivative of Tanh Activation Function def tanh_prime(z): return 1 - np.power(tanh(z), 2) For More Just Visit On InsideAIML .

Sigmoid Function(Activation Function In Neural Network)

Image
A Sigmoid Activation F unction  is a  fine   function  which has a  characteristic  S-  shaped   wind . There are a  number  of  common  sigmoid  functions ,  similar  as the logistic  function , the hyperbolic  digression , and the arctangent. sigmoid function is commonly used to relate specifically to the logistic function, also called the logistic sigmoid function.   All sigmoid functions have the property that they collude the exclusive opus line into a small range similar as between 0 and 1, or-1 and 1, so one use of a sigmoid function is to convert a real value into bone that can be explicated as a probability.  Sigmoid  functions   have   come   popular  in  deep   literacy  because they can  be   used  as an activation  function  in an  artificial  neural  network . They were  inspired  by the activation  eventuality  in  natural  neural  networks . Sigmoid  functions  are  also   useful  for  numerous   machine   literacy   operations  where a  real   number  needs to  be   co