Posts

Showing posts with the label tanh neural network python

Tanh Function

Image
The tanh activation  function  is  just  another  possible   function  that can  be   used  as anon-linear activation  function  between layers of a neural  network . It shares a  many   effects  in  common  with the sigmoid activation  function . Unlike a sigmoid  function  that will  collude   input   values  between 0 and 1, the Tanh will  collude   values  between-1 and 1.  Analogous  to the sigmoid  function , one of the  intriguing   parcels  of the tanh  function  is that the  outgrowth  of tanh can  be   expressed  in  terms  of the  function  itself. When to use which Activation Function in a Neural Network? Specifically , it depends on the  problem   type  and the  value   range  of the  anticipated   affair . For  illustration , to  prognosticate   values  that are  larger  than 1, tanh or sigmoid aren't  suitable  to  be   used  in the  affair   subcaste ,  rather , ReLU can  be   used . On the  other   hand , if the  affair   values   have  to  be  in the  range  () or