Posts

Showing posts with the label artificialintelligence python relu machine learning data science

Relu Activation Function(ReLU)

Image
The derivative of ReLU is, A simple python function to mimic the derivative of ReLU function is as follows, def der_ReLU(x): data = [1 if value>0 else 0 for value in x] return np.array(data, dtype=float) ReLU is  used   extensively   currently , but it has some  problems .  let 's  say  if we've  input   lower  than 0,  also  it  labors  zero, and the neural  network  can not  continue  the backpropagation algorithm. This  problem  is  generally   known  as  Dying  ReLU. To  get   relieve  of this  problem  we  use  an  extemporized   interpretation  of ReLU,  called  Leaky ReLU. The  remedied   direct  activation  function  (RELU) is a piecewise  direct   function  that, if the  input  is  positive   say  x, the  affair  will  be x.  else , it  labors  zero. The  fine  representation of ReLU Activation Function  is ,   The coding logic for the ReLU function is simple, if input_value > 0: return input_value else: return 0 A simple python function to mimic a ReLU f