How to apply relu function in Exercise of week 3(optional).)

I have applied the code and have gotten a close value of 0.2113 instead of the asserted value that is 0.2129. I want to use Relu to observe its performance.
Kindly tell me how can I implement RELU to get correct predictions in Week 3 Programming Assignment.

Hi @Ashar_Javid

Could you please identify the exercise and which part of the lab your are referring to.

Neural Networks: Week 3:- Programming Exercise( Planar Data Classification with One Hidden Layer)- Exercise 4- Forward Propagation for calculating A2
I want to apply RELU instead of tanh in the hidden layer portion, to observe its impact and working.

While using sigmoid for A2, and RELU for computing A1

Hi @Ashar_Javid ,

The implementation instruction specified to use tanh() for A1 and sigmoid for A2.
We need to understand what these activation functions do in order to use them properly.

tanh() would output a range of values between (-1, 1), relu() would output values (0, max) and sigmoid() output values (0, 1)
If we were to change the activation function for A1 from tanh() to relu() than we would loose all the negative values.

By all means to experiment, stick the relu() for A1 and to see what the happens.

It’s a great idea to experiment with using ReLU as the hidden layer activation in this exercise. You always learn something interesting when you try to extend the ideas in the course. Of course you will need to change more than just the forward prop logic: the derivatives of the activation functions affect the back prop as well.

There have been a couple of other threads about this in the past, e.g. this one. I was able to get pretty good accuracy using ReLU but it requires a lot more hidden units to get results equivalent to what you can get with tanh and 4 hidden units.