W4_A2_Ex-1/2_Two_layer_model

Hello, Huzaifa Amjad.

Look at the activations, you are performing while doing the forward_pass. Also, the input layer that you are dealing with.

Ok Now only problem is that error occur that dA2 is not defined.Tell me how to define dA2 kindly ?

For backward propagation, you start by computing dA2 for the output layer by using Y and A2. The formula for that is covered in various places. E.g. look at the L_model_backward logic in the Step by Step exercise that you just completed previously. Then you call linear_activation_backward for layer 2 and then for layer 1.

Thanks @Paul @Nobu @Rashmi I have passed

Dear Rashmi

I had same problem for several days. I check that type(costs[0]) is numpy.ndarray. Please help me…

I solved. In parameters = update_parameters(parameters, grads, learning_rate)

learning_rate = 0.0075 should be learning_rate.

Thanks.

Glad to know that you have solved the query on your own, Sung Yun!

One of my problems for which I found an answer here was using activation functions in the back propagation stage reverse.

Right! Since back propagation happens in the opposite order, you are dealing with the output layer first and then the hidden layers from later to earlier. Of course the derivative of the activation function is part of that computation, so you need to use the correct activation function corresponding to whatever the layer is that you are handling at each stage of the computation.