Programming Assignment for Week 4 - Deep Neural Network - Application

Hello everyone,

I am working on the last programming assignment of the DLS Course 1, week 4, and I have the following issue with the 2 layered model:

When trying to forward-propagate, I am getting the following error by the grader:

~/work/release/W4A2/ in linear_activation_forward(A_prev, W, b, activation)
    209         A, activation_cache = relu(Z)
--> 211     assert (A.shape == (W.shape[0], A_prev.shape[1]))
    212     cache = (linear_cache, activation_cache)

UnboundLocalError: local variable 'A' referenced before assignment

It refers to an “A” variable, which depends on the RELU calculation of Z (The first activation scalar of the first layer). It seems to me then that the A variable is not being calculated at all by the grader’s “activation_cache” function, which I do not have access to.

I am calculating the Z before calling the forward pass function. I do this based on the standard dot product of the weights times features plus bias that relate to the first layer. So to me the call seems clean.

Any ideas what might be making the grader crash?

Thank you kindly

My guess is that you are not calling the linear_activation_forward with the right value for activation. That parameter can only take either “sigmoid” or “relu”, and it is case sensitive.


Thanks for the pointer Alberto. I was calculating Z myself and submitting the array. Using a string fixed the problem.

Thank you very much