Hello everyone,
I am working on the last programming assignment of the DLS Course 1, week 4, and I have the following issue with the 2 layered model:
When trying to forward-propagate, I am getting the following error by the grader:
~/work/release/W4A2/dnn_app_utils_v3.py in linear_activation_forward(A_prev, W, b, activation)
209 A, activation_cache = relu(Z)
210
--> 211 assert (A.shape == (W.shape[0], A_prev.shape[1]))
212 cache = (linear_cache, activation_cache)
213
UnboundLocalError: local variable 'A' referenced before assignment
It refers to an “A” variable, which depends on the RELU calculation of Z (The first activation scalar of the first layer). It seems to me then that the A variable is not being calculated at all by the grader’s “activation_cache” function, which I do not have access to.
I am calculating the Z before calling the forward pass function. I do this based on the standard dot product of the weights times features plus bias that relate to the first layer. So to me the call seems clean.
Any ideas what might be making the grader crash?
Thank you kindly
Jonathan