Issuses with submission of the deep learning lesson one week 4

I am trying to submit my assignment for the week 4, deep learning application but it is taking too long to submit and never submits at all . I read the note that said we should ignore this that it will be resolved but it’s been a while now and my pset is not accepted/submitted yet:(

Kindly help!

If the servers are having problems, it sometimes takes a while to get it fixed. How long has this been happening?

Hi @Sheldon_Otieno,

Please make sure you have the default assignment file name.

Best,
Mubsi

It finally get submitted! Thank you all!

Hello there, I am having difficulties with trying to ask a question a fresh, I am getting an error so I am just going to ask my new question on here.

I have an issues with " Building your Deep Neural Network: Step by Step" specifically exercise 9 on the l_model backward propagation. I have an error having to do with the dA1, which i suspect has to do with calling linear_backward_activation function to get the parameters for backward propagation. There is no clear and obvious error to how I am implementing or calling the function and that is why I really need your help, thank so much. Below is the error my program is producing.


KeyError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
60 # YOUR CODE STARTS HERE
61 current_cache = caches[l]
—> 62 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(grads[“dA” + str(l+1)], current_cache, activation = “relu”)
63 grads[“dA” + str(l)] = dA_prev_temp
64 grads[“dW” + str(l + 1)] = dW_temp

KeyError: ‘dA1’

Any help is highly appreciated! Thanks again.

That error is telling you that at that line of code the grads dictionary does not have an entry for “dA1”. So you need to debug that by tracking the state of the grads dictionary. Where do entries get added to that? E.g. you could add print statements showing what key you are adding each time you add an entry. Most likely this error has to do with the logic in the “for” loop over the hidden layers. It would also be useful to do the “dimensional analysis” on the test case here, so that you understand what to expect at each layer.