I am in Week 4 past 1 assignment and writing code for L_model_forward. For forward activation till L-1 layers, when calling linear_activation_forward, I am getting W and b from parameters as parameters[‘W’] and parameters[‘b’] as below
A, cache = linear_activation_forward(A_prev, parameters[‘W’], parameters[‘b’], relu)
but when I run it, it gives KeyError: ‘W’
I cross checked that during initialization, same keyword ‘W’ was added in parameters.
Can you help on where I am going wrong.

Also noticed that for loop in given code for RelU function is till L while I expected it to be L-1.

Hi @vikkr ,

You have posted your query to MLS course1 week3 forum, it should be DLS course1 week4 assignment 1. You will get a quicker response if your query is posted in the correct forum.

The keyError is raised because your code is trying to access a key that doesn’t exist in the variable parameters. Please refer to the initialize_parameters_deep() to check how parameters is constructed.

thanks Kic. I even tried parameters[‘W[l]’], but it still not recognizing. I have changed the topic to DLS. thanks for highlighting

You have to call W and b of the l layer. If you don’t know how to do that, check the initialize_parameters_deep for the hint.


But after correcting this, you will face one more error as shown below:

UnboundLocalError: local variable 'linear_cache' referenced before assignment

Read this in advance.


Hi @vikkr ,

The key should be a string with the layer number as part of the string. So it would look like this:
If you check initialize_parameters_deep() function, you can see how the key is constructed:
‘W’ + str(L) where L is the layer number, str(L) converts the number stored in L into a char. The + sign concatenates the letter W with the converted number in L.

thanks a lot. that made it clear for me.