I am in Week 4 past 1 assignment and writing code for L_model_forward. For forward activation till L-1 layers, when calling linear_activation_forward, I am getting W and b from parameters as parameters[‘W’] and parameters[‘b’] as below
A, cache = linear_activation_forward(A_prev, parameters[‘W’], parameters[‘b’], relu)
but when I run it, it gives KeyError: ‘W’
I cross checked that during initialization, same keyword ‘W’ was added in parameters.
Can you help on where I am going wrong.
Also noticed that for loop in given code for RelU function is till L while I expected it to be L-1.
You have posted your query to MLS course1 week3 forum, it should be DLS course1 week4 assignment 1. You will get a quicker response if your query is posted in the correct forum.
The keyError is raised because your code is trying to access a key that doesn’t exist in the variable parameters. Please refer to the initialize_parameters_deep() to check how parameters is constructed.
The key should be a string with the layer number as part of the string. So it would look like this:
W1
If you check initialize_parameters_deep() function, you can see how the key is constructed:
‘W’ + str(L) where L is the layer number, str(L) converts the number stored in L into a char. The + sign concatenates the letter W with the converted number in L.