Week 4 Exercise 5 - L_model_forward W

May i ask why show error : ‘W’ from my below code.

in the for loop i use:
A, cache =linear_activation_forward(A_prev, parameters[“W”], parameters[“b”], activation = “relu”)

i pass wrong formula W? why?

Assuming that there are multiple layers in a neural network, there are different weights in each layer. In this exercise, we concatenate “W” and the layer number “l” to identify the weight for a specific layer in a Python dictionary.
In this sense, you may want to add a string after “W” to read/write weights for a specific layer. And, please be careful to convert a number to a string with str() to create a key for a Python dictionary.

hi Nobu,
i change it to str(parameters[“W”]), but still same error… may i ask why?

You are welcome.
“W” is already “string”. You do not need to touch any. What we need is to create, say, “W1”, for the weights of the first hidden layer. So, you need to concatenate"W" with “1”, which is actually coming from a loop counter. Finally, what you need to create is a string like “W”+str(l). Then, you can extract the value from “parameter dictionary”.

Nobu’s point is that the keys are not just W, right? They are W1 or W2 or b1, as python strings. So how do you construct a string like that in python? As with everything there are several ways, but they have demonstrated one technique for you at several points in the template code. Take a look at how they handle that in the template for initialize_parameters_deep for example. Look at the comments in the loop. Or Nobu just showed you that code in his most recent reply.