# Week 4 question with "for l in range(1,L)"

In week 4 assignment 1, the parameter initialization function doesn’t make sense to me.

Suppose that we have a 3-layer NN, and that layer_dims = [2, 4, 1]. In this case, we have L = len(layer_dims) = 3.

The “for l in range(1,L)” loop therefore will run only two times: we have initialized W1, b1, W2, b2 - but what about W3 and b3? It seems that this for loop is not going to initialize parameters for the output layer, yet we do need to return WL and bL… (as such I thought we should do range(1, L+1) instead of L) Or did I miss anything?

``````# GRADED FUNCTION: initialize_parameters_deep

def initialize_parameters_deep(layer_dims):
"""
Arguments:
layer_dims -- python array (list) containing the dimensions of each layer in our network

Returns:
parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":
Wl -- weight matrix of shape (layer_dims[l], layer_dims[l-1])
bl -- bias vector of shape (layer_dims[l], 1)
"""

np.random.seed(3)
parameters = {}
L = len(layer_dims) # number of layers in the network

for l in range(1, L):
#(≈ 2 lines of code)

parameters['W' + str(l)] =
parameters['b' + str(l)] =

assert(parameters['W' + str(l)].shape == (layer_dims[l], layer_dims[l - 1]))
assert(parameters['b' + str(l)].shape == (layer_dims[l], 1))

return parameters
``````

Remember that the input layer does not have weights, so if `layers_dims = [2, 4, 1]` you only need W1, b1 and W2, b2, right? The first entry in layers_dims is the number of input features, which is used to define the shape of W1.

To put it another way, if layers_dims has 3 elements (as in the case), then that is a 2 layer network, right? So you could argue that the comment on the setting of L is misleading: it’s not the number of layers in the network, but one greater than that. That bug has already been filed quite a while ago, but no action on fixing the comment yet.

1 Like

Thank you so much for the reply! So `layer_dims` contains n_x as its first element and hence our DNN has `len(layer_dims)-1` layers.