# Improvement suggestion to W4A1

The value of Variable L (and its comment as “number of layers in the network”) is incorrect.

See details below:

1. Open “/notebooks/release/W4A1/Building_your_Deep_Neural_Network_Step_by_Step.ipynb”

2. Find cell under “Exercise 2 - initialize_parameters_deep”:

3. The current code template looks like this:
“”"
def initialize_parameters_deep(layer_dims):

L = len(layer_dims) # number of layers in the network

for l in range(1, L):

“”"

4. Ideally, it should be
“”"
def initialize_parameters_deep(layer_dims):

L = len(layer_dims) - 1 # number of layers in the network

for l in range(1, L + 1):

“”"

Hello @zshao,

`for l in range(1, L)` is correct. If you changed `L` to `L+1`, it should throw an out-of-index exception, because `layer_dims` has only `L` items.

We start `l` from `1` instead of `0` because the 0th element of `layer_dims` represents the number of features of our input X. It is not a hidden layer and so we do not need to initialize any parameters for it.

Cheers,
Raymond

1 Like

Please note that in my “ideal code”, I also changed the definition of L to be “len(layer_dims) - 1”.

The definition of L (as number of layers in the network) should be one less than the length of (layer_dims) since the input_layer is not considered when calculating the layers in the network.

Please note that the code semantics is exactly the same. The only difference that I am suggesting is to make the value of L and the comment of L (number of layers in the network) match each other. This matches the course slide deck better as well, since the deck uses notations like W^1 … W^(L), and b^1 … b^(L).

Oh! Thanks for pointing it out. I had missed the change in your assignment of `L`.