Bug in reference initialize_parameters_zeros (week 1 assignment 1)

There seem to be two bugs in the reference code Initialization.ipynb for the week 1 assignment. (The bugs happen to cancel each other out.)

  1. The number L of layers is not len(layers_dims); instead it is len(layers_dims) - 1.

  2. The loop for l in range(1, L) should be for l in range(1, L+1). Otherwise, WL and bL are not set!

def initialize_parameters_zeros(layers_dims):
    """
    Arguments:
    layer_dims -- python array (list) containing the size of each layer.
    
    Returns:
    parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":
                    W1 -- weight matrix of shape (layers_dims[1], layers_dims[0])
                    b1 -- bias vector of shape (layers_dims[1], 1)
                    ...
                    WL -- weight matrix of shape (layers_dims[L], layers_dims[L-1])
                    bL -- bias vector of shape (layers_dims[L], 1)
    """
    
    parameters = {}
    L = len(layers_dims)            # number of layers in the network
    
    for l in range(1, L):
        #(≈ 2 lines of code)
        # parameters['W' + str(l)] = 
        # parameters['b' + str(l)] = 
        # YOUR CODE STARTS HERE
        
        # YOUR CODE ENDS HERE
    return parameters

You are correct that L is one greater than the number of layers, but that’s why the loop as written is correct. Remember that python indexing is 0-based. Try running the following loop and watch what happens:

for ii in range(1,5):
    print(f"ii = {ii}")

So it’s really only the docstring comment that is wrong.

Throughout the course, L denotes the number of layers. It would seem unwise to redefine its meaning just in this function.

Also note that the docstring on parameters, WL -- weight matrix of shape (layers_dims[L], layers_dims[L-1]) is only correct for the normal meaning of L.