W4 Assignment1 - Calculation of L seems not right

2023-05-19T18:30:00Z

Hi ,

def L_model_forward(X, parameters):
L = len(parameters) // 2 # number of layers in the neural network

Since parameters will have W & b for each layer excluding the input layer. So is not this code supposed to be
L= len(parameters) //2 + 1 ( to add back input later ? )

request for a clarification. thank you

Uday

We don’t have parameters for the input layer. Just for hidden and output layers.

Of ourse , my question rather is : what does L supposed to represent - number of layers including or excluding the input layer ? from the above code what we’ll get is the count excluding the input layer.

In that case , the for loop in the following code should start from 0 or the L should be incremented by 1 to add input layer count back to it

def L_model_forward(X, parameters):
image

We don’t count input layer as a layer, so if total layers are 10`, it means we have 9 hidden layers and 1 output layer.

Regarding the indexing in for loop, range(1, L) means that starting point is 1, not 0.

Best,
Saif.

thanks for quick reply… there is inconsistency in what L represents which is confusing .
in the earlier function it included the count of input layer, however the for loop initiated from 1 thus discounting input layer

image

image

however if L does not count input layer in this function below , why doe the for loop start from 1
image

Can you mention the function where it counts the input layer in a loop?

In a screenshot you shared, it is clearly mentioned that loop starts at 1 because layer 0 is the input (and we don’t want input layer as a part of our loop).

if L includes input layer then the for loop should initiate from 1 thus discounting the input layer, this is clear to me, which is the case in this code. Note here L does include input layer
image
image

however if L counts only hidden + output layer as in the code below , why doesnt the for loop initiate from 0 ?
image

Let me give you an example:

Let’s say layers_dims = [12288, 20, 7, 5, 1]. This is a 4-layer model, (3 input and 1 output layer) right?
We have L = len(layer_dims), which is 5 in this case. But we don’t want to start the loop from 0 till 4, we want to start from 1 to 4, thus counting all the 4 layers (3 input and 1 output layer) in a loop. We do not want input layer (index = 0) to be a part of for loop.

We are initiating from 1, not 0.

Your explanation of for loop makes perfect sense !

however my confusion is due to that the variable L sometimes includes the count of input layer [ as in the very example above you had given ] and sometime excluding it as in code below.
image

thank you very much for taking your time to help !!

L = len(parameters) // 2 comes from the L_model_forward, right? This is different from the initialize_parameters_deep where we use L = len(layer_dims).

In Python, for l in range(1, L) means that starting point is 1 but don’t count the L, so end point is L-1. So, as in above example, L = len(layer_dims), which was 5, you got the intuition.

However, if L = len(parameters) // 2, then for l in range(1, L) means start from the 1, don’t count the L (last layer). Here, we are doing the calculation for the hidden layers only. And for the output layer, we are doing it separately. Check the L_model_forward function.

Best,
Saif.

thank you … I got it !