Inconsistent quiz question and assignment: what is actually correct?

In the quiz, this answer is wrong:

However, in the programming assignment, we are told in L_model_forward implementation, to have a loop from [1…L), and then a separate equation outside the for loop for the last layer - the output layer:

def L_model_forward(X, parameters):

# Implement [LINEAR → RELU]*(L-1). Add “cache” to the “caches” list.
# The for loop starts at 1 because layer 0 is the input
for l in range(1, L):

# Implement LINEAR → SIGMOID. Add “cache” to the “caches” list.
AL, cache = linear_activation_forward(…)

So the question is: what is the correct approach? And if the approach in the assignment is the right one (which I think it is), then why a different thing is being asked in the quiz/a different notation/agreement is used in the quiz?

Confusing :slight_smile:

Hi @Oleksandra_Sopova

This is because when you use for loop
For i in range(1,L):
It starts from 1 and end with L-1 and we have L layers so there are 1 layer not computed which is output layer but when we in lecture we use notation for loop L which mean we have L layer but it is just illustration for simplify the lecture

I hope I answered you questions,
Please feel free to ask any questions,
Thanks,
Abdelrahman

You are right that the output layer is not computed in the loop, and it is fine - because the output layer could use a different activation function, like sigmoid.

Ah, just saw that I missed “g is the activation layer used in all layers”. That explain it. We do not need a separate line outside the for loop because all layers use the same activation function.