W 4 | Quiz | Error in Q.7 or am I just not thinking it straight?


This is a very small detail, but I can’t let it go until I know for sure. The question was:

If L is the number of layers of a neural network then dZ[L]=A[L]−YdZ^{[L]} = A^{[L]} - YdZ[L]=A[L]−Y. True/False?

I assumed the answer here is False, as it was not specified what activation function the last layer uses. The answer would be True only if we assumed a Sigmoid activation. Am i correct in this logic?

Hello, Oskar.

The answer should be True here.

Well, you see, in week 3, each activation has a different derivative. Thus, during backpropagation you need to know which activation was used in the forward propagation to be able to compute the correct derivative.

And, we are using sigmoid function as an activation just because we have been asked to do in that way to get an answer between “Yes” or “No”. Otherwise, there are other useful activation functions that could be implemented in the place to see an overall picture of the assignment that we are working on. If you have any other query, then please let us know.

We are only dealing with binary classifications here in Course 1, so the activation function at the output layer is fixed: it is always sigmoid. So the correct answer there is “True”. As Rashmi says, you have choices for the activations in the hidden layers, but not at the output layer.

Thanks for your replies. I believe it would be beneficial to change the question phrasing to specify that a sigmoid activation is used. Leaving that assumption to the learner will cause confusion :slight_smile: