Hi,
I just finished the quiz for course 4, week 2. It took me a few tries to get to understand my mistakes, because this week covered so much material.
However, I think that there is a mistake in one of the quiz questions - the one about the ResNet block. I got this question on my first try of the quiz, but no more on my second and third try. I have made a screenshot which I pasted below (I apologise for the spoiler, but it’s unavoidable in this case).
I think I gave the right answer here, but, as you see, it was marked incorrect. Now, I don’t see what is wrong with it. I double-checked my notes and the lectures, and I then also saw that this exact equation popped up in a re-formulated version of the question (“What is the skip connection part?”) in my second try of the quiz.
Am I overlooking something, or was my answer indeed correct? (Also, why is there a ‘b’ added just after the question? Is this some hint that should have been removed?)
Thanks,
Reinier
I think this question is to clarify where the residual block, a^{[l]} should be merged.
Please see Andrew’s chart. a^{[l]}, at the left hand side, goes into the next linear layer. In parallel, it skips one layer, and goes into the 2nd linear layer. The key point is where this should land.
Each neuron receives the output from the previous layer, then, 1) apply a linear operation with w and b to calculate z, then 2) apply an activation function.
As you see, a residual block is merged after the linear layer, which is, “before” ReLU activation. In this sense, a^{[l]} needs to be merged with the output from the linear operation. Then, an activation function, g(), (ReLu in Andrew’s chart), needs to be applied.
So, please reconsider where a^{[l]} should land.
Thank you very much for your answer, @anon57530071 !
It looks like I actually understood it correctly all the time, but somehow confused myself completely, probably because of all the expansions. I can’t believe I got this wrong
Good to know.
And, please edit it. I appreciate your considerations to the community.