Course 1 Week 3 Quiz

Hi there,

I’m wondering why the shape of W[1] is not (4, 1) here?

{moderator edit - quiz question and answers removed}

Thanks in advance!


  1. 2 ↩︎

Because there are 2 inputs and 4 output neurons.

Sorry, but for the output layer, shouldn’t there are 4 inputs and 1 output?

Ah, sorry, my apologies. I didn’t look closely enough at the question. The answer I gave was for W^{[1]}. But for W^{[2]}, note that there are 4 inputs and 1 output. So the dimensions would be 1 x 4, not 4 x 1, right? The math relationship is:

Z^{[2]} = W^{[2]} \cdot A^{[1]} + b^{[2]}

For this network the shape of A^{[1]} will be 4 x m, where m is the number of samples.

1 Like