Subtle bug in the Week 3 programming assignment

In the Week 3 graded lab, in the forward_propagation function, after the second END CODE HERE, there’s an assert that I believe is incorrect:

assert(Y_hat.shape == (n_x, X.shape[1]))

But Y_hat.shape should be (n_y,X.shape[1]). This is harmless for the single-variable part of the lab because n_x == n_y == 1. In the 2-variable section, though, I thought that bug would be exposed. It wasn’t: everything worked as expected. Finally I realized what’s going on: in the code for forward_propagation, n_x and n_y are not computed from the inputs; they are inherited from the environment when the function is compiled. The n_y inherited is 1 throughout the lab, and at the time forward_propagation is compiled, n_x = 1 also.

So I’m really identifying two bugs: the assertion should be Y_hat.shape == (n_y,X.shape[1]) and before the assertion, n_y should be computed from the inputs, say n_y, n_x = W.shape

1 Like

Right. I’m of the same opinion as well.
Just to elaborate and provide more context to the readers,
Given that Y^ = Wx + b, consider multiple linear regression where W=[w1 w2] (1X2), x = [x x2]^T (2Xm) with bias b(1X1) and training set size m

Y^ matrix dimension would be 1Xm, (see below)

Y^ = 1X2 @ 2Xm + 1X1
Y^ = 1Xm+ 1Xm (broadcasted from 1X1)
Y^ = 1Xm

Hence, assert(Y_hat.shape == (n_y, X.shape[1])) and not assert(Y_hat.shape == (n_x, X.shape[1]))

Refer section 3.2 - Neural Network Model with a Single Perceptron and Two Input Nodes for further reading

1 Like