Planar Data Classification Exercise 8

I’m on Exercise 8 and all of my codes pass up until the nn_model portion. I’ve checked with a friend of mine who is also taking the course and we have the same lines of code, but I’m getting an error and he isn’t. I don’t understand what I’m doing wrong. Can someone please share some insight/correct me?



ValueError Traceback (most recent call last)
1 t_X, t_Y = nn_model_test_case()
----> 2 parameters = nn_model(t_X, t_Y, 4, num_iterations=10000, print_cost=True)
4 print("W1 = " + str(parameters[“W1”]))
5 print("b1 = " + str(parameters[“b1”]))

in nn_model(X, Y, n_h, num_iterations, print_cost)
—> 46 A2, cache = forward_propagation(X, parameters)
47 cost = compute_cost(A2, Y)
48 grads = backward_propagation(parameters, cache, X, Y)

in forward_propagation(X, parameters)
32 # A2 = …
—> 34 Z1 =,X)+b1
35 A1 = np.tanh(Z1)
36 Z2 =,A1)+b2

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (4,5) and (2,3) not aligned: 5 (dim 1) != 2 (dim 0)

Hi @mandrews , and welcome to the specialization. This is your first time posting so I will remind you that directly posting your code in Discourse is a violation of the Coursera Honor Code. I have removed that for you; please refrain from doing so in your future posts. Thanks! :grinning:

The traceback is referring you to the foward_propagation function that you worked on earlier. The matrix multiplication performed by is invalid due to the dimensions of W2 and A1. The column dimension of W2 must equal the row dimension of A1. Note that the traceback also points this out. In fact, there is no way for these two matices to conform to matrix multiplication.

These matrices are initially set by the initialize_parameters further up in the notebook. Did you by chance “hard code” the dimension arguments to that function? In other words, the code that you completed inside that function must inherit the the arguments in the function’s “signature” (i.e. that what follow the def command: n_x, n_h, and n_y.

1 Like

Thank you for fixing that for me. I appreciate it!

And yep, that was the issue. I’m going to blame the lack of sleep and/or caffeine for this one. Thank you for the help!