# _Exercise 8_problem

Hello. I passed all the tests before Exercise 8, but for it I got an error–>

ValueError: shapes (1,3) and (4,3) not aligned: 3 (dim 1) != 4 (dim 0)

I know that is size of X and w.T

But I went through my codes and did not find any hardcoding. Also I looked the others comments and the same, could not find the answer to my problem.

Please, help me to find the problem.

Hi and welcome @Lilit.Ghalachyan,

Which line of code gives you this exception?

## Thank you for quick respond. Here is the error

ValueError Traceback (most recent call last)
in
----> 1 model_test(model)

~/work/release/W2A2/public_tests.py in model_test(target)
109 y_test = np.array([1, 0, 1])
110
→ 111 d = target(X, Y, x_test, y_test, num_iterations=50, learning_rate=1e-4)
112
113 assert type(d[‘costs’]) == list, f"Wrong type for d[‘costs’]. {type(d[‘costs’])} != list"

in model(X_train, Y_train, X_test, Y_test, num_iterations, learning_rate, print_cost)
38 w,b=initialize_with_zeros(X_train.reshape(X_train.shape[0], -Y_train.shape[1]).T.shape[0])
39
—> 40 parameters, grads, costs =optimize(w, b, X_train, Y_train, num_iterations, learning_rate, print_cost)
41
42 w=params[“w”]

in optimize(w, b, X, Y, num_iterations, learning_rate, print_cost)
35 # grads, cost = …
36 # YOUR CODE STARTS HERE
—> 37 grads, cost = propagate(w, b, X, Y)
38
39 # YOUR CODE ENDS HERE

in propagate(w, b, X, Y)
29 # cost = …
30 # YOUR CODE STARTS HERE
—> 31 A=sigmoid(np.dot(w.T,X)+b)
32 cost=-1/m*np.sum((np.dot(np.log(A),Y.T),(np.dot(np.log(1-A),(1-Y).T))))
33 # YOUR CODE ENDS HERE

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (1,3) and (4,3) not aligned: 3 (dim 1) != 4 (dim 0)

Hint:

doesn’t look right

I changed it to dimension of w,
w,b=initialize_with_zeros(X_train.shape[1])

## But got the same error

ValueError Traceback (most recent call last)
in
----> 1 model_test(model)

~/work/release/W2A2/public_tests.py in model_test(target)
109 y_test = np.array([1, 0, 1])
110
→ 111 d = target(X, Y, x_test, y_test, num_iterations=50, learning_rate=1e-4)
112
113 assert type(d[‘costs’]) == list, f"Wrong type for d[‘costs’]. {type(d[‘costs’])} != list"

in model(X_train, Y_train, X_test, Y_test, num_iterations, learning_rate, print_cost)
38 w,b=initialize_with_zeros(X_train.shape[1])
39
—> 40 parameters, grads, costs =optimize(w, b, X_train, Y_train, num_iterations, learning_rate, print_cost)
41
42 w=params[“w”]

in optimize(w, b, X, Y, num_iterations, learning_rate, print_cost)
35 # grads, cost = …
36 # YOUR CODE STARTS HERE
—> 37 grads, cost = propagate(w, b, X, Y)
38
39 # YOUR CODE ENDS HERE

in propagate(w, b, X, Y)
29 # cost = …
30 # YOUR CODE STARTS HERE
—> 31 A=sigmoid(np.dot(w.T,X)+b)
32 cost=-1/m*np.sum((np.dot(np.log(A),Y.T),(np.dot(np.log(1-A),(1-Y).T))))
33 # YOUR CODE ENDS HERE

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (1,3) and (4,3) not aligned: 3 (dim 1) != 4 (dim 0)

Think about the dimensions of w, b, and X.

Inside `propagate`, you can always `print(w.shape)` and `print(X.shape)` to see if your matrix multiplication works or not.

In the general case, A x B, the number of columns of A needs to match the number of rows of B.

You still have an error in your initialization call.

In my propagate function I wrote A=sigmoid(np.dot(w.T,X)+b), so the matrix multiplication works.
In initialization call I should write number of parameters which is X_train.shape[1], isn’t it?

I am sorry, I looked through the codes and still can’t find the error((

The matrix multiplication only works if w has been initialized correctly. I advise you to `print(X_train.shape)`. What is `X_train.shape[0]`? What is `X_train.shape[1]`? You can also look at the videos again, where they talk about vectorization and the layout of X. Hopefully, that will give you a hint for how to initialize your params using the dimensions of `X_train`. As I also said in my earlier post, print the shapes inside `propagate` and you will see that the matrix multiplication does not work

ok thank you very much.

1 Like

You are welcome! You can never debug and `print` too much to figure out what is going on when you encounter a problem.