Week 2 Exercise 6 optimize

Hi,
I’m trying to solve exercise 6 but I’m getting an error message that I can’t understand how to solve. anyone can help please? :slight_smile:

All code before this step is working fine

ValueError Traceback (most recent call last)
in
----> 1 params, grads, costs = optimize(w, b, X, Y, num_iterations=100, learning_rate=0.009, print_cost=False)
2
3 print ("w = " + str(params[“w”]))
4 print ("b = " + str(params[“b”]))
5 print ("dw = " + str(grads[“dw”]))

in optimize(w, b, X, Y, num_iterations, learning_rate, print_cost)
36 # YOUR CODE STARTS HERE
37
—> 38 grads, cost = propagate(w, b, X, Y)
39
40 # YOUR CODE ENDS HERE

in propagate(w, b, X, Y)
30 # YOUR CODE STARTS HERE
31
—> 32 A = sigmoid(np.dot(w.T, X) + b)
33 cost = -1/m * np.sum(Y*np.log(A) + (1-Y)*np.log(1-A))
34

ValueError: operands could not be broadcast together with shapes (2,3) (2,2)

thanks a lot for the help

That means that both your w^T \cdot X and your b values are the wrong shapes down in propagate. The first should be a row vector of dimension 1 x m where m is the number of columns of X. That means that the w is the wrong shape. Then b should be a scalar. Note that the bug is not in propagate: it is in optimize. Since the original w and b values that were passed into the test case for optimize were correct shapes, this implicates your “update” logic. Please check that logic. One good thing to try would be to add print statements after the “update” to show the new values of w and b. I predict you will find some surprises there. :nerd_face:

Found the bug!
Thank you very much!!!

That’s great! Congrats!

Let me guess: you were using dw to update b, right?