On week two programming assignments, I have the code following the logic and everything should be working right, but I am not just passing the tests. Here is more to what I am saying:
- for the initialization problem, where we have to initialize the parameters w, and b, I have the correct logic or so I think, but my code is not passing the tests. There is an assertion error statement but it doesn’t say any thing, here is the error text:
AssertionError Traceback (most recent call last)
in
2 w, b = initialize_with_zeros(dim)
3
----> 4 assert type(b) == float
5 print ("w = " + str(w))
6 print ("b = " + str(b))
AssertionError:
What could be the issue, can I get the TA to help or something?
- The same happens with the next question for forward and backward propagation problem, here is the error code again:
AssertionError Traceback (most recent call last)
in
5 grads, cost = propagate(w, b, X, Y)
6
----> 7 assert type(grads[“dw”]) == np.ndarray
8 assert grads[“dw”].shape == (2, 1)
9 assert type(grads[“db”]) == np.float64
AssertionError:
- For the next question I have the following error code even though I have not done anything to change to affect the dimensions of the train and test tests,
ValueError Traceback (most recent call last)
in
----> 1 params, grads, costs = optimize(w, b, X, Y, num_iterations=100, learning_rate=0.009, print_cost=False)
2
3 print ("w = " + str(params[“w”]))
4 print ("b = " + str(params[“b”]))
5 print ("dw = " + str(grads[“dw”]))
in optimize(w, b, X, Y, num_iterations, learning_rate, print_cost)
35 # grads, cost = …
36 # YOUR CODE STARTS HERE
—> 37 grads, cost = propagate(w, b, X, Y)
38
39 # YOUR CODE ENDS HERE
in propagate(w, b, X, Y)
30 # cost = …
31 # YOUR CODE STARTS HERE
—> 32 A = sigmoid(np.dot(w.T, X) + b)
33 cost = -1./m* np.sum(Y*np.log(A) + (1-Y)*np.log(1-A))
34
ValueError: operands could not be broadcast together with shapes (1,3) (2,2)
and so is the last question.
Any help will be really helpful! Thanks