I already checked each step one by one even if test were passing. I spent around 4 hours but didn’t find
My code for gradient descent is simple as it should be.
Blockquote
for i in range(num_iterations):
# (≈ 1 lines of code)
# Cost and gradient calculation
# grads, cost = …
# YOUR CODE STARTS HERE
grads , cost = propagate(w, b, X, Y)
# YOUR CODE ENDS HERE
# Retrieve derivatives from grads
dw = grads["dw"]
db = grads["db"]
# update rule (≈ 2 lines of code)
# w = ...
# b = ...
# YOUR CODE STARTS HERE
w = w - learning_rate * dw
b = b - learning_rate * db
# YOUR CODE ENDS HERE
Blockquote