hello, I have a problem calculating the cost function by using the propagate (w, b, X, Y) method it keeps saying wrong values for costs. has anyone faced this?

Here are my outputs for that test case:

```
w = [[0.19033591]
[0.12259159]]
b = 1.9253598300845747
dw = [[0.67752042]
[1.41625495]]
db = 0.21919450454067654
Costs = [array(5.80154532)]
All tests passed!
```

But notice that the test you failed has 2 cost values returned. The first one agrees, which was after 0 iterations, but the second one after 100 iterations is wrong. Note also that your dw and db values are correct and I assume your *propagate* code passes its tests. So that means that your gradients dw and db are not the problem.

So what else is new in *optimize*? It is the updating of the parameters, right? I would suggest that is the first place to look for mistakes. Are you sure you faithfully implemented what the math formulas are telling you to do for the “update” case?

*Update*: I figured out how to get exactly the wrong value you show for the second cost:

`AssertionError: Wrong values for costs. [array(5.80154532), array(1.05593344)] != [5.80154532, 0.31057104]`

There are two things different about the second test case that is failing: they use 101 iterations instead of 100 (that’s what causes you to get 2 cost values) and they use the learning rate of 0.1. But if I use the default learning rate of 0.009, then I get the result I showed above. So somehow you are over-riding or “hard-coding” the learning rate. You can examine the test cases by clicking “*File → Open*” and then opening the file *public_tests.py* and having a look around.

yeah right. the error was raised due to the reason that I used the row value of learning rate, “0.009” instead of “learning_rate” variable in my code. thank you for your help.