W2_A2_Ex-6_Understanding computation_Grads = {"dw": dw, "db": db}

Why are we doing this in the propagate function
grads = {“dw”: dw,
“db”: db}
and then in the optimization function

Retrieve derivatives from grads

    dw = grads["dw"]
    db = grads["db"]  

and how I get up to the definition that the weight and bias are respectively
w = w - learning_ratedw
b = b - learning_rate
db

Thks for any help

Ok I will give you some guidance:

here you are storing the changes in weights and bias in a dictionary using the key:value pair, read about python dictionaries online to get an idea.

and here you are reading them back.

I think in this point if you follow the classes you should be able to understand this, whats basically happening here is you are updating both the weights and bias by subtracting from them the changes (dw) multiplied with the amount of learning rate.

Go again back to the classes if you follow them properly you should be in position to understand them.