Course 1 week 2 gradient descent

Are the weights w and b simultaneously updated or non simultaneous updation of weight

You have to update both w and b. If by simultaneously you mean at the same time (i.e. same step), the answer is yes. There is no parallel programming involved though :slight_smile:

# update rule (≈ 2 lines of code)
# w = ...
# b = ...

They simultaneously updated what Dr Andrew says about in ML course