Course 1: Week 2 Logistic_Regression_with_a_Neural_Network_mindset

why are we creating deep copy of w and b and saving them into themselves in optimize function?
def optimize(w, b, X, Y, num_iterations=100, learning_rate=0.009, print_cost=False):
“”"
This function optimizes w and b by running a gradient descent algorithm

Arguments:
w -- weights, a numpy array of size (num_px * num_px * 3, 1)
b -- bias, a scalar
X -- data of shape (num_px * num_px * 3, number of examples)
Y -- true "label" vector (containing 0 if non-cat, 1 if cat), of shape (1, number of examples)
num_iterations -- number of iterations of the optimization loop
learning_rate -- learning rate of the gradient descent update rule
print_cost -- True to print the loss every 100 steps

Returns:
params -- dictionary containing the weights w and bias b
grads -- dictionary containing the gradients of the weights and bias with respect to the cost function
costs -- list of all the costs computed during the optimization, this will be used to plot the learning curve.

Tips:
You basically need to write down two steps and iterate through them:
    1) Calculate the cost and the gradient for the current parameters. Use propagate().
    2) Update the parameters using gradient descent rule for w and b.
"""

**w = copy.deepcopy(w)**

** b = copy.deepcopy(b)**

costs = []

for i in range(num_iterations):
    # (≈ 1 lines of code)
    # Cost and gradient calculation 
    # grads, cost = ...
    # YOUR CODE STARTS HERE
    grads, cost =propagate(w, b, X, Y)
    
    # YOUR CODE ENDS HERE
    
    # Retrieve derivatives from grads
    dw = grads["dw"]
    db = grads["db"]
    
    # update rule (≈ 2 lines of code)
    # w = 
    # b = ...
    # YOUR CODE STARTS HERE
    w=w-learning_rate*dw
    b=b-learning_rate*db
    # YOUR CODE ENDS HERE
    
    # Record the costs
    if i % 100 == 0:
        costs.append(cost)

    # Print the cost every 100 training iterations
    if print_cost and i % 100 == 0:
        print ("Cost after iteration %i: %f" %(i, cost))

params = {"w": w,
          "b": b}

grads = {"dw": dw,
         "db": db}

return params, grads, costs

I have the same problem

Please see this thread for the explanation.