Exercise 7 in Week3 asks for an implementation of the function update_parameters, which has three arguments, namely: parameters, grads and learning_rate = 1.2. Within the function body we are asked to provide the gradients. However, the gradients of course require the training data. One obvious choice to calculate the gradient (following the logic of the notebook) is to use the previously defined function backward_propagation, like so:
grads = backward_propagation(parameters=copied_parameters, cache=cache, X=X, Y=Y)
However, the arguments of the sought-after function update_parameters does not provide these parameters, nor are they given inside the function body or optional.
This leaves me puzzled about how to satisfy the requirements of this exercise.