Question on gradient tape

In “Introduction to TensorFlow” exercise, forward problem takes parameters but trainable_variables = [W1, b1, W2, b2, W3, b3]:

    W1 = parameters['W1']
    b1 = parameters['b1']
    W2 = parameters['W2']
    b2 = parameters['b2']
    W3 = parameters['W3']
    b3 = parameters['b3']

    ...            

            with tf.GradientTape() as tape:
                Z3 = forward_propagation(tf.transpose(minibatch_X), parameters)
                minibatch_total_loss = compute_total_loss(Z3, tf.transpose(minibatch_Y))

            train_accuracy.update_state(minibatch_Y, tf.transpose(Z3))
            trainable_variables = [W1, b1, W2, b2, W3, b3]

This is a bit confusing. Do I understand correctly that Tensorflow tracks that W1, … used in trainable_variables reference same objects as what forward_propagation grabs from parameters?

Yes.