C4 W4 A2 Maybe broken link Notebook neural style transfer

Exercise 6 - train_step
Implement the train_step() function for transfer learning

Use the Adam optimizer to minimize the total cost J.
Use a learning rate of 0.03.
Adam Optimizer documentation
You will use tf.GradientTape to update the image.
Within the tf.GradientTape():
Compute the encoding of the generated image using vgg_model_outputs
Compute the total cost J, using the variables a_C, a_S and a_G
Use alpha = 10 and beta = 40.

MAYBE BROKEN LINK AT tf.GradientTape()

(I meant to do only a report about this)

Have a nice day.

Thanks for your report. I’ll submit a support ticket to fix that.