C4 W4 A2 - Neural Style Transfer

I’m attempting to implement the tensor flow train_step() func. I keep getting the following error:
I’ve tried looking at the docs as well as the gradient tape lecture, but still can’t resolve. Please assist.

ValueError: in user code:

<ipython-input-133-effcf516998e>:34 train_step  *
    optimizer.apply_gradients([(grad, generated_image)])
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py:513 apply_gradients  **
    grads_and_vars = _filter_grads(grads_and_vars)
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py:1271 _filter_grads
    ([v.name for _, v in grads_and_vars],))

ValueError: No gradients provided for any variable: ['Variable:0'].
1 Like

Please click my name and message your notebook as an attachment.

Gradient tape is quite useful tool, but you need to be very careful that target variables need to be in the scope of “with tf.GradientTape() as tape:”.
In this case, J seems to be a problem. During waiting for balaji’s inspection, please check an indent of J =… It needs to be the same level as J_style, J_content,… i.e, within the scope of GradientTape.

Thanks. Checked the indentation is good–J does get computed within the context of the gradient. The call to tape is outside the context though… but the hard-coded python doesn’t set the persistent flag, so I didn’t change any code outside of place your code here.

The call to tape is outside the context though…

That’s OK. So, please wait for balaji’s check.

There’s no need to do this: generated_image = tf.Variable(tf.image.convert_image_dtype(content_image, tf.float32)) when invoking vgg_model_outputs. Use the function parameter generated_image so that gradients are correctly tracked.

1 Like

It worked :laughing: Thanks Balaji