Hi,
I’m confused in that I hit an error in section 3.3 - “Train the Model” of the programming exercise, when all prior code has executed and passed all tests. This despite the model requiring no additional code.
When I run the code block:
parameters, costs, train_acc, test_acc = model(new_train, new_y_train, new_test, new_y_test, num_epochs=100)
The error I get is:
ValueError: No gradients provided for any variable: [‘Variable:0’, ‘Variable:0’, ‘Variable:0’, ‘Variable:0’, ‘Variable:0’, ‘Variable:0’].
It would seem that the gradients I’m supplying to “optimizer.apply_gradients” aren’t defined, but I’m not sure why this occurs when the other code runs successfully. Any pointers greatly appreciated!
ValueError Traceback (most recent call last)
in
----> 1 parameters, costs, train_acc, test_acc = model(new_train, new_y_train, new_test, new_y_test, num_epochs=100)
in model(X_train, Y_train, X_test, Y_test, learning_rate, num_epochs, minibatch_size, print_cost)
74 grads = tape.gradient(minibatch_total_loss, trainable_variables)
75 print(grads) ##CAH added
—> 76 optimizer.apply_gradients(zip(grads, trainable_variables))
77 epoch_total_loss += minibatch_total_loss
78
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in apply_gradients(self, grads_and_vars, name, experimental_aggregate_gradients)
511 ValueError: If none of the variables have gradients.
512 “”"
→ 513 grads_and_vars = filter_grads(grads_and_vars)
514 var_list = [v for (, v) in grads_and_vars]
515
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in _filter_grads(grads_and_vars)
1269 if not filtered:
1270 raise ValueError("No gradients provided for any variable: s."
→ 1271 ([v.name for _, v in grads_and_vars],))
1272 if vars_with_empty_grads:
1273 logging.warning(
ValueError: No gradients provided for any variable: [‘Variable:0’, ‘Variable:0’, ‘Variable:0’, ‘Variable:0’, ‘Variable:0’, ‘Variable:0’].