W2 A2 Fine-tuning the model, metrics problem

Hello!

I have a problem in ‘Transfer Leasrning’ assignment with the exersice 3.

I’ve used this metric ‘tf.keras.metrics.Accuracy(name=‘accuracy’)’, but i have problem with assertion, because it says that ‘Accuracy’ object is not subscriptable.
I don’t know what to try next, so I will be very thankful for any help

Sincerely,
Anastasia

@anastasia, try checking if you have any extra commas in your lines of code.
If “metrics” is the last layer, then it might not need a trailing comma.

I don’t really have any commas, there is bigger peace of code:

loss_function=tf.keras.losses.BinaryCrossentropy(from_logits=True)
optimizer = tf.keras.optimizers.Adam(learning_rate=(0.1*base_learning_rate))
metrics=tf.keras.metrics.Accuracy()

model2.compile(loss=loss_function,
optimizer = optimizer,
metrics=metrics)

Then there is an assertion, where I have a mistake ‘‘Accuracy’ object is not subscriptable’
assert metrics[0] == ‘accuracy’, “Wrong metric”

Hey, you need to pass the accuracy metric in a list as tensorflow expects a list of the metrics in the argument list of compile().

And as it has been used in the previous assignments (and one time before in this assignment as well), you just need to use the string ‘accuracy’ rather than tf.keras.metrics.Accuracy(), as the test checks for the string.

1 Like

Thanks a lot!! :smile: :smile: :smile: :smile: :smile: :smile: :smile: :smile: :smile: :smile:

Hey!
Thank you for the answer! :slight_smile:
So just to be sure, to call tf.keras.metrics.Accuracy() is equivalent to call [‘accuracy’] but just doesn’t pass the assert test, right? :face_with_monocle:

1 Like

You’ll have to still put the tf.keras.metrics.Accuracy() in a list so the equivalent statements would be [tf.keras.metrics.Accuracy()] and [‘accuracy’], but yes :slightly_smiling_face:.

Another fun thing the metrics argument allows is to let you pass your own custom metrics you can build by extending tf.keras.metrics.Metric() and adding that to the metrics list.
So if I made a f1 score custom metric named f_1, I could easily add it by just passing metrics = ['accuracy', f_1] within compile.

1 Like

Ok, thanks for the quick answer!
I just went in back after a few months away and the new assignments are interesting (and better imo)!