Week 2, Transfer Learning, Alpaca

I am using the code:
loss_function=tf.keras.losses.BinaryCrossentropy(from_logits=True)
optimizer = tf.keras.optimizers.Adam(0.1*base_learning_rate)
metrics=tf.keras.metrics.Accuracy()
model2.compile(optimizer=optimizer,
loss=loss_function,
metrics=metrics)

but get the error:

TypeError Traceback (most recent call last)
in
3 assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”
4 assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”
----> 5 assert metrics[0] == ‘accuracy’, “Wrong metric”
6
7 print(’\033[92mAll tests passed!’)

TypeError: ‘Accuracy’ object is not subscriptable.

Changing to metrics=[‘accuracy’] doesn’t help either.

You need to pass the optimiser by calling the function correctly. Follow this documentation.

in keras metric is list type. We have to pass metric as a list. Follow this documentation.

Hey, I think this is a tensorflow bug.
I used:

model2.compile(optimizer=optimizer,
loss=loss_function,
metrics=[tf.keras.metrics.Accuracy()])

and then :
model2.metrics
gave me an empty list:

Or u can just try metric =[“accuracy”]

yup, still gives an error

Try metrics=[“accuracy”] assign this and in model.fit use

metric=metrics

yup tried. I guess it’s a tensorflow bug. tf version 2.3.0 has that.

I am running into similar issues. Can anyone confirm whether or not this is a tensor flow bug?