TypeError: 'Accuracy' object is not subscriptable

UNQ_C3

base_model = model2.layers[4]
base_model.trainable = True

Let’s take a look to see how many layers are in the base model

print("Number of layers in the base model: ", len(base_model.layers))

Fine-tune from this layer onwards

fine_tune_at = 120

START CODE HERE

Freeze all the layers before the fine_tune_at layer

for layer in base_model.layers[:fine_tune_at]:
layer.trainable = False

Define a BinaryCrossentropy loss function. Use from_logits=True

loss_function=tf.keras.losses.BinaryCrossentropy(from_logits=True)

Define an Adam optimizer with a learning rate of 0.1 * base_learning_rate

optimizer = tf.keras.optimizers.Adam(learning_rate=base_learning_rate * 0.1)

Use accuracy as evaluation metric

metrics=tf.keras.metrics.Accuracy()

END CODE HERE

model2.compile(loss=loss_function,
optimizer = optimizer,
metrics= metrics)


assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, “Not the correct layer”
assert loss_function.from_logits, “Use from_logits=True”
assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”
assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”
assert metrics[0] == ‘accuracy’, “Wrong metric”

print(’\033[92mAll tests passed!’)

output

TypeError Traceback (most recent call last)
in
3 assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”
4 assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”
----> 5 assert metrics[0] == ‘accuracy’, “Wrong metric”
6
7 print(’\033[92mAll tests passed!’)

TypeError: ‘Accuracy’ object is not subscriptable

1 Like

I’m running into this same error.

I have metrics setup the same as you. I also tried with the arguments (name='accuracy) and it was the same.

I found the answer here: Convolutional Neural Network (2 week, programming exercise 2) - #2 by fabhertz

1 Like

How can u show that or explain the code

metrics=[‘accuracy’]

3 Likes

Hey I did the same thing and passed the test, but isn’t this a bug? I would argue some real operation to be executed here.