Convolutional Neural Network (2 week, programming exercise 2)

Hi,

I am trying to finish the second programming exercise from week 2 (CNN course).
I think I am done, but I get the following error over and over:

TypeError: ‘Accuracy’ object is not subscriptable

The last code snipped is as follows:

for layer in base_model.layers[:120]:
layer.trainable = False
loss_function = tf.keras.losses.BinaryCrossentropy(from_logits = True)
optimizer = tf.keras.optimizers.Adam(learning_rate = 0.1 * base_learning_rate)
model2.compile(loss=loss_function,
optimizer = optimizer,
metrics=[“accuracy”])

I have also tried to add a line:
metrics = tf.keras.metrics.Accuracy(name = “accuracy”)

but it does not appear to solve the problem.

Many thanks in advance.

Kind regards,

jose.

1 Like

Look at how the metrics is specified in model2.compile(), and you may notice a difference with your implementation :slight_smile:

6 Likes

Hi Fabhertz,

Many thanks for your quick response. I have just adapted the code and now I get the “All tests passed”. Nevertheless, when I submit the exercise, I only get a 33/100. I guess that I have done something wrong, but I don’t get any error message in the python notebook.

How can I solve this?

Many thanks in advance!

Kind regards,

Jose.

Specifically, if I open the submission report I get the following two errors:

Tests failed on 2 cell(s)! These tests could be hidden. Please check your submission.

The following cell failed:

from test_utils import summary, comparator

alpaca_summary = [['InputLayer', [(None, 160, 160, 3)], 0],
                    ['Sequential', (None, 160, 160, 3), 0],
                    ['TensorFlowOpLayer', [(None, 160, 160, 3)], 0],
                    ['TensorFlowOpLayer', [(None, 160, 160, 3)], 0],
                    ['Functional', (None, 5, 5, 1280), 2257984],
                    ['GlobalAveragePooling2D', (None, 1280), 0],
                    ['Dropout', (None, 1280), 0, 0.2],
                    ['Dense', (None, 1), 1281, 'linear']] #linear is the default ac...

comparator(summary(model2), alpaca_summary)

for layer in summary(model2):
    print(layer)

The error was:

** ---------------------------------------------------------------------------**
** NameError Traceback (most recent call last)**
** in **
** 10 [‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the de…**
** 11 **
** —> 12 comparator(summary(model2), alpaca_summary)**
** 13 **
** 14 for layer in summary(model2):**


** NameError: name ‘model2’ is not defined**

==========================================================================================
The following cell failed:

assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, "Not the c...
assert loss_function.from_logits, "Use from_logits=True"
assert type(optimizer) == tf.keras.optimizers.Adam, "This is not an Adam optimizer"
assert optimizer.lr == base_learning_rate / 10, "Wrong learning rate"
assert metrics[0] == 'accuracy', "Wrong metric"

print('\033[92mAll tests passed!')

The error was:

** ---------------------------------------------------------------------------**
** NameError Traceback (most recent call last)**
** in **
** ----> 1 assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, "N…**
** 2 assert loss_function.from_logits, “Use from_logits=True”**
** 3 assert type(optimizer) == tf.keras.optimizers.Adam, "This is not an Adam op…**
** 4 assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”**
** 5 assert metrics[0] == ‘accuracy’, “Wrong metric”**


** NameError: name ‘loss_function’ is not defined**

1 Like

Thank you for the tip but I still don’t understant why the tf.keras.metrics.Accuracy(name = “accuracy”) won’t work here! ^^"

3 Likes

solve or, not? I am stuck here also

metrics=[‘accuracy’]
write it.

5 Likes

Can anyone give an explanation? I still don’t understand why the tf.keras.metrics.Accuracy(name = “accuracy”) doesn’t work.

Why does this work and the other doesn’t? Is it the grader or something I should understand for when I try this out in my own projects?
I’m referring to using [‘accuracy’] instead of tf.keras.metrics.Accuracy

metrics is a string that is subsequently passed into the tf.keras.metrics.Accuracy function call.

these quizes and practical labs are riddled with trick questions :rofl: its as used in model2.compile not as literally seen in the tf docs :sweat_smile: