Transfer_learning_with_MobileNet_v1 Assignment 2 of week 2Exercise 3.3

I don’t know where its going wrong …
Any help is appreciated
the error states:-
----> 1 assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, “Not the correct layer”
2 assert loss_function.from_logits, “Use from_logits=True”
3 assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”
4 assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”
5 assert metrics[0] == ‘accuracy’, “Wrong metric”

AssertionError: Not the correct layer

this is my code:
for layer in base_model.layers[:fine_tune_at]:
layer.trainable = False

Define a BinaryCrossentropy loss function. Use from_logits=True

loss_function=True

Define an Adam optimizer with a learning rate of 0.1 * base_learning_rate

optimizer = tf.keras.optimizers.Adam(learning_rate=0.1 * base_learning_rate)

Use accuracy as evaluation metric

metrics=‘accuracy’

for defining BinaryCrossEntropyLoss we have to use function from tf.keras as below
loss_function=tf.keras.losses.BinaryCrossentropy(from_logits=True)

Optimizer is correct in your code.

Accuracy should be in a list as below.
metrics=[‘accuracy’]

5 Likes