I don’t know where its going wrong …

Any help is appreciated

the error states:-

----> 1 assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, “Not the correct layer”

2 assert loss_function.from_logits, “Use from_logits=True”

3 assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”

4 assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”

5 assert metrics[0] == ‘accuracy’, “Wrong metric”

AssertionError: Not the correct layer

this is my code:

for layer in base_model.layers[:fine_tune_at]:

layer.trainable = False

# Define a BinaryCrossentropy loss function. Use from_logits=True

loss_function=True

# Define an Adam optimizer with a learning rate of 0.1 * base_learning_rate

optimizer = tf.keras.optimizers.Adam(learning_rate=0.1 * base_learning_rate)

# Use accuracy as evaluation metric

metrics=‘accuracy’