Why isn't callback function working?

[snippet removed by mentor]

This is my code. The output am getting is as follows :

Epoch 1/10 60000/60000 [==============================] - 9s 155us/sample - loss: 0.2242 - acc: 0.9354 Epoch 2/10 60000/60000 [==============================] - 9s 157us/sample - loss: 0.0936 - acc: 0.9713 Epoch 3/10 60000/60000 [==============================] - 9s 157us/sample - loss: 0.0609 - acc: 0.9812s - Epoch 4/10 60000/60000 [==============================] - 9s 145us/sample - loss: 0.0440 - acc: 0.9855 Epoch 5/10 60000/60000 [==============================] - 9s 145us/sample - loss: 0.0334 - acc: 0.9890 Epoch 6/10 60000/60000 [==============================] - 9s 148us/sample - loss: 0.0253 - acc: 0.9917 Epoch 7/10 60000/60000 [==============================] - 9s 148us/sample - loss: 0.0187 - acc: 0.9943 Epoch 8/10 60000/60000 [==============================] - 8s 142us/sample - loss: 0.0166 - acc: 0.9942 Epoch 9/10 60000/60000 [==============================] - 9s 145us/sample - loss: 0.0123 - acc: 0.9960 Epoch 10/10 60000/60000 [==============================] - 8s 140us/sample - loss: 0.0105 - acc: 0.9964


([0, 1, 2, 3, 4, 5, 6, 7, 8, 9], 0.9964)

I am not getting, why callback function is not working.

1 Like

Changing ‘accuracy’ for ‘acc’ should make the trick.


thanks a lot it worked. But ‘acc’ and ‘accuracy’ are almost the same right ? what’s the difference ? why is it that one works and other doesn’t ?

1 Like

It’s exactly the same. Name ‘accuracy’ was introduced later in new versions. What’s important here is the concept behind and that does not change.

1 Like

the course’s previous code examples use ‘accuracy’ in the callback function, so it’s natural students continue using the full name in assignment, but this would result failure when submitting the assignment. I spent 40 min debugging my code, and only realize the cause is different versions of code.
shouldn’t the assignment notify this in the beginning notes? (and we appreciate the other notes including the epoch and accuracy)


Could you folks vote on this?


I ran into this same issue. I found it helpful to list the keys passed in the logs dictionary.

        def on_epoch_end(self, epoch, logs={}):
            keys = list(logs.keys())
            print("End epoch {} of training; got log keys: {}".format(epoch, keys))

In this context, logs is a Python dictionary using key-value pairs. In dictionary lookup, close doesn’t count…the key is either in the dictionary or it’s not. Which means that the key used in the callback lookup must match the one used in the model compile() statement when the metric was defined. You can use the string ‘acc’ or ‘accuracy’ as the key, but pick one and use it consistently because in this context they are most definitely not exactly the same.

In this case the code fragment provided above is:


so you have to use ‘acc’ as the index in the callback or this expression

if(logs.get('accuracy') != None

will return False

1 Like

My callback is not working
All 10 epochs run and all with ‘acc’ = 0.0987 - see image
I have used ‘acc’ throughout and NOT ‘accuracy’
What is the problem?

[snippet removed by mentor]

1 Like

Another post spells out how to fix when call back functioin is not working

1 Like

Please remove code from all your posts. It’s okay to share stacktrace(s) though.

You already pointed out the error i.e. using acc as a metric instead of accuracy.

Click my name and message your notebook and expanded grader feedback if the grader fails your submission.

1 Like

I have the same problem, I reach 99% accuracy but my call to the callback function does not exit the train execution but continues executing the 10 epochs

This is my callback function

and my train function

Could someone help me or give me a hit if my callcack or train function is wrong?

1 Like

Posting code in a public topic is discouraged and can get your account suspended. It’s okay to share stacktrace on a public post and send code to a mentor via direct message. Please clean up your recent reply.

Since = code isn’t clear, you should look at the ungraded labs (C1_W2_Lab_2_callbacks.ipynb) to see what boolean value is used to ensure that model does stop training. Here’s another link you’ll find useful.

1 Like

Fix the typo in self.model.stop_trainig to get the code to work.


Solved, thanks a lot

1 Like