I have defined my callback with accuracy, but I don’t understand why is the accuracy going down over epochs
what input shape you have used?
go back and check your callback codes, on_epoch end and same shape as you got for data_shape.
Hope you have used Flatten layer with the input shape and another 2 dense layer.
I first use a flattened layer, input_shape=(28, 28), then three Dense layers. First dense layer 510 units, second 250 units, both with relu activation. Third Dense layer with 10 units and linear activation. i did not pass input-shape to any of the layers.
I passed on tf.keras.callbacks.Callback into the myCallback class, and self, epochs , logs= {} to the on_epock_end function, and set the self.model.stop_training to True
Resolved!
I should have used Softmax activation to get labels rather than probabilities with linear function