Week 2 assignment: callback

Hi ML friends!

I’m confused about the code below in bold. Perhaps someone can explain. I understand that logs.get(‘accuracy’)>.99 should trigger cancelling training, but why is there another condition here? I’m not sure what it should be. I have initially not provided any of my code, so as to not violate any community standards. I am asking a conceptual question that should not require code. Thanks!

class myCallback():
# Define the correct function signature for on_epoch_end
def on_epoch_end(None, None, None=None):
if logs.get(‘accuracy’) is not None and logs.get(‘accuracy’) > 0.99:
print(“\nReached 99% accuracy so cancelling training!”)

Thanks!
Dan

As far as my memory serves right, the grader invokes this callback even before the 1st epoch i.e. the training starts. I’m guessing this is based on a much older version of tensorflow i.e. < 2.7. As a result, you’ll get an error when you compare None and .99.

As per the tensorflow framework >= 2.7, this method will be invoked only after the 1st epoch. It’s safe to directly check for logs['accuracy'] inside the callback.

I too was confused by the “is not ____” in the if statement. I used 0 (zero) and passed the lab but don’t know what was expected or why.

When the framework invokes the on_epoch_end callback before end of 1st epoch, logs.get(any_key) will yield None. So, please check for None and not for 0