There was really strange thing happen in U-net programming assignment. When I trained my model, during epoch 39, the accuracy suddenly dropped from 0.97 to 0.42.I got the highest score for this task because it is unlikely that the problem is in the code. What could be the reason?
Screenshots:
Strange, maybe some error in calculations and memory storage.
The solution surfaces for deep neural networks are incredibly complicated. There is never any guarantee that you will get smooth monotonic convergence: you can go “off a cliff” at any point in the process. The other thing to note here is that the process is not deterministic in the notebook as written. I’ve even tried setting the TF global random seeds here and still find that just doing “Kernel → Restart and Clear Output” and then “Cell → Run All” I do not see consistent results from run to run. Here’s another thread about this from a while back. I’ve been meaning to do more research about how the Keras model “fit” process works and if there’s a way to make it deterministic, but I have not gotten back to that. If anyone has time to pursue this, please share anything that you learn!
Yeah from theory it can be possible that the optimization function to get out of a local mimima, you have made meaningful points here and also the results can not be reproducible from run to run makes sense. I had thought the system would be more robust as you mention in adjusting the learning rate for example. NN are very complicated thats true.
This also proves to me the existence of the element of irationality in any numerical system because irationality is embedded within the system itself. I mean the way maths segments nature is not really rational, it does not exist in reality.
It is a good question whether the optimization algorithms that are provided by TF/Keras are doing more sophisticated management of the learning rate, rather than using a fixed learning rate. I would hope and expect that they are doing something more sophisticated “under the covers”. But even with that, you can’t guarantee smooth convergence apparently.
The philosophical question of whether math describes reality or not is pretty deep waters. I’d say reality is just a “low res” version of what the math describes. But that’s beyond the scope of the course here and “above my pay grade”. 
1 Like
Also if you’re talking mathematics, you have to be a bit cautious how you use the term “rational”. That has technical meaning in math and is a serious deal. Irrational and transcendental also mean very specific things in math which do not map to the colloquial meanings of those terms. 
1 Like
Your input is most welcome and very valuable, you are right about using the terms with caution, but it was indented more as a philosophical input.