Course 4 Week 3 Strange things with U-net

The solution surfaces for deep neural networks are incredibly complicated. There is never any guarantee that you will get smooth monotonic convergence: you can go “off a cliff” at any point in the process. The other thing to note here is that the process is not deterministic in the notebook as written. I’ve even tried setting the TF global random seeds here and still find that just doing “Kernel → Restart and Clear Output” and then “Cell → Run All” I do not see consistent results from run to run. Here’s another thread about this from a while back. I’ve been meaning to do more research about how the Keras model “fit” process works and if there’s a way to make it deterministic, but I have not gotten back to that. If anyone has time to pursue this, please share anything that you learn!