Hi! If you’re wondering about how backpropagation works, I created a lecture a year ago that walks through backprop visually, with simple math. No fancy calculus, and no linear algebra. This was a prototype lecture I was making for a PyTorch lecture series, but the concepts are the same whether you use PyTorch or TensorFlow or code it up with regular Python.
Here is the Google Colab with lecture videos embedded inside of it: Backpropagation with simple math Jupyter Notebook .
If you prefer to just watch the videos, here is the YouTube playlist: Backpropagation with simple math YouTube Playlist
-Eddy