How does Back propagation works?

I’m trying to make a Neural Network without any libraries but it seems like back propagation was not really explained in the course. I think it’s one of the most important topics so, do you have any suggestions? Any help is appreciated.

It’s mathematically quite complicated, because it involves using calculus to compute the partial derivatives of all of the weight values.

Here is one tutorial on the topic. Be aware that it doesn’t directly apply to this course, because they’re using a different cost function. But this gives you a flavor of the complexity.

The math is outside the scope of this course. Since we’re using TensorFlow to handle backpropagation, presenting the math within the course isn’t included.

1 Like