Lecture notes missing backpropagation equations

The lecture notes for course 2 week 4 (https://community.deeplearning.ai/uploads/short-url/fc4IV56aXbmAu4qKlK0sh03NG9f.pdf) are missing the backpropagation equations:

From the video, this is what it should look like:

These are needed to complete section 2.5 Backprop of the assignment (where they are also not given).

Hi @Izak_van_Zyl_Marais,

We have a very clear note about missing information on the page.

As for the equations needed in the assignment. The code cell for back_prop exercise has enough information within to help implement the code.

Best,
Mubsi

Hi Mubsi,

I hope you are well, but I would have to politely disagree.

  1. The lecture video misses the formula for the gradient of b2.

  2. There are no comments aiding with the computation of the gradients in the code cell of the assignment.

  3. On a pedagogic note, discrepancies between formulas in different places make it harder for learners. The simplifications in the Jupyter Notebook, don’t really make anything simpler. They drop, among other things, a constant. But most importantly they confuse beginners by, for example, juxtaposing ReLU with the step function.

  4. I am also afraid your response sounds like an excuse and bad UX. People are paying for this course, so it would make sense to keep all materials updated and in sync with one another.

  5. As a wider note, people who have pointed out that something is genuinely missing about this assignment in this forum have been brushed aside rather than taken seriously. Sometimes the user is right.

P.S. Great job for posting this, Izak.

Hi Sia,

Thank you for your feedback.

Best,
Mubsi