DLS 2, Week 1, Assignment 3 - Gradient Checking final part


I did get the correct expected output in the end of assignment:

There is a mistake in the backward propagation!	difference = 0.2850931567761623

However, I did it, by calculating the difference using just the norms (not the quadratic form of those!) on purpose. Isn’t it an error in assignment? From my understanding the formula should contain the norms to the power of two, both in numerator and denominator. That’s how I calculated difference in the first half of the assignment and passed the grader. In the second half of the assignment calculating it like that didn’t pass the grader and I spent some time trying to find a bug until I realized that suddenly difference shouldn’t include numerator and denominator with **2. Am I missing something? Thank you!

Hi @Volodymyr_Bezguba

the norm here is called norm 2 but norm of power 2 so you didn’t want to make grad or gradapprox (both in numerator and denominator) in power 2 it is not correct you just use np.linalg.norm()

please feel free to ask any questions,

Actually you got it right, mate, as this mistake is intended by the course, and it is actually the expected output.
Later you will see an ungraded assignment that helps you find the mistake and correct it, if you want.