Hello All,
My apologizes as I know I am doing something silly, but I am receiving following error when using gradient_check:
There is a mistake in the backward propagation! difference = 0.9999999999999799
Forward Prop function (ex1) => theta * x
Backward Prod function (ex2)=> dtheta = x (as stated in the exercise prompt)
gradient_check function is as follows:
# moderator edit: code removed
Am I over looking something here? Thank you in advance!