{moderator edit; image of quiz question removed}
I’m not a student of that course, but they must have given you the formula for gradient descent. You just apply that formula with the values that they give you. What do you get when you do that?
is it one formulae , “Gradient descent” and when can i use it ?
@Areej_Sa:
You’ve posted in a General Discussion forum. Mentors for your course are unlikely to find your thread here.
I think you should move the thread to the correct forum area for whatever course you are attending.
You can use the “pencil” icon in the thread title to move your thread.
thank you
Yes, it is one formula and you can use it anytime you are doing back propagation to get a good solution for your network. But note that the formula is “general” in the sense that it uses the derivatives of the functions in your particular network and those can be different depending on what kind of a problem your network solves (classification or regression). Note that they actually gave you the function and its gradient in the question. So all you have to do is plug in the values in the formula with the given functions.
If they are asking the question the way they did, they must have covered this formula in the lectures. Did you watch them?
The answer you chose in the quiz is the gradient calculation. That’s not what you were asked to do.
The question asked you to compute the gradients, then apply the learning rate, and compute the value of x_1.
This is explained in the C2 W2 video titled:
“Optimization using Gradient Descent in two variables - Part 2”.
thanx so much, i watched them but they made me conduced in case of if how can i solve the equation by this method or not
thank you