Optional lab - Gradient Descent

Greeting everyone,
I could not find the definition (def) of gradient_fanction(x, y, w, b). Is it my bad sight or it is really missing from the notebook of this lab?
Thanks in advance, T.G.

Hello @tsvika_greener, welcome to our community!

I suppose you are asking about the gradient_function variable used as the input argument here

if so, and since it is an input argument, we need to look at where the function gradient_descent is being called, which for example, can be this one:

and we see that compute_gradient is the name of the variable which content is assigned to be gradient_function, and following this lead, we can finally get the def line for compute_gradient by searching it in notebook.

Cheers,
Raymond

Hi @rmwkwok and thanks for your rapid response. Nevertheless, according to the comment of gradient_descent() function, gradient_function is a function gradient_function() - please see the attached screen captures. That is later on gradient_descent() function calls this function gradient_function() - - please see the attached screen captures.



Now I am completely confused since I could not find the definition of that function - gradient_function()
Could you please help?
Thanks in advance, T.G.

Hi T.G., if we can focus on your last screenshot, gradient_function is an input argument when you call gradient_descent, is that right?

Hi @rmwkwok , yes you are right :slight_smile: , but then again in the comments of gradient_descent() function in the Args section there is a line where its write that gradient_function is a function to produce gradient (I marked it with blue box in my last screenshot). In addition, the for loop (in my first screenshot) calls gradient_function(x, y, w, b) with the input arguments x, y, w, b (I also marked it with blue box in my first screenshot).
Thanks T.G.

T.G., an input argument can be a value or a function, because both value and function are named by a variable name.

OK, so because functions in Python are objects we can pass them as arguments to another function - agree :slight_smile:
So now back to my original question, where is the (definition) def gradient_function(x, y, w, b) written in the notebook of this lab? I did not find it :thinking:
Thank you for your patience and understanding

Hi, T.G., can you show here a screenshot of when the function gradient_descent is being called, and highlight the variable that’s passed into that call as the last argument? And do you know why this will help answer your question?

Sorry for the confusion, I meant gradient_function(x, y, w, b) as I have previously referred in my messages. I could not find where is the (declaration) def gradient_function(x, y, w, b) of this function in the notebook of this lab:
https://www.coursera.org/learn/machine-learning/ungradedLab/lE1al/optional-lab-gradient-descent
Also, please see the attached screenshots:


Thanks T.G.

Hello T.G.,

This is not a confusion. There is a reason I asked you what variable is passed into the function gradient_descent, which is because it will lead you to the answer you want. If you don’t follow me, we can’t make any progress.

So far you have asked your question for 3 times, and you question is pretty clear. Now, please try to answer my following 4 questions, and do not repeat your question for the 4th times.

  1. In the following scenario of only 4 lines of code, where do you find the declaration of the variable ‘name’?
student = 'sally'

def print_student_name(name):
    print(name)

print_student_name(student)
  1. You ask about gradient_function, but I believe you are also aware that gradient_function is called inside another function named as gradient_descent, do you agree?

  2. If you agree, gradient_function is not the only variable being used inside gradient_descent, do you agree?

  3. If you agree, then, I would tell you, apart from gradient_function, the following are also used: cost_function, num_iters, alpha, w_in, and b_in, now, my question, can you pick any one of them, and tell me where it is declared?

T.G. @tsvika_greener, there is no confusion, so please don’t repeat your question again, and I ask you please move your attention away from your exact question for a just a little while, and try to answer my 4 questions so that we can make some progress. There is nothing I can do if you keep thinking there is a confusion, and just repeat your question over and over again.

If you don’t answer my questions, and just repeat your question, I am afraid I will have to leave your question to other mentors, so please try, ok? :wink:

Raymond

Dear @rmwkwok I appreciate your your patience and understanding very much.
Regarding to your questions:

  1. Agreed
  2. Agreed
  3. Agreed
  4. Yes, you made your point and I think that I got it - thanks
    So just last thing, from the implementation of Gradient Descent (screenshot is attached) it is written that gradient_descent(…) is utilizing compute_gradient(…) and comput_cost(…)

    I am curious to know if the argument cost_function in gradient_decent(…) is actually the function comput_cost(…) that was defined before then and the argument gradient_function in gradient_decent(…) is actually the function compute_gradient(…) that was defined before then?
    Thank you very much and I appreciate your efforts to clarify it to me. T.G.

Hi T.G.

Yes, but I hope you know that you can verify that yourself by examining the code.

I am copying 2 lines of code here and you can find them in the lab.

  1. def gradient_descent(x, y, w_in, b_in, alpha, num_iters, cost_function, gradient_function)
  2. w_final, b_final, J_hist, p_hist = gradient_descent(x_train ,y_train, w_init, b_init, tmp_alpha,
    iterations, compute_cost, compute_gradient)

The first line is the line that defines gradient_descent, and the second is the line that calls the gradient_descent.

Now, we focus on the arguments:

  1. x, y, w_in, b_in, alpha, num_iters, cost_function, gradient_function
  2. x_train ,y_train, w_init, b_init, tmp_alpha, iterations, compute_cost, compute_gradient

Do you see that the numbers of arguments are the same? They are actually one-one corresponded. Knowing how to verify it yourself is more important for the future.

Raymond

2 Likes

Many thanks @rmwkwok

1 Like

You are welcome T.G. :slight_smile:

Raymond