in this slide, I don’t get why is the derivative of u with respect to b is 2 instead of c, in the real time computation, since b and c are both variables, they should not be fixed number and should be constantly changing over time, does it mean that with each iteration in gradient descent, we need to recompute the derivatives of them(b,c) over and over again?

Well, I don’t know from which video this slide is. It would be better if you shared a link of a video too.

But I can see that c = 2 is mentioned here. So, the derivative of u with respect to b is c but c is equal to 2. Without knowing the proper background of a video, I cannot comment anymore.

Best,

Saif.

It is from the course 1 module 2 - computational graph of deep learning specialization

Hello @WONG_Lik_Hang_Kenny,

However, to actually compute the derivative, it means that we need to know the value of the answer. \frac{\partial u}{\partial b} = c = 2.

Given from the slide that the values of a, b, c are provided, it is natural that we actually compute the answers out.

Were you thinking that the purpose of the discussion is to find the formulae of all of the gradients in variable forms, so that the forms can be reused over iterations? If so, was that said also in the lecture and at what time mark?

Cheers,

Raymond