Greetings,

I have a question about the "forward_propagation (X, parameters) " python function in the assignment of the third week. The function uses the “parameters” randomly initialized in the “initialize_parameters(n_x, n_h, n_y)” function. However, shouldn’t the “forward_propagation” function be fed with unknown parameters W1, b1, W2 and b1, so that at the end, we have our formula of the cost function? Shouldn’t the parameters be initialized when we implement the gradient descent algorithm, not before that?

Hi @Ayoub , we meet again. No sure exactly what you mean by “unknown parameters W1, b1, W2 and b1”. You need to have some values to start the forward propagation step. Otherwise you can not do the calculation in the first forward propagation step. Isn’t that the same as randomly initializing, or do you mean something different?

Regards Stephanus

Yep. We meet again.

In fact, what isn’t clear to me yet is “You need to have some values to start the forward propagation step. Otherwise you can not do the calculation in the first forward propagation step. Isn’t that the same as randomly initializing,”.

We know in fact that the variables of any function are symbols which work as a placeholder for expression or quantities that may vary or change. Thus, in order to define J, we need to define the predictions A2, which are functions in four variables (W1, b1, W2 and b1).

But now after a deeper thinking, I think that the "forward_propagation (X, parameters) " function is meant to define the predictions the way I am thinking it should be, and at the same time, evaluate a particular output of a particular randomly chosen input X.