I am trying to understand in neural network model in layers especially first one how algorithm decide to take which values for w in equation z = w*x+b and b also. x is input and usually sigmoid(z) or other activation function is output. But how algorithm decides values for w and b for neurons?

Hello @MuratAn

Welcome to our community.

The algorithm starts with a random value for all the weights. b can be initialized to a random value or 0. From there on, the Gradient descent Algorithm will modify the values of w and b for each iteration, until the convergence criterion is met (or the number of stipulated iterations is completed).

2 Likes

The weight values are updated during training the NN. The method is called â€śbackpropagationâ€ť, if you want to research it online.

This course doesnâ€™t discuss the NN training process in detail. Itâ€™s provided for you automatically when using a ML package like TensorFlow.

2 Likes

Thank you guys for response. I understood now.