Why do we update each weights and bias to minimize log loss in neural networks?
1 Like
We update the weights and bias values in order to reach the minimum cost.
2 Likes
means that each layer in the neural network runs multiple times in order to find the best weight value from one perceptron to another. Am I right?
please correct me. Thanks
1 Like
Yes.
1 Like
Sorry, please excuse and ignore these two messages. I posted them on the wrong thread.
My apologies.