In the first hidden layer, how does the neuron determine values for w,b ? Isn't gradient descent needed to do that?

2 Likes

Hi @Arisha_Prasain , the answer you are looking for is not explained wholly in this course. The algorithm which update the weights of hidden layers and input layer is known as Backpropagation algorithm. The comparison of prediction (y_predict) with real output (y) took place at output layer and the error is transmitted back to previous layers for updating their weights. This concept is discussed in detail in Neural Networks and Deep Learning course.
Here is a 3Blue1Brown video explaining how the algorithm works.

2 Likes

Thank you for sharing the resource! You mean, the deep learning specialization offered by Deep learning.AI also explains this, right?

@Arisha_Prasain yes, It is explained in detail in the first course of deep learning specialization.

2 Likes