When is cost function calalculated in neural network forward prop or back prop. Andrew NG says it is calculated in forward prop but chatgpt says it is calculated in back prop.
Well prof.Andrew Ng’s statement is correct.
you feed input data through the neural network to make predictions. The input data passes through each layer, and the weighted sum of inputs is calculated at each neuron, followed by the application of an activation function. Eventually, you get an output from the network. During forward propagation, you calculate the cost or loss function, which measures the difference between the network’s predictions and the actual target values. The cost function quantifies how well or poorly the network is performing with the given parameters.
And during back prop you update its parameters (weights and biases) to minimize the cost function. So backpropagation involves computing gradients of the cost function with respect to the network’s parameters. These gradients are then used in gradient descent or other optimization algorithms to adjust the parameters in a direction that reduces the cost.
I recommend you avoid using chat tools as a study aide. They’re not reliable (as you have discovered).