Please help with Newtons Optimization method Understanding
When using newtons method which gives us zero of the function , we say need to change it a little in order to obtain the minimum of the function so we take derivate and second derivative of the function , but later we say that gradient if 0 tells that it can be the candidate of minimum and heissan matrix if value > 0 or value < 0 can be local minima or local maxima
so do we need to check all those local minimum and local maximum in order to calculate the global minimum or maximum
as per newtons optimization method
Hi @ayush.hakhu
When the gradient (first derivative) of the function is zero at a point, it indicates a potential stationary point (minimum, maximum, or saddle point). However, the gradient alone cannot determine whether the point is a stationary point.
The Hessian matrix consists of second-order partial derivatives of the function. If the Hessian matrix is positive definite, it indicates a local minimum. If it is negative definite, it indicates a local maximum. If it has both positive and negative eigenvalues, it indicates a saddle point.
Newton’s method is not guaranteed to find the global minimum or maximum of a function. It can converge to a local minimum or maximum depending on the initial guess and the function’s characteristics. To find the global minimum or maximum, multiple initial guesses may need to be tried, or alternative optimization methods such as gradient descent with random restarts or simulated annealing may be used.
2 Likes