Why calculate the eigenvalues of the Hessian for determining the concavity of a matrix?

If for one variable the point f(x) is a local minimum if f’'(x) > 0 and vice versa, wouldn’t f(x,y) be a local minimum of a function with multiple variables if f_xx(x) > 0 and f_yy(y) > 0 at that point and vice versa? So wouldn’t we just have the look at the elements in the main diagonal of the Hessian matrix?

Is this true for all cases? This seemed to be true for the examples shown in the lecture, but I wanted to know what the point is for calculating the eigenvalues of the Hessian matrix.

You are asking about what’s called the second partial derivative test, in which you check whether det(Hessian) is positive or negative and whether f_xx is positive or negative. See the info in the link below.

1 Like

Thank you for you answer. I guess it only works with two variables, and that the general form looks at the eigenvalues.