### Multivariable Optimization using The Second Derivative Test Example 1

We can use a tool called the second derivative test to classify extreme points in a multivariate function. Before, calculus with one variable just involved finding the first and second derivative of the function. Now, we need to find all the partial derivatives and second derivatives to classify the extreme points.

Find all the local maximums, minimums, and saddle points for \(f(x,y) = {y^3} + 3{x^2}y - 6{x^2} - 6{y^2} + 2\)

The first step is to find the first partial derivatives, set each to zero, and find as many of the extreme points as possible. I will use f with s subscript to denote the partial derivatives here.

\[{f_x} = 6xy - 12x\]

\[6xy - 12x = 0\]

From this, x = 0 or y = 2.

\[{f_y} = 3{x^2} + 3{y^2} - 12y\]

\[{f_y} = {x^2} + {y^2} - 4y\]

\[{x^2} + {y^2} - 4y = 0\]

\[{y^2} - 4y = - {x^2}\]

We will need to substitute x = 0 and y = 2 into this equation to find the extreme points.

First let's do x = 0:

\[{y^2} - 4y = 0\]

So when x = 0, y = 0 and y = 4.

Now do this for y = 2 and solve for x:

\[{x^2} + {(2)^2} - 4(2) = 0\]

\[{x^2} = 4\]

So when y = 2, x = 2 and x = -2.

So all the extreme points are (0,0), (0,4), (2,2), and (-2,2). When working with a complicated function and asked this on an exam, be careful and look for as many solutions as possible. I've seen students often forget an extreme point entirely, resulting in many lost marks.

The next step is to find all of the second derivatives. Note that \({f_{xy}} = {f_{yx}}\) at our level of study, which just means if you differentiate x then y is the same if you differentiated y then x.

\[{f_{xx}} = 6y - 12\]

\[{f_{yy}} = 6y - 12\]

\[{f_{xy}} = 6x\]

The second derivative test is:

\[D = {f_{xx}} \cdot {f_{yy}} - {\left[ {{f_{xy}}} \right]^2}\]

And you check the sign of D for each possible point.

If D>0 and \({f_{xx}}\) < 0, the point is a local maximum.

If D>0 and \({f_{xx}}\) > 0, the point is a local minimum.

If D<0, the point is a saddle point.

If D=0, the test is inconclusive and the point can be a local maximum, minimum, or saddle point.

Let's try the second derivative for the point (0,4) for example:

\[D(0,4) = (12)(12) - {(0)^2}\]

\[D(0,4) = 144\]

D is positive (144) and fxx is positive (12) so the point (0,4) is a local minimum.

Repeating for the point (0,0), we get D = 144 and fxx < 0, so (0,0) is a local maximum.

Checking (2,2) and (-2,2) result in D=-144, meaning that those two points are saddle points.

Thus we've analyzed the multivariate function's extreme points using the second derivative test. The main difficulty in these problems is making sure you solved for all the extreme points and remembering how to do partial derivatives.

Please see our other worked-out examples of multivariable optimization using either Lagrange multipliers or the second derivative test.

Lagrange Multiplier Examples:

Lagrange Multipliers Optimization Ex2

Lagrange Multipliers Optimization Ex3

Return to the Multivariable Optimization hub page:

Multivariable Optimization