# 5 Optimization

## 5.1 First and second derivative tests

Read Stewart Chapter 14, Thomas Chapter 14,

We will study multivariable scalar functions \[ f: D \to \mathbb{R}\,,\] where \(D\subseteq \mathbb{R}^n\), \(n\geq 2\).

**Definition 5.1 **A function \(f:D \to \mathbb{R}\) has a **local maximum** at \(\mathbf{x_0}\) if
\(f(\mathbf{x_0}) \geq f(\mathbf{x})\) for \(\mathbf{x} \in B_\delta(\mathbf{x_0})\) for small enough \(\delta\).
\(f\) has a **global maximum** at \(\mathbf{x_0}\) if
\(f(\mathbf{x_0}) \geq f(\mathbf{x})\) for \(\mathbf{x} \in D\).
\(f\) has a **local (global) minimum** at \(\mathbf{x_0}\) if
\(-f\) has a local (global) maximum at \(\mathbf{x_0}\)

**Theorem 5.1 (First derivative test) **Let \(f:D \to \mathbb{R}\) be a function.
If \(\mathbf{x_0}\) is a local minimum and \(f\) has partial derivatives at \(\mathbf{x_0}\).
Then
\[\begin{equation*}
\partial_{x_i} f(\mathbf{x}_0) = 0 \,.
\end{equation*}\]

The converse is not true, as having \(\nabla f(\mathbf{x}_0) = \mathbf{0}\) does not mean that \(f\) has a local minimum at \(\mathbf{x}_0\).

**Exercise 5.1 **Think of a function that the converse to the above theorem is not true.

This leads to the following notion.

**Definition 5.2 **\(\mathbf{x}_0\) is said to be a **critical point** of \(f:D\to \mathbb{R}\) if
\[\begin{equation*}
\nabla f(\mathbf{x}_0) = 0
\end{equation*}\]
or one of the partial derivatives \(\partial_{x_i} f(\mathbf{x}_0)\) fails to exist.

**Please pay attention about the “fail to exist” condition.**

**Theorem 5.2 (Second derivative test for functions of 2 variables) **Suppose the second partial derivatives of \(f\) are continuous near \((a,b)\)
and suppose that \((a,b)\) is a critical point of \(f\).
Let
\[\begin{equation*}
D = f_{xx}(a,b) f_{yy}(a,b) - f_{xy}(a,b)^2\,.
\end{equation*}\]

If \(D>0\) and \(f_{xx}(a,b) >0\), then \(f(a,b)\) is a local minimum.

If \(D>0\) and \(f_{xx}(a,b) <0\), then \(f(a,b)\) is a local maximum.

If \(D<0\), then \(f(a,b)\) is neither a local maximum nor local minimum.

If \(D=0\), then we cannot conclude.

**Theorem 5.3 (Extreme value theorem) **If \(f\) is continuous on a *closed* and *bounded* set \(D\). Then,
\(f\) attains an absolute minimum and an absolute maximum in \(D\).

### 5.1.1 Algorithm to find absolute maxima and minima on closed bounded regions

Find the values of \(f\) at the critical points of \(f\) in \(D\).

Find the extreme values of \(f\) on the boundary of \(D\).

The largest of the values from steps 1 and 2 is the absolute maximum value; the smallest of these values is the absolute minimum value.

## 5.2 Constrained optimization

Constrained optimization takes various forms, depending on the assumptions. We will deal with the most straight forward form. The problem we will study is the following:

Maximize/minimize a function \(f:D\to \mathbb{R}\), subject to a constraint (side condition) of the form \(g(\mathbf{x}) = k\), for some constant \(k\in \mathbb{R}\).

**Theorem 5.4 (Method of Lagrange Multiplier) **Suppose the maximum/minimum values of \(f\) exist and \(\nabla g(\mathbf{x}) \not=0\) where \(g(\mathbf{x}) = k\).
To find the maximum and minimum values of \(f\) subject to constraint
\(g(\mathbf{x}) = k\), we do the following:

Find all values of \(\mathbf{x}\) and \(\lambda \in \mathbb{R}\) such that \[\begin{equation*} \nabla f(\mathbf{x}) =\lambda \nabla g(\mathbf{x})\,, \end{equation*}\] and \[\begin{equation*} g(\mathbf{x}) = k \,. \end{equation*}\]

Evaluate \(f\) at all the points \(\mathbf{x}\) that result from step 1. The largest of these values is the maximum of \(f\); the smallest is the minimum value of \(f\).