---
title: "NMinValue"
language: "en"
type: "Symbol"
summary: "NMinValue[f, x] gives the global minimum value of f with respect to x. NMinValue[f, {x, y, ...}] gives the global minimum value of f with respect to x, y, .... NMinValue[{f, cons}, {x, y, ...}] gives the global minimum value of f subject to the constraints cons. NMinValue[..., x \\[Element] reg] constrains x to be in the region reg."
keywords: 
- constrained optimization
- cost function
- differential evolution
- extremization
- flexible polyhedron method
- global minimization
- goal functions
- integer programming
- linear programming
- minimization
- Nelder-Mead
- numerical minimization
- objective functions
- operations research
- optimization
- pay-off functions
- random search
- simulated annealing
- argmin
- minpos
- NelderMead
- DifferentialEvolution
- SimulatedAnnealing
- RandomSearch
- AMOEBA
- CONSTRAINED_MIN
- DFPMIN
- POWELL
- minimize
canonical_url: "https://reference.wolfram.com/language/ref/NMinValue.html"
source: "Wolfram Language Documentation"
related_guides: 
  - 
    title: "Optimization"
    link: "https://reference.wolfram.com/language/guide/Optimization.en.md"
  - 
    title: "Solvers over Regions"
    link: "https://reference.wolfram.com/language/guide/GeometricSolvers.en.md"
  - 
    title: "Symbolic Vectors, Matrices and Arrays"
    link: "https://reference.wolfram.com/language/guide/SymbolicArrays.en.md"
  - 
    title: "Convex Optimization"
    link: "https://reference.wolfram.com/language/guide/ConvexOptimization.en.md"
related_functions: 
  - 
    title: "NArgMin"
    link: "https://reference.wolfram.com/language/ref/NArgMin.en.md"
  - 
    title: "NMinimize"
    link: "https://reference.wolfram.com/language/ref/NMinimize.en.md"
  - 
    title: "NMaxValue"
    link: "https://reference.wolfram.com/language/ref/NMaxValue.en.md"
  - 
    title: "MinValue"
    link: "https://reference.wolfram.com/language/ref/MinValue.en.md"
  - 
    title: "FindMinValue"
    link: "https://reference.wolfram.com/language/ref/FindMinValue.en.md"
  - 
    title: "Min"
    link: "https://reference.wolfram.com/language/ref/Min.en.md"
  - 
    title: "LinearOptimization"
    link: "https://reference.wolfram.com/language/ref/LinearOptimization.en.md"
  - 
    title: "ConvexOptimization"
    link: "https://reference.wolfram.com/language/ref/ConvexOptimization.en.md"
  - 
    title: "GeometricOptimization"
    link: "https://reference.wolfram.com/language/ref/GeometricOptimization.en.md"
  - 
    title: "RegionDistance"
    link: "https://reference.wolfram.com/language/ref/RegionDistance.en.md"
related_tutorials: 
  - 
    title: "Numerical Mathematics: Basic Operations"
    link: "https://reference.wolfram.com/language/tutorial/NumericalCalculations.en.md"
  - 
    title: "Numerical Optimization"
    link: "https://reference.wolfram.com/language/tutorial/NumericalOperationsOnFunctions.en.md#24524"
  - 
    title: "Numerical Nonlinear Global Optimization"
    link: "https://reference.wolfram.com/language/tutorial/ConstrainedOptimizationGlobalNumerical.en.md"
  - 
    title: "Constrained Optimization"
    link: "https://reference.wolfram.com/language/tutorial/ConstrainedOptimizationOverview.en.md"
  - 
    title: "Unconstrained Optimization"
    link: "https://reference.wolfram.com/language/tutorial/UnconstrainedOptimizationOverview.en.md"
  - 
    title: "Implementation notes: Numerical and Related Functions"
    link: "https://reference.wolfram.com/language/tutorial/SomeNotesOnInternalImplementation.en.md#10453"
---
# NMinValue

NMinValue[f, x] gives the global minimum value of f with respect to x.

NMinValue[f, {x, y, …}] gives the global minimum value of f with respect to x, y, …. 

NMinValue[{f, cons}, {x, y, …}] gives the global minimum value of f subject to the constraints cons.

NMinValue[…, x∈reg] constrains x to be in the region reg.

## Details and Options

* ``NMinValue`` is also known as global optimization (GO).

* ``NMinValue`` always attempts to find a global minimum of ``f`` subject to the constraints given.

* ``NMinValue`` is typically used to find the smallest possible values given constraints. In different areas, this may be called the best strategy, best fit, best configuration and so on.

[image]

* If ``f`` and ``cons`` are linear or convex, the result given by ``NMinValue`` will be the global minimum, over both real and integer values; otherwise, the result may sometimes only be a local minimum.

* If ``NMinValue`` determines that the constraints cannot be satisfied, it returns ``Infinity``.

* ``NMinValue`` supports a modeling language where the objective function ``f`` and constraints ``cons`` are given in terms of expressions depending on scalar or vector variables. ``f`` and ``cons`` are typically parsed into very efficient forms, but as long as ``f`` and the terms in ``cons`` give numerical values for numerical values of the variables, ``NMinValue`` can often find a solution.

* The constraints ``cons`` can be any logical combination of:

|                                            |                                          |
| :----------------------------------------- | :--------------------------------------- |
| lhs == rhs                                 | equations                                |
| lhs > rhs, lhs ≥ rhs, lhs < rhs, lhs ≤ rhs | inequalities (LessEqual, …)              |
| lhs\[VectorGreater]rhs, lhs\[VectorGreaterEqual]rhs, lhs\[VectorLess]rhs, lhs\[VectorLessEqual]rhs         | vector inequalities (VectorLessEqual, …) |
| {x, y, …}∈rdom                             | region or domain specification           |

* ``NMinValue[{f, cons}, x∈rdom]`` is effectively equivalent to ``NMinValue[{f, cons && x∈rdom}, x]``.

* For ``x∈rdom``, the different coordinates can be referred to using ``Indexed[x, i]``.

* Possible domains ``rdom`` include:

|                       |                                                                                                |
| --------------------- | ---------------------------------------------------------------------------------------------- |
| Reals                 | real scalar variable                                                                           |
| Integers              | integer scalar variable                                                                        |
| Vectors[n, dom]       | vector variable in $\mathbb{R}^n$                                |
| Matrices[{m, n}, dom] | matrix variable in $\mathbb{R}^{m n}$                            |
| ℛ                     | vector variable restricted to the geometric region $\mathcal{R}$ |

* By default, all variables are assumed to be real.

* The following options can be given:

|                    |                  |                                                  |
| :----------------- | :--------------- | :----------------------------------------------- |
| AccuracyGoal       | Automatic        | number of digits of final accuracy sought        |
| EvaluationMonitor  | None             | expression to evaluate whenever f is evaluated   |
| MaxIterations      | Automatic        | maximum number of iterations to use              |
| Method             | Automatic        | method to use                                    |
| PrecisionGoal      | Automatic        | number of digits of final precision sought       |
| StepMonitor        | None             | expression to evaluate whenever a step is taken  |
| WorkingPrecision   | MachinePrecision | the precision used in internal computations      |

* The settings for ``AccuracyGoal`` and ``PrecisionGoal`` specify the number of digits to seek in both the value of the position of the minimum, and the value of the function at the minimum.

* ``NMinValue`` continues until either of the goals specified by ``AccuracyGoal`` or ``PrecisionGoal`` is achieved.

* The methods for ``NMinValue`` fall into two classes. The first class of guaranteed methods uses properties of the problem so that, when the method converges, the minimum found is guaranteed to be global. The second class of heuristic methods uses methods that may include multiple local searches, commonly adjusted by some stochasticity, to home in on a global minimum. These methods often do find the global minimum, but are not guaranteed to do so.

* Methods that are guaranteed to give a global minimum when they converge to a solution include:

|          |                                                       |
| -------- | ----------------------------------------------------- |
| "Convex" | use only convex methods                               |
| "MOSEK"  | use the commercial MOSEK library for convex problems  |
| "Gurobi" | use the commercial Gurobi library for convex problems |
| "Xpress" | use the commercial Xpress library for convex problems |

* Heuristic methods include:

|                         |                                                                         |
| ----------------------- | ----------------------------------------------------------------------- |
| "NelderMead"            | simplex method of Nelder and Mead                                       |
| "DifferentialEvolution" | use differential evolution                                              |
| "SimulatedAnnealing"    | use simulated annealing                                                 |
| "RandomSearch"          | use the best local minimum found from multiple random starting points   |
| "Couenne"               | use the Couenne library for non-convex mixed-integer nonlinear problems |

---

## Examples (65)

### Basic Examples (4)

Find the global minimum value of a univariate function:

```wl
In[1]:= NMinValue[2x ^ 2 - 3x + 5, x]

Out[1]= 3.875
```

---

Find the global minimum value of a multivariate function:

```wl
In[1]:= NMinValue[(x y - 3) ^ 2 + 1, {x, y}]

Out[1]= 1.
```

---

Find the global minimum value of a function subject to constraints:

```wl
In[1]:= NMinValue[{x - 2y, x ^ 2 + y ^ 2 ≤ 1}, {x, y}]

Out[1]= -2.23607
```

---

Find the global minimum value of a function over a geometric region:

```wl
In[1]:= NMinValue[x + y, {x, y}∈Disk[]]

Out[1]= -1.41421
```

### Scope (40)

#### Basic Uses (12)

Find the minimum value of $x+2y$ subject to constraints $x^2+2y^2\leq 3,x+y==2,x\geq 1$ :

```wl
In[1]:= NMinValue[{x + 2y, x^2 + 2y^2 ≤ 3, x + y == 2, x ≥ 1}, {x, y}]

Out[1]= 2.33333
```

---

Several linear inequality constraints can be expressed with ``VectorGreaterEqual`` :

```wl
In[1]:= NMinValue[{x + y, VectorGreaterEqual[{{x + 2y, x}, {3, -1}}]}, {x, y}]

Out[1]= 1.
```

Use esc`` v>= ``esc or ``\[VectorGreaterEqual]`` to enter the vector inequality sign ``\[VectorGreaterEqual]`` :

```wl
In[2]:= NMinValue[{x + y, {x + 2y, x}\[VectorGreaterEqual]{3, -1}}, {x, y}]

Out[2]= 1.
```

An equivalent form using scalar inequalities:

```wl
In[3]:= NMinValue[{x + y, x + 2y ≥ 3, x ≥ -1}, {x, y}]

Out[3]= 1.
```

---

Use a vector variable $v=\{x,y\}$:

```wl
In[1]:= NMinValue[{{1, 1}.v, {{1, 2}, {1, 0}}.v\[VectorGreaterEqual]{3, -1}}, v]

Out[1]= 1.
```

---

The inequality $a.x+b\unicode{f435}0$ may not be the same as $a.x\unicode{f435}-b$ due to possible threading in $a.x+b$:

```wl
In[1]:= NMinValue[{{1, 1}.x, {{1, 2}, {1, 0}}.x + {-3, 1}\[VectorGreaterEqual]0}, x]

Out[1]= 3.

In[2]:= NMinValue[{{1, 1}.x, {{1, 2}, {1, 0}}.x\[VectorGreaterEqual]{3, -1}}, x]

Out[2]= 1.
```

To avoid unintended threading in $a.x+b$, use ``Inactive[Plus]``:

```wl
In[3]:= NMinValue[{{1, 1}.x, {{1, 2}, {1, 0}}.x + {-3, 1}\[VectorGreaterEqual]0}, x]

Out[3]= 1.
```

---

Use constant parameter equations to avoid unintended threading in $a.x+b$ :

```wl
In[1]:= parEqs = {a == $$\{\{1,2\},\{1,0\}\}$$, b == {3, -1}};

In[2]:= NMinValue[{{1, 1}.x, {parEqs, a.x + b\[VectorGreaterEqual]0}}, x]

Out[2]= -1.
```

---

``VectorGreaterEqual`` represents a conic inequality with respect to the ``"NonNegativeCone"`` :

```wl
In[1]:= constraint = VectorGreaterEqual[{{{1, 2}, {1, 0}}.v, {3, -1}}, "NonNegativeCone"]

Out[1]= {{1, 2}, {1, 0}}.vUnderscript[\[VectorGreaterEqual], "NonNegativeCone"]{3, -1}
```

To explicitly specify the dimension of the cone, use ``{"NonNegativeCone", n}``:

```wl
In[2]:= constraint = VectorGreaterEqual[{{{1, 2}, {1, 0}}.v, {3, -1}}, {"NonNegativeCone", 2}]

Out[2]= {{1, 2}, {1, 0}}.vUnderscript[\[VectorGreaterEqual], {"NonNegativeCone", 2}]{3, -1}
```

Find the minimum value:

```wl
In[3]:= NMinValue[{{1, 1}.v, constraint}, v]

Out[3]= 1.
```

---

Find the minimum value of $x+2y$ subject to the constraint $x^2+y^2\leq 9$ :

```wl
In[1]:= NMinValue[{x + y, x^2 + y^2 ≤ 9}, {x, y}]

Out[1]= -4.24264
```

Specify the constraint $x^2+y^2\leq 9$ using a conic inequality with ``"NormCone"``:

```wl
In[2]:= constraint = VectorGreaterEqual[{{x, y, 3}, 0}, {"NormCone", 3}]

Out[2]= {x, y, 3}Underscript[\[VectorGreaterEqual], {"NormCone", 3}]0
```

Find the minimum value:

```wl
In[3]:= NMinValue[{x + y, constraint}, {x, y}]

Out[3]= -4.24264
```

---

Find the minimum value of the function $\sum _{i=1}^3x_i$ subject to the constraint $a.x\unicode{f435}b, x_1\geq 1,x\in \mathbb{R}^3$ :

```wl
In[1]:= {a, b} = {{{1, 1, 0}, {0, 1, 1}, {1, 1, 1}}, {2, 2, -1}};
```

Use ``Indexed`` to access components of a vector variable, e.g. $x_1$ :

```wl
In[2]:= NMinValue[{Total[x], a.x\[VectorGreaterEqual]b, Indexed[x, 1] ≥ 1}, x]

Out[2]= 3.
```

---

Use ``Vectors[n, dom]`` to specify the dimension and domain of a vector variable when it is ambiguous:

```wl
In[1]:=
NMinValue[{Indexed[x, {1}] + 2Indexed[x, {2}], $$\left(
\begin{array}{cc}
 x_1 & 1 \\
 1 & x_2 \\
\end{array}
\right)$$Underscript[\[VectorGreaterEqual], {"SemidefiniteCone", 2}]0}, x]
```

NMinValue::itdim: The dimensionality of variable x is not well specified.

```wl
Out[1]= NMinValue[{Indexed[x, {1}] + 2 Indexed[x, {2}], {{Indexed[x, {1}], 1}, {1, Indexed[x, {2}]}}Underscript[\[VectorGreaterEqual], {"SemidefiniteCone", 2}]0}, x]

In[2]:=
NMinValue[{Indexed[x, {1}] + Indexed[2x, {2}], $$\left(
\begin{array}{cc}
 x_1 & 1 \\
 1 & x_2 \\
\end{array}
\right)$$Underscript[\[VectorGreaterEqual], {"SemidefiniteCone", 2}]0}, x∈Vectors[2, Reals]]

Out[2]= 2.82843
```

---

Specify non-negative constraints using ``NonNegativeReals`` ($\mathbb{R}_{\geq \, 0}$):

```wl
In[1]:= NMinValue[{{1, 1}.x, {Indexed[x, {1}], Indexed[x, {2}], 1}Underscript[\[VectorGreaterEqual], {"NormCone", 3}]0}, x∈Vectors[2, NonNegativeReals]]

Out[1]= -8.218923451501242`*^-7
```

An equivalent form using vector inequality $x\unicode{f435}0$:

```wl
In[2]:= NMinValue[{{1, 1}.x, {Indexed[x, {1}], Indexed[x, {2}], 1}Underscript[\[VectorGreaterEqual], {"NormCone", 3}]0, x\[VectorGreaterEqual]0}, x]

Out[2]= -8.218923451501242`*^-7
```

---

Specify non-positive constraints using ``NonPositiveReals`` ($\mathbb{R}_{\leq \, 0}$):

```wl
In[1]:= NMinValue[{-Total[v], {1, 2}.v\[VectorLessEqual]-3}, v∈Vectors[2, NonPositiveReals]]

Out[1]= 1.5
```

An equivalent form using vector inequalities:

```wl
In[2]:= NMinValue[{-Total[v], {1, 2}.v\[VectorLessEqual]-3, v\[VectorLessEqual]0}, v]

Out[2]= 1.5
```

---

``Or`` constraints can be specified:

```wl
In[1]:= NMinValue[{x + y, x ^ 2 + y ^ 2 ≤ 1 || (x + 2) ^ 2 + (y + 2) ^ 2 ≤ 1}, {x, y}]//Quiet

Out[1]= -5.41421
```

#### Domain Constraints (4)

Specify integer domain constraints using ``Integers`` :

```wl
In[1]:= NMinValue[{x + y, x + 2y ≥ 3, x ≥ -2}, {x, y∈Integers}]

Out[1]= 1.
```

---

Specify integer domain constraints on vector variables using ``Vectors[n, Integers]`` :

```wl
In[1]:=
NMinValue[{Total[x], a == {{1, 2}, {1, 0}}, b == {3, -2}, 
	a.x\[VectorGreaterEqual]b}, x∈Vectors[2, Integers]]

Out[1]= 1.
```

---

Specify non-negative integer domain constraints using ``NonNegativeIntegers`` ($\mathbb{Z}_{\geq \, 0}$):

```wl
In[1]:= NMinValue[{x + 2y, $$x^2+2y^2\leq  3$$, x + y == 2, x∈NonNegativeIntegers}, {x, y}]

Out[1]= 3.
```

---

Specify non-positive integer domain constraints using ``NonPositiveIntegers`` ($\mathbb{Z}_{\leq \, 0}$):

```wl
In[1]:= NMinValue[{-x + y, x + 2y ≥ 3, x∈NonPositiveIntegers}, {x, y}]

Out[1]= 1.5
```

#### Region Constraints (5)

Find the minimum value of $z$ over a region:

```wl
In[1]:=
t = RotationTransform[{{0, 0, 1}, {1, 1, 1}}];
ℛ = TransformedRegion[Ellipsoid[{0, 0, 0}, {1, 2, 3}], t];

In[2]:= NMinValue[z, {x, y, z}∈ℛ]

Out[2]= -2.16025
```

---

Find the minimum distance between two regions:

```wl
In[1]:=
Subscript[ℛ, 1] = Disk[];
Subscript[ℛ, 2] = InfiniteLine[{{-2, 0}, {0, 2}}];

In[2]:= NMinValue[EuclideanDistance[{x, y}, {u, v}], {{x, y}∈Subscript[ℛ, 1], {u, v}∈Subscript[ℛ, 2]}]

Out[2]= 0.414214
```

---

Find the minimum of $r$ such that the triangle and ellipse still intersect:

```wl
In[1]:=
Subscript[ℛ, 1] = Triangle[{{0, 0}, {1, 0}, {0, 1}}];
Subscript[ℛ, 2] = Disk[{1, 1}, {2r, r}];

In[2]:= NMinValue[{r, {x, y}∈Subscript[ℛ, 1] && {x, y}∈Subscript[ℛ, 2]}, {r, x, y}]

Out[2]= 0.447213
```

---

Find the minimum radius of a disk that contains the given three points:

```wl
In[1]:= Subscript[ℛ, 3] = Disk[{a, b}, r];

In[2]:= NMinValue[{r, ({0, 0} | {1, 0} | {0, 1})∈Subscript[ℛ, 3]}, {a, b, r}]

Out[2]= 0.707107
```

Using ``Circumsphere`` gives the same result directly:

```wl
In[3]:= Circumsphere[{{0, 0}, {1, 0}, {0, 1}}]//N

Out[3]= Sphere[{0.5, 0.5}, 0.707107]
```

---

Use $x\in \mathcal{R}$ to specify that $x$ is a vector in $\mathbb{R}^3$ with $\| x\| =1$ :

```wl
In[1]:= ℛ = Sphere[];

In[2]:= NMinValue[x.{1, 2, 3}, x∈ℛ]

Out[2]= -3.74166
```

#### Linear Problems (5)

With linear objectives and constraints, when a minimum is found it is global:

```wl
In[1]:= NMinValue[{x + y, 3x + 2y ≥ 7 && x + 2y ≥ 6 && x ≥ 0 && y ≥ 0}, {x, y}]

Out[1]= 3.25
```

---

The constraints can be equality and inequality constraints:

```wl
In[1]:= NMinValue[{x - y, x + y + z == 1 / 2, x - 2z == 1, 2x - y ≥ 1}, {x, y, z}]

Out[1]= 0.428571
```

---

Use ``Equal`` to express several equality constraints at once:

```wl
In[1]:= NMinValue[{x - y, {x + y, x - y} == {1 / 2, 1}}, {x, y}]

Out[1]= 1.
```

An equivalent form using several scalar equalities:

```wl
In[2]:= NMinValue[{x - y, x + y == 1 / 2, x - y == 1}, {x, y}]

Out[2]= 1.
```

---

Use ``VectorLessEqual`` to express several ``LessEqual`` inequality constraints at once:

```wl
In[1]:= NMinValue[{x + y, VectorLessEqual[{{x - 2y, -x + y}, {3, 2}}]}, {x, y}]

Out[1]= -12.
```

Use esc`` v<= ``esc to enter the vector inequality in a compact form:

```wl
In[2]:= NMinValue[{x + y, {x - 2y, -x + y}\[VectorLessEqual]{3, 2}}, {x, y}]

Out[2]= -12.
```

An equivalent form using scalar inequalities:

```wl
In[3]:= NMinValue[{x + y, {x - 2y ≤ 3, -x + y ≤ 2}}, {x, y}]

Out[3]= -12.
```

---

Use ``Interval`` to specify bounds on variable:

```wl
In[1]:= NMinValue[{x + y, x + 2y ≥ 3}, {x∈Interval[{-1, 2}], y∈Interval[{-1, 1}]}]

Out[1]= 2.
```

#### Convex Problems (7)

Use ``"NonNegativeCone"`` to specify linear functions of the form $a.x\unicode{f435}b$:

```wl
In[1]:=
NMinValue[{Total[x], 
	VectorGreaterEqual[{{{-1, -1 / 2}, {1, -1 / 2}, {0, 1}}.x, {-1 / 2, -1 / 2, -1 / 2}}, {"NonNegativeCone", 3}]}, x]

Out[1]= -1.25
```

Use esc`` v>= ``esc to enter the vector inequality in a compact form:

```wl
In[2]:= NMinValue[{Total[x], {{-1, -(1/2)}, {1, -(1/2)}, {0, 1}}.x\[VectorGreaterEqual]-(1/2)}, x]

Out[2]= -1.25
```

---

Find the minimum value of a convex quadratic function subject to linear constraints:

```wl
In[1]:= NMinValue[{(x - 1)^2 + y^2, x + y / 2 ≤ 1 / 2, x - y ≥ 0}, {x, y}]

Out[1]= 0.2
```

---

Find the minimum value of a convex quadratic function subject to a set of convex quadratic constraints:

```wl
In[1]:= NMinValue[{(x - 1)^2 + y^2, x^2 + (y^2/2) ≤ (1/2), 2x^2 ≤ 3y}, {x, y}]

Out[1]= 0.198003
```

---

Find the minimum distance between two convex regions:

```wl
In[1]:=
Subscript[ℛ, 1] = Triangle[{{1, 1}, {2, 1}, {1, 2}}];
Subscript[ℛ, 2] = Disk[];

In[2]:= NMinValue[EuclideanDistance[x, y], {x∈Subscript[ℛ, 1], y∈Subscript[ℛ, 2]}]

Out[2]= 0.414211
```

---

Find the minimum value of $10 x+11y$ such that $\left(
\begin{array}{cc}
 x & 1 \\
 1 & y \\
\end{array}
\right)$ is positive semidefinite:

```wl
In[1]:=
NMinValue[{10x + 11 y, (⁠|   |   |
| - | - |
| x | 1 |
| 1 | y |⁠)Underscript[\[VectorGreaterEqual], {"SemidefiniteCone", 2}]0}, {x, y}]

Out[1]= 20.9762
```

---

Minimize convex objective function $-\text{Log}[x+y]$ such that $\left(
\begin{array}{cc}
 x+y & 1 \\
 1 & x-y \\
\end{array}
\right)$ is positive semidefinite and $1\leq x\leq 10,-1\leq y\leq 1$ :

```wl
In[1]:=
NMinValue[{-Log[x + y], (⁠|       |       |
| ----- | ----- |
| x + y | 1     |
| 1     | x - y |⁠)Underscript[\[VectorGreaterEqual], {"SemidefiniteCone", 2}]0, 1 ≤ x ≤ 10, -1 ≤ y ≤ 1}, {x, y}]

Out[1]= -2.3979
```

---

Find the minimum value of a convex objective function over a 4-norm unit disk:

```wl
In[1]:= NMinValue[{(x - 2)^2 + (y - 1 / 2)^2, Inactive[Norm][{x, y}, 4] ≤ 1}, {x, y}]

Out[1]= 1.0222
```

#### Transformable to Convex (4)

Find the minimum perimeter of a rectangle with area 1 such that the height is at most half the width:

```wl
In[1]:= NMinValue[{2w + 2h, h w == 1, h ≤ (1/2)w, h ≥ 0, w ≥ 0}, {h, w}]

Out[1]= 4.24266
```

This problem is log-convex and is solved by making a transformation ``{h -> Exp[\[ScriptH]], w -> Exp[ \[ScriptW]]}`` and taking logarithms to get the convex problem:

```wl
In[2]:= NMinValue[{Log[Exp[\[ScriptW] + Log[2]] + Exp[\[ScriptH] + Log[2]]], \[ScriptH] + \[ScriptW] == 0, \[ScriptH] - \[ScriptW] ≤ Log[1 / 2]}, {\[ScriptH], \[ScriptW]}]

Out[2]= 1.44519

In[3]:= Exp[%]

Out[3]= 4.24264
```

---

Find the minimum value of the quasi-convex function $-x y$ subject to inequality and norm constraints. The objective is quasi-convex because it is a product of a non-negative function and a non-positive function over the domain:

```wl
In[1]:= NMinValue[{-x y, x ≥ 0, y ≥ 0, Norm[{x - 1, y - 2}] ≤ 1}, {x, y}]

Out[1]= -4.68174
```

Quasi-convex problems can be solved as parametric convex optimization problems for the parameter $\alpha$ :

```wl
In[2]:=
pfun = ParametricConvexOptimization[0, {-x y ≤ α, x ≥ 0, y ≥ 0, 
	Norm[{x - 1, y - 2}] ≤ 1}, {x, y}, α, "PrimalMinimizerRules"]

Out[2]= ParametricFunction[<>]
```

Plot the objective as a function of the level-set $\alpha$ :

```wl
In[3]:=
fun[α_ ? NumericQ] := -x * y /. pfun[α];
Plot[fun[α], {α, -4.68, 0}, PlotRange -> All]

Out[3]= [image]
```

For a level-set value between the interval $[-4.682,-4.68]$, the smallest objective is found:

```wl
In[4]:= pfun[-4.68]

Out[4]= {x -> 1.81699, y -> 2.57617}
```

The problem becomes infeasible when the level-set value is increased:

```wl
In[5]:= pfun[-4.69]
```

ParametricConvexOptimization::nsolc: There are no points that satisfy the constraints.

```wl
Out[5]= {x -> Indeterminate, y -> Indeterminate}
```

---

Minimize $x^2-y^2$ subject to the constraint $\|\{x,y\}\|\text{$<$=}2$. The objective is not convex but can be represented by a difference of convex function $f(x,y)-g(x,y)$ where $f$ and $g$ are convex functions:

```wl
In[1]:= res = NMinValue[{x ^ 2 - y ^ 2, Norm[{x, y}] <= 2}, {x, y}]

Out[1]= -4.
```

Plot the region and the minimizing point:

```wl
In[2]:= Show[Plot3D[x ^ 2 - y ^ 2, {x, -2, 2}, {y, -2, 2}, ...], Graphics3D[{PointSize[0.04], Point[{0, -2, res}]}]]

Out[2]= [image]
```

---

Minimize $x^2+y$ subject to the constraints $1\text{$<$=}\|\{x,y\}\|\text{$<$=}2$. The constraint $1\text{$<$=}\|\{x,y\}\|$ is not convex but can be represented by a difference of convex constraint $f(x,y)-g(x,y)\text{$<$=}0$ where $f,g$ are convex functions:

```wl
In[1]:= res = NMinValue[{x ^ 2 + y, 1 <= Norm[{x, y}] <= 2}, {x, y}]

Out[1]= -2.
```

Plot the region and the minimizing point:

```wl
In[2]:= Show[Plot3D[x ^ 2 + y, {x, -2, 2}, {y, -2, 2}, ...], Graphics3D[{PointSize[0.04], Point[{0, -2, res}]}]]

Out[2]= [image]
```

#### General Problems (3)

Find the minimum value of a linear objective subject to nonlinear constraints:

```wl
In[1]:= NMinValue[{x + y, Sin[2x] + Cos[3y] ≤ 1, Norm[{x, y}] ≤ 2}, {x, y}]

Out[1]= -2.82843
```

---

Find the minimum value of a nonlinear objective subject to linear constraints:

```wl
In[1]:= NMinValue[{Sin[2x] + Cos[x], -2 ≤ x ≤ 3}, x]

Out[1]= -1.76017
```

---

Find the minimum value of a nonlinear objective subject to nonlinear constraints:

```wl
In[1]:= NMinValue[{Sin[x^2 + y], Sin[2x] + Cos[3y] ≤ 1, Norm[{x^2, y}] ≤ 2}, {x, y}]

Out[1]= -1.
```

### Options (7)

#### AccuracyGoal & PrecisionGoal (2)

This enforces convergence criteria $\left\|x_k-x^*\right\| \leq  \max \left(10^{-9},10^{-8}\left\|x_k\right\|\right)$ and $\text{$\triangledown $f}\left(x_k\right)\leq 10^{-9}$ :

```wl
In[1]:= NMinValue[Sin[Tan[x] / 2], x, AccuracyGoal -> 9, PrecisionGoal -> 8]

Out[1]= -1.
```

---

This enforces convergence criteria $\left\|x_k-x^*\right\| \leq  \max \left(10^{-20},10^{-18}\left\|x_k\right\|\right)$ and $\text{$\triangledown $f}\left(x_k\right)\leq 10^{-20}$, which is not achievable with the default machine-precision computation:

```wl
In[1]:= NMinValue[Sin[Tan[x] / 2], x, AccuracyGoal -> 20, PrecisionGoal -> 18]
```

NMinValue::cvmit: Failed to converge to the requested accuracy or precision within 100 iterations.

```wl
Out[1]= -1.
```

Setting a high ``WorkingPrecision`` makes the process convergent:

```wl
In[2]:= NMinValue[Sin[Tan[x] / 2], x, AccuracyGoal -> 20, PrecisionGoal -> 18, WorkingPrecision -> 40]

Out[2]= -1.000000000000000000000000000000000000000
```

#### EvaluationMonitor (1)

Record all the points evaluated during the solution process of a function with a ring of minima:

```wl
In[1]:=
f[x_, y_] := (x ^ 2 + y ^ 2 - 16) ^ 2;
{sol, pts} = Reap[
	NMinValue[f[x, y], {{x, -5, 5}, {y, -5, 5}},   Method -> "DifferentialEvolution", EvaluationMonitor :> Sow[{x, y}]]];
```

Plot all the visited points that are close in objective function value to the final solution:

```wl
In[2]:=
ContourPlot[f[x, y], {x, -5, 5}, {y, -5, 5}, ColorFunction -> (Hue[.6(1 - #1)]&), 
	Epilog -> Map[Point, Cases[First[pts], x_ /; Abs[f@@x - sol] ≤ .05]], Contours -> 4Range[0, 10] ^ 2]

Out[2]= [image]
```

#### Method (2)

Some methods may give suboptimal results for certain problems:

```wl
In[1]:= objective[x_] := Sin[.4 x ^ 2] + .7Cos[.6 x] x;NMinValue[{objective[x], -9 < x < 9}, {x}, Method -> #]& /@ {"NelderMead", "DifferentialEvolution", "SimulatedAnnealing"}

Out[1]= {-3.87335, -3.87335, -3.87335}
```

The automatically chosen method gives the optimal solution for this problem:

```wl
In[2]:= minVal  = NMinValue[{objective[x], -9 < x < 9}, {x}]

Out[2]= -4.68261
```

Plot the objective function along with the global minimum value:

```wl
In[3]:= Plot[{objective[x], minVal}, {x, -9, 9}]

Out[3]= [image]
```

---

Use method ``"NelderMead"`` for problems with many variables when speed is essential:

```wl
In[1]:= n = 50;AbsoluteTiming[NMinValue[Sum[i * (x[i] - i) ^ 2 + Sin[x[i]], {i, 1, n}], Table[x[i], {i, 1, n}], Method -> "NelderMead"]]

Out[1]= {0.0515778, -0.645913}
```

#### StepMonitor (1)

Steps taken by ``NMinValue`` in finding the minimum of the classic Rosenbrock function:

```wl
In[1]:=
pts = Reap[NMinValue[(1 - x) ^ 2 + 100(-x ^ 2 - y) ^ 2, {x, y}, Method -> "NelderMead", StepMonitor :> Sow[{x, y}]]][[2, 1]];
pts  = Join[{{-1.2, 1}}, pts];

In[2]:= ContourPlot[(1 - x) ^ 2 + 100(-x ^ 2 - y) ^ 2, {x, -1.3, 1.5}, {y, -1.5, 1.4}, Epilog -> {Arrow[pts], Point[pts]}, Contours -> Table[10 ^ -i, {i, -2, 10}], ColorFunction -> (Hue[(Log[10, #] + 10) / 12]&), ColorFunctionScaling -> False]

Out[2]= [image]
```

#### WorkingPrecision (1)

With the working precision set to $20$, by default ``AccuracyGoal`` and ``PrecisionGoal`` are set to $\frac{20}{2}$ :

```wl
In[1]:= NMinValue[Cos[x ^ 2 - 3 y] + Sin[x ^ 2 + y ^ 2], {x, y}, WorkingPrecision -> 20]

Out[1]= -2.0000000000000000000
```

### Applications (6)

#### Geometry Problems (3)

Find the minimum distance between two disks of radius 1 centered at $c_1=\{0,0\}$ and $c_2=\{2,1\}$. Let $p_1$ be a point on disk 1. Let $p_2$ be a point on disk 2. The objective is to minimize $\left\|p_1-p_2\right\|$ subject to constraints $\left\|p_1\right\|\leq 1,\left\|p_2-c_2\right\|\leq 1$ :

```wl
In[1]:= res = NMinValue[{Norm[p1 - p2], c2 == {2, 1}, Norm[p1] ≤ 1, Norm[p2 - c2] ≤ 1}, {p1∈Vectors[2, Reals], p2∈Vectors[2, Reals]}]

Out[1]= 0.236069
```

---

Find the radius $r$ of a minimal enclosing ball that encompasses a given region:

```wl
In[1]:= region = [image];
```

Find the minimum value of the radius $r$ subject to the constraints $\left\|p_i-c\right\|\leq r,i=1,2,\ldots ,n$ :

```wl
In[2]:= constraints = Table[Norm[Inactive[Plus][pi, -c]] ≤ r, {pi, MeshCoordinates[region]}];

In[3]:= NMinValue[{r, constraints}, {r, c∈Vectors[3, Reals]}]

Out[3]= 1.20794
```

The minimal enclosing ball can be found efficiently using ``BoundingRegion`` :

```wl
In[4]:= BoundingRegion[region, "MinBall"]

Out[4]= Ball[{-0.0047416, 4.578623380344049`*^-16, 0.31701}, 1.20794]
```

Visualize the enclosing ball:

```wl
In[5]:= Show[Graphics3D[{Opacity[0.2], Green, %}], region]

Out[5]= [image]
```

---

Find the smallest square that can contain $n$ circles of given radius $r_i$ for $i=1,\text{...},n$ that do not overlap. Specify the number of circles and the radius of each circle:

```wl
In[1]:=
n = 20;
r = BlockRandom[RandomReal[{0.5, 1.5}, n], RandomSeeding -> 123];
```

If $c_i$ is the center of circle $i$, then the objective is to minimize $\max _{i = 1,\text{..},n}\left(\left\|c_i\|_{\infty }+ r_i\right.\right)$. The objective can be transformed so as to minimize $s$ and $-s\text{$<$=}\left\|c_i\|_{\infty }+r_i\text{$<$=}s\right., i=1,\text{...},n$ :

```wl
In[2]:= objectiveConstraint = Table[-s <= Norm[c[i], Infinity] + r[[i]] <= s, {i, n}];
```

The circles must not overlap:

```wl
In[3]:=
nonOverlapConstraint = Table[Norm[c[i] - c[j]] >= r[[i]] + r[[j]], 
	{i, 1, n}, {j, i + 1, n}];
```

Collect the variables:

```wl
In[4]:= vars = Append[Table[Element[c[i], Vectors[2, Reals]], {i, n}], Element[s, Reals]];
```

The circles are contained in the square $[-s,s] \times  [-s,s]$. Find the bounds:

```wl
In[5]:= NMinValue[{s, objectiveConstraint, nonOverlapConstraint}, vars, Method -> "DifferenceOfConvex"]

Out[5]= 4.65931
```

#### Data-Fitting Problems (1)

Find the Gaussian width parameter $\epsilon$ that minimizes the residual for an $L_1$ fit to nonlinear discrete data:

```wl
In[1]:= data = Block[...];
```

Fit the data using the basis $\phi _i(x;\epsilon )=e^{-\left(\epsilon \left(x-c_i\right)\right){}^{\wedge}2}$ :

```wl
In[2]:= basis[ϵ_] = Exp[-(ϵ(x - #)) ^ 2]& /@ Subdivide[-3, 3, 10];
```

The function will be approximated by $s(x;\epsilon )=\sum _{i=1}\lambda _i\phi _i(x,\epsilon )$ :

```wl
In[3]:=
a[ϵ_] = DesignMatrix[data, basis[ϵ], x, IncludeConstantBasis -> False];
output = data[[All, 2]];
```

Construct a parametric function that gives the residual:

```wl
In[4]:= pfun[ϵ_ ? NumericQ] := NMinValue[Norm[Inactive[Plus][a[ϵ].λ, -output], 1], λ]
```

Show the residual as a function of $\epsilon$ :

```wl
In[5]:= ListLinePlot[Table[{ϵ, pfun[ϵ]}, {ϵ, .5, 2, .1}]]

Out[5]= [image]
```

Find the scaling parameter that produces the minimum residual:

```wl
In[6]:= Quiet[FindArgMin[pfun[ϵ], {ϵ, 1}]]

Out[6]= {0.950462}
```

#### Iterated Optimization (1)

Find parameter $\alpha \geq 0$ such that the distance from the ellipse $x^2/\alpha ^2+\alpha ^2y^2\leq 1$ to the point ``(1, 2)`` is as small as possible:

```wl
In[1]:=
pfun[α_ ? NumericQ] := NMinValue[{Norm[{(x - 1), (y - 2)}], 
	 Norm[{x / α, α y}] ≤ 1}, {x, y}]
```

Show the distance as a function of $\alpha$ :

```wl
In[2]:= Plot[pfun[α], {α, 0.1, 2}]

Out[2]= [image]
```

Find the optimal parameter $\alpha$ that maximizes the reciprocal of the distance:

```wl
In[3]:= Subscript[α, opt] = FindArgMin[{pfun[α], α > 0}, {α, .5}][[1]]

Out[3]= 0.372614
```

Visualize the point with respect to the disk:

```wl
In[4]:= Graphics[{{LightBlue, Disk[{0, 0}, {Subscript[α, opt], 1 / Subscript[α, opt]}]}, {Red, Point[{1, 2}]}}, Axes -> True, AspectRatio -> Automatic]

Out[4]= [image]
```

#### Trajectory Optimization (1)

Find the minimum length of a path between the start and end points while avoiding circular obstacles:

```wl
In[1]:=
p = {{2, 2}, {7, 6}, {3, 5}, {4, 8}, {8, 9}, {6, 2.5}};
r = {1, 1.3, 1.5, 1, 0.7, 1.5};
{start, end} = {{0, 0}, {10, 10}};
domain = Graphics[{MapThread[Circle[#1, #2]&, {p, r}], 
	{PointSize[0.02], Point[{start, end}]}}, Frame -> True]

Out[1]= [image]
```

The path is discretized into $n$ different points. The distance between these points must be less than $l/n$ where $l$ is the length to be minimized:

```wl
In[2]:=
n = 30;
distanceConstraints = Table[Inactive[Norm][x[i] - x[i - 1]] <= l / n, {i, n}];
```

The points cannot be inside the circular objects:

```wl
In[3]:= objectConstraints = Table[Inactive[Norm][-p[[j]] + x[i]] >= r[[j]], {i, 1, n}, {j, 1, 6}];
```

The start and end points are known:

```wl
In[4]:= positionConstraints = {x[0] == start, x[n] == end};
```

Collect the variables:

```wl
In[5]:= vars = Append[Table[x[i]∈Vectors[2, Reals], {i, 0, n}], l∈Reals];
```

Minimize the length $l$ subject to the constraints:

```wl
In[6]:=
NMinValue[{l, distanceConstraints, 
	objectConstraints, positionConstraints}, vars, Method -> "DifferenceOfConvex"]

Out[6]= 14.5693
```

### Properties & Relations (8)

``NMinimize`` gives the minimum value and rules for the minimizing values of the variables:

```wl
In[1]:= NMinimize[{x.x  + y, Norm[x + y] ≤ 1}, {Element[x, Vectors[2, Reals]], y}]

Out[1]= {-0.832107, {x -> {0.25, 0.25}, y -> -0.957107}}
```

``NArgMin`` gives a list of the minimizing values:

```wl
In[2]:= NArgMin[{x.x  + y, Norm[x + y] ≤ 1}, {Element[x, Vectors[2, Reals]], y}]

Out[2]= {{0.25, 0.25}, -0.957107}
```

``NMinValue`` gives only the minimum value:

```wl
In[3]:= NMinValue[{x.x  + y, Norm[x + y] ≤ 1}, {Element[x, Vectors[2, Reals]], y}]

Out[3]= -0.832107
```

---

Maximizing a function ``f`` is equivalent to minimizing ``-f`` :

```wl
In[1]:= NMinValue[{x ^ 2 + y ^ 2, x + 2 y  ≥ 3}, {x, y}]

Out[1]= 1.8

In[2]:= -NMaxValue[{-(x ^ 2 + y ^ 2), x + 2 y  ≥ 3}, {x, y}]

Out[2]= 1.8
```

---

For convex problems, ``ConvexOptimization`` may be used to obtain additional solution properties:

```wl
In[1]:= NMinValue[{x ^ 2 + y ^ 2, x + 2 y  ≥ 3}, {x, y}]

Out[1]= 1.8

In[2]:= ConvexOptimization[x ^ 2 + y ^ 2, x + 2 y  ≥ 3, {x, y}, "PrimalMinimumValue"]

Out[2]= 1.8
```

Get the dual solution:

```wl
In[3]:= ConvexOptimization[x ^ 2 + y ^ 2, x + 2 y  ≥ 3, {x, y}, "DualMaximizer"]

Out[3]= {{1.2}}
```

---

For convex problems with parameters, using ``ParametricConvexOptimization`` gives a ``ParametricFunction`` :

```wl
In[1]:= pfun = ParametricConvexOptimization[x + 2 y, (x - α) ^ 2 + (y - β) ^ 2 ≤ 1, {x, y}, {α, β}, "PrimalMinimumValue"]

Out[1]= ParametricFunction[<>]
```

The ``ParametricFunction`` may be evaluated for values of the parameter:

```wl
In[2]:= {pfun[0., 0.], pfun[.5, 1.], pfun[2., 1.]}

Out[2]= {-2.23606, 0.263932, 1.76393}
```

Define a function for the parametric problem using ``NMinValue`` :

```wl
In[3]:=
fun[α_ ? NumericQ, β_ ? NumericQ]  := NMinValue[{x + 2 y, (x - α) ^ 2 + (y - β) ^ 2 ≤ 1}, {x, y}];
{fun[0., 0.], fun[.5, 1.], fun[2., 1.]}

Out[3]= {-2.23607, 0.263932, 1.76393}
```

Compare the speeds of the two approaches:

```wl
In[4]:=
testData = RandomReal[{0, 4}, {1000, 2}];
{First[AbsoluteTiming[pfun @@@testData]], First[AbsoluteTiming[fun @@@ testData]]}

Out[4]= {1.37004, 2.07656}
```

Derivatives of the ``ParametricFunction`` can also be computed:

```wl
In[5]:= D[pfun[α, β], {{α, β}}] /. {α -> 2., β -> 1.}

Out[5]= {1.00001, 2.}
```

---

For convex problems with parametric constraints, ``RobustConvexOptimization`` finds an optimum that works for all possible values of the parameters:

```wl
In[1]:= {rMinValue, rMin} = RobustConvexOptimization[2x + 3y, x + y ≥ α && x - y ≤ β && -1 ≤ α ≤ 1 && 0 ≤ β ≤ 1, {x, y}, {α, β}, {"PrimalMinimumValue", "PrimalMinimizerRules"}]

Out[1]= {2.5, {x -> 0.5, y -> 0.5}}
```

``NMinValue`` may find a smaller minimum value for particular values of the parameters:

```wl
In[2]:= NMinValue[{2x + 3y, x + y ≥ α && x - y ≤ β && α == 0 && β == 0}, {x, y}]

Out[2]= 0.
```

The minimizer that gives this value does not satisfy the constraints for all allowed values of $\alpha$ and $\beta$ :

```wl
In[3]:= minimizer = Last[NMinimize[{2x + 3y, x + y ≥ α && x - y ≤ β && α == 0 && β == 0}, {x, y}]]

Out[3]= {x -> 0., y -> 0.}

In[4]:= (x + y ≥ α && x - y ≤ β  /. minimizer) /. {α -> 1, β -> 1}

Out[4]= False
```

The minimum value found for particular values of the parameters is less than or equal to the robust minimum:

```wl
In[5]:=
fun[α_, β_] /; (-1 ≤ α ≤ 1 && 0 ≤ β ≤ 1)  := 
	NMinValue[{2x + 3y, x + y ≥ α && x - y ≤ β}, {x, y}];

In[6]:=
testValues = RandomVariate[UniformDistribution[{{-1, 1}, {0, 1}}], 100];
AllTrue[testValues, ((fun@@#) ≤ rMinValue)&]

Out[6]= True
```

---

``NMinValue`` can solve linear programming problems:

```wl
In[1]:= NMinValue[{2x + 3y - z, 1 ≤ x + y + z ≤ 2 && 1 ≤ x - y + z ≤ 2 && x - y - z == 3}, {x, y, z}]

Out[1]= 3.
```

``LinearProgramming`` can be used to solve the same problem given in matrix notation:

```wl
In[2]:=
c = {2., 3., -1.};
m = {{1, 1, 1}, {1, 1, 1}, {1, -1, 1}, {1, -1, 1}, {1, -1, -1}};
b = {{1, 1}, {2, -1}, {1, 1}, {2, -1}, {3, 0}};

In[3]:= c.LinearProgramming[c, m, b, -Infinity]

Out[3]= 3.
```

---

Use ``RegionDistance`` to compute the minimum distance from a point to a region:

```wl
In[1]:=
ℛ = Cone[{{-1, 0, 1}, {2, 3, 4}}, 3];
p = {3., 2., 1.};

In[2]:= RegionDistance[ℛ, p]

Out[2]= 1.58346
```

Compute the distance using ``NMinValue`` :

```wl
In[3]:= NMinValue[Norm[{x, y, z} - p], {x, y, z}∈ℛ]

Out[3]= 1.58346
```

---

Use ``RegionBounds`` to compute the bounding box:

```wl
In[1]:=
f = x^4 + 3. x^2 y + 2. x^2 y^2 - y^3 + y^4 ≤ 0;
ℛ = ImplicitRegion@@{f, {x, y}}

Out[1]= ImplicitRegion[x^4 + 3. x^2 y + 2. x^2 y^2 - y^3 + y^4 ≤ 0, {x, y}]

In[2]:= RegionBounds[ℛ]

Out[2]= {{-0.880086, 0.880086}, {-0.5625, 1.}}
```

Use ``NMaxValue`` and ``NMinValue`` to compute the same bounds:

```wl
In[3]:= {{x1, x2}, {y1, y2}} = {NMinValue[#, {x, y}∈ℛ], NMaxValue[#, {x, y}∈ℛ]}& /@ {x, y}

Out[3]= {{-0.880086, 0.880086}, {-0.5625, 1.}}

In[4]:= Show[{Graphics[{LightBlue, Rectangle[{x1, y1}, {x2, y2}]}], RegionPlot[f, {x, x1 - 1, x2 + 1}, {y, y1 - 1, y2 + 1}]}]

Out[4]= [image]
```

## See Also

* [`NArgMin`](https://reference.wolfram.com/language/ref/NArgMin.en.md)
* [`NMinimize`](https://reference.wolfram.com/language/ref/NMinimize.en.md)
* [`NMaxValue`](https://reference.wolfram.com/language/ref/NMaxValue.en.md)
* [`MinValue`](https://reference.wolfram.com/language/ref/MinValue.en.md)
* [`FindMinValue`](https://reference.wolfram.com/language/ref/FindMinValue.en.md)
* [`Min`](https://reference.wolfram.com/language/ref/Min.en.md)
* [`LinearOptimization`](https://reference.wolfram.com/language/ref/LinearOptimization.en.md)
* [`ConvexOptimization`](https://reference.wolfram.com/language/ref/ConvexOptimization.en.md)
* [`GeometricOptimization`](https://reference.wolfram.com/language/ref/GeometricOptimization.en.md)
* [`RegionDistance`](https://reference.wolfram.com/language/ref/RegionDistance.en.md)

## Tech Notes

* [Numerical Mathematics: Basic Operations](https://reference.wolfram.com/language/tutorial/NumericalCalculations.en.md)
* [Numerical Optimization](https://reference.wolfram.com/language/tutorial/NumericalOperationsOnFunctions.en.md#24524)
* [Numerical Nonlinear Global Optimization](https://reference.wolfram.com/language/tutorial/ConstrainedOptimizationGlobalNumerical.en.md)
* [Constrained Optimization](https://reference.wolfram.com/language/tutorial/ConstrainedOptimizationOverview.en.md)
* [Unconstrained Optimization](https://reference.wolfram.com/language/tutorial/UnconstrainedOptimizationOverview.en.md)
* [Implementation notes: Numerical and Related Functions](https://reference.wolfram.com/language/tutorial/SomeNotesOnInternalImplementation.en.md#10453)

## Related Guides

* [`Optimization`](https://reference.wolfram.com/language/guide/Optimization.en.md)
* [Solvers over Regions](https://reference.wolfram.com/language/guide/GeometricSolvers.en.md)
* [Symbolic Vectors, Matrices and Arrays](https://reference.wolfram.com/language/guide/SymbolicArrays.en.md)
* [Convex Optimization](https://reference.wolfram.com/language/guide/ConvexOptimization.en.md)

## History

* [Introduced in 2008 (7.0)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn70.en.md) \| [Updated in 2014 (10.0)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn100.en.md) ▪ [2021 (12.3)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn123.en.md) ▪ [2021 (13.0)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn130.en.md) ▪ [2022 (13.2)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn132.en.md) ▪ [2024 (14.1)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn141.en.md)