## Pages

### Nelder-Mead Optimization Example in Python

Optimization is a technique for finding the minimum or maximum value of the function from the set of available options. Finding the shortest path from point A to point B by evaluating multiple alternative directions can be a simple example of an optimization problem.

Nelder-Mead algorithm is a direct search optimization method to solve optimization problems. In this tutorial, I'll explain how to use Nelder-Mead method to find a minima of a given function in Python. SciPy API provides the minimize() function that can be used to apply several optimization methods and we can implement Nelder-Mead method by using this function.

The tutorial covers:

1. Nelder-Mead method implementation
2. Source code listing

We'll start by loading the required libraries.

` `
```from scipy.optimize import minimize
import matplotlib.pyplot as plt
import numpy as np ```
` `

Nelder-Mead method does not require gradient information and it is often used in multidimensional nonlinear functions. The method requires starting points to search.
First, we'll prepare objective function as below and visualize it in a graph.

` `
```def func(x):
x = x**2+2*np.sin(x*np.pi)
return x```
` `
```x = np.arange(-2, 2, 0.01)
y = func(x)

plt.title("Given function in x range")
plt.plot(x, y)
plt.grid()
plt.show() ```
` `

We'll use minimize() function to find the optima by Nelder-mead method. The minimize() function is a common interface to implement various optimization methods. The function requires initial starting points to start searching and we define it as x0 and set 'nelder-mead' into a method.

` `
```x0 = -1
result = minimize(func, x0, method="nelder-mead")
print(result) ```

`final_simplex: (array([[-0.45380859],       [-0.45390625]]), array([-1.77303645, -1.77303644]))           fun: -1.7730364463746562       message: 'Optimization terminated successfully.'          nfev: 30           nit: 15        status: 0       success: True             x: array([-0.45380859])`
` `

Here, result contains the following attributes:

fun - value of objective function
nfev -  the number of evaluation,
nit - the number of iterations,
success - the existence of optimizer,
x -  the solution of the optimization,
status -  termination status,

We can describe determined optima in a graph as shown below.

` `
```plt.title("Describing the function minimum")
plt.plot(x, y, label="y")
plt.plot(result['x'], result['fun'], 'sr', label="minimum")
plt.grid()
plt.show()```
` `

We can add more starting points and search the minima again.

` `
```x0 = [-1, 1]
ig, ax = plt.subplots(2, figsize=(6, 8))
i = 0
for i in range(len(x0)):
result = minimize(func,  x0[i], method="nelder-mead")
ax[i].plot(x, y, label="y")
ax[i].plot(result['x'], result['fun'], 'sr', label="minimum")
ax[i].set_title("Starts from " + str(x0[i]))
ax[i].grid()
i = i + 1

plt.tight_layout()
plt.show() ```
` `

In this tutorial, we've briefly learned how to use Nelder-Mead method with minimize() function in Python. The full source code is listed below.

Source code listing

` `
```from scipy.optimize import minimize
import matplotlib.pyplot as plt
import numpy as np```
` `
` `
```def func(x):
x = x**2+2*np.sin(x*np.pi)
return x

x = np.arange(-2, 2, 0.01)
y = func(x)

plt.title("Given function in x range")
plt.plot(x, y)
plt.grid()
plt.show()

x0 = -1
result = minimize(func,  x0, method="nelder-mead")
print(result)

plt.title("Describing the function minimum")
plt.plot(x, y, label="y")
plt.plot(result['x'], result['fun'], 'sr', label="minimum")
plt.grid()
plt.show()

x0 = [-1, 1]
ig, ax = plt.subplots(2, figsize=(6, 8))
i = 0
for i in range(len(x0)):
result = minimize(func,  x0[i], method="nelder-mead")
ax[i].plot(x, y, label="y")
ax[i].plot(result['x'], result['fun'], 'sr', label="minimum")
ax[i].set_title("Starts from " + str(x0[i]))
`   `