Univariate Function Optimization Example in Python

    Optimization is a technique for finding the minimum or maximum value of the function from the set of available options. Finding the shortest path from point A to point B by evaluating multiple alternative directions can be a simple example of an optimization problem. 

    Finding a single input scalar (minimum) that defines optimal output of the target function is called univariate function optimization. In other words, it involves optimizing with respect to a single input variable. This is a simpler case compared to multivariate optimization, where the function has multiple variables. 

    SciPy API provides the minimize_scalar() function to implement univariate optimization with a given method. In this tutorial, you'll learn how to perform univariate function optimization by using the minimize_scalar() function with a Brent method in Python. The tutorial covers:

  1. Univariate function optimization
  2. Source code listing

 We'll start by loading the required libraries.


from scipy.optimize import minimize_scalar 
import matplotlib.pyplot as plt
import numpy as np 
 

Univariate Function Optimization
 
    Univariate function contains a single variable. To get an optimal value from univariate function, we'll find a single optimal input.
    First, we'll prepare objective function as below and visualize it in a graph. 
 
 
def func(x):
    x = x**2+2*np.sin(x*np.pi)
    return x
 
 
x = np.arange(-2, 2, 0.01)
y = func(x)

plt.title("Given function in x range")
plt.plot(x, y)
plt.grid()
plt.show() 
 
 

 
The plot shows the shape of the function.
    We'll use minimize_scalar() function to find the optima. The minimize_scalar() minimizes univariate function with given method such as 'Brent', 'Bounded', and 'Golden'.
 
 
result = minimize_scalar(func, method="brent")
print(result)
 
fun: -1.773036468129433
nfev: 14
nit: 10
success: True
x: -0.4538535458172619
 
 

    In this example, the func function is a simple univariate function. The minimize_scalar function is then used to find the minimum of the function.

    Here, result contains the following attributes:

    fun - value of objective function 
    nfev -  the number of evaluation, 
    nit - the number of iterations, 
    success - the existence of optimizer, 
    x -  the solution of the optimization.

    Now we can visualize the detected minimum of the function in graph.

 
plt.title("Describing the function minimum")
plt.plot(x, y, label="y")
plt.plot(result['x'], result['fun'], 'sr', label="minimum")
plt.legend(loc='best', fancybox=True, shadow=True)
plt.grid()
plt.show()
 

    
    This is a basic illustration, and the minimize_scalar() function provides various methods for univariate optimization, including the Brent method, golden section search, and bounded methods. The choice of method depends on the characteristics of the function and the specific requirements of the optimization problem.
    In this tutorial, we've briefly learned how to perform univariate function optimization with minimize_scalar() function in Python. The full source code is listed below. 
 
 
Source code listing

 
from scipy.optimize import minimize_scalar 
import matplotlib.pyplot as plt
import numpy as np


def func(x):
    x = x**2 + 2*np.sin(x*np.pi)
    return x


x = np.arange(-2, 2, 0.01)
y = func(x)

plt.title("Given function in x range")
plt.plot(x, y)
plt.grid()
plt.show()


result = minimize_scalar(func, method="brent")
print(result)

plt.title("Describing the function minimum")
plt.plot(x, y, label="y")
plt.plot(result['x'], result['fun'], 'sr', label="minimum")
plt.legend(loc='best', fancybox=True, shadow=True)
plt.grid()
plt.show() 
  
 
 
References:

No comments:

Post a Comment