Note

This notebook can be downloaded here: Optimization_Tutorial.ipynb

Optimization tutorials (TD)

The Rosenbrock function

The Rosenbrock function is a classical benchmark for optimization algorithms. It is defined by the following equation:

\[f(x, y) = (1-x)^2 + 100 (y-x^2)^2\]
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt

def Rosen(X):
    """
    Rosenbrock function
    """
    x, y = X
    return (1-x)**2 + 100. * (y-x**2)**2


x = np.linspace(-2., 2., 100)
y = np.linspace(-1., 3., 100)
X, Y = np.meshgrid(x,y)
Z = Rosen( (X, Y) )

fig = plt.figure(0)
plt.clf()
plt.contourf(X, Y, Z, 20)
plt.colorbar()
plt.contour(X, Y, Z, 20, colors = "black")
plt.grid()
plt.xlabel("x")
plt.ylabel("y")
plt.show()
../../../../_images/Optimization_Tutorial_1_0.png
index=np.where(Z==Z.min())
print(index)
print(X[index])
print(Y[index])
(array([49]), array([74]))
[ 0.98989899]
[ 0.97979798]

Questions

  1. Find the minimum of the function using brute force. Comment the accuracy and number of function evaluations.
  2. Same question with the simplex (Nelder-Mead) algorithm.
  3. Try to locate the two minima of the function \(x \mapsto x^4+3x^3y+6x^2y^2+3xy^3+y^4-4(x-y)(x+3y)\)
from scipy import optimize
X0 = [-1,-1] # Initial guess
sol = optimize.minimize(Rosen, X0, method = "nelder-mead")
sol
final_simplex: (array([[ 0.99999886,  0.99999542],
      [ 0.99996261,  0.99992484],
      [ 0.9999812 ,  0.99996926]]), array([  5.30934392e-10,   1.41212609e-09,   5.05898700e-09]))
          fun: 5.3093439186371615e-10
      message: 'Optimization terminated successfully.'
         nfev: 125
          nit: 67
       status: 0
      success: True
            x: array([ 0.99999886,  0.99999542])

Curve fitting

  1. Chose a mathematical function \(y = f(x, a, b)\) and code it.
  2. Chose target values of \(a\) and \(b\) that you will try to find back using optimization.
  3. Evaluate it on a grid of \(x\) values.
  4. Add some noise to the result.
  5. Find back \(a\) and \(b\) using curve_fit