fminunc Solves a multi-variable unconstrainted optimization problem

Calling Sequence

xopt = fminunc(f,x0)
xopt = fminunc(f,x0,options)
[xopt,fopt] = fminunc(.....)
[xopt,fopt,exitflag]= fminunc(.....)
[xopt,fopt,exitflag,output]= fminunc(.....)
[xopt,fopt,exitflag,output,gradient]=fminunc(.....)
[xopt,fopt,exitflag,output,gradient,hessian]=fminunc(.....)

Parameters

f :

a function, representing the objective function of the problem

x0 :

a vector of doubles, containing the starting of variables.

options:

a list, containing the option for user to specify. See below for details.

xopt :

a vector of doubles, the computed solution of the optimization problem.

fopt :

a scalar of double, the function value at x.

exitflag :

a scalar of integer, containing the flag which denotes the reason for termination of algorithm. See below for details.

output :

a structure, containing the information about the optimization. See below for details.

gradient :

a vector of doubles, containing the the gradient of the solution.

hessian :

a matrix of doubles, containing the the hessian of the solution.

Description

Search the minimum of an unconstrained optimization problem specified by : Find the minimum of f(x) such that

The routine calls Ipopt for solving the Un-constrained Optimization problem, Ipopt is a library written in C++.

The options allows the user to set various parameters of the Optimization problem. It should be defined as type "list" and contains the following fields.

  • Syntax : options= list("MaxIter", [---], "CpuTime", [---], "Gradient", ---, "Hessian", ---);
  • MaxIter : a Scalar, containing the Maximum Number of Iteration that the solver should take.
  • CpuTime : a Scalar, containing the Maximum amount of CPU Time that the solver should take.
  • Gradient : a function, representing the gradient function of the Objective in Vector Form.
  • Hessian : a function, representing the hessian function of the Objective in Symmetric Matrix Form.
  • Default Values : options = list("MaxIter", [3000], "CpuTime", [600]);

The exitflag allows to know the status of the optimization which is given back by Ipopt.

  • exitflag=0 : Optimal Solution Found
  • exitflag=1 : Maximum Number of Iterations Exceeded. Output may not be optimal.
  • exitflag=2 : Maximum CPU Time exceeded. Output may not be optimal.
  • exitflag=3 : Stop at Tiny Step.
  • exitflag=4 : Solved To Acceptable Level.
  • exitflag=5 : Converged to a point of local infeasibility.

For more details on exitflag see the ipopt documentation, go to http://www.coin-or.org/Ipopt/documentation/

The output data structure contains detailed informations about the optimization process. It has type "struct" and contains the following fields.

  • output.Iterations: The number of iterations performed during the search
  • output.Cpu_Time: The total cpu-time spend during the search
  • output.Objective_Evaluation: The number of Objective Evaluations performed during the search
  • output.Dual_Infeasibility: The Dual Infeasiblity of the final soution

Examples

//Find x in R^2 such that it minimizes the Rosenbrock function
//f = 100*(x2 - x1^2)^2 + (1-x1)^2
//Objective function to be minimised
function y=f(x)
y= 100*(x(2) - x(1)^2)^2 + (1-x(1))^2;
endfunction
//Starting point
x0=[-1,2];
//Gradient of objective function
function y=fGrad(x)
y= [-400*x(1)*x(2) + 400*x(1)^3 + 2*x(1)-2, 200*(x(2)-x(1)^2)];
endfunction
//Hessian of Objective Function
function y=fHess(x)
y= [1200*x(1)^2- 400*x(2) + 2, -400*x(1);-400*x(1), 200 ];
endfunction
//Options
options=list("MaxIter", [1500], "CpuTime", [500], "Gradient", fGrad, "Hessian", fHess);
//Calling Ipopt
[xopt,fopt,exitflag,output,gradient,hessian]=fminunc(f,x0,options)
// Press ENTER to continue

Examples

//Find x in R^2 such that the below function is minimum
//f = x1^2 + x2^2
//Objective function to be minimised
function y=f(x)
y= x(1)^2 + x(2)^2;
endfunction
//Starting point
x0=[2,1];
//Calling Ipopt
[xopt,fopt]=fminunc(f,x0)
// Press ENTER to continue

Examples

//The below problem is an unbounded problem:
//Find x in R^2 such that the below function is minimum
//f = - x1^2 - x2^2
//Objective function to be minimised
function y=f(x)
y= -x(1)^2 - x(2)^2;
endfunction
//Starting point
x0=[2,1];
//Gradient of objective function
function y=fGrad(x)
y= [-2*x(1),-2*x(2)];
endfunction
//Hessian of Objective Function
function y=fHess(x)
y= [-2,0;0,-2];
endfunction
//Options
options=list("MaxIter", [1500], "CpuTime", [500], "Gradient", fGrad, "Hessian", fHess);
//Calling Ipopt
[xopt,fopt,exitflag,output,gradient,hessian]=fminunc(f,x0,options)

Authors

  • R.Vidyadhar , Vignesh Kannan