Minimization with Gradient and Hessian Sparsity Pattern
This example shows how to solve a nonlinear minimization problem with a tridiagonal Hessian matrix approximated by sparse finite differences instead of explicit computation.
The problem is to find to minimize
where = 1000.
n = 1000;
To use thetrust-region
method infminunc
, you must compute the gradient in the objective function; it is not optional as in thequasi-newton
method.
The helper functionbrownfg
at theend of this examplecomputes the objective function and gradient.
To allow efficient computation of the sparse finite-difference approximation of the Hessian matrix
, the sparsity structure of
must be predetermined. In this case, the structureHstr
, a sparse matrix, is available in the filebrownhstr.mat
. Using thespy
command, you can see thatHstr
is, indeed, sparse (only 2998 nonzeros).
loadbrownhstrspy(Hstr)
Set theHessPattern
option toHstr
usingoptimoptions
. When such a large problem has obvious sparsity structure, not setting theHessPattern
option uses a great amount of memory and computation unnecessarily, becausefminunc
attempts to use finite differencing on a full Hessian matrix of one million nonzero entries.
To use the Hessian sparsity pattern, you must use thetrust-region
algorithm offminunc
. This algorithm also requires you to set theSpecifyObjectiveGradient
option totrue
usingoptimoptions
.
options = optimoptions(@fminunc,'Algorithm','trust-region',...'SpecifyObjectiveGradient',true,'HessPattern',Hstr);
Set the objective function to@brownfg
. Set the initial point to –1 for odd
components and +1 for even
components.
xstart = -ones(n,1); xstart(2:2:n,1) = 1; fun = @brownfg;
Solve the problem by callingfminunc
using the initial pointxstart
and optionsoptions
.
[x,fval,exitflag,output] = fminunc(fun,xstart,options);
Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
Examine the solution and solution process.
disp(fval)
7.4738e-17
disp(exitflag)
1
disp(output)
迭代:7 funcCount: cgiter 8 stepsize: 0.0046ations: 7 firstorderopt: 7.9822e-10 algorithm: 'trust-region' message: 'Local minimum found....' constrviolation: []
The function
is a sum of powers of squares and, therefore, is nonnegative. The solutionfval
is nearly zero, so it is clearly a minimum. The exit flag1
also indicates thatfminunc
finds a solution. Theoutput
structure shows thatfminunc
takes only seven iterations to reach the solution.
Display the largest and smallest elements of the solution.
disp(max(x))
1.9955e-10
disp(min(x))
-1.9955e-10
The solution is near the point where all elements ofx = 0
.
Helper Function
这段代码创建了brownfg
helper function.
function[f,g] = brownfg(x)% BROWNFG Nonlinear minimization test problem%% Evaluate the functionn=length(x); y=zeros(n,1); i=1:(n-1); y(i)=(x(i).^2).^(x(i+1).^2+1) +...(x (i + 1) ^ 2)。^ (x (i) ^ 2 + 1);f =sum(y);% Evaluate the gradient if nargout > 1ifnargout > 1 i=1:(n-1); g = zeros(n,1); g(i) = 2*(x(i+1).^2+1).*x(i).*...((x(i).^2).^(x(i+1).^2))+...2*x(i).*((x(i+1).^2).^(x(i).^2+1)).*...log(x(i+1).^2); g(i+1) = g(i+1) +...2*x(i+1).*((x(i).^2).^(x(i+1).^2+1)).*...log(x(i).^2) +...2*(x(i).^2+1).*x(i+1).*...((x(i+1).^2).^(x(i).^2));endend