I am trying to use the fminunc function for convex optimization. However, in my case I am taking the gradient with respect with logx. Let my objective function be F. Then the gradient will be
dF/dx = (dF/dlogx) * (1/x)
= > dF/dlogx = (dF/dx) * x
So
logx_new = logx_old + learning_rate * x * (dF/logx)
x_new = exp(logx_new)
How can I implement this in fminunc
It's possible and described in the documentation:
If the gradient of fun can also be computed and the GradObj option is 'on', as set by options = optimset('GradObj','on') then the function fun must return, in the second output argument, the gradient value g, a vector, at x.
fminunc with custom gradient
So for example: if
f = @(x) x.^2;
thendf/dx = 2*x
and you can useYou can then pass that function to
fminunc
:fminunc with logx gradient
For your logx gradient, this becomes:
and the
fminunc
stays the same.fminunc with anonymous function
If you want, you can also use anonymous functions:
Example of fminunc with logx gradient
Additional example for
f = (log(x))^2
and then:
Example of fminunc with custom gradient and multiple variables
For multiple variables e.g. f(x,y), you'll have to put your variables into a vector, example:
This function corresponds to a paraboloid. Of course you'll also have to use a vector for the initial starting parameters, in this case eg: x0=[-5 3]