I am trying to create a function and a function handle of the said function where the function takes in output parameters from the previous call and new input parameters and calculates the gradient in the function as given in the toy example shown below. I would like to pass the function handle to hmcSampler. However, I am having a problem in creating the function handle and would like some help.
To clarify: I want to call logPosterior with a new value of theta, but also with the theta and logpdf output from the previous call. And I need to do this through a function handle that will be called multiple times by a function I don't control, so I need either logPosterior or its handle to manage storing the old values. In the first call, there should be different values of theta and old_theta so that the function can get going.
%% Toy implementation of hmcsampler class in Matlab
NumPredictors = 2;
trueIntercept = 2;
trueBeta = [3;0];
NumData = 100;
rng('default') %For reproducibility
X = rand(NumData,NumPredictors);
mu = X*trueBeta + trueIntercept;
y = mu;
% define the mean and variance of normal distribution of each parameter
means = [0; 0];
standevs = [1;1];
% create multivariate normal log probability distribution
[logpdf, grad_logpdf] = @(theta)logPosterior(theta, old_theta, X, y, means, standevs, old_logpdf); % <- How to write this?
% create the startpoint from which sampling starts
startpoint = randn(2, 1);
% create an HMC sampler object
smp = hmcSampler(logpdf, startpoint);
% estimate maximum of log probability density
[xhat, fitinfo] = estimateMAP(smp);
num_chains = 4;
chains = cell(num_chains, 1);
burnin = 50000;
num_samples = 2000000;
function [logpdf, grad_logpdf] = logPosterior(theta, old_theta, X, y, means, standevs, old_logpdf)
% values
intercept = theta(1);
beta = theta(2:end);
y_computed = X*beta + intercept;
log_likelihood = log(y_computed);
del_loglikelihood = log_likelihood - old_logpdf;
del_params = theta - old_theta;
grad_params1 = del_loglikelihood/del_params;
% compute log priors and gradients of parameters
log_prior_params = 0;
grad_params2 = [];
for i = 1:3
[lp, grad] = normalDistGrad(theta(i), means(i), standevs(i));
log_prior_params = log_prior_params + lp;
grad_params2 = [grad_params2; grad];
end
% return the log posterior and its gradient
logpdf = log_likelihood + log_prior_params;
grad_logpdf = grad_params1 + grad_params2;
end
function [lpdf,glpdf] = normalDistGrad(X, Mu, Sigma)
Z = (X - Mu)./Sigma;
lpdf = sum(-log(Sigma) - .5*log(2*pi) - .5*(Z.^2));
glpdf = -Z./Sigma;
end
I would implement
logPosterioras follows for it to keep track of the values in the last function call. Apersistentvariable is local to the function, but persists between calls, making it an ideal tool to give the function memory.You wold now take a function handle as follows:
funcis the handle you'd pass into whatever function will call it. You could initialize the previous variables by running your function once (I'm not sure what a suitable initialization is!):