Why isn't my Matlab code for a randomly generated covariance matrix making a positive definite matrix?

222 Views Asked by At

Here is my code. I'm getting an error that when I use chol(V) that V is not positive definite. I would think that by construction it must be positive definite. Any idea what's going wrong?

% I want 10000 draws of a 5x1 multivariate normal distribution
N =5;
T = 10000;

% randomly generate standard deviations
sigma = 1 + .1*rand(N,1);

% randomly generate correlations which are between [-1,1]
rho = -1+2*rand(nchoosek(N,2),1);

% This grabs the indices of the elements in the lower triangle below the main diagonal
% itril comes from https://www.mathworks.com/matlabcentral/fileexchange/23391-triangular-and-diagonal-indexing
I = itril(N,-1);

% Initialize correlation matrix
corr = zeros(N);

% Fill in lower triangle of correlation matrix with generated correlations
corr(I) = rho;

% make correlation matrix symmetric with 1s on diagonal
corr = corr+corr'+eye(N);

% Variance matrix is sigma_i*sigma_j*corr(i,j)
V = (sigma*sigma').*corr;

% means vector
mu = rand(N,1);

% generate multivariate normal draws
e = mu' + randn(T,N)*chol(V);
1

There are 1 best solutions below

0
On

That's just not the way to create a correlation matrix. Just because your matrix is symmetric, has 1's on the diagonal and values between -1 and 1 off the diagonal doesn't mean it's a correlation matrix. E.g. the way you create your matrix, you could have correlation +1 between random variables X1 and X2, +1 between X2 and X3, and -1 between X1 and X3, which is clearly not possible if the matrix entries come from the correlations between "real" random variables. So, there's also no guarantee that the matrix thus generated is positive (semi-) definite.

You should rather generate some matrix X that contains the (linear) dependencies between your random variables, and then your covariance matrix is simply X' * X (or X * X' depending on how you order X).