Implementation of EM algorithm for Gaussian Mixture Models using Matlab

89 Views Asked by At

Using the EM algorithm, I want to train a Gaussian Mixture model with four components on a given dataset. The set is three dimensional and contains 300 samples.

The problem is that after about 6 rounds of the EM algorithm, the covariance matrices sigma become close to singular according to matlab (rank(sigma) = 2 instead of 3). This in turn leads to undesired results like complex values evaluating the gaussian distribution gm(k,i).

Furthermore I used the log of the gaussian to account for underflow troubles - see E-step. I am not sure if this is correct and if I have to take the exp of the responsibilites p(w_k | x^(i), theta) somewhere else?

Can you tell me if my implementation of the EM algorithm is correct so far? And how to account for the problem with the close to singular covariance sigma?

Here is my implementation of the EM algorithm:

First I initialized the means and the covariance of the components using kmeans:

load('data1.mat');

X = Data'; % 300x3 data set
D = size(X,2); % dimension
N = size(X,1); % number of samples
K = 4; % number of Gaussian Mixture components

% Initialization
p = [0.2, 0.3, 0.2, 0.3]; % arbitrary pi
[idx,mu] = kmeans(X,K); % initial means of the components

% compute the covariance of the components
sigma = zeros(D,D,K);
for k = 1:K
    sigma(:,:,k) = cov(X(idx==k,:));
end

For the E-step I am using the following formula to calculate the responsibilities.

responsibility

w_k are the k gaussian components.

x^(i) is a single datapoint (sample)

theta stands for the parameters of the gaussian mixture model: mu, Sigma, pi.

Here is the corresponding code:

% variables for convergence 
converged = 0;
prevLoglikelihood = Inf;
prevMu = mu;
prevSigma = sigma;
prevPi = p;
round = 0;
while (converged ~= 1)
    round = round +1
    gm = zeros(K,N); % gaussian component in the nominator
    sumGM = zeros(N,1); % denominator of responsibilities
    % E-step:  Evaluate the responsibilities using the current parameters
    % compute the nominator and denominator of the responsibilities
    for k = 1:K
        for i = 1:N
             Xmu = X-mu;
             % I am using log to prevent underflow of the gaussian distribution (exp("small value"))
             logPdf = log(1/sqrt(det(sigma(:,:,k))*(2*pi)^D)) + (-0.5*Xmu*(sigma(:,:,k)\Xmu'));
             gm(k,i) = log(p(k)) * logPdf;
             sumGM(i) = sumGM(i) + gm(k,i);
         end
    end

    % calculate responsibilities
    res = zeros(K,N); % responsibilities
    Nk = zeros(4,1);
    for k = 1:K
        for i = 1:N
            % I tried to use the exp(gm(k,i)/sumGM(i)) to compute res but this leads to sum(pi) > 1.
            res(k,i) = gm(k,i)/sumGM(i);
        end
        Nk(k) = sum(res(k,:));
    end

Nk(k) is computed using the formula given in the M-step and is used in the M-step to calculate the new probabilities p(k).

M-step

reestimate parameters using current responsibilities

    % M-step: Re-estimate the parameters using the current responsibilities
    for k = 1:K
        for i = 1:N
            mu(k,:) = mu(k,:) + res(k,i).*X(k,:);
            sigma(:,:,k) = sigma(:,:,k) + res(k,i).*(X(k,:)-mu(k,:))*(X(k,:)-mu(k,:))';
        end
        mu(k,:) = mu(k,:)./Nk(k);
        sigma(:,:,k) = sigma(:,:,k)./Nk(k);
        p(k) = Nk(k)/N;
    end

Now in order to check for convergence the log-likelihood is computed using this formula:

log-likelihood

    % Evaluate the log-likelihood and check for convergence of either 
    % the parameters or the log-likelihood. If not converged, go to E-step.
    loglikelihood = 0;
    for i = 1:N
        loglikelihood = loglikelihood + log(sum(gm(:,i)));
    end


    % Check for convergence of parameters
    errorLoglikelihood = abs(loglikelihood-prevLoglikelihood);
    if (errorLoglikelihood <= eps)
        converged = 1; 
    end

    errorMu = abs(mu(:)-prevMu(:));
    errorSigma = abs(sigma(:)-prevSigma(:));
    errorPi = abs(p(:)-prevPi(:));

    if (all(errorMu <= eps) && all(errorSigma <= eps) && all(errorPi <= eps))
        converged = 1;
    end

    prevLoglikelihood = loglikelihood;
    prevMu = mu;
    prevSigma = sigma;
    prevPi = p;

end % while 

Is there something wrong with my Matlab implementation of EM algorithm for Gaussian Mixture Models?

0

There are 0 best solutions below