How to deal with very very large matrices?

20 Views Asked by At
% input variables
T = 2274;
N = 58;
B = 10000;
H = 250;
K = 3;

% I simulate matrices of similar size to give the idea of the problem
rng('default') % even if replicability is not necessary in this case

% first set (based on dimension)
P0 = lognrnd(0, 1, T, N);
R0 = rand(T, N) - rand(T, N);
% second set (based on dimension)
R = rand(T, N, K) - rand(T, N, K);
V0 = rand(T, N, K);
Z0 = randn(T, N, K);

% third and foruth set (based on dimension)
Z1 = randn(H, B, N, K);
R1 = rand(H, B, N, K) - rand(H, B, N, K);
V1 = rand(H, B, N, K);
P1 = lognrnd(0, 1, H, B, N, K);

Z2 = randn(H, B, N, K);
R2 = rand(H, B, N, K) - rand(H, B, N, K);
V2 = rand(H, B, N, K);
P2 = lognrnd(0, 1, H, B, N, K);

While MATLAB can handle the computations, saving and loading these big matrices is extremely slow, and loading the '.mat' files often results in crashes. I have attempted to use the matfile function to access the variables without loading them, but it doesn't seem to provide a satisfactory solution in terms of speed. I would greatly appreciate any suggestions or improvements on how to efficiently deal with such big matrices in MATLAB. Is there a more optimized way to handle these large datasets, particularly for saving and loading operations?

0

There are 0 best solutions below