Do I have a memory inconcistency? Macos

148 Views Asked by At

I currently use Activity monitor, however this yields a serious inconsistency. I have a running program that builds and stores a 60 x 100,000 array of 10 dimensional GSL vectors in double precision, and another 6 x 60 array of 16,807 dimensional GSL vectors.

I code in C++, I'm using the GSL library out of convenience for the minute.

GSL vectors are essentially an array of double precision data and a pointer, so I think it should be accurate to measure their usage in terms of just the double precision components.

Now, by my calculations, I should be storing around 500 MB of data (8bytes per double). However, my Macos Activity monitor tells me I'm storing 1.4 GB of "real" memory. Now, this may be a very inaccurate method to measure memory usage, but it's not inaccurate at predicting when my machine will switch from using RAM to using swaps, and become very slow! When I increase the first array size to 60 x 400k, for example, I run out of memory and everything stops dead.

So is it that my math is wrong, or is there something going wrong with the way my computer is estimating how much data it's storing?

EDIT: Or is it something about the way I'm storing pointer based data that's confusing the allocator into massively overcompensating the storage need?

EDIT 2: The data is stored in stl::vector< stl::vector<gsl::gsl_vector * > > structures. I read that Eigen does not use dynamical memory allocation: could this lead to substantial improvement in the memory management?

1

There are 1 best solutions below

3
On BEST ANSWER

You are creating 600,000 std::vectors. That's a lot of overhead. Each of those vectors isn't simply an array - there is an overhead of pointer, size, capacity, alignment, etc. Plus I suspect you allocate your GSL vectors on the heap?

Eigen may partially solve your problems as it has a static specialization of Matrix class. However, you should be really thinking of using a proper 3d contigous storage array/tensor object.