Out of memory on ndarray conditional selection

140 Views Asked by At

I've read that python is able to use the entire physical memory available on the machine, therefore it should not run out of memory before actually filling up all the free 9+GB of my laptop.

However, using laspy to parse a 10M point cloud (200MB) and selecting points in the following way produces an out of memory error:

inFile = File(sys.argv[1], mode = "r")
all_points = np.vstack([inFile.x, inFile.y, inFile.z, inFile.return_num, inFile.intensity]).transpose()
lower_points = all_points[ 1 > inFile.z ]
upper_points = all_points[ 1 <= inFile.z ]

The last conditional selection triggers the memory error. There are actually 2M points that satisfy the first condition, and 10M points in total, so 8M points should satisfy the second condition.

If I change upper_points to be a normal list (as:[] ) and .append every point which z is bigger than 1, it works without problems.

8M points should be approximately 200MB or a bit more, so I don't really understand the problem. What am I missing?

0

There are 0 best solutions below