Alea GPU Memory Allocation Limit

338 Views Asked by At

I am using Alea GPU with a GTX-1081ti, which has 11GB of global memory.

When I use the Alea Gpu.Default.Allocate method to allocate memory on the GPU, when I get up near 3+ GB I get an "out of memory" error.

My code is:

private static int Length = 2147000000;
...
var gpu = Gpu.Default;    
gpu.Allocate<int>(Length);

This should allocate about 8GB of GPU RAM, but it gives a CUDA out of memory exception. I've tried lower Length values, and based on the report from Task Manager the GPU Dedicated Memory only goes up to around 3 GB before it gets the error.

However, when I do the same with raw CUDA code (outside of Alea), my limit is a bit over 80% of 11 GB, or almost 9 GB (since W10 limits the allocation).

Does anyone know why I'm getting an "out of memory" error at only 3 GB?

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

(Putting into an answer since it seems to have solved your issue)

It sounds like you may be compiling in 32 bit mode. Make sure you are are using a 64-bit OS and compiling in 64-bit mode. You wouldn't be the first to accidentally compile in 32-bit mode :)