R Memory Management: Getting values of large RasterFile prompts error: cannot allocate vector of size n GB

204 Views Asked by At

I am aware that there are similar questions and problems around but mine is very specific.

I'm working with the WorldClim and Chelsa Climate data sets in 30 arc-second resolution. In the process I want to put the specific datasets into a ncdf4 file. In the process the data must be converted into a vector to be casted into a matrix. This matrix is then reverst and in the end transponed. The output data is than casted into a array to be later put into the ncdf-file.

My example code looks like this.

library(raster)
library(ncdf4)

# Download of one of the raster files (here 110 Mb .tif-Raster from the Chelsa server)
temp.raster <- raster("https://envidatrepo.wsl.ch/uploads/chelsa/chelsa_V1/bioclim/integer/CHELSA_bio10_08.tif")

#determining the cellsize
cellsize <- (temp.raster@[email protected]@extent@xmin)/temp.raster@ncols
# Set Longitudes
lon <- as.array(seq(temp.raster@extent@xmin+cellsize/2,
                    temp.raster@extent@xmax-cellsize/2,
                    (temp.raster@[email protected]@extent@xmin)/temp.raster@ncols))
nlon <- length(lon)

# Set Latitudes
lat <- as.array(seq(temp.raster@extent@ymin+cellsize/2,
                    temp.raster@extent@ymax-cellsize/2,
                    (temp.raster@[email protected]@extent@xmin)/temp.raster@ncols))
nlat <- length(lat)


temp.raster.data <- t(apply(matrix(data = as.numeric(values(temp.raster)),
                                             nrow = nlat,
                                             ncol = nlon,
                                             byrow = TRUE),
                            2,
                            rev))
temp.raster.data <- array(temp.raster.data, dim=c(nlon,nlat,1))

The error occurs at the values()-function in the process of extracting the values out of the raster file.

Setting a higher memory limit as suggested in other questions did not work, and adding more RAM is not an option.

Has someone a suggestion of a workaround or even simpler solution to this problem?

My sessionInfo() is

R version 4.0.2 (2020-06-22)

Platform: x86_64-w64-mingw32/x64 (64-bit)

Running under: Windows 10 x64 (build 18363)

With 12 GB of RAM (memory.limit() --> 11984)

Thank you for your time!

1

There are 1 best solutions below

0
On BEST ANSWER

A simpler way would be to use writeRaster

f <- "https://envidatrepo.wsl.ch/uploads/chelsa/chelsa_V1/bioclim/integer/CHELSA_bio10_08.tif"
bf <- basename(f)
download.file(f, bf, mode="wb")

library(raster)
r <- raster(bf)
ncfile <- gsub(".tif", ".nc", bf)
x <- writeRaster(r, ncfile)

Also you are reinventing the wheel a lot; consider the below.

#determining the cellsize
cellsize <- res(r)

# Lon/lat
lon <- xFromCol(r, 1:ncol(r))
lat <- yFromRow(r, 1:nrow(r))