I'm attempting to look into a NetCDF file, and I've read that I can work with them if they're larger than available memory by "breaking them into chunks" using dask. However, I get:
ValueError: unrecognized chunk manager dask - must be one of: []
I'm also not sure if breaking it into chunks a certain size works the way I attempted...
I expected it to print the dataset showing variables with type dask.array and a chunksize attribute
I do have dask installed, but it can't find the module if I attempt to import it
Other people having this isssue have said installing an older version of xarray worked, but it didn't.
I'm very new to python, so I'll do my best to provide more info if needed. Here's what I've got up to the issue:
import numpy as np
import matplotlib.pyplot as plt
import xarray as xr
file = 'jan2mT.nc'
folder = 'foldername'
filename = folder+file
ds = xr.open_dataset(filename,decode_times=True,chunks={'time': '250MB'})
print(ds)