cannot import s3fs in pyspark

5.2k Views Asked by At

When i try importing the s3fs library in pyspark using the following code:

import s3fs

I get the following error:

An error was encountered: cannot import name 'maybe_sync' from 'fsspec.asyn' (/usr/local/lib/python3.7/site-packages/fsspec/asyn.py) Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/s3fs/init.py", line 1, in from .core import S3FileSystem, S3File File "/usr/local/lib/python3.7/site-packages/s3fs/core.py", line 12, in from fsspec.asyn import AsyncFileSystem, sync, sync_wrapper, maybe_sync ImportError: cannot import name 'maybe_sync' from 'fsspec.asyn' (/usr/local/lib/python3.7/site-packages/fsspec/asyn.py)

The fsspec package has been installed in my notebook. And I actually had been using it fine for a long time, when this suddenly happened. I tried googling, but could not find this specific error. Has anyone come across this before? And if so, do you know how to solve it?

2

There are 2 best solutions below

1
On BEST ANSWER

Glad to hear this wasn't just me. It looks like if you pip install versions s3fs==0.5.1 and fsspec==0.8.3, that should fix it.

0
On

be carefull when changing the version of s3fs and fsspec ! in case your code turns even worse than before

pip install --upgrade botocore
pip install --upgrade s3fs
pip install --upgrade sfspec