use AWS_PROFILE in pandas.read_parquet

1.7k Views Asked by At

I'm testing this locally where I have a ~/.aws/config file.

~/.aws/config looks some thing like:

[profile a] 
...
[profile b]
...

I also have a AWS_PROFILE environmental variable set as "a".

I would like to read a file in which is accessible with profile b using pandas.

I am able to access it through s3fs by doing:

import s3fs
fs = s3fs.S3FileSystem(profile="b")
fs.get("BUCKET/FILE.parquet", "FILE.parquet")
pd.read_parquet("FILE.parquet")

However, if I try to pass this to pd.read_parquet using storage_options I get a PermissionError: Forbidden.

pd.read_parquet(
    "s3://BUCKET/FILE.parquet",
    storage_options={"profile": "b"},
)

full Traceback below

Traceback (most recent call last):
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/s3fs/core.py", line 233, in _call_s3
    out = await method(**additional_kwargs)
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/pandas/io/parquet.py", line 459, in read_parquet
    return impl.read(
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/pandas/io/parquet.py", line 221, in read
    return self.api.parquet.read_table(
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/pyarrow/parquet.py", line 1672, in read_table
    dataset = _ParquetDatasetV2(
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/pyarrow/parquet.py", line 1504, in __init__
    if filesystem.get_file_info(path_or_paths).is_file:
  File "pyarrow/_fs.pyx", line 438, in pyarrow._fs.FileSystem.get_file_info
  File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status
  File "pyarrow/_fs.pyx", line 1004, in pyarrow._fs._cb_get_file_info
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/pyarrow/fs.py", line 226, in get_file_info
    info = self.fs.info(path)
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/fsspec/asyn.py", line 72, in wrapper
    return sync(self.loop, func, *args, **kwargs)
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/fsspec/asyn.py", line 53, in sync
    raise result[0]
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/fsspec/asyn.py", line 20, in _runner
    result[0] = await coro
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/s3fs/core.py", line 911, in _info
    out = await self._call_s3(
  File "/home/ray/local/bin/anaconda3/envs/main/lib/python3.8/site-packages/s3fs/core.py", line 252, in _call_s3
    raise translate_boto_error(err)
PermissionError: Forbidden

Note: there is an old question somewhat related to this but it didn't help: How to read parquet file from s3 using dask with specific AWS profile

1

There are 1 best solutions below

0
On

You just need to add the following argument to the function:

storage_options=dict(profile='your_profile_name')

Hence the read statement is:

pd.read_parquet("s3://your_bucket",storage_options=dict(profile='your_profile_name'))