Skip to content

AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables not read for S3/Zarr access #3259

@alexandre-borowczyk

Description

@alexandre-borowczyk

Hi all,

Thanks for the great work putted in supporting zarr and google cloud! I'm running in a issue and I'm uncertain whether it's improper setup or another issue. Hope to get some help working it through.

Version: netCDF 4.10.0-development (commit 871bbc3)
Linux Ubuntu 24.04.1
gcc: 13.3.0
g++: 13.3.0
GNU Make 4.3
autoconf (GNU Autoconf) 2.71
automake (GNU automake) 1.16.5
curl 8.5.0
Python 3.12.3

Build Configuration:

nc-config --all
This netCDF 4.10.0-development has been built with the following features: 
  --cc                -> gcc
  --cflags            -> -I/usr/local/include -I/usr/include/hdf5/serial
  --libs              -> -L/usr/local/lib -lnetcdf
  --static            -> -lhdf5_hl -lhdf5 -lm -lz -lcrypto -lzip -lsz -lbz2 -lzstd -lblosc -lxml2 -lcurl 
  --has-dap           -> yes
  --has-dap2          -> yes
  --has-dap4          -> yes
  --has-nc2           -> yes
  --has-nc4           -> yes
  --has-hdf5          -> yes
  --has-hdf4          -> no
  --has-logging       -> yes
  --has-pnetcdf       -> no
  --has-szlib         -> yes
  --has-cdf5          -> yes
  --has-parallel4     -> no
  --has-parallel      -> no
  --has-nczarr        -> yes
  --has-zstd          -> yes
  --has-benchmarks    -> no
  --has-multifilters  -> yes
  --has-stdfilters    -> bz2 deflate szip blosc zstd
  --has-quantize      -> yes
  --prefix            -> /usr/local
  --includedir        -> /usr/local/include
  --libdir            -> /usr/local/lib
  --plugindir         -> /usr/local/hdf5/lib/plugin
  --plugin-searchpath -> /usr/local/hdf5/lib/plugin:/usr/local/hdf5/lib/plugin
  --version           -> netCDF 4.10.0-development
  --build-system      -> autotools

Issue Description:

AWS credentials from environment variables (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) are not being used when accessing GCS resources via the gs3:// protocol with NCZarr. Credentials only work when stored in ~/.aws/credentials file.

Steps to Reproduce:

  1. Set up valid AWS/GCS HMAC credentials in ~/.aws/credentials:
[default]
aws_access_key_id = GOOG1E...
aws_secret_access_key = ...
  1. Test access to Zarr dataset (works):
ncdump -h gs3://bucket-name/path/to/dataset.zarr#mode=zarr,s3,consolidated
  1. Remove credentials file:
rm -rf ~/.aws/
  1. Export same credentials as environment variables:
export AWS_ACCESS_KEY_ID="GOOG1E..."
export AWS_SECRET_ACCESS_KEY="..."

5.Test access again (fails):

WARN: Could not open file: /home/user/.aws/config
WARN: Could not open file: /home/user/.aws/credentials
WARN: AWS config file not loaded
ncdump: gs3://bucket-name/path/to/dataset.zarr#mode=zarr,s3,consolidated: NetCDF: Authorization failure

The credentials can be verified to work correctly via curl:

curl --aws-sigv4 "aws:amz:auto:s3" \
     --user "$AWS_ACCESS_KEY_ID:$AWS_SECRET_ACCESS_KEY" \
     "https://storage.googleapis.com/bucket-name/?list-type=2"
# Returns valid XML listing

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions