datacube.utils.aws.configure_s3_access#
- datacube.utils.aws.configure_s3_access(profile=None, region_name='auto', aws_unsigned=False, requester_pays=False, cloud_defaults=True, client=None, **gdal_opts)[source]#
Credentialize for S3 bucket access or configure public access.
This function obtains credentials for S3 access and passes them on to processing threads, either local or on dask cluster.
Note
if credentials are STS based they will eventually expire, currently this case is not handled very well, reads will just start failing eventually and will never recover.
- Parameters:
profile – AWS profile name to use
region_name – Default region_name to use if not configured for a given/default AWS profile
aws_unsigned – If
True
don’t bother with credentials when reading from S3requester_pays – Needed when accessing requester pays buckets
cloud_defaults – Assume files are in the cloud native format, i.e. no side-car files, disables looking for side-car files, makes things faster but won’t work for files that do have side-car files with extra metadata.
client – Dask distributed
dask.Client
instance, if supplied apply settings on the dask cluster rather than locally.gdal_opts – Any other option to pass to GDAL environment setup
- Returns:
credentials object or
None
ifaws_unsigned=True