-
Notifications
You must be signed in to change notification settings - Fork 421
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot open a deltatable in S3 using AWS_PROFILE based credentials from a local machine #855
Comments
The underlying object store implementation does not plan on supporting this. It doesn't depend on the official AWS crates and doesn't want to re-implement something that complex, which is reasonable. apache/arrow-rs#2178 But perhaps for the Python bindings, we could have optional integration with boto3? Implementation would be something like: if you don't pass in from boto3 import Session
session = Session()
credentials = session.get_credentials() (source) Does that seem reasonable? |
I really think we should have native SSO implementation since it's such a common development workflow. Doing it in python through boto3 is a reasonable short term workaround for python users, so I think we should do that. It would be even better if we can do this in the Rust core. We have had many AWS auth issues reported since we switched to objectstore-rs, looks like its auth implementation is a bit broken :*( @wouove have you tried downgrading to version 0.5.x? |
Hi all, Thank you for your quick replies :) from deltalake import DeltaTable
from boto3 import Session
session = Session()
credentials = session.get_credentials()
current_credentials = credentials.get_frozen_credentials()
storage_options = {}
uri = "s3://bucket/key"
storage_options = {}
storage_options["AWS_ACCESS_KEY_ID"] = current_credentials.access_key
storage_options["AWS_SECRET_ACCESS_KEY"] = current_credentials.secret_key
storage_options["AWS_SESSION_TOKEN"] = current_credentials.token
dt = DeltaTable(uri, storage_options=storage_options)
print(dt.version()) @houqp Downgrading to 0.5.x did not work, I got |
Proposed upstream workaround - apache/arrow-rs#2891 TBC I would still recommend aws-vault over using this feature, but this functionality may serve as an optional escape valve. |
# Description - Add the support of `AWS_PROFILE` environment variable for AWS S3 - Bump version of `object_store` to `0.5.2` # Related Issue(s) - relate to apache/arrow-rs#3229 - relate to apache/arrow-rs#2891 - closes #855
Environment
Delta-rs version: 0.6.1
Binding: Python (using 3.10.3)
Environment:
Bug
What happened:
While trying to open a Delta Lake table from s3, I got the following error:
I realise this bug is a lot like the one reported in issue 854. Still, I'd like to report it separately, as the hardware and authentication flow is different in this issue.
What you expected to happen:
Credentials should be accessible from the AWS profile, which should be set by an env. variable
AWS_PROFILE = "Your profile"
. The credentials are set for a profile once starting an AWS session through SSO.I expected the code to have used this profile to get the AWS credentials. When using any method from boto3, to interact with AWS' APIs from Python, the credentials are taken from the profile. Hence, I expect the script to finish and just print the latest version of the table.
How to reproduce it:
Code:
More details:
The only work-around was in issue 854, which is kinda ugly.
The text was updated successfully, but these errors were encountered: