-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for cross-account materials #12
Comments
Some questions so we understand this scenario better
|
|
Thanks for that. The plugin is built with an assumption that the AWS keys ( If creating an IAM user (with the existing policies) an option, then you could use those keys as the environment variables. I'm not aware of the exact good practice - if creating 1 IAM user with cross account access or 1 IAM user per account. HTH. |
I'm assuming you're using the AWS SDK with this plugin. Which means it knows about IAM instance roles and how to make use of them (i.e. you don't need to set these variables, they're available to the instance/SDK the code is running on). And yes, it's absolutely possible to have an instance from one account access resources from another, however, adding a material on the server fails of the bucket is only available on another account. That's what this bug is about :) If it's still unclear please let me know, I'll try to paint a better picture (literally perhaps). ❤️ |
Guess we need to add the ability for the Go server to assume role and access bucket in another account - http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/securitytoken/model/AssumeRoleRequest.html This will be needed to add the package and to poll for changes. It is an interesting use case to support, but this may not be something we can develop and test given our setup and priorities. We would, of course, welcome PRs for the same. We'll keep this issue open. |
Actually, you don't need to assume a role for accessing external resources. Bucket policies allow for cross-account access independent of account access in the account being accessed: http://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html The policy mentioned in the example is attached to the EC2 instance running Go.CD using an instance profile, i.e. you already have all the necessary rights to access the bucket on another account. However, currently the method of checking for bucket availability/access in the materials dialog doesn't account for buckets that don't exist in the same account. I don't know how the check works, but it's the one component that prevents me adding a material that you should actually have access to. |
We have a similar scenario where the bucket is in Account A, but Go servers / agents are either in Account A or Account B. We have a bucket policy in Account A that permits the IAM role used by servers in Account B full access to the bucket, but are currently forced to use Access/Secret keys rather than the IAM role, due to the fact that objects uploaded from Account B are inaccessible from Account A. We believe this can be fixed by adding the ACL: Bucket Owner - full access at the time of upload, but have not yet tested that. Hoping to have it tested later this week, and, if so - will reply here. |
@manojlds @ashwanthkumar @moritzheiber What we needed to do to make it work is add the statement Assuming setting this ACL is safe for all users of the plugins (I belive it should be, as it is the default ACL if you had only one account anyway), we can issue a PR with this change. |
For me this was less of a question of policies (I know this works; IAM/Bucket Policies can do that) but rather UI/integration. How would I be adding such a bucket as a material in the UI? It's not possible currently as (I'm just guessing) the code isn't trying to access the bucket in S3 directly but rather lists all available buckets in the same account. |
@moritzheiber Yes, when you define the repository, the Check Connection fails (the Check Connection does list buckets, and only those in the current account are available). Perhaps the check can be performed a different way (perhaps using client.doesBucketExist() rather than listing buckets). The above does not prevent one from using buckets cross-account, however. Even though the connection check fails, you can still define materials (the Check Package on Add Material does work). As mentioned, we are now using a cross-account setup with these plugins at GCM. The issue we encountered is that there is seemingly no way to enforce S3 ACLs be set on new S3 objects using policy, so if someone in Account A uploads an object into a bucket in Account B, unless you set the ACL at the time of upload, then Account B cannot read metadata or download that object, even though it owns the bucket. This is resolved with the change described above. |
@haimizrael That's good news, I'll give it another try tomorrow. As far I could remember I was never able to add the plugin is a material repository, and thus also never able to reference any S3 sources/destinations in my pipelines. |
Check connection now uses listObjects. Hope that addresses most of the requirements for this issue. |
We are using, as it's considered best practice, multiple accounts for our Go.CD infrastructure. This also means our agents are running in different accounts for difference environments:
We've decided to put all of our S3 artifacts into Account B. Fetching/pushing artifacts is done using scripts and the AWS CLI/SDK. That's exactly where I wanted this plugin to come in. Unfortunately, I can't even get past added our S3 artifact bucket as a material since it's not in the same account as the server itself.
Now, we could just move said bucket over to Account A obviously, but then the agents would most likely (haven't checked the code) not be able to push/pull to/from it, essentially the same problem as described before with the server + material repository.
I would be beneficial to somehow account for the use case.
The text was updated successfully, but these errors were encountered: