Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for cross-account materials #12

Closed
moritzheiber opened this issue Aug 8, 2015 · 12 comments
Closed

Support for cross-account materials #12

moritzheiber opened this issue Aug 8, 2015 · 12 comments

Comments

@moritzheiber
Copy link

We are using, as it's considered best practice, multiple accounts for our Go.CD infrastructure. This also means our agents are running in different accounts for difference environments:

  • Account A: Go.CD servver
  • Account B: Go.CD agents for account B
  • Account C: Go.CD agents for account C

We've decided to put all of our S3 artifacts into Account B. Fetching/pushing artifacts is done using scripts and the AWS CLI/SDK. That's exactly where I wanted this plugin to come in. Unfortunately, I can't even get past added our S3 artifact bucket as a material since it's not in the same account as the server itself.

Now, we could just move said bucket over to Account A obviously, but then the agents would most likely (haven't checked the code) not be able to push/pull to/from it, essentially the same problem as described before with the server + material repository.

I would be beneficial to somehow account for the use case.

@ashwanthkumar
Copy link
Member

Some questions so we understand this scenario better

  • How are agents talking to Server now? Is it over internet (HTTPs)?
  • Can you please explain how does your current setup using AWS CLI/SDK work across accounts? Does it use mfacli.sh?

@moritzheiber
Copy link
Author

  • Our agents are in their own VPCs, peered with the Go.CD server's VPC, so they're talking to the server over a private network, without any access from the outside.
  • The agents are allowed to upload to a certain bucket in one account using a bucket policy and IAM policies associated with their instance profile in their respective accounts.

@ashwanthkumar
Copy link
Member

Thanks for that. The plugin is built with an assumption that the AWS keys (AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID) are available as environment variables on the machines (server and the agents).

If creating an IAM user (with the existing policies) an option, then you could use those keys as the environment variables. I'm not aware of the exact good practice - if creating 1 IAM user with cross account access or 1 IAM user per account. HTH.

@moritzheiber
Copy link
Author

I'm assuming you're using the AWS SDK with this plugin. Which means it knows about IAM instance roles and how to make use of them (i.e. you don't need to set these variables, they're available to the instance/SDK the code is running on).

And yes, it's absolutely possible to have an instance from one account access resources from another, however, adding a material on the server fails of the bucket is only available on another account. That's what this bug is about :)

If it's still unclear please let me know, I'll try to paint a better picture (literally perhaps). ❤️

@manojlds
Copy link
Member

Guess we need to add the ability for the Go server to assume role and access bucket in another account - http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/securitytoken/model/AssumeRoleRequest.html

This will be needed to add the package and to poll for changes.

It is an interesting use case to support, but this may not be something we can develop and test given our setup and priorities. We would, of course, welcome PRs for the same. We'll keep this issue open.

@moritzheiber
Copy link
Author

Actually, you don't need to assume a role for accessing external resources. Bucket policies allow for cross-account access independent of account access in the account being accessed:

http://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html

The policy mentioned in the example is attached to the EC2 instance running Go.CD using an instance profile, i.e. you already have all the necessary rights to access the bucket on another account.

However, currently the method of checking for bucket availability/access in the materials dialog doesn't account for buckets that don't exist in the same account. I don't know how the check works, but it's the one component that prevents me adding a material that you should actually have access to.

@haimizrael
Copy link
Contributor

We have a similar scenario where the bucket is in Account A, but Go servers / agents are either in Account A or Account B. We have a bucket policy in Account A that permits the IAM role used by servers in Account B full access to the bucket, but are currently forced to use Access/Secret keys rather than the IAM role, due to the fact that objects uploaded from Account B are inaccessible from Account A.

We believe this can be fixed by adding the ACL: Bucket Owner - full access at the time of upload, but have not yet tested that. Hoping to have it tested later this week, and, if so - will reply here.

@haimizrael
Copy link
Contributor

@manojlds @ashwanthkumar @moritzheiber
We were able to successfully set up a cross-account scenario (as explained in the comment above).

What we needed to do to make it work is add the statement putObjectRequest.setCannedAcl(CannedAccessControlList.BucketOwnerFullControl); in method public void put(PutObjectRequest putObjectRequest) in S3ArtifactStore, such that artifacts uploaded from accounts other than the one owning the bucket can be accessed from the account owning the bucket.

Assuming setting this ACL is safe for all users of the plugins (I belive it should be, as it is the default ACL if you had only one account anyway), we can issue a PR with this change.

@moritzheiber
Copy link
Author

For me this was less of a question of policies (I know this works; IAM/Bucket Policies can do that) but rather UI/integration. How would I be adding such a bucket as a material in the UI? It's not possible currently as (I'm just guessing) the code isn't trying to access the bucket in S3 directly but rather lists all available buckets in the same account.

@haimizrael
Copy link
Contributor

@moritzheiber Yes, when you define the repository, the Check Connection fails (the Check Connection does list buckets, and only those in the current account are available). Perhaps the check can be performed a different way (perhaps using client.doesBucketExist() rather than listing buckets).

The above does not prevent one from using buckets cross-account, however. Even though the connection check fails, you can still define materials (the Check Package on Add Material does work). As mentioned, we are now using a cross-account setup with these plugins at GCM.

The issue we encountered is that there is seemingly no way to enforce S3 ACLs be set on new S3 objects using policy, so if someone in Account A uploads an object into a bucket in Account B, unless you set the ACL at the time of upload, then Account B cannot read metadata or download that object, even though it owns the bucket. This is resolved with the change described above.

@moritzheiber
Copy link
Author

@haimizrael That's good news, I'll give it another try tomorrow. As far I could remember I was never able to add the plugin is a material repository, and thus also never able to reference any S3 sources/destinations in my pipelines.

@manojlds
Copy link
Member

manojlds commented Sep 7, 2017

Check connection now uses listObjects. Hope that addresses most of the requirements for this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants