Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional documentation, we greatly value feedback and contributions from our community.
Please read through this document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your bug report or contribution.
We welcome you to use the GitHub issue tracker to report bugs or suggest features.
When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already reported the issue. Please try to include as much information as you can. Details like these are incredibly useful:
- A reproducible test case or series of steps
- The version of our code being used
- Any modifications you've made relevant to the bug
- Anything unusual about your environment or deployment
- Testing
- Unit test added (prefer not to modify an existing test, otherwise, it's probably a breaking change)
- Integration test added (if adding a new pattern or making a significant update to an existing pattern)
- Docs
- README: README and/or documentation topic updated
- Design: For significant features, design document added to
design
folder
- Title and Description
- Change type: title prefixed with fix, feat and module name in parens, which will appear in changelog
- Title: use lower-case and doesn't end with a period
- Breaking?: last paragraph: "BREAKING CHANGE: <describe what changed + link for details>"
- Issues: Indicate issues fixed via: "Fixes #xxx" or "Closes #xxx"
If there isn't one already, open an issue describing what you intend to contribute. It's useful to communicate in advance, because sometimes, someone is already working in this space, so maybe it's worth collaborating with them instead of duplicating the efforts.
If you are proposing a new Solutions Construct, the best way to do this is create the full README.md document for the Construct in advance (defining all interfaces, the minimal deployment scenario, the architecture diagram, etc.). This will give us all the information we need to provide feedback and the document will live on as documentation (saving you that effort labor). Not all groups of CDK L2 objects is a Solutions Construct - you will want to follow our design guidelines.
Once the design is finalized, you can re-purpose this PR for the implementation, or open a new PR to that end.
Good AWS Solutions Constructs have the following characteristics:
- Multi-service: The goal of AWS Solutions Constructs is to weave multiple services together in a well-architected way.
- Minimal (if any) Business Logic: AWS Solutions Constructs should be applicable to all businesses and workloads as much as possible so that they are...
- Reusable across multiple use-cases: We would rather have a small library of Constructs that are wildly popular with customers rather than a huge library of Constructs that customers find irrelevant.
- Well Architected: AWS Solutions Constructs should be secure, reliable, scalable and cost efficient.
Now it's time to work your magic. Here are some guidelines:
- Coding style (abbreviated):
- In general, follow the style of the code around you
- 2 space indentation
- 120 characters wide
- ATX style headings in markdown (e.g.
## H2 heading
)
- Every change requires a unit test
- If you change APIs, make sure to update the module's README file
- Try to maintain a single feature/bugfix per pull request. It's okay to introduce a little bit of housekeeping changes along the way, but try to avoid conflating multiple features. Eventually all these are going to go into a single commit, so you can use that to frame your scope.
- If your change introduces a new construct, take a look at the our aws-apigateway-lambda Construct for an explanation of the L3 patterns we use. Feel free to start your contribution by copy&pasting files from that project, and then edit and rename them as appropriate - it might be easier to get started that way.
- To ensure CDKv2 compatibility of all the Solutions Constructs, please ensure the code meets the following guidelines:
- Import statement for
Construct
is standalone, for example,import { Construct } from '@aws-cdk/core';
instead ofimport { Construct, App, Aws } from '@aws-cdk/core';
- Check to make sure the usage of
Construct
in the code is also standalone, for example,export class IotToSqs extends Construct
instead ofexport class IotToSqs extends cdk.Construct
- Core classes are imported from
@aws-cdk/core
only, for example,import { Duration } from "@aws-cdk/core;
instead ofimport { Duration } from "@aws-cdk/core/lib/duration";
- DO NOT USE deprecated APIs, it will not build in CDKv2, for example, using
statistic?
attribute of @aws-cdk/aws-cloudwatch.Alarm Construct Props will fail to build in CDKv2 - DO NOT USE experimental modules, it will not build in CDKv2, for example, avoid using L2 constructs for HTTP or Websocket API will fail to build in CDKv2
- Import statement for
If you are introducing a new feature such as a new pattern make sure to include your coverage report directory path into the sonar-project.properties
file.
Integration Tests compare the CDK synth output of a full stack using a construct to a previously generated expected template. In so doing, they-
- Act as a regression detector. It does this by running
cdk synth
on the integration test and comparing it against the*.expected.json
file. This highlights how a change affects the synthesized stacks. - Allow for a way to verify if the stacks are still valid CloudFormation templates, as part of an intrusive change.
This is done by running
yarn integ
which will runcdk deploy
across all of the integration tests in that package. Remember to set up AWS credentials before doing this. - Provide a method to validates that constructs deploy successfully. While a successful CloudFormation deployment does not mean that the construct functions correctly, it does protect against problems introduced by drift in the CDK or services themselves.
If you are working on a new feature that is using previously unused CloudFormation resource types, or involves configuring resource types across services, you need to write integration tests that use these resource types or features.
To the extent possible, include a section (like below) in the integration test file that specifies how the successfully deployed stack can be verified for correctness. Correctness here implies that the resources have been set up correctly. The steps here are usually AWS CLI commands but they need not be.
/*
* Stack verification steps:
* * <step-1>
* * <step-2>
*/
Examples:
Each integration test generates a .expected.json file by actually deploying the construct and extracting the template from the CFN stack. Once you’ve written your integration test, follow these steps to generate these files:
- In the Docker build container, go to the folder for the construct you are working on (the folder with the package.json file). The Docker build container must be initialized and allow-partial-builds.sh run.
- Configure the CLI within the Docker container using
aws configure
. You will need an access key with enough privileges to launch everything in your stack and call CloudFormation – admin access is probably the surest way to get this. - Run the commands
npm run build && npm run integ
. The code will be compiled and each integration test stack will be deployed, the template gathered from CloudFormation as the expected result and the stack destroyed. You will seeinteg.your-test-name.expected.json
files appear in the project for each test.
The standard npm run build+lint+test
command will compare the cdk synth output against the .expected.json file. The Solutions Constructs team will run npm run integ
in each construct periodically to guard against drift and ensure each construct still deploys.
NOTE: Running npm run integ
will launch a stack including the construct for each integration test. It will also delete the stack after collecting the CloudFormation template. While the stack will only be around for a few seconds, during this time you account will be billed for the resources. Some tests may leave behind an S3 bucket - you should check after running this step. Ideally,
Create a commit with the proposed changes:
-
Commit title and message (and PR title and description) must adhere to conventionalcommits.
- The title must begin with
feat(module): title
,fix(module): title
,refactor(module): title
orchore(module): title
. - Title should be lowercase.
- No period at the end of the title.
- The title must begin with
-
Commit message should describe motivation. Think about your code reviewers and what information they need in order to understand what you did. If it's a big commit (hopefully not), try to provide some good entry points so it will be easier to follow.
-
Commit message should indicate which issues are fixed:
fixes #<issue>
orcloses #<issue>
. -
Shout out to collaborators.
-
If not obvious (i.e. from unit tests), describe how you verified that your change works.
-
If this commit includes breaking changes, they must be listed at the end in the following format (notice how multiple breaking changes should be formatted):
BREAKING CHANGE: Description of what broke and how to achieve this behavior now
* **module-name:** Another breaking change
* **module-name:** Yet another breaking change
- Push to a GitHub fork
- Submit a Pull Requests on GitHub.
- Please follow the PR checklist written above. We trust our contributors to self-check, and this helps that process!
- Discuss review comments and iterate until you get at least one “Approve”. When iterating, push new commits to the same branch. Usually all these are going to be squashed when you merge to master. The commit messages should be hints for you when you finalize your merge commit message.
- Make sure to update the PR title/description if things change. The PR title/description are going to be used as the commit title/message and will appear in the CHANGELOG, so maintain them all the way throughout the process.
- Make sure your PR builds successfully (we have CodeBuild setup to automatically build all PRs)
The CodeBuild runs through the following build steps:
- Content scanning using Viperlight utility. It is a security, vulnerability and general risk highlighting tool. The source code for utility is located here It uses .viperlightignore to override any false alarms.
- Build/validate/package all the constructs in the library
- Scan the Cloudformation templates generated by Integration Tests using (cfn_nag)[https://github.com/stelligent/cfn_nag]
- Once approved and tested, a maintainer will squash-merge to master and will use your PR title/description as the commit message.
GitHub provides additional document on forking a repository and creating a pull request.
All package.json
files in this repo use a stable marker version of 0.0.0
. This means that when you declare dependencies, you should always use 0.0.0
. This makes it easier for us to bump a new version and also reduces the chance of merge conflicts after a new version is released.
Additional scripts that take part in the versioning mechanism:
deployment/v2/get-sc-version.js
can be used to obtain the actual version of the repo. You can use either from JavaScript code byrequire('./deployment/v2/get-sc-version')
or from a shell scriptnode -p "require('./deployment/v2/get-sc-version')"
.deployment/v2/get-version-placeholder.js
returns0.0.0
and used to DRY the version marker.deployment/v2/align-version.sh
anddeployment/v2/align-version.js
are used to align all package.json files in the repo to the official version. This script is invoked in frombuild-patterns.sh
, first time before the build process to replace the versions from marker version (0.0.0
) to the release version e.g.2.13.0
and then the second time at the end of the build process to revert the versions back from release version e.g.2.13.0
to marker version (0.0.0
).
$ cd <root-of-aws-solutions-constructs-repo>
$ docker run -u root --rm --net=host -it -v $PWD:$PWD -w $PWD public.ecr.aws/jsii/superchain:1-bookworm-slim
# The build-patterns.sh command can take along time, be sure to allocate enough resources in the Docker dashboard
# (6 CPUs is good)
docker$ ./deployment/v2/build-patterns.sh
# At this point the container is configured and ready to work on.
# To work on a specific construct, execute the Partial Build steps below
First run a clean Full Build before doing the partial build (the full build installs all the tools required to build the library). Once you've initialized the Docker container by running a full build, you can build and test individual constructs by following the steps below.
$ cd <root-of-aws-solutions-constructs-repo>
$ docker run -u root --rm --net=host -it -v $PWD:$PWD -w $PWD public.ecr.aws/jsii/superchain:1-bookworm-slim
docker$ source ./deployment/v2/allow-partial-builds.sh
docker$ cd my-module
docker$ npm run build+lint+test
This project has adopted the Amazon Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.
If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our vulnerability reporting page. Please do not create a public github issue.
See the LICENSE file for our project's licensing. We will ask you to confirm the licensing of your contribution.
We may ask you to sign a Contributor License Agreement (CLA) for larger changes.