Skip to content

Repo for managing the Bootstrap (DynamoDB + S3) Terraform Module..

License

Notifications You must be signed in to change notification settings

sourcefuse/terraform-aws-arc-bootstrap

Repository files navigation

Module Structure

Latest Release Last Updated Terraform GitHub Actions

Quality gate

Known Vulnerabilities

Introduction


A backend specifies the location where Terraform stores its state data files. Terraform relies on persistent state data to monitor the resources under its management. This enables collaborative access to the state data, allowing multiple individuals to collaborate on the management of a set of infrastructure resources. This module creates and configures a S3 bucket backend along with a DynamoDB lock table to store Terraform state files. As a best practice use this module to create the backend for Terraform


image search api


Prerequisites

Before using this module, ensure you have the following:

  • AWS credentials configured.
  • Terraform installed.
  • A working knowledge of Terraform.

Getting Started

  1. Define the Module

Initially, it's essential to define a Terraform module, which is organized as a distinct directory encompassing Terraform configuration files. Within this module directory, input variables and output values must be defined in the variables.tf and outputs.tf files, respectively. The following illustrates an example directory structure:

bootstrap/
|-- main.tf
|-- variables.tf
|-- outputs.tf
  1. Define Input Variables

Inside the variables.tf or in *.tfvars file, you should define values for the variables that the module requires.

  1. Use the Module in Your Main Configuration In your main Terraform configuration file (e.g., main.tf), you can use the module. Specify the source of the module, and version, For Example
module "bootstrap" {
  source  = "sourcefuse/arc-bootstrap/aws"
  # version = "x.x.x"  # we recommend pinning the module to a specific version

  bucket_name              = var.bucket_name
  dynamodb_name            = var.dynamodb_name
}
  1. Output Values

Inside the outputs.tf file of the module, you can define output values that can be referenced in the main configuration. For example:

output "bucket_id" {
  value = module.bootstrap.bucket_id
}

output "bucket_name" {
  value = module.bootstrap.bucket_name
}
  1. Execute Terraform Commands

After defining your main configuration, navigate to the directory containing your Terraform files and run the following commands:

terraform init
terraform apply
  1. Review and Confirm

Terraform will display a plan showing the changes it intends to make. Review the plan and confirm by typing 'yes' when prompted.

  1. Migrating local state to backend

After the initial apply of terraform, you can add backend section to migrate Terraforn state to S3 bucket

terraform {
  required_version = ">= 1.4"

  backend "s3" {
    region         = "us-east-1"
    key            = "terraform-bootstrap/terraform.tfstate"
    bucket         = "terraformbucketexample"
    dynamodb_table = "terraform-lock"
    encrypt        = true
  }
}

Then run terraform init to initialize the new backend:

Initializing modules...

Initializing the backend...
Do you want to migrate all workspaces to "aws"?
  Both the existing "local" backend and the newly configured "aws" backend
  support workspaces. When migrating between backends, Terraform will copy
  all workspaces (with the same names). THIS WILL OVERWRITE any conflicting
  states in the destination.

  Terraform initialization doesn't currently migrate only select workspaces.
  If you want to migrate a select number of workspaces, you must manually
  pull and push those states.

  If you answer "yes", Terraform will migrate all states. If you answer
  "no", Terraform will abort.

Our local state has now been migrated to the new backend. It is now safe to remove the local terraform.tfstate.

Requirements

Name Version
terraform >= 1.4, < 2.0.0
aws >= 4.0, < 6.0.0

Providers

Name Version
aws 5.58.0

Modules

No modules.

Resources

Name Type
aws_dynamodb_table.terraform_state_lock resource
aws_s3_bucket.private resource
aws_s3_bucket_acl.this resource
aws_s3_bucket_analytics_configuration.private_analytics_config resource
aws_s3_bucket_cors_configuration.this resource
aws_s3_bucket_inventory.inventory resource
aws_s3_bucket_lifecycle_configuration.this resource
aws_s3_bucket_logging.this resource
aws_s3_bucket_ownership_controls.this resource
aws_s3_bucket_public_access_block.public_access_block resource
aws_s3_bucket_server_side_encryption_configuration.example resource
aws_s3_bucket_versioning.this resource
aws_caller_identity.current data source
aws_iam_policy_document.policy data source
aws_partition.current data source

Inputs

Name Description Type Default Required
abort_incomplete_multipart_upload_days Specifies the number of days after initiating a multipart upload when the multipart upload must be completed. number 14 no
bucket_key_enabled Whether or not to use Amazon S3 Bucket Keys for SSE-KMS. bool false no
bucket_name The name of the bucket. string n/a yes
cors_rules List of maps containing rules for Cross-Origin Resource Sharing. list(any) [] no
dynamo_kms_master_key_id The Default ID of an AWS-managed customer master key (CMK) for Amazon Dynamo string null no
dynamodb_hash_key The attribute to use as the hash (partition) key. string "LockID" no
dynamodb_name The name of the table, this needs to be unique within a region. string n/a yes
enable_analytics Enables storage class analytics on the bucket. bool true no
enable_bucket_force_destroy A boolean that indicates all objects (including any locked objects) should be deleted from the bucket so that the bucket can be destroyed without error. bool false no
enable_bucket_inventory If set to true, Bucket Inventory will be enabled. bool false no
enable_bucket_logging Enable bucket activity logging. bool false no
enable_dynamodb_point_in_time_recovery Whether to enable point-in-time recovery - note that it can take up to 10 minutes to enable for new tables. bool true no
enable_s3_public_access_block Bool for toggling whether the s3 public access block resource should be enabled. bool true no
enable_versioning Enable versioning. Once you version-enable a bucket, it can never return to an unversioned state. bool true no
expiration Specifies a period in the object's expire. list(any)
[
{
"expired_object_delete_marker": true
}
]
no
inventory_bucket_format The format for the inventory file. Default is ORC. Options are ORC or CSV. string "ORC" no
kms_master_key_id The AWS KMS master key ID used for the SSE-KMS encryption. string "" no
logging_bucket_name The S3 bucket to send S3 access logs. string "" no
logging_bucket_target_prefix To specify a key prefix for log objects. string "" no
mfa_delete mfa_delete is disabled bool false no
noncurrent_version_expiration Number of days until non-current version of object expires number 365 no
noncurrent_version_transitions Non-current version transition blocks list(any)
[
{
"days": 30,
"storage_class": "STANDARD_IA"
}
]
no
schedule_frequency The S3 bucket inventory frequency. Defaults to Weekly. Options are 'Weekly' or 'Daily'. string "Weekly" no
sse_algorithm The server-side encryption algorithm to use. Valid values are AES256 and aws:kms string "AES256" no
tags A mapping of tags to assign to the bucket. map(string)
{
"Module": "terraform-aws-arc-bootstrap",
"TerraformManaged": "true"
}
no
transitions Current version transition blocks list(any) [] no

Outputs

Name Description
bucket_arn Bucket's ARN
bucket_id Bucket's ID
bucket_name Bucket's Name
dynamodb_arn DynamoDB's ARN
dynamodb_id DynamoDB's ID
dynamodb_name DynamoDB's Name

Development

Versioning

While Contributing or doing git commit please specify the breaking change in your commit message whether its major,minor or patch

For Example

git commit -m "your commit message #major"

By specifying this , it will bump the version and if you don't specify this in your commit message then by default it will consider patch and will bump that accordingly

Prerequisites

Configurations

  • Configure pre-commit hooks
pre-commit install
  • Execute pre-commit
pre-commit run -a

Authors

This project is authored by below people

  • SourceFuse ARC Team

About

Repo for managing the Bootstrap (DynamoDB + S3) Terraform Module..

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages