Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Terraform code change suggestions #4

Closed
wants to merge 4 commits into from
Closed

chore: Terraform code change suggestions #4

wants to merge 4 commits into from

Conversation

bryantbiggs
Copy link

@bryantbiggs bryantbiggs commented Apr 25, 2022

  • Adds a standard .gitignore for Terraform projects plus ignores zip files and retains example.terraform.tfvars for reference
  • Adds a standard .pre-commit-config.yaml; you can see more about it here https://github.com/antonbabenko/pre-commit-terraform. If you want to run these checks on pull-requests, you can simply copy this file into the .github/workflows/ directory and it will work as is

Since I know this is mirroring the CloudFormation equivalent, I avoided comments related to logic and usage. For example - typically I would comment about tagging all resources, or more extensive usage of variables instead of hardcoded values, etc.

Also, I only looked at the API Gateway service for now - a lot of these similar changes would apply to the other areas of the codebase as well.

See additional comments on PR sections below

@@ -1,48 +0,0 @@
{
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the files in dev-resources/ have been replaced with a combination of the varaibles.tf file and terraform.tfvars. terraform.tfvars is used somewhat like .env files in that its not checked in and is gitignored, but a reference file has been provided to copy+paste. Any variables provided in variables.tf can be set in the terraform.tfvars and will automatically be picked up for use when running terraform plan or terraform apply

@@ -0,0 +1,78 @@
# Instance

## Local Development & Testing
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just brief explanation of local setup and using *.tfvars file for local dev/testing

Note that this example may create resources which cost money. Run `terraform destroy` when you no longer need these resources.

```
<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These lines are tied to the terraform_docs pre-commit hook. Anything between the opening and closing is automatically updated when you run pre-commit run -a

@@ -0,0 +1,147 @@
################################################################################
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

User preference - you can split up by files or by simple stanzas or both. This is just the format I use but please feel free to use whatever makes the most sense for yourselves. The key is to help logically group resources and keep supporting resources (data, locals, etc.) as close to where they are used as possible. Otherwise, flipping back and forth between multiple files can become a hassle and annoying. Again, completely user preference

handler = var.lambda_handler
s3_bucket = var.use_local_function ? null : var.lambda_s3_bucket
s3_key = var.use_local_function ? null : var.lambda_s3_key
filename = var.use_local_function ? data.archive_file.function[0].output_path : null
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because the API allows for filename or s3_bucket/s3_key, we add in a flag variable which is a boolean to determine which route we intend to use

@@ -0,0 +1,24 @@
exports.handler = async (event, context) => {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inlining the file works as well - however, you'll typically see it externalized so that when working and developing locally, you can setup a directory for the language used. For example, in this directory users would setup things like .eslintrc or other local dev tools to support their workflow, and then just import the code into the Terraform section using either file() or templatefile() if variables need to be injected


resource "aws_apigatewayv2_stage" "lambda" {
api_id = aws_apigatewayv2_api.lambda.id
################################################################################
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

main.tf in a module typically holds the actual source code for the resources (i.e. - https://github.com/bryantbiggs/terraform-aws-eks/blob/master/main.tf)

main.tf in a project though will generally hold generic information that is common throughout the project, including the remote backend settings, common local variables and data sources that are used throughout the project and not in any one specific resource or location

@@ -1,9 +1,11 @@
# tflint-ignore: terraform_naming_convention - matching case of CloudFormation equivalent for testing purposes
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are the ignore rules I mentioned - without them, these lines would show up in the tflint errors. Adding a bit of an explanation just helps our future-selves as well as users to know why we are deviating from the norm

@@ -0,0 +1,59 @@
variable "aws_region" {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct me if I am wrong, but I believe the variables defined here should align 1:1 with the variables in the schema.yaml file, correct? I did not update them all completely as I wasn't sure

service_name = "terraform-test-service"

codebuild_deployments = {
foo = {}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the code where this variable is used, I've added a try(each.value.name, each.key) which means - try to use the name attribute if its present (see next block where bar will be used via this route), if not present, then use the map's key which is foo

################################################################################

locals {
environment_account_ids = split(",", var.environment_account_ids)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good example of a local variable usage:

  1. Its closed to where its used so you don't have to go far to understand whats what
  2. Its handling some transformation/processing before its used since we cannot pass the value as we would have liked (here we split a comma separated string to get an array of strings instead of simply passing in an array of strings since CloudFormation doesn't allow - I *assume)


locals {
# Lookup map that converts shorthand syntax to expanded form CodePipeline accepts
buildspec_runtime = {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume this map+lookup is used due to the use of the lambda runtime and converting it to the codebuild equivalent?


resource "aws_s3_bucket_policy" "function_bucket_policy" {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a few things on naming:

  1. We don't need to re-add what type of resource it is so function_bucket_policy is just function_bucket since the resource type is aws_s3_bucket_policy. It avoids that redundancy
  2. The actual name should be short and descriptive. Here I made them quite generic to be short but when you have quit a few of the same resources, being descriptive is key while avoid being verbose. So for example, a good name might be get_orders_by_id or health_check, etc.


resource "aws_s3_bucket_policy" "function_bucket_policy" {
count = var.pipeline.inputs.environment_account_ids != "" ? 1 : 0
policy = data.aws_iam_policy_document.function_bucket_policy_document.json
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when using the data source, I usually name the data source the same as the resource name. The only exception are the assume role policy documents, in that case I simply append _assume to the end of the name. This keeps things logically organized



resource "aws_codebuild_project" "deploy_project" {
for_each = { for instance in var.service_instances : instance.name => instance }
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to some shortcomings in how for_each is evaluated, its always best to use static keys for the maps that are being iterated over. This will work fine provided the name is always a static string, but if someone were to use aws_<some_resource>.foo.id for the name attribute here, it would throw an ugly, vague error that looks like this hashicorp/terraform#4149

mikewrighton added a commit that referenced this pull request Apr 29, 2022
Add apigw-lambda-svc.
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant