When working in organization, permissions are always limited to developers. A popular way is to use AWS IAM role to delegate access to resources to different AWS accounts. You share resources in one account with users in a different account. By setting up cross-account access in this way, you don't have to create individual IAM users in each account. In addition, users don't have to sign out of one account and sign into another in order to access resources in different AWS accounts. This project demos how to develop a coffee shop restful api using API Gateway, Lambda, DynamoDB and deploy the api through Terraform (https://www.terraform.io/) and Serverless Framework (https://www.serverless.com/)
- Configure your AWS profile throught aws-cli
aws configure
. Your AWS default profile should be granted permission to assume role arn:aws:iam::${accountId}:role/${IAM_ROLE} (this role should be attached managed policies for S3 (deployment), APIGateway (lambda execution), Lambda (lambda execution), Dynamodb (lambda execution), Cloudwatch (lambda execution) and inline policy of iam:AttachRolePolicy to attach the managed policies to lambda execution role). If you encountered error such as InvalidClientTokenId: The security token included in the request is invalid., it's probably your aws access key and secret access key of the default AWS profile is not correct. - You can also set environment variable AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in terminal/command prompt instead of using your default AWS profile.
- ESLint (https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint). This project uses ESLint for code quality checking. Please install this extension in your code editor.
- AWS CLI : https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
- Docker (https://www.docker.com/get-started) : This project use docker to build local Dynamodb image.
- Terraform : https://learn.hashicorp.com/tutorials/terraform/install-cli?in=terraform/aws-get-started
- Serverless : https://www.serverless.com/framework/docs/getting-started
src/api/**
: Lambda function code.src/authorizer/**
: Lambda authorizer code.src/layer/**/**
: Custom node modules that are used as lambda layers. Any custom node module directory should include a package.json and default export object inindex.js
. You can create a module bynpm run create-layer-module ${path_to_your_module}
or remove any module bynpm run remove-layer-module ${path_to_your_module}
. E.g. if you runnpm run create-layer-module mymodule/m1 mymodule/m2
,src/layer/mymodule/m1
andsrc/layer/mymodule/m2
would be created.layer/nodejs/node_modules
: Generated Lambda layer which would be deployed. This folder should not be committed.terraform/**
: Terraform deployment directory. Runterraform init
to initialize.test/**/**
: Lambda code test casescoverage/**
: Generated test coverage and reports. This folder should not be committed.
- Clone project repository.
git clone https://github.com/johnnyn2/aws-apigw-lambda-dynamodb-restv2-sls-tf.git
- Locate the project.
cd aws-apigw-lambda-dynamodb-restv2-sls-tf
- Check out develop branch.
git checkout -b develop_${YOUR_NAME}
- Install dependencies.
npm install
- Start local dynamodb.
docker-compose up
- Create tables.
npm run create-table-local
- Start local development using serverless framework.
npm run dev
- This project use
ESLint
for code quality checking,Prettier
for code formatting andcommitlint
(https://github.com/conventional-changelog/commitlint) for commit message linting. Please follow the rules from Conventional Commits 1.0.0 (https://www.conventionalcommits.org/en/v1.0.0/) to write your commit messages. - Git hooks -
husky
andlint-staged
: You can only successfully commit to the repository if the commit passes code linting, commit linting and test cases checking. Error messages will be shown on the terminal/command prompt if your commit fails any checking - Create merge request targeting the develop branch, add "WIP" tag and assign to yourself. Link any opening issue that is going to be solved by your merge request in the merge request description section.
- Remove the "WIP" tag and assign the merge request to any maintainer when the change is ready
- GetCoffeeByOrigin and PutCoffee API and associated test cases are created for demos. You can add any apis further as you want.
- Testing framework :
jest
(https://jestjs.io/) - Testing with DynamoDB (https://jestjs.io/docs/dynamodb)
- Run all tests
npm run test
- Run indivdual test
npm run -- ${path_of_your_test_file}
- Under
/terraform
directory, initialize a working directory containing Terraform configuration files.terraform init
- Create Terraform speculative plan.
npm run tf-plan:${stage}
- If step 2 succeed, you can execute actions proposed in the Terraform plan.
npm run tf-apply:${stage}
Options:- stage :
dev | staging | prod
- stage :
- Under root directory, deploy the API.
npm run sls-deploy:${stage}
Options:- stage :
dev | staging | prod
- stage :
- You can use Terraform or Serverless Framework to deploy the resources. However, only Serverless Framework supports lambda development in local environment.
- Why put environment variables in scripts, such as
deploy.js
,offline.js
but not.env
/.sh
/.bat
file?- Because the only way for lambda function to access the DynamoDB on cloud is to source temporary credentials or profile from the role. We can only achieve this through
AWS STS
and set the temporary credentials as environment variables in lambda execution environment (i.e. Serverless Framework environment) or setting global source profile (https://docs.aws.amazon.com/sdkref/latest/guide/setting-global-source_profile.html) and set environment variableAWS_SDK_LOAD_CONFIG=1
when we need to invoke our api in local. However, it is annoying for every developer to configure and add additional aws profile for sourcing profile compare to executing a nodejs script.
- Because the only way for lambda function to access the DynamoDB on cloud is to source temporary credentials or profile from the role. We can only achieve this through