If you are planning to use this repo for reference, please hit the star. Thanks!
We plan to utilize GitHub Actions and Terraform to deploy our React project on AWS EKS.
We will deploy React Application on aws Elastic Kubernetes(EKS). We will use Github actions for the ci/cd pipeline. We will use EC2 as the self-hosted runner for our GitHub Actions. We will integrate Sonarcube for code analysis and Trivy Image scan to scan our docker images. Also, we will integrate slack to get Build/deployment notifications.
Youtube Video Tutorial: https://www.youtube.com/live/HkGMxxjBt8g?si=PzSJEGwPrJmn8yli
You should have basic Knowledge of AWS services, Docker, Kubernetes, and GitHub Actions.
1. Create an AWS EC2 Instance and an IAM Role
2. Add a Self Hosted Runner To AWS EC2
3. Docker Installation and Running SonarQube Container
4. Integrate SonarQube with GitHub Actions
5. Installation of tools (Java JDK, Terraform, Trivy, Kubectl, Node.js, NPM, AWS CLI)
6. Provision AWS EKS With Terraform
7. Dockerhub and Trivy Image Scan Setup
8. Deploy Application(image) to AWS EKS
9. Integrate Slack Notifications
10. Running Final/Complete Github actions Workflow
11. Delete the infrastructure (To Avoid Extra Billing, if you are just using it for learning Purposes)
You have to Navigate to AWS Console.
Then Search/Enter IAM
Click Roles
Then Click Create role
Now Click AWS Service, And Then Click Choose a service or use case
Now Click EC2 and Click NEXT
Click the Search Fileds, Then Add permissions Policies
Add These Four Policies:
- EC2 full access
- AmazonS3FullAccess
- AmazonEKSClusterPolicy
- AdministratorAccess
Click NEXT and Then click the Role Name Field.
Type cicd-jenkins
Click Create role
Note We will use/attach this AIM Role during AWS EC2 instance Creation.
To launch an AWS EC2 instance running Ubuntu 22.04 via the AWS Management Console, start by signing in to your AWS account and accessing the EC2 dashboard. Click on "Launch Instances" and proceed through the steps. In "Step 1," select "Ubuntu 22.04" as the AMI, and in "Step 2," opt for "t2.medium" as the instance type. Configure instance details, storage, tags, and security group settings according to your requirements. Additionally, attach the previously created IAM role in the advanced details. Review the settings, create or select a key pair for secure access, and then launch the instance. Once launched, utilize the associated key pair to connect via SSH for access, ensuring the security of your connection. (Look at image Below)
Now Go to GitHub Repository and click on Settings -> Actions -> Runners
Click on New self-hosted runner
Now select Linux and Architecture X64
Use the below commands to add a self-hosted runner
Note: In pic Commads are related to my account, Use your commands, it appears on your GitHub self-hosted runner Page.
Now SSH to your AWS instance to connect with your Instance.
And Past/Run these commands.
mkdir actions-runner && cd actions-runner
Command "mkdir actions-runner && cd actions-runner" serves to generate a fresh directory named "actions-runner" within the present working directory. Subsequently, it swiftly switches the current working directory to this newly created "actions-runner" directory. This approach streamlines file organization and facilitates executing successive actions within the newly formed directory without the need for separate navigation.
Download the latest runner package
curl -o actions-runner-linux-x64-2.311.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.311.0/actions-runner-linux-x64-2.311.0.tar.gz
Validate the hash.
echo "29fc8cf2dab4c195bb147384e7e2c94cfd4d4022c793b346a6175435265aa278 actions-runner-linux-x64-2.311.0.tar.gz" | shasum -a 256 -c
Now Extract the installer
tar xzf ./actions-runner-linux-x64-2.311.0.tar.gz
Create the runner and start the configuration experience
./config.sh --url https://github.com/codewithmuh/react-aws-eks-github-actions --token AMFXNTP3IVE6IAZSWO3ZEGDFT2QV6
If you have provide multiple labels use commas for each label.
The last step, run it!
./run.sh
Let's close Runner for now.
ctrl + c # To close Run this Command.
Connect with your Instance using SSH or Putty.(The Method you are using). If already connected, ignore.
Run these commands
sudo apt-get update
sudo apt install docker.io -y
sudo usermod -aG docker ubuntu
newgrp docker
sudo chmod 777 /var/run/docker.sock
Now We will Pull SonarQube Docker Image and run the SonarQube Container.
Note Make sure to add a port in the security group of your instance.
docker run -d --name sonar -p 9000:9000 sonarqube:lts-community
Now copy the IP address of Your EC2 instance
<EC2-PUBLIC-IP:9000>
Now Login with these creds.
login admin
password admin
Now Update your Sonarqube password.
This is the Sonarqube dashboard.
Integrating SonarQube with GitHub Actions allows you to automatically analyze your code for quality and security as part of your continuous integration pipeline.We already have Sonarqube up and running
Now On Sonarqube Dashboard click on Manually
On the Next Page, You have to provide the name of your project and provide a branch name. The Click on SetUp Button.
On the Next Page, You have to click on "With GitHub Actions"
This will provide an overview of the project and provide some instructions to integrate.
Now Let's open your Giuthub Repository.
Now Click on Settings. (if you are using my repo, make sure you have forked it)
Click on Secrets and variables and then click on actions.
It will open a page, Clock on New Repository secret.
Now Go to your Sonarqube dashboard
Copy SONAR_TOKEN and click on Generate Token
Click on Generate
Let's copy the Token and add it to GitHub secrets
Now Go back to GitHub and Paste the copied name for the secret and token
Name: SONAR_TOKEN
Secret: Paste Your Token and click on Add secret
Now go back to the Sonarqube Dashboard
Copy the Name and Value
Go to GitHub now and paste-like this and click on add secret
Our Sonarqube secrets are added and you can see it.
Go to Sonarqube Dashboard and click on continue
Now create your Workflow for your Project. In my case, I am using React Js. That's why I am selecting Other.
Now it Generates and workflow for my Project
Note: Make sure to use your files for this Section.
Go back to GitHub. click on Add file and then create a new file
Go back to the Sonarqube dashboard and copy the file name and content
Add in GitHub like this (Look at image)
Let's add our workflow
To do that click on Add file and then click on Create a new file
Here is the file name
.github/workflows/sonar.yml #you can use any name I am using sonar.yml
Copy content and add it to the file
name: Sonar Code Review Workflow
on:
push:
branches:
- main
jobs:
build:
name: Build
runs-on: self-hosted
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis
- uses: sonarsource/sonarqube-scan-action@master
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
# If you wish to fail your job when the Quality Gate is red, uncomment the
# following lines. This would typically be used to fail a deployment.
# - uses: sonarsource/sonarqube-quality-gate-action@master
# timeout-minutes: 5
# env:
# SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
Click on commit changes
Now workflow is created.
Start again GitHub actions runner from the Ec2 instance Run These Commands
cd actions-runner
./run.sh
Click on Actions now
Now it's automatically started the workflow
Let's click on Build and see what are the steps involved
Click on Run Sonarsource and you can do this after the build completion
Build complete.
Go to the Sonarqube dashboard and click on projects and you can see the analysis
If you want to see the full report, click on issues.
Use this script to automate the installation of these tools.
Create script on Your aws ec2.
vim run.sh
Copy the Below given content
#!/bin/bash
sudo apt update -y
sudo touch /etc/apt/keyrings/adoptium.asc
sudo wget -O /etc/apt/keyrings/adoptium.asc https://packages.adoptium.net/artifactory/api/gpg/key/public
echo "deb [signed-by=/etc/apt/keyrings/adoptium.asc] https://packages.adoptium.net/artifactory/deb $(awk -F= '/^VERSION_CODENAME/{print$2}' /etc/os-release) main" | sudo tee /etc/apt/sources.list.d/adoptium.list
sudo apt update -y
sudo apt install temurin-17-jdk -y
/usr/bin/java --version
# Install Trivy
sudo apt-get install wget apt-transport-https gnupg lsb-release -y
wget -qO - https://aquasecurity.github.io/trivy-repo/deb/public.key | gpg --dearmor | sudo tee /usr/share/keyrings/trivy.gpg > /dev/null
echo "deb [signed-by=/usr/share/keyrings/trivy.gpg] https://aquasecurity.github.io/trivy-repo/deb $(lsb_release -sc) main" | sudo tee -a /etc/apt/sources.list.d/trivy.list
sudo apt-get update
sudo apt-get install trivy -y
# Install Terraform
sudo apt install wget -y
wget -O- https://apt.releases.hashicorp.com/gpg | sudo gpg --dearmor -o /usr/share/keyrings/hashicorp-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/hashicorp-archive-keyring.gpg] https://apt.releases.hashicorp.com $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/hashicorp.list
sudo apt update && sudo apt install terraform
# Install AWS CLI
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
sudo apt-get install unzip -y
unzip awscliv2.zip
sudo ./aws/install
# Install kubectl
sudo apt update
sudo apt install curl -y
curl -LO https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl
sudo install -o root -g root -m 0755 kubectl /usr/local/bin/kubectl
kubectl version --client
# Install Node.js 16 and npm
curl -fsSL https://deb.nodesource.com/gpgkey/nodesource.gpg.key | sudo gpg --dearmor -o /usr/share/keyrings/nodesource-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/nodesource-archive-keyring.gpg] https://deb.nodesource.com/node_16.x focal main" | sudo tee /etc/apt/sources.list.d/nodesource.list
sudo apt update
sudo apt install -y nodejs
Now Run this script:
chmod +x run.sh
./run.sh
Now Check if these tools are installed or not. By checking Their versions.
kubectl version
aws --version
java --version
trivy --version
terraform --version
node -v
Note: Before starting this part 06, Make sure Terraform and AWS CLI are installed and an aws account is configured on your system. You can see my article to get aws and terraform installation and configuration done.
Now let's clone repo:
https://github.com/codewithmuh/react-aws-eks-github-actions.git
cd react-aws-eks-github-actions
cd terraform-eks
This will change your directory to terraform-eks files.
Now Change your s3 bucket in the backend file. (You can create S3 Bucket on AWS S3)
Now initialize the terraform.
terraform init
Now validate the configurations and syntax of all files.
terraform validate
Now Plan and apply your infrastructure.
terraform plan
terraform apply
It can take up to 10 Minutes to create your AWS EKS cluster.
You can check by going to aws EKS service.
Also, check your Node Grpup EC2 Instance, by going to EC2 Dashboard.
Now you have to create a Personal Access token for your Dockerhub account.
Go to docker hub and click on your profile --> Account settings --> security --> New access token
Copy This Token.
Add this Token to your Github actions Secret.
Also, add another secret of your dockerhub username.
Now create a new workflow with the name build.yaml . Make sure to replace the username and image name with yours.
name: Code Build Workflow
on:
workflow_run:
workflows:
- Sonar Code Review Workflow
types:
- completed
jobs:
build:
name: Build
runs-on: self-hosted
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Docker build and push
run: |
docker build -t react-aws-eks-github-actions .
docker tag react-aws-eks-github-actions codewithmuh/react-aws-eks-github-actions:latest
docker login -u ${{ secrets.DOCKERHUB_USERNAME }} -p ${{ secrets.DOCKERHUB_TOKEN }}
docker push codewithmuh/react-aws-eks-github-actions:latest
env:
DOCKER_CLI_ACI: 1
Now you can check image is pushed to your Dockerhub Account.
Now add another Workflow for the docker image scan, using the name 'trivy.yml'. Use the content below, but make sure to replace the image with your one.
name: Trivy Image Scan Workflow
on:
workflow_run:
workflows:
- Code Build Workflow
types:
- completed
jobs:
build:
name: Docker Image Scan
runs-on: self-hosted
steps:
- name: Checkout Repository
uses: actions/checkout@v2
- name: Pull the Docker image
run: docker pull codewithmuh/react-aws-eks-github-actions:latest
- name: Trivy image scan
run: trivy image codewithmuh/react-aws-eks-github-actions:latest
Here is the output of the build.
Now add these lines into build.yml steps, so we can test image on our aws ec2 instance.
- name: Pull the Docker image On AWS EC2 For Testing
run: docker pull sevenajay/tic-tac-toe:latest
- name: Stop and remove existing container
run: |
docker stop react-aws-eks-github-actions || true
docker rm react-aws-eks-github-actions || true
- name: Run the container on AWS EC2 for testing
run: docker run -d --name react-aws-eks-github-actions -p 3000:3000 codewithmuh/react-aws-eks-github-actions:latest
When Build is completed , visit the websiet in your browser.
Note: Make sure port 3000 is added on your Ec2.
"Your_EC2_IP:3000"
If it's not working, Try to pull an image on your system and check for errors. Fix them then build the GitHub actions again.
Now create our final deploy.yml file to deploy on aws eks.
And Paste this content there. (Do not commit yet, commit it after part 09)
Note. Make sure to repalce cluster name and region nname with your one.
name: Deploy To EKS
on:
workflow_run:
workflows:
- Code Build Workflow
types:
- completed
jobs:
build:
name: Docker Image Scan
runs-on: self-hosted
steps:
- name: Checkout Repository
uses: actions/checkout@v2
- name: Pull the Docker image
run: docker pull codewithmuh/react-aws-eks-github-actions:latest
- name: Update kubeconfig
run: aws eks --region us-west-1 update-kubeconfig --name EKS_cluster_codewithmuh
- name: Deploy to EKS
run: kubectl apply -f deployment-service.yml
This will updates the kubeconfig to configure kubectl to work with an Amazon EKS cluster in the region with the name of your cluster, Also deploys Kubernetes resources defined in the deployment-service.yml file to the Amazon EKS cluster using kubectl apply.
No Go to Your Slack and create a new channel for notifications.
Now Click on your slack account name --> settings & Administration --> Manage Apps
t will open a new tab, select build now
Now Click on Create an app
Select from scratch
Provide a name for the app and select workspace and create
Select Incoming webhooks
Now Set incoming webhooks to on
Click on Add New webhook to workspace
Select Your channel that created for notifications and allow
It will generate a webhook URL copy it
Now come back to GitHub and click on settings
Go to secrets --> actions --> new repository secret and add
Add the below code to the deploy.yml workflow and commit and the workflow will start.
- name: Send a Slack Notification
if: always()
uses: act10ns/slack@v1
with:
status: ${{ job.status }}
steps: ${{ toJson(steps) }}
channel: '#git'
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
This step sends a Slack notification. It uses the act10ns/slack action and is configured to run "always," which means it runs regardless of the job status. It sends the notification to the specified Slack channel using the webhook URL stored in secrets.
If you get this error, Try to configure aws cli on the ec2 instance to resolve this matter.
Now our build is completed.
And here is the Slack notification.
Let’s go to the Ec2 ssh connection
Rn this command.
kubectl get all
Open the port in the security group for the Node group instance.
After that copy the external IP and paste it into the browser
Here is the Output: The website is Live.
Part 10: Delete the infrastructure (To Avoid Extra Billing, if you are just using it for learning Purposes)
To destroy, Follow these Steps:
-
comment this line run: kubectl apply -f deployment-service.yml in deploy.yaml, adn add this line run: kubectl delete -f deployment-service.yml . You can say it replacement between lines. It will delete the container and delete the Kubernetes deployment.
-
Stop the self-hosted runner.
-
To delete the Eks cluster
cd /home/ubuntu
cd Project_root_folder
cd terraform-eks
terraform destroy --auto-approve
Then Delete ths dockerhub token. Once cluster is destroyed, delete the ec2 instance and iam role.
Special thanks to codewithmuh for creating this incredible Devops Project and simplifying the CI/CD process.