Welcome to the Cloud Computing repository! This repository contains assignments and projects related to cloud computing technologies and practices. Below is a brief overview of the contents of this repository:
- 🐳 A1: Docker Assignment
- 📂 A2: Compute, Network & Security
- 📂 A3: Serverless
- 📂 K8s: Kubernetes
- 📂 TermAssignment: PDF-Image Parsing Microservice
In the A1 folder, you will find the Docker assignment. This assignment focuses on building and orchestrating Docker containers to create a simple microservice architecture.
By completing this assignment, I gained knowledge and skills in the following areas:
- 🛠 Setting up Docker and building containers
- 📦 Creating Dockerfiles and managing Docker images
- 🌐 Communicating between containers using Docker networks
- 📝 Using JSON for data interchange
- 🚀 Developing microservice architectures with Docker Compose
Build two simple web app containers that communicate with each other through a Docker network. The first container serves as an orchestrator and gatekeeper, while the second container performs calculations based on input data.
The first container's role is to receive JSON input, validate it, and pass it to the second container for processing. Here are the tasks for Container 1:
- Listen on port 6000 for JSON input via an HTTP POST request to "/calculate".
- Validate the input JSON to ensure a file name was provided.
- Verify that the file exists.
- Send the file and product parameters to Container 2.
- Return the response from Container 2.
The second container's role is to perform calculations based on the input received from Container 1. Here are the tasks for Container 2:
- Mount the host machine directory '.' to a Docker volume.
- Listen on a defined endpoint/port for calculate requests.
- Load the file into memory.
- Parse the CSV file.
- Calculate the sum of all rows matching the given product parameter.
- Return the sum in JSON format or an error if the file is not a proper CSV file.
In the A2 folder, you will find assignments related to compute, network, and security in cloud computing, focusing on AWS Elastic Compute (EC2) instances and Virtual Private Clouds (VPCs).
By completing this assignment, I gained knowledge and skills in the following areas:
- 🚀 Launching AWS EC2 instances
- 🔗 Connecting to EC2 instances and provisioning them for web applications
- 🛡️ Implementing secure architectures with Virtual Private Clouds (VPCs)
- 🏗️ Implementing VPCs on AWS
- 🌐 Deploying public-facing services within a VPC
- 🔒 Deploying private services within a VPC
- 💻 Working with AWS libraries for operations
- 📝 Building REST APIs and working with JSON arrays
Build a web application using any language or framework, deployed on an EC2 instance behind a VPC.
Application running on EC2 will be public-facing and accessible through a Public IP or Elastic IP. It will listen to the following endpoints:
POST /store-products
: Receive and parse a JSON body, connect to an AWS RDS database server running on a private subnet inside your VPC, insert records into the products table, and return appropriate status codes.GET /list-products
: Connect to the AWS RDS database and return a list of all products.
In the A3 folder, you will find assignment related to serverless computing using AWS Lambda, Step Functions, and API Gateway.
By completing this assignment, I gained knowledge and skills in the following areas:
- 🌟 Understanding the benefits of serverless computing
- 🧩 Implementing finite state machines using AWS Step Functions
- 🌐 Building serverless APIs with AWS API Gateway
- 🔐 Basic understanding of common encryption algorithms
Build REST API entry points using serverless compute mechanisms. Here's an overview of the process:
- Create a State Machine configured with API Gateway.
- The State Machine will evaluate input and select an option.
- Based on this option, a Lambda function will perform a hashing operation.
- The Lambda Function will trigger a POST request with the result to a different endpoint.
- You can query the grade of your recent test submission by calling a POST request.
- Endpoint for State Machine:
/hashing/select
- Endpoint to Start Process:
/serverless/start
- Endpoint to Receive State Machine Input:
/hashing/select
- Endpoint to Receive Result:
/serverless/end
In the K8s folder, you will find assignment related to building a cloud-native CI/CD pipeline and deploying workloads to Google Kubernetes Engine (GKE).
By completing this assignment, I gained knowledge and skills in the following areas:
- 🐳 Containerizing applications using Docker
- 🚀 Building CI/CD pipelines using GCP tools
- ☸️ Understanding Kubernetes concepts and deploying applications on GKE clusters
- 💾 Attaching persistent volumes to GKE clusters and accessing data
- 🛠️ Using Kubernetes tools (e.g., kubectl) to interact with containers and diagnose cluster issues
- 🔧 Implementing application update strategies in GKE
- 🌐 Building REST APIs
Build two simple microservices in the programming language of your choice. These services should be able to interact with each other. You can reuse the services developed in Assignment 1 with minor changes. To deploy the services on GCP, you will create a CI/CD pipeline that deploys the service to GKE.
The role of the first container is to store files to a persistent volume in GKE and serve as a gatekeeper to calculate products from the stored file. It must:
- Be deployed as a service in GKE to communicate with the Internet.
- Have access to persistent volume in GKE to store and retrieve files.
- Communicate with Container 2.
- Validate the input JSON request.
- Send the "file" parameter to Container 2 to calculate the product and return the response.
The role of Container 2 is to listen on a defined endpoint, calculate the total product, and return the result in the appropriate JSON format. It must:
- Have access to the persistent volume of GKE.
- Interact with Container 1.
- Calculate the total product.
- Return the total in JSON format or an error if the file is not a proper CSV.
- Sumit Savaliya (sumit.savaliya@dal.ca)