This project is my submission for our DevOps & Software Testing's workshop at INSAT.
We were required to come up with a project, implement its Unit Tests, Integration Tests and E2E tests, then implement a CI/CD pipeline.
The project is a simple Spring Boot application that communicates with an AWS S3 bucket.
Through POST /uploadImage
, an image that you embed to the body of the request through image
key will be resized and saved in an S3 bucket.
Through GET /getImage/{image}
, image
being the name of the image that you sent, you can get back your picture.
The project requires two environment variables; AWS_ACCESS_KEY and AWS_SECRET_KEY to be able to connect to the S3 bucket.
The project contains:
.github
: pipelines.assets
: assets necessary for the README.e2e
: E2E test.infra
: simple IAC using Terraform.src
: SpringBoot code and Tests.images
,resizedimages
andretrive
: directories that the project require.pom.xml
: the XML file that contains information about the project and configuration details used by Maven to build the project.
Click here to go to the Unit Tests.
To mock the calls to S3, I used Mockito, a framework that allows the creation of test double objects (mock objects) in automated unit tests for the purpose of test-driven development (TDD) or behavior-driven development (BDD).
@Mock
AmazonS3 s3;
@Rule
public MockitoRule rule = MockitoJUnit.rule();
@Autowired
@Mock
private ImageServiceS3 imageServiceS3;
@Before
public void setUp() {
MockitoAnnotations.openMocks(this);
ReflectionTestUtils.setField(imageServiceS3, // inject into this object
"property", // assign to this field
"value"); // object to be injected
}
@Test
public void uploadFileFromMultipartFileTestCase() throws IOException {
Mockito.when(s3.putObject(anyString(), anyString(), anyString())).thenReturn(new PutObjectResult()); // Mock call to S3
Mockito.when(imageServiceS3.getS3()).thenReturn(s3);
BufferedImage bufferedImage = ImageIO.read(Paths.get(resourcePath + "/test_image.jpg").toFile());
Mockito.when(imageServiceS3.uploadImage(uploadedFileName, bufferedImage)).thenCallRealMethod(); // Call real method
// rest of the test
}
In this part, we're going to test the interfaces, meaning everything related to the communication with S3 buckets.
For that, we are going to test the retrieval and upload on a test bucket, which is just another bucket not related to the one we use in the application.
You can find the tests here.
As for the E2E test, since we only have a backend project, I opted for a Python script that tests the main workflow.
The script is the following:
- Send
POST /uploadImage
with our test image. - Check that the response status is 200 and response message is clear.
- Send
GET /getImage/testImage
. - Check that the image is retrieved correctly and that it had been resized.
- Remove image from S3 to clear everything up.
To clear everything up, we use this method.
# Remove object from S3
def clean_up(s3):
os.remove(TEMP_FILEPATH)
b = boto3.Bucket(s3, BUCKET_NAME)
k = boto3.Key(b)
k.key = FILENAME
b.delete_key(k)
As you can see in the Dockerfile, I opted for a multi-stage build.
-
Stage one:
- Copy
pom.xml
; - Install maven dependencies;
- Copy
./src
- Build
.jar
file.
- Copy
-
Stage two:
- Make the directories necessary for the project;
- Copy the
.jar
file from Stage one; - Expose port and annouce env variables;
- Run the
.jar
file.
First, we need to prepare the EC2 instance.
For that, you can find in the infra directory the Terraform code to provision the EC2 and prepare the Security Groups, and also install Docker in the instance.
For all of this to work, we need to do the following.
- Prepare our RSA keys by
ssh-keygen -t rsa -m PEM
and put them in./infra/keys
directories; - Add the values in
terraform.tfvars
file; - Go to terraform.io and prepare the workspace;
- Generate an API access token in the Terraform cloud.
- Add the private key,
terraform.tfvars
's content and the API access token to GitHub secrets.
Here's and example of terraform.tfvars
;
aws-region = "AWS region"
aws-access-key = "AWS access key"
aws-secret-key = "AWS secret key"
ec2-public-key = "The public key generated"
And we're all set.
- Tests: Run Unit Tests and Integration Tests.
- Build and Release: Build the Docker Image and push it to Dockerhub.
- E2E Tests: Run the python script.
- Run the IAC: Apply the changes in the Terraform code.
- Deploy:
- SSH into the EC2 instance.
- Kill the docker container that's currently running and remove it.
- Pull the new image.
- Run the new image.
A simple pipeline to run the Unit Tests and Integration Tests.