Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache Docker Image #147

Merged
merged 1 commit into from
Feb 12, 2021
Merged

Cache Docker Image #147

merged 1 commit into from
Feb 12, 2021

Conversation

pezholio
Copy link
Contributor

This checks for the existence of a cache if the Docker image or the dependencies have not changed. If there is a cache available, we download and restore it. If there is not, we run the tests, then save and upload the cached image for next time.

This checks for the existence of a cache if the Docker image or the dependencies
have not changed. If there is a cache available, we download and restore it. If there
is not, we run the tests, then save and upload the cached image for next time.
run:
mkdir -p /tmp/docker-save && docker save app:test -o
/tmp/docker-save/snapshot.tar && ls -lh /tmp/docker-save
if: always() && steps.cache-docker.outputs.cache-hit != 'true'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to update the cache even if there's a test failure? Feels like it would be better to keep the cache to those images that pass.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there may be value in sticking with the initial approach.

I'm imaginging the scenario where a pull request adds a new dependency to one of the earlier docker layers. The CI job fails due to application tests which are sourced by one of the last Docker layers. Given this PR is likely to generate another CI run which only changes the application layer, there is value in having a cached dependency layer on hand.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, that's exactly the scenario I envisaged

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess I was worried about broken stuff sitting in the cache and causing bugs (but I guess we should trust Docker). Is the cache isolated to a branch? Will a cache from a branch replace the main cache (and therefore mean everyone not on that branch will have slower builds)? Don't feel super strongly either way, though.

Base automatically changed from main to develop January 17, 2021 19:02
Copy link
Contributor

@tahb tahb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Build time starts as 5m 56s and reduces to 2m 31s. Thank you for adding this!

@pezholio pezholio merged commit 2cece86 into develop Feb 12, 2021
@pezholio pezholio deleted the cache-docker-image branch February 12, 2021 14:23
tahb added a commit to DFE-Digital/buy-for-your-school that referenced this pull request Apr 27, 2021
This work is being copied across from our Rails Template. It was added after this project was created [1].

We are already using Docker Compose in CI on this project. However for context there is another pending change on the Rails Template[2] that includes a screenshot of how caching can behave.

The approach here is that we create a shared area on disk called /tmp/docker-save. We load this in at the start of every test run. It will make a past docker image for this app available to the build context. Docker can then use this image and it's layers to optimise the build steps, opting for the cached version rather than rebuilding from scratch.

If the cache is not hit (meaning the Dockerfile or gem.lock file changed) then it will add a new image into the cache at the end of the run. In this situation the builds will still take as long as they do now however this should be much less frequent.

[1] dxw/rails-template#147
[2] dxw/rails-template#213
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants