Skip to content

Latest commit

 

History

History
56 lines (42 loc) · 2 KB

README.md

File metadata and controls

56 lines (42 loc) · 2 KB

Build Status

Cache Buildkite Plugin

A Buildkite plugin to restore and save directories by cache keys. For example, use the checksum of a .resolved or .lock file to restore/save built dependencies between independent builds, not just jobs.

Restore & Save Caches

steps:
  - plugins:
    - danthorpe/cache#v1.0.0:
        cache_key: "v1-cache-{{ checksum 'Podfile.lock' }}"
        paths: [ "Pods/", "Rome/" ]

Cache Key Templates

The cache key is a string, which support a crude template system. Currently checksum is the only command supported for now. It can be used as in the example above. In this case the cache key will be determined by executing a checksum (actually sha1sum) on the Podfile.lock file, prepended with v1-cache-.

S3 Storage

This plugin uses AWS S3 sync to cache the paths into a bucket as defined by environment variables defined in your agent.

export BUILDKITE_PLUGIN_CACHE_S3_BUCKET_NAME="my-unique-s3-bucket-name"
export BUILDKITE_PLUGIN_CACHE_S3_PROFILE="my-s3-profile"

The paths are synced using Amazon S3 into your bucket using a structure of organization-slug/pipeline-slug/cache_key, as determined by the Buildkite environment variables.

Rsync Storage

You can also use rsync to store your files using the rsync_storage config parameter. If this is set it will be used as the destination parameter of a rsync -az command.

steps:
  - plugins:
    - danthorpe/cache#v1.0.0:
        rsync_storage: '/tmp/buildkite-cache'
        cache_key: "v1-cache-{{ checksum 'Podfile.lock' }}"
        paths: [ "Pods/", "Rome/" ]

The paths are synced using rsync_storage/cache_key/path. This is useful for maintaining a local cache directory, even though this cache is not shared between servers, it can be reused by different agents/builds.