Skip to content

Commit

Permalink
update README with marketing copy
Browse files Browse the repository at this point in the history
  • Loading branch information
mattwwarren committed May 23, 2024
1 parent 5a6fe3e commit 56d6064
Show file tree
Hide file tree
Showing 2 changed files with 39 additions and 14 deletions.
53 changes: 39 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,26 +1,57 @@
# CloudTruth Config Catalyst
Extract parameters from your existing config and import to your CloudTruth organization
![Logo for Config Catalyst by CloudTruth](https://github.com/cloudtruth/config-catalyst/blob/main/docs/img/repo-logo.png?raw=true)

# Config Catalyst

![CI](https://github.com/cloudtruth/config-catalyst/actions/workflows/ci.yaml/badge.svg)
![Codecov](https://img.shields.io/codecov/c/github/cloudtruth/config-catalyst)
![Docker pulls](https://img.shields.io/docker/pulls/cloudtruth/config-catalyst)
![version](https://img.shields.io/docker/v/cloudtruth/config-catalyst)

Config Catalyst automatically converts static config files in your repos into parameterized templates.
It's the easiest way to "pay down config tech debt" with a single command.
We ❤️ feedback, [bugs](https://github.com/cloudtruth/config-catalyst/issues/new), and [enhancement suggestions](https://github.com/cloudtruth/config-catalyst/issues/new). We also have a #config-catalyst channel [on our Discord](https://discord.com/invite/eBZXm9Tzr7).

# Motivation
Config Catalyst exists to solve this problem:
"I need to take this weird network config that Bob (who left five years ago) roughed out by hand and turn it into a crisp little YAML template with parameterized variables."
Static, hard-coded config is a form of "tech debt" many teams want to eliminate, but the "pay it down" process is tedious and time consuming.

# How it works
Config Catalyst works off of a local copy of your repo folder structure.
1. Scans your repos and finds all the configuration files (JSON, YAML, ENV, HCL, TF_Var, etc..)
1. Automatically creates parameterized templates with linked parameters and secrets to the parameter/secret store.
1. Result: Finally, parameter and secret changes can be reflected in all the config files that use/depend on them.

You don't need to build all the plumbing anymore. Config file changes are independent of param/secret variable changes.

# Supported file types
Need to manage values across multiple environments with one template? No problem!

Bonus: Optionally, the variables and secrets can be synced to Azure Key Vault, AWS Secrets Manager, ParameterStore, or Vault.

# Features and supported file types

**Current file type support includes**:
* JSON
* YAML
* dotenv
* tfvars
* variables.tf

## Not yet supported (but planned)
**Not yet supported (but planned)**
* ini
* pkl

# Usage
# Installation
This utility is distributed as a Docker container and can be pulled from cloudtruth/config-catalyst on Docker Hub

Clone your repo(s) to local disk to allow Config Catalyst to find the supported file types.

You will need a CloudTruth [API token](https://app.cloudtruth.io/organization/api) to complete the process.

Use your existing account or create a free [CloudTruth account](https://app.cloudtruth.io/signup) to get an API token.

# Usage

## Procesing a directory tree (the easy method)
You can feed a directory of files into the `walk-directories` command, which will find all files matching the supplied types and parse them into CloudTruth config formats. If you supply your CLOUDTRUTH_API_KEY via docker, the data will be uploaded to your CloudTruth account.

Expand All @@ -29,13 +60,7 @@ docker run --rm -it -e CLOUDTRUTH_API_KEY="myverysecureS3CR3T!!" -v ${PWD}/files
```

# Advanced Usage
## Processing a single file
An example of how to process a .env file
```
docker run --rm -v ${PWD}/files:/app/files cloudtruth/config-catalyst process-configs -p myproj --default-values /app/samples/.env.sample --file-type dotenv --output-dir /app/files/
```

This command will mount a subdir `files` from the current working directory to the container. Assuming your input file is in that dir, the processed files will be placed in that dir once processing has completed.
These examples break down the directory walking method into its individual components.

## Processing several files
An example of how to process several .env files and create values for each environment
Expand Down Expand Up @@ -79,5 +104,5 @@ Issues, pull requests, and discussions are welcomed. Please vote for any issues

See `dynamic_importer.processors` and the subclasses within for examples of the current design. TL;DR - if you can convert the source into a dict, `BaseProcessor._traverse_data` should handle most of the heavy lifting.

# Counter-examples
Because this has come up internally, this utility is intended to process config data, not application code or even raw IaC code. If you want to feed it a full Terraform manifest, you're going to get strange results. Pull your common variables into a variables.tf first!
# Further reading
CloudTruth [documentation](https://docs.cloudtruth.com/)
Binary file added docs/img/repo-logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 56d6064

Please sign in to comment.