Skip to content

jonhilgart22/go-trader

Repository files navigation

Go Trader

Code Grade Code Grade Code style: black

  • Golang lambda app that triggers buy signals alongside a Python ML model

bollinger

Above, we can see a sample run on historic Bitcoin prices.

Data

  1. wget http://api.bitcoincharts.com/v1/csv/localbtcUSD.csv.gz
  • This data is used in notebooks/testing_bollinger_bands_and_ts_models.ipynb
  1. Manually scrape data from https://coinmarketcap.com/
  • This data is more recent, up to 09-04-2021, and is used in notebooks/bollinger_bands_and_coinbase_data.ipynb
  1. Download SPY data from here
  2. Coin market cap

Adding a new col of data

  1. Add into the CoinPricePredictor
  2. Update the self.ml_train_cols to include this col

Architecture

atchitecture

  1. The go app handles connecting to the FTX exchange, pulling down data from/pushing up data to S3, adding the new data, launching the python program, and executing orders
  2. The Python program trains the ML models, builds the Bollinger Bands, predict whether to enter/exit trades and returns current trade information to the golang app.

Data

  • All data is store in S3 in s3://go-trader/data

Models

  • Stored in S3 in s3://go-trader/models

Deployment

Infrastructure

  • All contained in the terraform/ directory. This project uses tfswitch to change between Terraform versions.
  1. bash terraform plan
  2. bash terraform apply Helpful Terraform article for VPC lambdas here

S3 bucket: go-trader

CI/CD

Via Github Actions

  • Use act to test locally
  • bash brew install act
  • run act bash act

Deploy

  1. Make any config changes and then make upload_configs
  2. Build and push the docker image ./scripts/build.sh
  3. If you need to update any env vars, use the scripts/set_ssm.sh script with the name and value. This includes new FTX env vars for different accounts.
  4. Update the Docker Image for the lambda if you've build a new one
  • aws lambda update-function-code --function-name go-trader-function --image-uri $(aws lambda get-function --function-name go-trader-function | jq -r '.Code.ImageUri')
  1. Wait for the next Lambda run!

Adding a new coin

  1. Add new data to the google sheet here. I like using (Coin Market Cap)[https://coinmarketcap.com/] for historical data.
  2. Download this data and put it into the tmp/ directory. The tmp directory is used for the AWS lambda. If you are running experiments, you can put the experiment data into the data/ folder.
  3. Run simulations in the 'notebooks/training_from_python_code.ipynb' jupyter notebook to see how the default config would work. For new coins, we should use tbt and btc as the additional_dfs argument.
    1. This notebook works the same as the inference code. Therefore, it uses the training_configs directory to store the results of each coin. You'll need to add in new configs for each coin, or reset the configs to zero for new training runs. You will need to add a new <coin>_all_predictions.csv that includes dates from the early data from the historical predictions.
    2. After you have these simulations, use them to seed the stacking predictions from the notebook notebooks/combine_historical_predictions_with_current.ipynb
    3. Also, make sure you upload this new data after downloading make download_configs_and_data
  4. Setup a new FTX account, generate an api key for this account, create aws SSM params for these params, in the ssm_store.go, set these values as env vars. Also add these to the env_vars.sh file
  5. Add these env vars using scripts/set_ssm.sh. ./scripts/set_ssm.sh FTX_KEY <your-api-key>
  6. Update the main.go file to create a new FTX client for this coin
  7. Update the main.py files for this new coin as well as predict_price_movements.py . Grep around for btc to see what code to update
  8. Download the current configs make download_configs . We're going to be re-uploading the state and want the latest snapshot of reality
  9. Create new *.yml files under tmp/ (here). This includes the three coin specific files (actions_to_take, trading_state_config, won_and_lost), and updating the constants.yml file.
  10. Create a new Eventbridge trigger in the main.tf file
  11. Upload the configs make upload_configs
  12. Upload the dataset make upload_data
  13. build the lambda image ./scripts/build.sh
  14. update the lambda with the new image! aws lambda update-function-code --function-name go-trader-function --image-uri $(aws lambda get-function --function-name go-trader-function | jq -r '.Code.ImageUri') NB: be sure to update the various checks that look for the correct input coins

Testing

  1. New Python code
  • make run_python
  1. New Golang code (need to build the go binary and run it as if it were a lambda)
  • docker run --rm -v "$PWD":/go/src/handler lambci/lambda:build-go1.x sh -c 'go build app/src/main.go'
  • docker run --rm -e ON_LOCAL=true -v "$HOME"/.aws:/home/sbx_user1051/.aws:ro -v "$PWD":/var/task lambci/lambda:go1.x main '{"coinToPredict": "btc"}'
    • Can change the coinToPredict to any of the currently supported coins
  • To test the full app, you need to build the docker image, update the lambda, and test the lambda

Performance

  • View the different model performance results here
  • You can check on the current performance by calling the FTX API and parsing the trades made
    • python scripts/generate_performance_numbers_from_ftx.py