Skip to content
This repository has been archived by the owner on Feb 8, 2023. It is now read-only.

Add AWS helper module that uploads saved files to S3 #35

Closed

Conversation

islamhamdi
Copy link

Description:

Creating an AWS module that gets its configuration through CMD parameters (AWS KEY, SECRET and S3Bucket), that uploads the file (once saved) into S3 into a custom bucket path (e.g: https://s3.console.aws.amazon.com/s3/buckets/rabbitmq-backups-rabbitio/?region=us-west-2&tab=overview)

Testing:

  • Ran go build -a --> gets a binary out rabbitio

  • Ran ./rabbitio out -e staging -q test-rabbitio -u amqp://<username>:<password>@rabbitmq-staging.ucoachapp.com:5672 --awsKey <aws_key> --awsSecret <aws_secret> --s3Bucket rabbitmq-backups-rabbitio

Screen Shot 2020-09-24 at 5 04 20 PM

  • Messages uploaded successfully to S3:
    Screen Shot 2020-09-24 at 5 21 58 PM

@islamhamdi
Copy link
Author

Hi @vorce and @stiangrindvoll, I thought about adding S3 uploads support as a contribution to the library. We (Insidetrack: https://www.insidetrack.org/) found that it's useful to upload the messages as well to a bucket in S3 beside having the data saved in files on disk, the reason is that we can execute commands from rabbitio from inside other docker containers, and transferring the files manually is painful due to permissions, I'll work next on containerizing rabbitio as a docker container.

@stiangrindvoll
Copy link
Collaborator

stiangrindvoll commented Sep 29, 2020

Hi @islamhamdi Thank you for your interest in this project!

I wonder if the use case you are looking for is better done with Benthos? Please check it out.

Otherwise i like the idea!
Would it be possible to make an s3 sub command under out to include the s3 options there and make it fail if not provided that way?

Is is a requirement for you to save on disk, meaning inside the docker container? Or do you mount a volume for this? I'm just wondering if it makes sense to make it optional in the case of s3 storing, to save to disk. Although it adds extra security to keep it.

@islamhamdi
Copy link
Author

@stiangrindvoll Thanks for the suggestion about Benthos, I'll take a look!

Alright, I'll take a look at having a sub-command for S3 under out, and most probably add the same later under in as well that consumes the files from S3 and pushes into rabbitmq. I'll send the updated PR.

Our use case happens when rabbitmq is flooded with messages by some bad logic/script that ran, we need "somewhere" that has credentials for AWS, has elixir (our services run w/ elixir), and able to connect to rabbitmq through VPN. The idea would be to wrap all the logic in a binary like rabbitio instead of a written script, usually we were consuming the messages and uploading them to S3 from inside one of our services docker containers which is not durable (pods can be terminated in k8s and it's not tied to a volume).

@islamhamdi
Copy link
Author

Hi @stiangrindvoll, I thought about having a way to run both the main command (out) and its subcommand (s3) all at once, but this doesn't seem to be doable yet: spf13/cobra#726, what do you think? Shall we keep it as it is, or make s3 a separate command that calls outCmd.Execute()?

@islamhamdi
Copy link
Author

Ping @stiangrindvoll any updates on the above comment?

@islamhamdi islamhamdi closed this Oct 12, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants