Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove the need for AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to be set with Digital Ocean deployment #1344

Merged
merged 3 commits into from
Jul 1, 2022

Conversation

costrouc
Copy link
Member

Closes #1343

Changes introduced in this PR:

AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY will be automatically set based on SPACES_ACCESS_KEY_ID and SPACES_SECRET_ACCESS_KEY.

Types of changes

What types of changes does your PR introduce?

Put an x in the boxes that apply

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds a feature)
  • Breaking change (fix or feature that would cause existing features to not work as expected)
  • Documentation Update
  • Code style update (formatting, renaming)
  • Refactoring (no functional changes, no API changes)
  • Build related changes
  • Other (please describe):

Testing

Requires testing

  • Yes
  • No

Further comments (optional)

If this is a relatively large or complex change, kick off the discussion by explaining why you chose the solution you did and what alternatives you considered and more.

@costrouc costrouc force-pushed the feat-do-set-aws-key branch 4 times, most recently from 5480c09 to d0519db Compare June 23, 2022 13:39
qhub/utils.py Outdated Show resolved Hide resolved
@costrouc
Copy link
Member Author

@viniciusdc this will allow you to remove the need for setting those environment variables for the Digital Ocean deploymet.

Copy link
Contributor

@viniciusdc viniciusdc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for updating this @costrouc

@viniciusdc
Copy link
Contributor

@costrouc may I ask if you can do a DO render over this branch to validate if everything works? -- as the CI envs here possess all the necessary keys already, we might be unable to identify some exceptions

@costrouc costrouc added this to the Release v0.4.3 milestone Jun 28, 2022
@trallard trallard added provider: AWS type: maintenance 🛠 Day-to-day maintenance tasks labels Jun 30, 2022
@viniciusdc viniciusdc added the needs: tests ✅ This contribution is missing tests label Jun 30, 2022
@viniciusdc
Copy link
Contributor

I will be testing this on DO, today

qhub/utils.py Outdated Show resolved Hide resolved
@viniciusdc
Copy link
Contributor

Tested with remote DO deployment and no AWS credentials, all working !!!

@viniciusdc viniciusdc merged commit 1686b3a into main Jul 1, 2022
viniciusdc added a commit that referenced this pull request Jul 1, 2022
…EY to be set with Digital Ocean deployment (#1344)"

This reverts commit 1686b3a.
viniciusdc added a commit that referenced this pull request Jul 1, 2022
…EY to be set with Digital Ocean deployment (#1344)" (#1355)

This reverts commit 1686b3a.
@viniciusdc
Copy link
Contributor

viniciusdc commented Jul 1, 2022

HI @costrouc sorry for the messy revert there, It seems that the destroy command failed with missing environments -- I tried adding a quick workaround, but the error persisted so I reverted the PR.

@viniciusdc
Copy link
Contributor

to fix the destroy part we just need to add the new terraform state context manager in gather_stage_outputs and destroy_stages of destroy.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs: tests ✅ This contribution is missing tests provider: AWS type: maintenance 🛠 Day-to-day maintenance tasks
Projects
None yet
3 participants