Skip to content
This repository has been archived by the owner on Jan 9, 2023. It is now read-only.

Vault data disks not mounted #697

Closed
MattiasGees opened this issue Jan 22, 2019 · 1 comment · Fixed by #698
Closed

Vault data disks not mounted #697

MattiasGees opened this issue Jan 22, 2019 · 1 comment · Fixed by #698
Assignees
Labels
kind/bug Categorizes issue or PR as related to a bug.
Milestone

Comments

@MattiasGees
Copy link
Member

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug

/kind feature

What happened:
Vault clusters don't mount and use their data EBS volume. The disk gets created in AWS but doesn't get mounted to the vault instance.

What you expected to happen:
2 disks mounted on every instance in the vault cluster.

How to reproduce it (as minimally and precisely as possible):
tarmak init
tarmak apply

Anything else we need to know?:

Environment:

  • Kubernetes version (use kubectl version): 1.12
  • Cloud provider or hardware configuration**:
  • Install tools:
  • Others:
@jetstack-bot jetstack-bot added the kind/bug Categorizes issue or PR as related to a bug. label Jan 22, 2019
@simonswine
Copy link
Contributor

Sounds like a regression of #93, can you investigate @JoshVanL. I was able to reproduce that, with an existing config

/assign @JoshVanL

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants