-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ssh key not attached to ec2 worker node #227
Comments
The current behaviour is like this: By default, You can supply custom key path, e.g. However, we should discuss whether having set @richardcase @karinnainiguez @kschumy @christopherhein what are your thoughts on this? |
thanks @errordeveloper but please let's always update the online documentation. |
@errordeveloper i don't necessarily think that by specifying This may be a crazy idea but could we:
|
This tripped me up as well, I was just following the online eksctl homepage examples. Also I feel the the help output might need a bit more information on it, such as its a boolean value.
|
Sounds like a good idea to me! We should also update the docs (per @kylesloan's comment, it sounds like docs are a little misleading at the moment).
I am not sure, perhaps eventually we could do this, but it's not very easy for an existing nodegroup, as nodes won't just start using a key you've added. I think we should allow adding SSH for separate nodegroups, when we have support for more than one. Also, we should tackle #148 first :) |
This default behaviour can be euphemistically described as "misleading". It should be obvious that i need ssh access, when specifying ssh key. |
As we have a default value, and SSH is not enabled by default, we will have trigger enabling SSH when a value is set that is not the same is default default. What I mean is:
Alternatively, as @richardcase suggested, we could at least warn in case of 3 that access won't be enabled.
@richardcase i think that's no as useful now, as one can add a new nodegroup that has SSH access when they need to (e.g. to debug an issue with custom AMI or install/run some ad-hoc software on their nodes). |
This was fixed via #657. |
Here is my hack to add the ssh-key after EKS has been provisioned. https://gist.github.com/seva-ramin/8efe3fd0c3d29c448e915b9acd6db079 |
Use default kubeconfig location
What happened?
eksctl does not attach the requested ssh key to the worker nodes.
What you expected to happen?
eksctl correctly attaches the requested ssh key to the worker nodes.
How to reproduce it?
eksctl -v5 create cluster --name=eks-cluster --region=eu-west-1 --ssh-public-key=eks
Versions
Please paste in the output of these commands:
The text was updated successfully, but these errors were encountered: