-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Auto generate API keys for a jupyter service lifetime #3526
Comments
@newton1985 what do you think about this? would you like to add a suggestion? |
@mguidon I wrote down the idea you proposed last evening. Am I missing something? Do you have other suggestions? |
@pcrespov - this seems like a very natural and convenient feature. But I do have two questions - 1) I recall that a user can generate multiple key pairs for a given deployment, so in the case that there is more than one pair, how do we distinguish? Also, 2) a user may be logged into one platform, but would like to launch jobs to a different platform via the API. Is there a way to retain this functionality as well? |
@newton1985 very good points! thx!
|
|
This has been implemented. |
Running
osparc
python client in osparc should not require any extra effort to authorize since you are already in logged in in the platform.When a jupyter service opens, it should include
OSPARC_API_KEY
andOSPARC_API_SECRET
as environment variables such that theosparc
client is authenticated out of the box.This code would then work right away:
Moreover, the client could include this as default configuration so there is no need to add this step in the notdbook anymore.
The API keys will be auto-generated when the jupyter starts and will remain for the lifetime of the service, i.e. a new restart will produce new API keys.
This functionality should be only for jupyter services (so far, we do not have a "formal" way to distinguish what is a jupyter service)
We might introduce it as an option in the UI to activate this feature?? Or should it be there by default?
Tasks
The text was updated successfully, but these errors were encountered: