-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Authentication and service account plan for Pipeline + Kubeflow #374
Labels
Comments
@IronPan what is the remaining work here? |
@Ark-kun How is the work to migrate the samples to use the gcp credential? |
The samples are updated to have the right permission now. |
HumairAK
pushed a commit
to red-hat-data-services/data-science-pipelines
that referenced
this issue
Mar 11, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently in Pipeline instruction, we deploy the cluster with cloud-platform scope. Without any configuration, the pipeline and any derived argo jobs will run under the default service account (default Compute Engine service account). This unrestricted setup was OK in the past for development and testing propose. But as project goes public, I think it might want to change the service account setup into a managed way.
Kubeflow uses different service accounts for different roles.
For ML pipeline, the argo job, as well as any derived workload such as tf-job, should ideally run under kf-user service account in order to access to GCP API. To achieve this here are some todo items needed from various pipeline points.
The text was updated successfully, but these errors were encountered: