Skip to content

Commit

Permalink
Merge pull request #541 from hbelmiro/issue-498
Browse files Browse the repository at this point in the history
Added samples for v2 and fixed v1 existing samples
  • Loading branch information
HumairAK committed Feb 2, 2024
2 parents 5af48e5 + b9de476 commit fb34c43
Show file tree
Hide file tree
Showing 30 changed files with 453 additions and 37 deletions.
26 changes: 13 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -182,14 +182,14 @@ components, your DB and Storage configurations, and so forth. For now, we'll use
inspect this sample resource to see other configurable options.

```bash
cd ${WORKING_DIR}/config/samples
cd ${WORKING_DIR}/config/samples/v2/dspa-simple
kustomize build . | oc -n ${DSP_Namespace} apply -f -
```

> Note: the sample CR used here deploys a minio instance so DSP may work out of the box
> this is unsupported in production environments and we recommend to provide your own
> object storage connection details via spec.objectStorage.externalStorage
> see ${WORKING_DIR}/config/samples/dspa_simple_external_storage.yaml for an example.
> see ${WORKING_DIR}/config/samples/v2/external-object-storage/dspa.yaml for an example.
Confirm all pods reach ready state by running:

Expand All @@ -207,7 +207,7 @@ we deployed a `DataSciencePipelinesApplication` resource named `sample`. We can
```bash
DSP_Namespace_2=test-ds-project-2
oc new-project ${DSP_Namespace_2}
cd ${WORKING_DIR}/config/samples
cd ${WORKING_DIR}/config/samples/v2/dspa-simple
kustomize build . | oc -n ${DSP_Namespace_2} apply -f -
```

Expand All @@ -220,7 +220,7 @@ you can simply investigate and deploy the following path:
```bash
DSP_Namespace_3=test-ds-project-3
oc new-project ${DSP_Namespace_3}
cd ${WORKING_DIR}/config/samples/custom-configs
cd ${WORKING_DIR}/config/samples/v2/custom-configs
kustomize build . | oc -n ${DSP_Namespace_3} apply -f -
```

Expand All @@ -232,16 +232,16 @@ These can be configured by the end user as needed.
### Deploy a DSP with external Object Storage

To specify a custom Object Storage (example an AWS s3 bucket) you will need to provide DSPO with your S3 credentials in
the form of a k8s `Secret`, see an example of such a secret here `config/samples/external-object-storage/storage-creds.yaml`.
the form of a k8s `Secret`, see an example of such a secret here `config/samples/v2/external-object-storage/storage-creds.yaml`.

DSPO can deploy a DSPA instance and use this S3 bucket for storing its metadata and pipeline artifacts. A sample
configuration for a DSPA that does this is found in `config/samples/external-object-storage`, you can update this as
configuration for a DSPA that does this is found in `config/samples/v2/external-object-storage`, you can update this as
needed, and deploy this DSPA by running the following:

```bash
DSP_Namespace_3=test-ds-project-4
oc new-project ${DSP_Namespace_4}
cd ${WORKING_DIR}/config/samples/external-object-storage
cd ${WORKING_DIR}/config/samples/v2/external-object-storage
kustomize build . | oc -n ${DSP_Namespace_3} apply -f -
```

Expand All @@ -264,7 +264,7 @@ To understand how these components interact with each other please refer to the
## Deploying Optional Components

### MariaDB
To deploy a standalone MariaDB metadata database (rather than providing your own database connection details), simply add a `mariaDB` item under the `spec.database` in your DSPA definition with an `deploy` key set to `true`. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details. Note that this component is mutually exclusive with externally-provided databases (defined by `spec.database.externalDB`).
To deploy a standalone MariaDB metadata database (rather than providing your own database connection details), simply add a `mariaDB` item under the `spec.database` in your DSPA definition with an `deploy` key set to `true`. All other fields are defaultable/optional, see [All Fields DSPA Example](config/samples/v2/dspa-all-fields/dspa_all_fields.yaml) for full details. Note that this component is mutually exclusive with externally-provided databases (defined by `spec.database.externalDB`).

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
Expand All @@ -280,7 +280,7 @@ spec:
```

### Minio
To deploy a Minio Object Storage component (rather than providing your own object storage connection details), simply add a `minio` item under the `spec.objectStorage` in your DSPA definition with an `image` key set to a valid minio component container image. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details. Note that this component is mutually exclusive with externally-provided object stores (defined by `spec.objectStorage.externalStorage`).
To deploy a Minio Object Storage component (rather than providing your own object storage connection details), simply add a `minio` item under the `spec.objectStorage` in your DSPA definition with an `image` key set to a valid minio component container image. All other fields are defaultable/optional, see [All Fields DSPA Example](config/samples/v2/dspa-all-fields/dspa_all_fields.yaml) for full details. Note that this component is mutually exclusive with externally-provided object stores (defined by `spec.objectStorage.externalStorage`).

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
Expand All @@ -297,7 +297,7 @@ spec:
```

### ML Pipelines UI
To deploy the standalone DS Pipelines UI component, simply add a `spec.mlpipelineUI` item to your DSPA with an `image` key set to a valid ui component container image. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details.
To deploy the standalone DS Pipelines UI component, simply add a `spec.mlpipelineUI` item to your DSPA with an `image` key set to a valid ui component container image. All other fields are defaultable/optional, see [All Fields DSPA Example](config/samples/v2/dspa-all-fields/dspa_all_fields.yaml) for full details.

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
Expand All @@ -314,7 +314,7 @@ spec:


### ML Metadata
To deploy the ML Metadata artifact linage/metadata component, simply add a `spec.mlmd` item to your DSPA with `deploy` set to `true`. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details.
To deploy the ML Metadata artifact linage/metadata component, simply add a `spec.mlmd` item to your DSPA with `deploy` set to `true`. All other fields are defaultable/optional, see [All Fields DSPA Example](config/samples/v2/dspa-all-fields/dspa_all_fields.yaml) for full details.

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
Expand Down Expand Up @@ -643,6 +643,6 @@ Refer to this [repo][kubeflow-pipelines-examples] to see examples of different p
[thirdparty-images]: https://github.com/opendatahub-io/data-science-pipelines/tree/master/third-party
[pre-commit-installation]: https://pre-commit.com/
[kubebuilder-docs]: https://book.kubebuilder.io/
[dspa-yaml]: https://github.com/opendatahub-io/data-science-pipelines-operator/blob/main/config/samples/dspa_all_fields.yaml#L77
[sample-yaml]: https://github.com/opendatahub-io/data-science-pipelines-operator/blob/main/config/samples/dspa_simple.yaml
[dspa-yaml]: https://github.com/opendatahub-io/data-science-pipelines-operator/blob/main/config/samples/v2/dspa-all-fields/dspa_all_fields.yaml#L77
[sample-yaml]: https://github.com/opendatahub-io/data-science-pipelines-operator/blob/main/config/samples/v2/dspa-simple/dspa_simple.yaml
[kubeflow-pipelines-examples]: https://github.com/rh-datascience-and-edge-practice/kubeflow-pipelines-examples
21 changes: 0 additions & 21 deletions config/samples/dspa_simple_external_storage.yaml

This file was deleted.

File renamed without changes.
File renamed without changes.
10 changes: 10 additions & 0 deletions config/samples/v1/custom-configs/kustomization.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
# mlpipeline-minio-artifact.yaml is required due to https://issues.redhat.com/browse/RHOAIENG-1720
- mlpipeline-minio-artifact.yaml
- dspa.yaml
- db-creds.yaml
- artifact_script.yaml
- storage-creds.yaml
- ui-configmap.yaml
14 changes: 14 additions & 0 deletions config/samples/v1/custom-configs/mlpipeline-minio-artifact.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
apiVersion: v1
kind: Secret
metadata:
name: mlpipeline-minio-artifact
labels:
opendatahub.io/dashboard: 'true'
opendatahub.io/managed: 'true'
annotations:
opendatahub.io/connection-type: s3
openshift.io/display-name: Minio Data Connection
data:
accesskey: QUtJQUlPU0ZPRE5ON0VYQU1QTEU=
secretkey: d0phbHJYVXRuRkVNSS9LN01ERU5HL2JQeFJmaUNZRVhBTVBMRUtFWQ==
type: Opaque
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ spec:
# One of minio or externalStorage must be specified for objectStorage
# This example illustrates minimal deployment with minio
# This is NOT supported and should be used for dev testing/experimentation only.
# See dspa_simple_external_storage.yaml for an example with external connection.
# See external-object-storage/dspa.yaml for an example with external connection.
objectStorage:
minio:
# Image field is required
Expand Down
File renamed without changes.
10 changes: 10 additions & 0 deletions config/samples/v2/custom-configs/db-creds.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
kind: Secret
apiVersion: v1
metadata:
name: testdbsecret
labels:
app: mariadb-sample
component: data-science-pipelines
stringData:
password: "testingpassword"
type: Opaque
89 changes: 89 additions & 0 deletions config/samples/v2/custom-configs/dspa.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: sample
spec:
dspVersion: v2
apiServer:
deploy: true
image: gcr.io/ml-pipeline/api-server:2.0.2
stripEOF: true
terminateStatus: Cancelled
trackArtifacts: true
dbConfigConMaxLifetimeSec: 120
collectMetrics: true
autoUpdatePipelineDefaultVersion: true
resources:
requests:
cpu: 250m
memory: 500Mi
limits:
cpu: 500m
memory: 1Gi
persistenceAgent:
deploy: true
image: gcr.io/ml-pipeline/persistenceagent:2.0.2
numWorkers: 2
resources:
requests:
cpu: 120m
memory: 500Mi
limits:
cpu: 250m
memory: 1Gi
scheduledWorkflow:
deploy: true
image: gcr.io/ml-pipeline/scheduledworkflow:2.0.2
cronScheduleTimezone: UTC
resources:
requests:
cpu: 120m
memory: 100Mi
limits:
cpu: 250m
memory: 250Mi
mlpipelineUI:
deploy: true
image: gcr.io/ml-pipeline/frontend:2.0.2
resources:
limits:
cpu: 100m
memory: 256Mi
requests:
cpu: 100m
memory: 256Mi
configMap: custom-ui-configmap
database:
mariaDB:
deploy: true
image: registry.redhat.io/rhel8/mariadb-103:1-188
username: mlpipeline
pipelineDBName: randomDBName
pvcSize: 10Gi
resources:
requests:
cpu: 300m
memory: 800Mi
limits:
cpu: "1"
memory: 1Gi
passwordSecret:
name: testdbsecret
key: password
objectStorage:
minio:
deploy: true
image: quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance
bucket: mlpipeline
pvcSize: 10Gi
resources:
requests:
cpu: 200m
memory: 100Mi
limits:
cpu: 250m
memory: 1Gi
s3CredentialsSecret:
secretName: mlpipeline-minio-artifact
accessKey: AWS_ACCESS_KEY_ID
secretKey: AWS_SECRET_ACCESS_KEY
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,5 @@ kind: Kustomization
resources:
- dspa.yaml
- db-creds.yaml
- artifact_script.yaml
- storage-creds.yaml
- ui-configmap.yaml
17 changes: 17 additions & 0 deletions config/samples/v2/custom-configs/storage-creds.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
apiVersion: v1
kind: Secret
metadata:
name: mlpipeline-minio-artifact
labels:
opendatahub.io/dashboard: 'true'
opendatahub.io/managed: 'true'
annotations:
opendatahub.io/connection-type: s3
openshift.io/display-name: Minio Data Connection
data:
AWS_ACCESS_KEY_ID: QUtJQUlPU0ZPRE5ON0VYQU1QTEU=
AWS_SECRET_ACCESS_KEY: d0phbHJYVXRuRkVNSS9LN01ERU5HL2JQeFJmaUNZRVhBTVBMRUtFWQ==
# The following keys are needed while https://github.com/kubeflow/pipelines/issues/9689 is open
accesskey: QUtJQUlPU0ZPRE5ON0VYQU1QTEU=
secretkey: d0phbHJYVXRuRkVNSS9LN01ERU5HL2JQeFJmaUNZRVhBTVBMRUtFWQ==
type: Opaque
11 changes: 11 additions & 0 deletions config/samples/v2/custom-configs/ui-configmap.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
apiVersion: v1
data:
viewer-pod-template.json: |-
{
"spec": {
"serviceAccountName": "ds-pipelines-viewer-sample"
}
}
kind: ConfigMap
metadata:
name: custom-ui-configmap
Loading

0 comments on commit fb34c43

Please sign in to comment.