Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dataset operator CrashLoopBackOff #362

Closed
vmtuan12 opened this issue May 23, 2024 · 4 comments · Fixed by #363
Closed

Dataset operator CrashLoopBackOff #362

vmtuan12 opened this issue May 23, 2024 · 4 comments · Fixed by #363
Labels
bug Something isn't working dataset Issues related to the Dataset and DatasetInternal Operator question Further information is requested

Comments

@vmtuan12
Copy link

I installed Datashim in my K8s cluster for Minio, using the command
kubectl apply -f https://raw.githubusercontent.com/datashim-io/datashim/master/release-tools/manifests/dlf.yaml

But the pod dataset-operator keeps having status CrashLoopBackOff, and here is the log of this pod
image

I still do not know why, and how to fix it.

@vmtuan12 vmtuan12 added the question Further information is requested label May 23, 2024
@srikumar003
Copy link
Collaborator

@vmtuan12 you have spotted a bug but it was triggered by a mistake in the dataset specification you provided. The spec did not contain secret-name as required (see example below):

apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
  name: example-dataset
spec:
  local:
    type: "COS"
    secret-name: "{SECRET_NAME}" #see s3-secrets.yaml for an example
    secret-namespace: "{SECRET_NAMESPACE}" #optional if the secret is in the same ns as dataset
    endpoint: "{S3_SERVICE_URL}"
    bucket: "{BUCKET_NAME}"
    readonly: "true" # default is false
    region: "" #it can be empty

@srikumar003 srikumar003 added bug Something isn't working dataset Issues related to the Dataset and DatasetInternal Operator labels May 23, 2024
@vmtuan12
Copy link
Author

@vmtuan12 you have spotted a bug but it was triggered by a mistake in the dataset specification you provided. The spec did not contain secret-name as required (see example below):

apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
  name: example-dataset
spec:
  local:
    type: "COS"
    secret-name: "{SECRET_NAME}" #see s3-secrets.yaml for an example
    secret-namespace: "{SECRET_NAMESPACE}" #optional if the secret is in the same ns as dataset
    endpoint: "{S3_SERVICE_URL}"
    bucket: "{BUCKET_NAME}"
    readonly: "true" # default is false
    region: "" #it can be empty

Can you tell me where is the s3-secrets.yaml? I use this configuration in the document
image

@srikumar003
Copy link
Collaborator

@vmtuan12 An example for specifying S3 credential as a K8s secret is provided here . We recommend using this over the default example.

Having said that, your configuration should work so check if there are any missing parameters or paste a santised version here so that we can take a look at it

Thanks for raising the issue. I have issued a patch for the original bug which could close this issue. Please reopen with new comments

@vmtuan12
Copy link
Author

@vmtuan12 An example for specifying S3 credential as a K8s secret is provided here . We recommend using this over the default example.

Having said that, your configuration should work so check if there are any missing parameters or paste a santised version here so that we can take a look at it

Thanks for raising the issue. I have issued a patch for the original bug which could close this issue. Please reopen with new comments

Thanks! I have fixed that, now the pod dataset-operator is on, but now i have a problem. My PVC is having status pending, and when the description is like the image below. Is it telling me that the bucket is invalid or something like that?
image
The dataset yaml of mine is:

apiVersion: v1
kind: Secret
metadata:
  name: s3secret
stringData:
  accessKeyID: "minioadmin"
  secretAccessKey: "minioadmin"
---
apiVersion: datashim.io/v1alpha1
kind: Dataset
metadata:
  name: example-dataset
spec:
  local:
    type: "COS"
    secret-name: "s3secret"
    secret-namespace: "dlf"
    endpoint: "http://<IP>:<port>"
    bucket: "/home/test-ds"
    region: ""

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working dataset Issues related to the Dataset and DatasetInternal Operator question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants