Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added pvcviewer-controller rock #154

Merged
merged 10 commits into from
Nov 29, 2024
Merged

Added pvcviewer-controller rock #154

merged 10 commits into from
Nov 29, 2024

Conversation

BON4
Copy link
Contributor

@BON4 BON4 commented Nov 22, 2024

Description


  • Added rockcraft.yaml
  • Added tests
  • Added tox.ini

This is re-open request for this PR.

Logs

Using pvcviewer-operator. Modified: metadata.yaml. Replaced "docker.io/kubeflownotebookswg/pvcviewer-controller:v1.9.0" with "vyurchenko581/pvcviewer-controller:1.9.0".

Juju status after pvcviwer-operator model is deployed:

Model     Controller          Cloud/Region        Version  SLA          Timestamp
kubeflow  microk8s-localhost  microk8s/localhost  3.4.6    unsupported  11:54:49+02:00

App                 Version  Status   Scale  Charm               Channel         Rev  Address         Exposed  Message
grafana-agent-k8s   0.32.1   waiting      1  grafana-agent-k8s   latest/stable    45  10.152.183.201  no       installing agent
istio-gateway                active       1  istio-gateway       latest/edge    1287  10.152.183.83   no       
istio-pilot                  active       1  istio-pilot         latest/edge    1235  10.152.183.191  no       
pvcviewer-operator           active       1  pvcviewer-operator                    0  10.152.183.131  no       

Unit                   Workload  Agent  Address      Ports  Message
grafana-agent-k8s/0*   blocked   idle   10.1.34.152         grafana-dashboards-provider: off, grafana-cloud-config: off
istio-gateway/0*       active    idle   10.1.34.187         
istio-pilot/0*         active    idle   10.1.34.180         
pvcviewer-operator/0*  active    idle   10.1.34.183         

Juju status after all tests passed:

Model     Controller          Cloud/Region        Version  SLA          Timestamp
kubeflow  microk8s-localhost  microk8s/localhost  3.4.6    unsupported  11:56:09+02:00

App                Version  Status   Scale  Charm              Channel         Rev  Address         Exposed  Message
grafana-agent-k8s  0.32.1   waiting      1  grafana-agent-k8s  latest/stable    45  10.152.183.201  no       installing agent
istio-gateway               active       1  istio-gateway      latest/edge    1287  10.152.183.83   no       
istio-pilot                 active       1  istio-pilot        latest/edge    1235  10.152.183.191  no       

Unit                  Workload  Agent  Address      Ports  Message
grafana-agent-k8s/0*  blocked   idle   10.1.34.152         logging-consumer: off, grafana-cloud-config: off
istio-gateway/0*      active    idle   10.1.34.187         
istio-pilot/0*        active    idle   10.1.34.180         

Logs of passed integration tests tox -vve integration -- --model kubeflow --keep-models:

integration: 107 D clear env temp folder /home/bon/repos/pvcviewer-operator/.tox/integration/tmp [tox/tox_env/api.py:311]
integration: 117 I find interpreter for spec PythonSpec(path=/usr/bin/python3) [virtualenv/discovery/builtin.py:58]
integration: 117 I proposed PythonInfo(spec=CPython3.10.12.final.0-64, exe=/usr/bin/python3, platform=linux, version='3.10.12 (main, Sep 11 2024, 15:47:36) [GCC 11.4.0]', encoding_fs_io=utf-8-utf-8) [virtualenv/discovery/builtin.py:65]
integration: 117 D accepted PythonInfo(spec=CPython3.10.12.final.0-64, exe=/usr/bin/python3, platform=linux, version='3.10.12 (main, Sep 11 2024, 15:47:36) [GCC 11.4.0]', encoding_fs_io=utf-8-utf-8) [virtualenv/discovery/builtin.py:67]
integration: 119 D filesystem is case-sensitive [virtualenv/info.py:25]
integration: 148 W commands[0]> pytest -vv --tb native --asyncio-mode=auto /home/bon/repos/pvcviewer-operator/tests/integration --log-cli-level=INFO -s --model kubeflow --keep-models [tox/tox_env/api.py:425]
============================================================================================= test session starts ==============================================================================================
platform linux -- Python 3.10.12, pytest-8.2.2, pluggy-1.5.0 -- /home/bon/repos/pvcviewer-operator/.tox/integration/bin/python
cachedir: .tox/integration/.pytest_cache
rootdir: /home/bon/repos/pvcviewer-operator
configfile: pyproject.toml
plugins: operator-0.35.0, anyio-4.4.0, asyncio-0.21.2
asyncio: mode=auto
collected 6 items                                                                                                                                                                                              

tests/integration/test_charm.py::test_build_and_deploy 
------------------------------------------------------------------------------------------------ live log setup ------------------------------------------------------------------------------------------------
INFO     pytest_operator.plugin:plugin.py:734 Connecting to existing model microk8s-localhost:kubeflow on unspecified cloud
------------------------------------------------------------------------------------------------ live log call -------------------------------------------------------------------------------------------------
INFO     juju.model:model.py:2097 Deploying ch:amd64/focal/istio-pilot-1235
INFO     juju.model:model.py:2097 Deploying ch:amd64/focal/istio-gateway-1287
INFO     juju.model:model.py:2971 Waiting for model:
  istio-pilot/0 [allocating] waiting: installing agent
  istio-gateway/0 [allocating] waiting: installing agent
INFO     juju.model:model.py:2971 Waiting for model:
  istio-pilot/0 [allocating] waiting: agent initialising
  istio-gateway/0 [allocating] waiting: agent initialising
INFO     juju.model:model.py:2971 Waiting for model:
  istio-pilot/0 [allocating] waiting: agent initialising
  istio-gateway/0 [idle] waiting: List of <ops.model.Relation istio-pilot:1> versions not found for apps: istio-pilot
INFO     juju.model:model.py:2971 Waiting for model:
  istio-pilot/0 [idle] active: 
  istio-gateway/0 [idle] active: 
INFO     pytest_operator.plugin:plugin.py:575 Using tmp_path: /home/bon/repos/pvcviewer-operator/.tox/integration/tmp/pytest/kubeflow0
INFO     pytest_operator.plugin:plugin.py:1083 Building charm pvcviewer-operator
INFO     pytest_operator.plugin:plugin.py:1088 Built charm pvcviewer-operator in 84.37s
INFO     juju.model:model.py:2097 Deploying local:pvcviewer-operator-0
INFO     juju.model:model.py:2971 Waiting for model:
  pvcviewer-operator/0 [allocating] waiting: installing agent
INFO     juju.model:model.py:2971 Waiting for model:
  pvcviewer-operator/0 [executing] active: 
INFO     juju.model:model.py:2971 Waiting for model:
  pvcviewer-operator/0 [idle] active: 
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:78 deploying grafana-agent-k8s from latest/stable channel
INFO     juju.model:model.py:2097 Deploying ch:amd64/jammy/grafana-agent-k8s-45
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:82 Adding relation: pvcviewer-operator:grafana-dashboard and grafana-agent-k8s:grafana-dashboards-consumer
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:95 Adding relation: pvcviewer-operator:metrics-endpoint and grafana-agent-k8s:metrics-endpoint
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:108 Adding relation: pvcviewer-operator:logging and grafana-agent-k8s:logging-provider
INFO     juju.model:model.py:2971 Waiting for model:
  grafana-agent-k8s/0 [allocating] waiting: installing agent
INFO     juju.model:model.py:2971 Waiting for model:
  grafana-agent-k8s/0 [allocating] waiting: agent initialising
INFO     juju.model:model.py:2971 Waiting for model:
  grafana-agent-k8s/0 [executing] blocked: grafana-dashboards-provider: off, grafana-cloud-config: off
INFO     juju.model:model.py:2971 Waiting for model:
  grafana-agent-k8s/0 [idle] blocked: grafana-dashboards-provider: off, grafana-cloud-config: off
INFO     juju.model:model.py:2971 Waiting for model:
  grafana-agent-k8s/0 [idle] blocked: grafana-dashboards-provider: off, grafana-cloud-config: off
PASSED
tests/integration/test_charm.py::test_metrics_enpoint 
------------------------------------------------------------------------------------------------ live log call -------------------------------------------------------------------------------------------------
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:225 found relations [<Relation id=4 grafana-agent-k8s:metrics-endpoint pvcviewer-operator:metrics-endpoint>] for pvcviewer-operator:metrics-endpoint
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:371 running cmd `relation-get --format=yaml -r 4 --app - pvcviewer-operator` on unit pvcviewer-operator/0
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:371 running cmd `curl -m 5 -sS localhost:12345/agent/api/v1/metrics/targets` on unit grafana-agent-k8s/0
PASSED
tests/integration/test_charm.py::test_logging 
------------------------------------------------------------------------------------------------ live log call -------------------------------------------------------------------------------------------------
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:225 found relations [<Relation id=5 pvcviewer-operator:logging grafana-agent-k8s:logging-provider>] for pvcviewer-operator:logging
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:371 running cmd `relation-get --format=yaml -r 5 - grafana-agent-k8s/0` on unit grafana-agent-k8s/0
PASSED
tests/integration/test_charm.py::test_alert_rules 
------------------------------------------------------------------------------------------------ live log call -------------------------------------------------------------------------------------------------
INFO     test_charm:test_charm.py:161 found alert_rules: {'KubeflowServiceDown', 'KubeflowServiceIsNotStable'}
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:225 found relations [<Relation id=4 grafana-agent-k8s:metrics-endpoint pvcviewer-operator:metrics-endpoint>] for pvcviewer-operator:metrics-endpoint
INFO     charmed_kubeflow_chisme.testing.cos_integration:cos_integration.py:371 running cmd `relation-get --format=yaml -r 4 --app - pvcviewer-operator` on unit pvcviewer-operator/0
PASSED
tests/integration/test_charm.py::test_pvcviewer_example 
------------------------------------------------------------------------------------------------ live log call -------------------------------------------------------------------------------------------------
INFO     httpx:_client.py:1026 HTTP Request: PATCH https://127.0.0.1:16443/api/v1/namespaces/kubeflow-user-example-com?fieldManager=pvcviewer-operator "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1026 HTTP Request: PATCH https://127.0.0.1:16443/apis/kubeflow.org/v1alpha1/namespaces/kubeflow-user-example-com/pvcviewers/pvcviewer-sample?fieldManager=pvcviewer-operator "HTTP/1.1 201 Created"
INFO     httpx:_client.py:1026 HTTP Request: PATCH https://127.0.0.1:16443/api/v1/namespaces/kubeflow-user-example-com/persistentvolumeclaims/pvcviewer-sample?fieldManager=pvcviewer-operator "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1026 HTTP Request: GET https://127.0.0.1:16443/api/v1/namespaces/kubeflow/services/istio-ingressgateway-workload "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1026 HTTP Request: PATCH https://127.0.0.1:16443/api/v1/namespaces/kubeflow-user-example-com?fieldManager=pvcviewer-operator "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1026 HTTP Request: PATCH https://127.0.0.1:16443/apis/kubeflow.org/v1alpha1/namespaces/kubeflow-user-example-com/pvcviewers/pvcviewer-sample?fieldManager=pvcviewer-operator "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1026 HTTP Request: PATCH https://127.0.0.1:16443/api/v1/namespaces/kubeflow-user-example-com/persistentvolumeclaims/pvcviewer-sample?fieldManager=pvcviewer-operator "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1026 HTTP Request: GET https://127.0.0.1:16443/api/v1/namespaces/kubeflow/services/istio-ingressgateway-workload "HTTP/1.1 200 OK"
PASSED
tests/integration/test_charm.py::test_remove_deletes_virtual_service 
------------------------------------------------------------------------------------------------ live log call -------------------------------------------------------------------------------------------------
INFO     httpx:_client.py:1026 HTTP Request: GET https://127.0.0.1:16443/api/v1/namespaces/kubeflow/services/istio-ingressgateway-workload "HTTP/1.1 200 OK"
PASSED
---------------------------------------------------------------------------------------------- live log teardown -----------------------------------------------------------------------------------------------
INFO     pytest_operator.plugin:plugin.py:862 Model status:

Model     Controller          Cloud/Region        Version  SLA          Timestamp
kubeflow  microk8s-localhost  microk8s/localhost  3.4.6    unsupported  11:55:46+02:00

App                Version  Status   Scale  Charm              Channel         Rev  Address         Exposed  Message
grafana-agent-k8s  0.32.1   waiting      1  grafana-agent-k8s  latest/stable    45  10.152.183.201  no       installing agent
istio-gateway               active       1  istio-gateway      latest/edge    1287  10.152.183.83   no       
istio-pilot                 active       1  istio-pilot        latest/edge    1235  10.152.183.191  no       

Unit                  Workload  Agent  Address      Ports  Message
grafana-agent-k8s/0*  blocked   idle   10.1.34.152         logging-consumer: off, grafana-cloud-config: off
istio-gateway/0*      active    idle   10.1.34.187         
istio-pilot/0*        active    idle   10.1.34.180         

INFO     pytest_operator.plugin:plugin.py:868 Juju error logs:


INFO     pytest_operator.plugin:plugin.py:947 Forgetting model main...


======================================================================================== 6 passed in 432.71s (0:07:12) =========================================================================================
integration: 433711 I exit 0 (433.56 seconds) /home/bon/repos/pvcviewer-operator> pytest -vv --tb native --asyncio-mode=auto /home/bon/repos/pvcviewer-operator/tests/integration --log-cli-level=INFO -s --model kubeflow --keep-models pid=928362 [tox/execute/api.py:280]
  integration: OK (433.60=setup[0.04]+cmd[433.56] seconds)
  congratulations :) (433.65 seconds)

Logs from pvcviewer-operator container that used pvcviwer-controller rock kubectl logs -n kubeflow pvcviewer-operator-0 -c pvcviewer-operator:

2024-11-22T09:52:13.162Z [pebble] HTTP API server listening on ":38813".
2024-11-22T09:52:13.162Z [pebble] Started daemon.
2024-11-22T09:52:25.819Z [pebble] POST /v1/files 11.652546ms 200
2024-11-22T09:52:25.831Z [pebble] POST /v1/files 10.921374ms 200
2024-11-22T09:52:25.844Z [pebble] POST /v1/files 10.90358ms 200
2024-11-22T09:52:25.852Z [pebble] GET /v1/plan?format=yaml 385.128µs 200
2024-11-22T09:52:25.853Z [pebble] POST /v1/layers 93.898µs 200
2024-11-22T09:52:25.865Z [pebble] POST /v1/services 10.955067ms 202
2024-11-22T09:52:25.876Z [pebble] Service "pvcviewer-operator" starting: /manager --leader-elect
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.builder	Registering a mutating webhook	{"GVK": "kubeflow.org/v1alpha1, Kind=PVCViewer", "path": "/mutate-kubeflow-org-v1alpha1-pvcviewer"}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.webhook	Registering webhook	{"path": "/mutate-kubeflow-org-v1alpha1-pvcviewer"}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.builder	Registering a validating webhook	{"GVK": "kubeflow.org/v1alpha1, Kind=PVCViewer", "path": "/validate-kubeflow-org-v1alpha1-pvcviewer"}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.webhook	Registering webhook	{"path": "/validate-kubeflow-org-v1alpha1-pvcviewer"}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	setup	starting manager
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.metrics	Starting metrics server
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	starting server	{"kind": "health probe", "addr": "[::]:8081"}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.metrics	Serving metrics server	{"bindAddress": ":8080", "secure": false}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.webhook	Starting webhook server
2024-11-22T09:52:25.891Z [pvcviewer-operator] I1122 09:52:25.891543      15 leaderelection.go:250] attempting to acquire leader lease kubeflow/57a72bdf.kubeflow.org...
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.webhook	Serving webhook server	{"host": "", "port": 9443}
2024-11-22T09:52:25.891Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	controller-runtime.certwatcher	Starting certificate watcher
2024-11-22T09:52:25.908Z [pvcviewer-operator] I1122 09:52:25.908161      15 leaderelection.go:260] successfully acquired lease kubeflow/57a72bdf.kubeflow.org
2024-11-22T09:52:25.908Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *v1alpha1.PVCViewer"}
2024-11-22T09:52:25.908Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *v1.Deployment"}
2024-11-22T09:52:25.908Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *v1.Service"}
2024-11-22T09:52:25.908Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *unstructured.Unstructured"}
2024-11-22T09:52:25.908Z [pvcviewer-operator] 2024-11-22T09:52:25Z	INFO	Starting Controller	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer"}
2024-11-22T09:52:25.908Z [pvcviewer-operator] 2024-11-22T09:52:25Z	DEBUG	events	pvcviewer-operator-0_152f3599-793d-4030-b163-8675f851e447 became leader	{"type": "Normal", "object": {"kind":"Lease","namespace":"kubeflow","name":"57a72bdf.kubeflow.org","uid":"43002001-5dd4-469e-a784-d7d8c291ab7a","apiVersion":"coordination.k8s.io/v1","resourceVersion":"727722"}, "reason": "LeaderElection"}
2024-11-22T09:52:26.014Z [pvcviewer-operator] 2024-11-22T09:52:26Z	INFO	Starting workers	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "worker count": 1}
2024-11-22T09:52:26.888Z [pebble] GET /v1/changes/1/wait?timeout=4.000s 1.022860607s 200
2024-11-22T09:52:26.896Z [pebble] GET /v1/services 117.412µs 200
2024-11-22T09:52:29.912Z [pebble] GET /v1/services 37.461µs 200
2024-11-22T09:52:34.264Z [pvcviewer-operator] 2024-11-22T09:52:34Z	DEBUG	controller-runtime.certwatcher	certificate event	{"event": "REMOVE        \"/tmp/k8s-webhook-server/serving-certs/tls.key\""}
2024-11-22T09:52:34.265Z [pvcviewer-operator] 2024-11-22T09:52:34Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:34.270Z [pebble] POST /v1/files 11.421529ms 200
2024-11-22T09:52:34.277Z [pvcviewer-operator] 2024-11-22T09:52:34Z	DEBUG	controller-runtime.certwatcher	certificate event	{"event": "REMOVE        \"/tmp/k8s-webhook-server/serving-certs/tls.crt\""}
2024-11-22T09:52:34.277Z [pvcviewer-operator] 2024-11-22T09:52:34Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:34.282Z [pebble] POST /v1/files 10.904753ms 200
2024-11-22T09:52:34.294Z [pebble] POST /v1/files 11.082498ms 200
2024-11-22T09:52:34.300Z [pebble] GET /v1/plan?format=yaml 129.886µs 200
2024-11-22T09:52:34.307Z [pebble] GET /v1/services 38.623µs 200
2024-11-22T09:52:37.321Z [pebble] GET /v1/services 31.419µs 200
2024-11-22T09:52:41.559Z [pvcviewer-operator] 2024-11-22T09:52:41Z	DEBUG	controller-runtime.certwatcher	certificate event	{"event": "REMOVE        \"/tmp/k8s-webhook-server/serving-certs/tls.key\""}
2024-11-22T09:52:41.559Z [pvcviewer-operator] 2024-11-22T09:52:41Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:41.564Z [pebble] POST /v1/files 11.025101ms 200
2024-11-22T09:52:41.571Z [pvcviewer-operator] 2024-11-22T09:52:41Z	DEBUG	controller-runtime.certwatcher	certificate event	{"event": "REMOVE        \"/tmp/k8s-webhook-server/serving-certs/tls.crt\""}
2024-11-22T09:52:41.572Z [pvcviewer-operator] 2024-11-22T09:52:41Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:41.577Z [pebble] POST /v1/files 10.835481ms 200
2024-11-22T09:52:41.589Z [pebble] POST /v1/files 10.911165ms 200
2024-11-22T09:52:41.595Z [pebble] GET /v1/plan?format=yaml 95.751µs 200
2024-11-22T09:52:41.603Z [pebble] GET /v1/services 29.306µs 200
2024-11-22T09:52:43.227Z [pebble] GET /v1/notices?timeout=30s 30.000962196s 200
2024-11-22T09:52:44.578Z [pebble] GET /v1/services 29.486µs 200
2024-11-22T09:52:48.890Z [pvcviewer-operator] 2024-11-22T09:52:48Z	DEBUG	controller-runtime.certwatcher	certificate event	{"event": "REMOVE        \"/tmp/k8s-webhook-server/serving-certs/tls.key\""}
2024-11-22T09:52:48.891Z [pvcviewer-operator] 2024-11-22T09:52:48Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:48.896Z [pebble] POST /v1/files 10.85554ms 200
2024-11-22T09:52:48.903Z [pvcviewer-operator] 2024-11-22T09:52:48Z	DEBUG	controller-runtime.certwatcher	certificate event	{"event": "REMOVE        \"/tmp/k8s-webhook-server/serving-certs/tls.crt\""}
2024-11-22T09:52:48.903Z [pvcviewer-operator] 2024-11-22T09:52:48Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-22T09:52:48.908Z [pebble] POST /v1/files 10.864136ms 200
2024-11-22T09:52:48.921Z [pebble] POST /v1/files 10.839209ms 200
2024-11-22T09:52:48.928Z [pebble] GET /v1/plan?format=yaml 96.553µs 200
2024-11-22T09:52:48.936Z [pebble] GET /v1/services 36.97µs 200
2024-11-22T09:52:51.892Z [pebble] GET /v1/services 31.9µs 200
2024-11-22T09:53:13.229Z [pebble] GET /v1/notices?timeout=30s 30.000830743s 200
2024-11-22T09:53:43.231Z [pebble] GET /v1/notices?timeout=30s 30.002512475s 200
2024-11-22T09:54:13.233Z [pebble] GET /v1/notices?timeout=30s 30.000911932s 200
2024-11-22T09:54:15.558Z [pebble] GET /v1/plan?format=yaml 147.719µs 200
2024-11-22T09:54:15.564Z [pebble] POST /v1/layers 4.530904ms 200
2024-11-22T09:54:16.566Z [pebble] Cannot flush logs to target "grafana-agent-k8s/0": Post "http://grafana-agent-k8s-0.grafana-agent-k8s-endpoints.kubeflow.svc.cluster.local:3500/loki/api/v1/push": dial tcp 10.1.34.152:3500: connect: connection refused
2024-11-22T09:54:43.236Z [pebble] GET /v1/notices?timeout=30s 30.003352058s 200
2024-11-22T09:55:13.239Z [pebble] GET /v1/notices?timeout=30s 30.002733129s 200

@BON4 BON4 marked this pull request as ready for review November 25, 2024 13:30
@BON4 BON4 requested a review from a team as a code owner November 25, 2024 13:30
Copy link
Collaborator

@DnPlas DnPlas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @BON4 ! I reviewed this PR by building the rock and running a container with the image generated from it, but I had trouble finding the /manager binary. As you can see from this quick test, it is not in the path it is supposed to be:

ubuntu@ip-172-31-23-177:~/kubeflow-rocks$ docker run --rm -ti --entrypoint /bin/bash pvcview:0.1
_daemon_@b122f874a0fc:/$ /manager --health-probe-bind-address=:8081 --metrics-bind-address=:8080 --leader-elect
bash: /manager: No such file or directory
_daemon_@b122f874a0fc:/$ ls -la /manager
ls: cannot access '/manager': No such file or directory
_daemon_@b122f874a0fc:/$ ls -la /bin/ | grep manager
_daemon_@b122f874a0fc:/$

You will also see this error on the tests.

I think that using the Go plugin is causing this issue, as the binary will be installed in ${CRAFT_PART_INSTALL}/bin instead of /manager, according to the docs.

Usually we avoid using the Go plugin because of that and instead use override-build. You can use this as reference.

Also, it seems like there is an overlap between this PR and #130. May I ask you to close the other one if it is not needed any more?

@BON4
Copy link
Contributor Author

BON4 commented Nov 27, 2024

@DnPlas Thank you for your review and suggestions. The issue was identified in this line. The Go plugin builds the binary with a default name, which in this case is pvc-viewer rather than pvcviewer-operator. As a result, no files were moved to / during the organize step.

I did test the rock before pushing the changes, but I overlooked the fact that Docker caches images. The tests passed locally because they were using a pre-refactored version of the rock.

I opted to use the Go plugin without overriding any steps here, as there are no third-party files involved. The only requirement for the final OCI image is the executable itself.

Is it a common convention to avoid using the Go plugin? Do you think it could still be suitable for my specific case?

About this PR #130. It will be closed, I mentioned it in PR description so no comments will be lost.

Copy link
Collaborator

@DnPlas DnPlas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @BON4 !

I tested this PR by:

  1. Building the rock -> Success
  2. Running the image in a container and looking for the binary -> Success
ubuntu@ip-172-31-23-177:~/pvcviewer-operator$ docker run --rm -ti --entrypoint /bin/bash pvcview:0.2
_daemon_@95ca93b7b916:/$ ls /manager
/manager
  1. Saving the image in the microk8s local registry and using it for deploying the pvcviewer-operator -> Success
juju deploy pvcviewer-operator --channel latest/edge --trust --resource oci-image=pvcview:0.2

# kubectl logs from the container

2024-11-29T15:37:06.321Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	setup	starting manager
2024-11-29T15:37:06.321Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	controller-runtime.metrics	Starting metrics server
2024-11-29T15:37:06.321Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	controller-runtime.metrics	Serving metrics server	{"bindAddress": ":8080", "secure": false}
2024-11-29T15:37:06.321Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	starting server	{"kind": "health probe", "addr": "[::]:8081"}
2024-11-29T15:37:06.321Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	controller-runtime.webhook	Starting webhook server
2024-11-29T15:37:06.321Z [pvcviewer-operator] I1129 15:37:06.321862      31 leaderelection.go:250] attempting to acquire leader lease kubeflow/57a72bdf.kubeflow.org...
2024-11-29T15:37:06.322Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	controller-runtime.certwatcher	Updated current TLS certificate
2024-11-29T15:37:06.322Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	controller-runtime.webhook	Serving webhook server	{"host": "", "port": 9443}
2024-11-29T15:37:06.322Z [pvcviewer-operator] 2024-11-29T15:37:06Z	INFO	controller-runtime.certwatcher	Starting certificate watcher
2024-11-29T15:37:21.504Z [pvcviewer-operator] I1129 15:37:21.504512      31 leaderelection.go:260] successfully acquired lease kubeflow/57a72bdf.kubeflow.org
2024-11-29T15:37:21.504Z [pvcviewer-operator] 2024-11-29T15:37:21Z	DEBUG	events	pvcviewer-operator-0_4c1fe7cb-7827-4f3a-b895-67f66d771a08 became leader	{"type": "Normal", "object": {"kind":"Lease","namespace":"kubeflow","name":"57a72bdf.kubeflow.org","uid":"ad6a044f-ab99-4246-8883-bc7db1ed9afa","apiVersion":"coordination.k8s.io/v1","resourceVersion":"2413"}, "reason": "LeaderElection"}
2024-11-29T15:37:21.504Z [pvcviewer-operator] 2024-11-29T15:37:21Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *v1alpha1.PVCViewer"}
2024-11-29T15:37:21.505Z [pvcviewer-operator] 2024-11-29T15:37:21Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *v1.Deployment"}
2024-11-29T15:37:21.505Z [pvcviewer-operator] 2024-11-29T15:37:21Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *v1.Service"}
2024-11-29T15:37:21.505Z [pvcviewer-operator] 2024-11-29T15:37:21Z	INFO	Starting EventSource	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "source": "kind source: *unstructured.Unstructured"}
2024-11-29T15:37:21.505Z [pvcviewer-operator] 2024-11-29T15:37:21Z	INFO	Starting Controller	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer"}
2024-11-29T15:37:21.612Z [pvcviewer-operator] 2024-11-29T15:37:21Z	INFO	Starting workers	{"controller": "pvcviewer", "controllerGroup": "kubeflow.org", "controllerKind": "PVCViewer", "worker count": 1}
2024-11-29T15:37:34.407Z [pebble] GET /v1/notices?timeout=30s 30.001618932s 200
2024-11-29T15:38:04.408Z [pebble] GET /v1/notices?timeout=30s 30.000534485s 200

To answer your questions:

I opted to use the Go plugin without overriding any steps here, as there are no third-party files involved. The only requirement for the final OCI image is the executable itself.

The way you have defined it is good, actually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants