Common labels: {"host_id":"ceacb99587e34bcc840bc7a7cc0d4453","ingress":"ingress.logs.journald","job_type":"daemon","labels_host_id":"ceacb99587e34bcc840bc7a7cc0d4453","labels_ingress":"ingress.logs.journald","labels_job_type":"daemon","labels_used_grok":"TsLevelMsg","namespace":"NotDefined","stack":"NotDefined","task":"NotDefined","task_group":"NotDefined","used_grok":"TsLevelMsg"} Line limit: "20000 (8587 returned)" Total bytes processed: "6.45 MB" 2023-05-11T11:39:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="718.575ยตs" 2023-05-11T11:39:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:39:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:39:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=6.787234ms 2023-05-11T11:39:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:57832 2023-05-11T11:39:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="336.856ยตs" 2023-05-11T11:39:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:39:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:39:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="372.242ยตs" 2023-05-11T11:39:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="352.507ยตs" 2023-05-11T11:39:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-ready\" check" task=group-grafana-agent time_limit=40s 2023-05-11T11:39:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent time_limit=20s 2023-05-11T11:39:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="448.012ยตs" 2023-05-11T11:39:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:39:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.348549ms 2023-05-11T11:39:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="466.075ยตs" 2023-05-11T11:39:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:39:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="335.184ยตs" 2023-05-11T11:39:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="438.176ยตs" 2023-05-11T11:39:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:39:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:39:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="471.818ยตs" 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="334.035ยตs" 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.037510995s 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:27dfb19c-1e44-2e49-a689-0a4e369f7bd2 method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) no new data (index was the same) 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=10.732728488s 2023-05-11T11:39:17+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m3.729829669s 2023-05-11T11:39:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:39:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:39:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=4.249775ms 2023-05-11T11:39:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.050386ms 2023-05-11T11:39:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:34854 2023-05-11T11:39:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:39:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=19.387191781s 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 method=GET url="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.408129991s 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) no new data (index was the same) 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/minio@default.global): returned "nomad/jobs/minio" 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" duration=1m1.108265036s 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/minio@default.global): GET /v1/var/nomad/jobs/minio?index=10463&stale=true&wait=1m0s 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) marking successful data response 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) successful contact, resetting retries 2023-05-11T11:39:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="632.029ยตs" 2023-05-11T11:39:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces?index=1 duration=5m5.098544222s 2023-05-11T11:39:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="346.344ยตs" 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:1022bc44-6bd1-c8c5-62c5-4166c31f7afc method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=18.264990515s 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.126030954s 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) no new data (index was the same) 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m1.181491432s 2023-05-11T11:39:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="349.964ยตs" 2023-05-11T11:39:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="651.417ยตs" 2023-05-11T11:39:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:39:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.151415ms 2023-05-11T11:39:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="323.469ยตs" 2023-05-11T11:39:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:39:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="403.54ยตs" 2023-05-11T11:39:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:39:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:39:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="347.873ยตs" 2023-05-11T11:39:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="327.719ยตs" 2023-05-11T11:39:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:39:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="464.041ยตs" 2023-05-11T11:39:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:39:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.365842ms 2023-05-11T11:39:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:33972 2023-05-11T11:39:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:39:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:39:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="399.907ยตs" 2023-05-11T11:39:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="549.641ยตs" 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=16.060345074s 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="345.337ยตs" 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres failure=true event="1683797943066396498 - Restart Signaled" 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=857749e0-52fe-92ee-7bef-fafbe67605ee task=group-keycloak-postgres check=keycloak_postgres_ping 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 1, 0, 0>" 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee check=keycloak_postgres_ping task=group-keycloak-postgres 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished alloc task restart hook: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee name=group_services end="2023-05-11 09:39:03.066740647 +0000 UTC m=+362.340787223" duration="307.53ยตs" 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running alloc task restart hook: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee name=group_services start="2023-05-11 09:39:03.066433116 +0000 UTC m=+362.340479693" 2023-05-11T11:39:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres failure=true event="1683797943066396498 - Restart Signaled" 2023-05-11T11:39:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="403.071ยตs" 2023-05-11T11:39:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:39:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.644423ms 2023-05-11T11:39:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:39:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="451.292ยตs" 2023-05-11T11:39:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:39:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="400.214ยตs" 2023-05-11T11:38:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="355.469ยตs" 2023-05-11T11:38:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:38:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): GET /v1/health/service/mimir?index=14250&stale=true&wait=1m0s 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) no new data (index was the same) 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results after filtering 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) successful contact, resetting retries 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) marking successful data response 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=6.163512ms 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) marking successful data response 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results after filtering 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): GET /v1/health/service/mimir?index=14250&stale=true&wait=1m0s 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) successful contact, resetting retries 2023-05-11T11:38:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) no new data (index was the same) 2023-05-11T11:38:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="602.819ยตs" 2023-05-11T11:38:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) successful contact, resetting retries 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) marking successful data response 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results after filtering 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) no new data (index was the same) 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): GET /v1/health/service/mimir?index=14250&stale=true&wait=1m0s 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.469374ms 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="816.724ยตs" 2023-05-11T11:38:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:39760 2023-05-11T11:38:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:38:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="727.308ยตs" 2023-05-11T11:38:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="625.592ยตs" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="645.766ยตs" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished alloc task restart hook: alloc_id=54969951-d541-ae97-922a-7db38096bae5 name=group_services end="2023-05-11 09:38:53.03682251 +0000 UTC m=+352.310869086" duration="191.281ยตs" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<2, 2, 0, 0>" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=54969951-d541-ae97-922a-7db38096bae5 task=group-nats check="service: \"nats-prometheus-exporter\" check" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=54969951-d541-ae97-922a-7db38096bae5 task=group-nats check="service: \"nats\" check" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running alloc task restart hook: alloc_id=54969951-d541-ae97-922a-7db38096bae5 name=group_services start="2023-05-11 09:38:53.036631229 +0000 UTC m=+352.310677805" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats failure=true event="1683797933036601312 - Restart Signaled" 2023-05-11T11:38:53+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter failure=true event="1683797933036601312 - Restart Signaled" 2023-05-11T11:38:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:38:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:38:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="466.419ยตs" 2023-05-11T11:38:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.662799ms 2023-05-11T11:38:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="346.328ยตs" 2023-05-11T11:38:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:49+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="499.204ยตs" 2023-05-11T11:38:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:38:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:38:49+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.811629166s 2023-05-11T11:38:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="370.949ยตs" 2023-05-11T11:38:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="397.419ยตs" 2023-05-11T11:38:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:38:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="424.123ยตs" 2023-05-11T11:38:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.733031ms 2023-05-11T11:38:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:50218 2023-05-11T11:38:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:38:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:45+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="387.22ยตs" 2023-05-11T11:38:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="362.123ยตs" 2023-05-11T11:38:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="356.246ยตs" 2023-05-11T11:38:43+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee check=keycloak_postgres_ping task=group-keycloak-postgres time_limit=20s 2023-05-11T11:38:43+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats time_limit=10s 2023-05-11T11:38:43+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats\" check" task=group-nats time_limit=20s 2023-05-11T11:38:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="487.046ยตs" 2023-05-11T11:38:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:38:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.732713ms 2023-05-11T11:38:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="660.016ยตs" 2023-05-11T11:38:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.321016ms 2023-05-11T11:38:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:38:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:38:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="545.416ยตs" 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="409.934ยตs" 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=13 errors=0 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.082601334s 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15497 total=22 pulled=9 filtered=13 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=13 2023-05-11T11:38:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=19.429832434s 2023-05-11T11:38:37+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce client_status=running desired_status="" 2023-05-11T11:38:37+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.runner_hook.alloc_health_watcher: health set: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce healthy=true 2023-05-11T11:38:37+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="495.189ยตs" 2023-05-11T11:38:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:38:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.07287ms 2023-05-11T11:38:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:50004 2023-05-11T11:38:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:38:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:35+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="456.093ยตs" 2023-05-11T11:38:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="358.321ยตs" 2023-05-11T11:38:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="390.498ยตs" 2023-05-11T11:38:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="795.725ยตs" 2023-05-11T11:38:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.719541ms 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.052536037s 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:bd7464de-fa72-736b-e57c-6782cc7d7202 method=GET url="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.073338931s 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/observability@default.global) successful contact, resetting retries 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" duration=1m2.74768483s 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/observability@default.global) marking successful data response 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/observability@default.global): returned "nomad/jobs/observability" 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/observability@default.global) no new data (index was the same) 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/observability@default.global): GET /v1/var/nomad/jobs/observability?index=3271&stale=true&wait=1m0s 2023-05-11T11:38:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="408.03ยตs" 2023-05-11T11:38:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="427.38ยตs" 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.855529689s 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=16.826322272s 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:87cd4d30-a263-1598-0a57-72046f840473 method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) no new data (index was the same) 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:38:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m1.633766451s 2023-05-11T11:38:29+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="382.175ยตs" 2023-05-11T11:38:29+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:38:29+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:38:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="477.279ยตs" 2023-05-11T11:38:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-16e4d8f1cfc1e34115fb1c5777fbdf5da5a1fec1 2023-05-11T11:38:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=16.353822622s 2023-05-11T11:38:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="484.803ยตs" 2023-05-11T11:38:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:38:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.118096ms 2023-05-11T11:38:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="596.755ยตs" 2023-05-11T11:38:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:45118 2023-05-11T11:38:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:38:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="797.559ยตs" 2023-05-11T11:38:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="379.312ยตs" 2023-05-11T11:38:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.367981ms 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished alloc task restart hook: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff name=group_services end="2023-05-11 09:38:22.948098052 +0000 UTC m=+322.222144628" duration="135.645ยตs" 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<2, 2, 0, 0>" 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent failure=true event="1683797902947923360 - Restart Signaled" 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=group-grafana-agent check="service: \"grafana-agent-ready\" check" 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=group-grafana-agent check="service: \"grafana-agent-health\" check" 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running alloc task restart hook: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff name=group_services start="2023-05-11 09:38:22.947962407 +0000 UTC m=+322.222008983" 2023-05-11T11:38:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:38:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.749215ms 2023-05-11T11:38:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="591.739ยตs" 2023-05-11T11:38:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="726.951ยตs" 2023-05-11T11:38:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces?index=1 duration=5m7.796332484s 2023-05-11T11:38:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="620.097ยตs" 2023-05-11T11:38:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:38:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:38:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="166.274ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="492.597ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="221.269ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?meta=true duration="220.315ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:34678: EOF 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:34680: EOF 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:34676: EOF 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=POST path=/v1/search/fuzzy duration="286.137ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/members duration="204.661ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/acl/token/self duration=1.188899ms 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/operator/license duration="4.09ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/acl/token/self error="RPC Error:: 400,ACL support disabled" code=400 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/regions duration="213.612ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="491.638ยตs" 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:34636: EOF 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:34628: EOF 2023-05-11T11:38:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:34620: EOF 2023-05-11T11:38:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-16e4d8f1cfc1e34115fb1c5777fbdf5da5a1fec1 2023-05-11T11:38:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="376.443ยตs" 2023-05-11T11:38:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43618: EOF 2023-05-11T11:38:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?meta=true duration="196.902ยตs" 2023-05-11T11:38:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="251.431ยตs" 2023-05-11T11:38:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:38:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="574.708ยตs" 2023-05-11T11:38:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.392412ms 2023-05-11T11:38:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:48928 2023-05-11T11:38:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:38:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="477.734ยตs" 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.290871ms 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.487405ms 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="397.607ยตs" 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=13.022260075s 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 method=GET url="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.767824088s 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) no new data (index was the same) 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) successful contact, resetting retries 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" duration=1m1.561936338s 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/minio@default.global): GET /v1/var/nomad/jobs/minio?index=10463&stale=true&wait=1m0s 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) marking successful data response 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/minio@default.global): returned "nomad/jobs/minio" 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43612: EOF 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15490 duration="604.865ยตs" 2023-05-11T11:38:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15493 duration="570.599ยตs" 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:27dfb19c-1e44-2e49-a689-0a4e369f7bd2 method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.597835682s 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) no new data (index was the same) 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m1.830250302s 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="340.392ยตs" 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=13 errors=0 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.96798758s 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15496 total=22 pulled=9 filtered=13 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=13 2023-05-11T11:38:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.128924722s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres failure=true event="1683797892921405575 - Restart Signaled" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running alloc task restart hook: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee name=group_services start="2023-05-11 09:38:12.921435038 +0000 UTC m=+312.195481607" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee check=keycloak_postgres_ping task=group-keycloak-postgres 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres failure=true event="1683797892921405575 - Restart Signaled" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=857749e0-52fe-92ee-7bef-fafbe67605ee task=group-keycloak-postgres check=keycloak_postgres_ping 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 1, 0, 0>" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished alloc task restart hook: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee name=group_services end="2023-05-11 09:38:12.921805625 +0000 UTC m=+312.195852201" duration="370.594ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=script_checks end="2023-05-11 09:38:12.886344276 +0000 UTC m=+312.160390851" duration="181.552ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier end="2023-05-11 09:38:12.886464897 +0000 UTC m=+312.160511477" duration=1.447312ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=task_services start="2023-05-11 09:38:12.886024857 +0000 UTC m=+312.160071431" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=task_services end="2023-05-11 09:38:12.886149095 +0000 UTC m=+312.160195674" duration="124.243ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=script_checks start="2023-05-11 09:38:12.886162725 +0000 UTC m=+312.160209299" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce done=false 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce from=poststart to=wait_alloc 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce client_status=running desired_status="" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=stats_hook start="2023-05-11 09:38:12.885338314 +0000 UTC m=+312.159384901" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce from=main to=poststart 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier start="2023-05-11 09:38:12.885017589 +0000 UTC m=+312.159064165" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=stats_hook end="2023-05-11 09:38:12.885572004 +0000 UTC m=+312.159618580" duration="233.679ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier state=running 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier type=Started msg="Task started by client" failed=false 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:38:12.881Z 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker network=unix @module=docker_logger address=/tmp/plugin2675591233 timestamp=2023-05-11T09:38:12.879Z 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=14261 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=469143893b35552ab0c7dc598d8440ee8ffb4ffc53487f7ac030717a5391a843 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=13 errors=0 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=10.832374487s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15495 total=22 pulled=9 filtered=13 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=13 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.66732061s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=469143893b35552ab0c7dc598d8440ee8ffb4ffc53487f7ac030717a5391a843 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=logunifier memory=2147483648 memory_reservation=67108864 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=logunifier 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=logunifier labels="map[com.github.logunifier.application.name:logunifier com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.strip.ansi:true com.github.logunifier.application.version:0.1.1 com.hashicorp.nomad.alloc_id:5f8ff55c-60f6-b98d-d49f-bac37cf70bce com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:logunifier com.hashicorp.nomad.task_name:logunifier]" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=logunifier network_mode=container:10c4027a9b228a9ce97aa69294e474bd6b62692a9528c2dde0c8b71411d9b5bc 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/suikast42/logunifier:0.1.1 image_id=sha256:129dde9892f1f049c30b2e1e1b21bdfc9db6ac58ca81de1ed4d67c22085a197e references=1 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=logunifier container_name=logunifier-5f8ff55c-60f6-b98d-d49f-bac37cf70bce 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=logunifier binds="[]string{\"/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/logunifier/local:/local\", \"/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/logunifier/secrets:/secrets\"}" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=logunifier volumes=["/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/alloc:/alloc", "/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/logunifier/local:/local", "/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/logunifier/secrets:/secrets"] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=script_checks end="2023-05-11 09:38:12.718186853 +0000 UTC m=+311.992233427" duration="423.903ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier allocID=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task="&{logunifier docker map[args:[-loglevel debug -natsServers nats.service.consul:4222 -lokiServers loki.service.consul:9005] image:registry.cloud.private/suikast42/logunifier:0.1.1 labels:[map[com.github.logunifier.application.name:logunifier com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.strip.ansi:true com.github.logunifier.application.version:0.1.1]] ports:[health]] map[] [] [] [] [] 0xc0020822a0 0xc004115e00 map[] 5s 0xc002f15b90 [] false 0s [] [] }" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier end="2023-05-11 09:38:12.718242761 +0000 UTC m=+311.992289337" duration=42.256088ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=script_checks start="2023-05-11 09:38:12.717762944 +0000 UTC m=+311.991809524" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=api end="2023-05-11 09:38:12.717700574 +0000 UTC m=+311.991747152" duration="894.901ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=devices start="2023-05-11 09:38:12.716111933 +0000 UTC m=+311.990158513" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=artifacts end="2023-05-11 09:38:12.716050517 +0000 UTC m=+311.990097091" duration="590.511ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=api start="2023-05-11 09:38:12.716805672 +0000 UTC m=+311.990852251" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=devices end="2023-05-11 09:38:12.71674861 +0000 UTC m=+311.990795184" duration="636.671ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=volumes end="2023-05-11 09:38:12.715398233 +0000 UTC m=+311.989444808" duration="512.33ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=artifacts start="2023-05-11 09:38:12.715460001 +0000 UTC m=+311.989506580" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=dispatch_payload end="2023-05-11 09:38:12.714792267 +0000 UTC m=+311.988838841" duration="861.774ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=volumes start="2023-05-11 09:38:12.714885899 +0000 UTC m=+311.988932478" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=logmon end="2023-05-11 09:38:12.713824376 +0000 UTC m=+311.987870953" duration=34.053479ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=dispatch_payload start="2023-05-11 09:38:12.713930482 +0000 UTC m=+311.987977067" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier @module=logmon path=/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/alloc/logs/.logunifier.stderr.fifo timestamp=2023-05-11T09:38:12.712Z 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier @module=logmon path=/opt/services/core/nomad/data/alloc/5f8ff55c-60f6-b98d-d49f-bac37cf70bce/alloc/logs/.logunifier.stdout.fifo timestamp=2023-05-11T09:38:12.712Z 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier version=2 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier @module=logmon address=/tmp/plugin1819783386 network=unix timestamp=2023-05-11T09:38:12.710Z 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:38:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-5f8ff55c-60f6-b98d-d49f-bac37cf70bce-group-logunifier-logunifier-health-health 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier path=/usr/local/bin/nomad pid=14208 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier path=/usr/local/bin/nomad 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=identity end="2023-05-11 09:38:12.679718973 +0000 UTC m=+311.953765548" duration="704.859ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=logmon start="2023-05-11 09:38:12.679770895 +0000 UTC m=+311.953817474" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=identity start="2023-05-11 09:38:12.679014111 +0000 UTC m=+311.953060689" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=task_dir end="2023-05-11 09:38:12.67894613 +0000 UTC m=+311.952992703" duration=2.059508ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce done=false 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce client_status=pending desired_status="" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-task-5f8ff55c-60f6-b98d-d49f-bac37cf70bce-group-logunifier-logunifier-health-health exists=false reason=operations 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce client_status=pending desired_status="" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=task_dir start="2023-05-11 09:38:12.676886614 +0000 UTC m=+311.950933195" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=validate end="2023-05-11 09:38:12.676838645 +0000 UTC m=+311.950885220" duration="809.878ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier name=validate start="2023-05-11 09:38:12.676028764 +0000 UTC m=+311.950075342" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=consul_http_socket start="2023-05-11 09:38:12.675716664 +0000 UTC m=+311.949763240" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=group_services start="2023-05-11 09:38:12.675595355 +0000 UTC m=+311.949641925" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce from=prestart to=main 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce done=false 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=consul_grpc_socket start="2023-05-11 09:38:12.675704281 +0000 UTC m=+311.949750856" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce from=init to=prestart 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=group-logunifier check="service: \"logunifier-health\" check" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=checks_hook start="2023-05-11 09:38:12.675738281 +0000 UTC m=+311.949784856" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=csi_hook start="2023-05-11 09:38:12.675726351 +0000 UTC m=+311.949772919" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.72\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"f2:db:2f:fa:7a:aa\",\"Sandbox\":\"/var/run/docker/netns/e700f25b9a05\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"vethde37e5d2\":{\"IPConfigs\":null,\"Mac\":\"16:2e:7d:8b:74:d3\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=consul_http_socket end="2023-05-11 09:38:12.675721283 +0000 UTC m=+311.949767853" duration="4.613ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce end="2023-05-11 09:38:12.675749377 +0000 UTC m=+311.949795947" duration=773.580063ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=csi_hook end="2023-05-11 09:38:12.675731692 +0000 UTC m=+311.949778261" duration="5.342ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier start="2023-05-11 09:38:12.675986675 +0000 UTC m=+311.950033249" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=consul_grpc_socket end="2023-05-11 09:38:12.675711221 +0000 UTC m=+311.949757791" duration="6.935ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=group_services end="2023-05-11 09:38:12.675693786 +0000 UTC m=+311.949740361" duration="98.436ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=checks_hook end="2023-05-11 09:38:12.675744644 +0000 UTC m=+311.949791214" duration="6.358ยตs" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 1, 0, 0>" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=network end="2023-05-11 09:38:12.675579833 +0000 UTC m=+311.949626411" duration=741.909367ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=10.266109363s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:1022bc44-6bd1-c8c5-62c5-4166c31f7afc method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=16.642060962s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m0.079387574s 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) no new data (index was the same) 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=8.259113ms 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-prometheus-exporter-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-security-postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-4eee21ed-4cb6-1319-8401-19107f29b34b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/forwardauth-8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0c17f752-dbd0-0697-a3e6-0f53fd8e9e6b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b962fb2f-9cfc-0ee5-c8e0-13cb5ae7eef4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/logunifier-dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/minio-273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/security_postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-agent-83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe182175-94df-c3a2-5033-f4863d6daf59] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_65078e15-e522-a18d-9cd8-c84b61d04080] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_795d0184-24eb-9a8d-2125-9da13758ed2a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0ebfdace-f1f3-5f9c-8b58-e5975abaf36b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a850ddb2-d6d9-37bc-7d4e-617f82f436a6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_edb8f187-0b2a-f981-0de8-8a482860261f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8f79622e-0821-d1dc-fa63-db1705ca0c74] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d2928440-57bd-9f0d-be52-db209648a8b7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_729ec73e-7c5d-a7f4-6bbe-ba89e1c82c48] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8958df2c-747b-937d-fda6-9de0502129a3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_29b5878e-b982-304a-661d-27e4346863cb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3e6dc8cb-c970-c304-f6a4-a91bd1ae4f4c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aff7cd30-fe31-a88e-15d9-87c1d9c99e5c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_778d37b8-b473-7ae4-1aeb-5c89c8aafdb2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fb14d5de-dcda-5120-191e-6db17613d746] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_108c13c9-3484-3eec-2469-18eb5d5bcc9f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_afba5d27-f527-a783-8a1f-9a007e0dcf1f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e3cd1900-5492-239d-07f1-a5f1163701db] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2767f1f6-303d-39e8-7386-5bc11447ee70] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_95892491-2a91-03ad-131b-87fe8c898a05] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0a6c32ef-65d7-113b-e95f-0d4fcbe5b17e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0c9e5807-351d-1412-6a76-dad70f1a2aee] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9256b9c1-e86a-c54e-76c9-19b554d3590a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac62d1a8-321f-3c9f-bca0-f66d4ee6051a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2550e92d-7eea-a28c-8df3-8ac8921a618c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6b659b2f-9582-a3ee-3921-1f921693fad7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_88b47db0-ff4e-8503-7a1a-5602041fcd91] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bb573a46-ace6-432e-e4b1-3a6f5a9802d4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_23a7476b-2d56-d8af-d7d2-be2dd74a819b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9764c731-33c4-f84c-0bac-acf982471ec6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e1697492-e775-22a5-0fdd-e6016b11d065] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_815b8320-f88f-14f8-a12c-2ad2e7c81af8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e27123c-e020-cca6-d17f-227a9204992f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_000e4e5e-9756-1007-de31-5e01ddfd9afc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6b90ef66-8558-63c4-b4ae-28a46aff796c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d16b376-9f86-997f-d7c1-6ce409c639f7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d914a6cb-df4c-c1a2-72c4-01fa1270a96a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0667520a-191e-2abc-16af-3b665c6b0a8a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8b796690-2887-eeea-8678-0ec52778b89d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_293c489e-6f64-7df6-181d-8d4bd3cc6dc7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83dd82ef-357a-e09a-5f68-c4f83f5d3022] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a84dbceb-e563-6f63-78a3-a60434903fc4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dfac93eb-d5ef-016d-e1b1-391bf3f6b98e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_37ec705e-dd7e-66a9-cb09-925a608d00db] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f66b4793-86ac-5933-935d-753d07bbe2f1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_578ba772-e064-722a-a608-dcd67e4f0d69] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_25437077-cb41-bc3e-8448-39bb4fc3a1b9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_98060fb3-c42f-2555-1181-7a8c2ccdf5e3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dd98d116-01c5-bc43-1d57-e93333e5c5fb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d9fe8ae4-5f8f-ea8c-01ce-032da8d61c02] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d07aefbb-0892-0f71-277f-62b800275169] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_475586af-fe17-46be-548f-30bb43ad8c67] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c7bb4f9b-baef-303f-b0fa-6b64918cbb34] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_230c4069-3739-55ae-acf5-e00ad49a78d0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac36bbc3-943b-46eb-85ca-55e5ab0da25a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a0b1e2d4-d14b-8ef4-c8f8-e0e077da1120] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5db518fa-7beb-128d-b446-19ecc1ed26e7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a68e3b40-11a8-92c8-5e79-f4e5958c2cd9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f856633a-575f-54e5-c7f6-c90d5ea6f20b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33410af5-f1eb-d8c0-0f95-18eed2fc0545] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c936c4a-1073-d460-a67e-b7ecc9193d82] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a503b091-ec04-4d76-7053-83d537d6d591] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_10d3203e-fb17-25f5-0443-9dc14bee4353] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b55f2d5e-2db6-b3c2-0406-c99a6086799c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0f2a367-7d42-c7f2-54ea-40a11bfd0c76] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a03f35f1-5783-1872-e534-e8182154d3a0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_094169c6-7f71-8797-2811-d28b34f98cbc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_38e26055-c0c5-c599-ce16-47b8ac4a6b9f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_db869c37-317e-290d-53f6-6fe4e1203f9f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_53ac7e54-c453-1470-617d-705c82a9d303] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d76e8314-e43c-b6fc-ffeb-a4a16f92ed30] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5f709788-d95a-3f64-a2d9-a5ba9b541c67] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7b173beb-1432-9c14-f9f7-c327fb91f092] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_acdc0594-ed2a-2dfd-d480-ad3fa90f0c14] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a59a5bb9-4564-eb81-3c8f-4cd28b8bfefb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_00abd284-a436-a771-7932-2588a00861bb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5ab17d03-a694-611a-4c30-fe42fa77d611] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f269f1e0-3257-65c9-251f-cfb8263f7178] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1e90aad6-28ad-fc45-fbf7-37e45ea84120] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_70753ccb-16bc-c1b1-f3ac-3a4def318638] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0aaf5216-6aca-175e-5690-e24589f6b339] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ad09d1f3-ea27-a0a4-a909-a03433289f68] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c754856f-9e64-7585-4e96-aea4177f666d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9d1dd65b-e0e8-f50a-b5a4-501fcc32dffa] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c66d139-6193-eec9-7f1f-bb66615746be] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4be86bf6-ec41-3afe-28b8-390999baf6ca] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3f2ee033-a760-0227-0c16-00efd8830dae] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_cdaf15cd-9c50-e527-3d84-4ca24fe957d0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2d9a275c-c00f-97de-5355-7211f16b700b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_62fe668f-b911-d1b2-fe29-65c45e4132fd] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33b79af9-85f0-b75c-5f82-4d67c4de5a1e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a604a30-09cb-dd55-49a9-51661b063017] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_69f54715-9420-d012-cf73-aca5df54e002] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a44fc66c-ddee-a1a9-b338-f5d1c62dcec1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_395f79bb-f3aa-730c-b2a8-d3eacbad0b1e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8a26522c-7343-fd3f-ecb6-e7a8207afef3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_689aa6d5-4263-33ed-bd9d-6b995cab3269] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_60f70a58-4ada-2bd8-3a59-0d7496e78e3c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dfaa3956-7472-60cf-7e26-24a8b746f22e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0f894843-0f10-5d66-e833-a2a02c7d0d6d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6bfbafe7-6eb7-d3e7-f611-ad21bde726e1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8dc8d24e-db95-9ab8-4100-5ff73fd0107f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dff8d527-e218-0ea0-aa27-b5e7e7192b49] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_af43521c-3db6-cdd6-b4a9-7ac7d6eb06df] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c29afea1-c6d0-e407-a4f5-4eb92385ca32] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aea4eb53-7bfa-7d7c-e516-bac0b392d2cc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fff8efd3-199c-9fc5-f043-a2daea286333] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7d870d5b-8579-b2de-24c8-93d37eaee260] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f7fb25b5-c57c-f328-5492-982192b038e3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4a73890a-df6f-c603-d0c4-72fada95a363] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_eb7d66d3-d0ed-d3f2-99c7-82425a05fc94] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2c7dabe8-496b-1b80-2b3b-fad6530c94be] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f8b65a43-697f-5ace-c12e-57699129054f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d7cee0be-360d-3028-cfe7-1749771765e7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a37c5dc0-676b-40b6-4ec5-d88db5c5104a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_25e88201-39fa-8d38-f654-b4ac5332328f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_51934506-a27c-fce0-0046-1cb998e8fc8a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_232196fb-c36e-a126-5765-816d8f277f5b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d969d14-48a2-3794-c8c9-0a5ff6ca6397] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4caca5c4-5782-7020-3b9c-319bb6fb8d6f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_022aa122-779f-2169-ceec-50158e708783] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_96fc6978-1b1e-e4ca-768f-806373b8ed61] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_392b1f76-301b-85ba-a56a-659fa7c68b5b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6e1e8ce7-3002-4409-160c-853735b87aad] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_58a6d31f-e3af-ee39-636e-b56cf3535c35] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_30dfa12f-0b80-5592-04e7-a4e42593df05] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2f87b450-8e20-757e-c6f8-02352961a64f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_10e8a5dc-cae6-ca39-6e75-ccc24fbb5793] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8010ea2f-c1d4-95df-3c60-c986afcf820d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_90186d8f-0c98-0f3a-1a20-f526ed171d42] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e222c9d8-611d-1158-3bd1-5cd575e3bfae] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c8555ba-1133-2e30-c23f-a1ed2f3a52cd] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f464894-eacb-9913-c641-a29461c38cc8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c10a90bb-5cbc-5809-13ea-aacae5ca3ae0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_66731feb-b41a-e45c-8786-63f8f308d1c6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c2979cd-c640-aee6-e178-449b7f874c61] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebf9b42c-3684-c50b-d777-98758305caa2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d180bfaf-7db9-6d0c-f06e-d9f8fa4b0d08] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e4a2a0f0-ce1a-8dd4-909b-8d493b7b3147] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac66cfa7-0a1d-1e31-2d90-2926b3573291] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_781552f3-602d-39aa-b2af-c18f114d768f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_76f5d304-f03e-56a5-03a9-4b6f1ffade8b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ea18f060-4b36-58c4-a1b0-804bc0b2c4a8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_113f1fc1-3b11-332d-13a7-453709d6aac0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_282695f7-4eda-6d20-e6df-960a05c641fa] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/security_postgres-7482c5fb-31aa-9a2e-ae65-03b48382616d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e2de0ad3-36e0-c7be-a60a-7aeccc465072] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7b0a58a6-2274-4708-4288-dcee1491567b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1db5c222-3edd-5cf8-40e0-c4e8dff94230] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b0bc336e-3fe5-259c-3af5-62ee25fe8880] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b345211d-47ba-4b7f-cd61-bc3dd1e25047] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6d413e3e-9258-cfbe-9570-147272d499ec] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6379ab71-b1ef-8a23-d223-65f2b1e1f67b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_44b945dd-d512-8604-6209-1a337950815f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_18488429-5e8d-8fdf-8263-9ff8198ba613] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_48da3f5d-7216-93a2-def9-e94bc908f612] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_39f5e534-1fd5-fccd-a32e-c685e853a88e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0240e90c-8b34-989c-e88f-289eb9aa3349] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b3f39155-7204-f2bf-49a9-edbf0fcaa6f4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_50eb5136-e840-f32a-98fb-44a2030fc7db] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a5d4961-4f69-11d7-5c6a-f1173533a8da] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_41bbbe03-e95c-ff4a-e63d-42a01d25b68f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_99ee7671-099a-b141-199e-7eabb093bdff] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5c072300-990b-ec8e-3601-53d5a5c1a695] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-security-postgres-7482c5fb-31aa-9a2e-ae65-03b48382616d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_63327bdf-7c68-5c6a-1729-29d45fedf215] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d72501c2-107e-16f2-a426-1bc6a1f1b393] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/await-for-keycloak-113f1fc1-3b11-332d-13a7-453709d6aac0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ed8ae133-d03d-b420-9ae5-75b474a3956f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e3fb29a-a589-a7c0-412a-7da320727211] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f567b5c5-c5e3-0a70-822e-1e9da955d602] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_41b9d251-ee3c-d626-89b2-b7f89e4df0fb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83b8ebe5-10cd-3595-e5c9-4f32bf9a0aea] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_94ca89a3-e84e-34d4-dccd-8df84ef01e62] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-7402ab24-7a72-daa7-18e8-791c7fef8f1f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a93c54c3-3cf5-127e-c6d2-bb3fe7ec6a9e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8139a273-35af-1267-555c-90af7f1b171d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/await-for-keycloak-3372a577-0d74-8c27-cd77-bd38c6aba0e3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_702b6438-6595-5532-4c19-aa6af75c6a97] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_18b05f18-14bd-730e-716a-9dc14e22c1a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0c76bf1-d720-811f-6d49-a0c51e021385] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c7c0acf5-af3e-6071-5a04-dac053d8d528] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-7b0a58a6-2274-4708-4288-dcee1491567b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dad6f08d-63f9-981e-0710-f606f16b1612] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b9b30315-d632-5298-de37-f84d88930d30] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7f515eb6-5425-326a-eb88-11177c6983e7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7204c6f8-b9f0-e7ed-306b-b399af5f4902] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_40e835b5-7ee9-e829-79da-4a86171d644d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/logunifier-3d488e11-cda7-c934-acc4-58513c8fa9fc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-fe8e888f-c9b8-4899-22c7-3e5299e6a286] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f25898d6-1b9b-fa45-6d54-e4eb24679da6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4b9fd3bc-2afa-468c-f449-b0a22dbc83a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7402ab24-7a72-daa7-18e8-791c7fef8f1f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fa80e84e-7ab6-c17a-80f4-0d2dbad5f124] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/await-for-security-postgres-7b0a58a6-2274-4708-4288-dcee1491567b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_72352633-73de-befd-2546-8f194631491b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_883add62-5382-573f-01a4-74ccdd730433] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d43f469a-4e69-6c5b-1ec6-8c5f0cc0a6c1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-a4a139ac-114d-d326-46d9-15f4eda159bc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7482c5fb-31aa-9a2e-ae65-03b48382616d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0c6a4b5-a4ac-3ca7-b7c3-e59986aef734] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-1b74c940-fcb1-69e1-6d67-0b63054f266c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_401e0cdd-f517-3b40-94d7-81b2f92e2e46] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_791a2e74-2587-1b51-8973-3237ad797a81] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_63541a26-e4e1-b0e0-43e7-63f512c38704] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8fc6b74b-9f30-7f05-61ab-2c924a3b8098] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ef11d95f-70b0-c7c6-bc36-14168c070c72] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b546beb3-03b8-6d1e-5043-668d3eda5687] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_24bc8713-ad7d-041a-fe16-cada508034d2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a16cbe1-f824-2af9-80ee-09842a8a8241] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b5ea3aa9-e526-8c29-a60d-7795d09509ac] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_115fc8b3-d9fa-010d-1405-a2e0c7fa1b5e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4866d91a-2ec9-04b4-38d0-64f463f2ae12] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6cc03836-41d7-fd6b-7681-79da7c0e3e29] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a540a62f-1446-270f-c043-051226c5dff7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f0a21535-449e-3f2c-3607-c38cdae7df35] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ae8402f0-024f-f584-5f52-49c3a9faa0ae] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_01a65f63-28e6-8105-5beb-1d4f0d295757] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d28e4db-8bc5-4422-54b5-69150fa7108a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bae6e060-b7ad-12c1-490f-a6a99ec6038f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9ba5f11e-f068-a873-b0d8-4aab2225ee73] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_42f56e72-cd27-d498-8ee0-9f61ddd62bd3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1d7ad2fe-9254-a983-8f30-3474d14985a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4be69159-89ef-a308-568d-b1f562aa3b82] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a0bb855f-ec2d-f607-fcad-4e8d1b651540] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c9db02e1-a294-f637-9c2c-4ed146c46db5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5fc105f0-5a7e-408d-3e3c-e3366b5ac222] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_27d401ae-ccac-5357-9800-f3b93a2bbf8a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_08893a28-b5e6-ac4b-8a80-da7dda48d71c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c4eb1af-0b87-2a6e-4201-2afba9d75bc2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f38e15f-9ef3-095d-5920-6c3f5d5f4154] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_451b867b-b426-9d6b-1402-3b5b3bb8dc10] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_faffa699-9a3f-f235-00d0-8222bd2704db] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4e0eb0a5-1e92-b0ef-5626-6baee78a0096] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a496910b-e46c-2a50-94d9-b9cccade6da3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1d400e59-1686-5416-26ed-e328c33657ce] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f30a2d53-ad5c-2680-6145-5fcca77d0806] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6c18dc9-765a-269e-ddd4-6541faaa71fc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d5435bc0-09c5-8116-8e85-b6dc6de92dcb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_210c5f9d-f522-4018-44a3-a35a5be973d0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f36080f5-de47-4f3a-43e2-f2cbd77471fe] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7cbb5e24-5969-999b-9fbb-df11e6023463] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_562d515c-f5f3-6318-5cb0-62a6cc8a3f52] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_92c3fc44-45a1-7e5c-d1e7-fa0936e305d3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_976be946-07a4-010b-bd1b-8b24f97b77ef] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_06167f7a-9fc6-e391-c32e-04f1c9be9d74] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2236ba16-24ec-4cae-a904-6e2ec00c2235] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_42d8020c-ab01-d852-bf4d-3f8b818fec88] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b77e81cb-1c47-924e-de6c-178aa424d548] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ce740085-603e-8c99-f95f-6bd3a046125c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_055db48f-1569-e610-bf66-855290350cc2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0162d7ac-15c0-ee4c-84e1-acb0ec062a70] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_85fde91b-9eda-e1e7-bbcd-50ca082eb50b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d884e696-31a8-7a7f-a3f0-2f3115571b25] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0571cc3-5d44-db64-4574-aabbd14c7e11] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_43216db8-e085-1598-2de0-33d011c7a694] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3af77b66-c371-f94e-a6c4-1a714e550ea5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ee692e00-1d1f-5224-ee51-d131f95cc6a1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6dcf4bfa-0af8-24fa-b4ca-f51cf59bc72a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a959690f-4ca6-b16b-2ef7-af85f846e39e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_31167ba1-9760-26f3-9b81-f6a26e1cdee3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4a634af1-b6b5-7399-72ed-8df147121301] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_81c2c252-0fd8-13bf-2a2c-86e11f7f7c2f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f99620d3-5066-702b-1bdd-785063bd8458] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_245b7a62-98e2-a367-fdea-79bbae4351b8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5836b290-9722-d43c-69e3-0ca326679dc2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9e970bd1-1db4-8a3c-0012-af615216a289] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_54a42993-1455-1101-f925-7b44cd21e63a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c4763cc3-c845-d35c-3851-6aef16578419] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_67d482ae-3889-8323-7eac-18928a2e2b79] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_80d103b4-f1b5-cc84-6bbf-3d918c2e7a1f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_47ae54eb-9927-1223-3927-e04326bae278] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0d5d8f2-f0fa-3d69-f582-584918264a2b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_66e7e264-304b-aef4-a77e-c53e504d7c41] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_56fa7fb8-f6b7-bba4-0176-23f4fc2c53a3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_51ead1db-8c4c-a83d-7530-31b0d7d642b0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_32561c34-1040-b3d2-dbb1-f70e3ae87139] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_faeb2629-97cd-4c98-5dde-f8daf8bfb9a4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6e7f5f6-420c-d67d-914d-fcb14f5ccb82] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2369851b-4d5c-486a-01b6-b3be16cf1dd7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_363a1b3e-f1dd-475f-0835-3361a71e3959] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1115fa0b-5239-07ff-fd6a-360806092c0f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7a06fc09-a648-27d4-21ba-a106ae8a67a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a72b497-e2be-fdb7-67b8-b4da3300e99e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0c39273-62f1-45b4-bc64-555e01b80a75] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0fe726d0-4672-85b9-4eda-a856d61b94b3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0e5811e2-d255-c4d9-bf36-5fd969595349] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4ce37b28-01e7-3e0d-609a-86b12b47dfa8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0104afc-2307-d89f-85b4-d5bda88c5d75] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fd07f4b0-448a-89f3-e776-042884a954b6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2a33e7e3-89c7-eff7-640a-26d44fef367c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_76fe058b-0dd7-4b9b-232c-5e75ffd01069] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ca1198a8-f040-1f2c-f390-ff919c577efa] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_68c263df-6161-b2db-758e-f0db602bef34] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6022c83a-faa7-0544-d9f9-d04827d0e938] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f1f2976d-c36a-35d4-fa00-7c3f8252f292] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_42d0351a-ded8-99c2-7e3b-9142fb79a495] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_496351af-34dc-7b83-f449-323049b051b3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b2ef4cd5-3443-645b-9bbd-d962b7273c5a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_274e1659-46f7-b45b-1b70-10889078595c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_459d1eb6-22c4-53c0-49f1-171a5d754fd7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b81968a4-1c3c-a446-5958-9f4abc7a1203] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1ff419ee-9614-de46-aeee-ed5d68d0cca3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5976fd18-1ef3-b888-83ae-3315150cfca0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f4fda85-3686-00a6-6d40-c08d2ef781d4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c3694fc9-5c13-0c7c-ce5a-1cd873f7b090] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_abe6958e-ded5-09e2-b468-4445ea93d505] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_630a54e1-e69e-7110-a27e-220de6cba009] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_94773357-3cc2-d331-0413-db39d3a0a0fe] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6a29b206-61b1-8943-4fd4-3d774b903927] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3bd03adf-59fd-d7f7-014c-f702435b095c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a21a59e6-1a0c-bab8-aee9-7738c527f613] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a9654f51-b825-964e-6c7d-c0b47c8db273] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2548a0e3-5a69-6b6d-f666-16da44615de5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_314ca142-d66e-f476-2c45-dee23cd0fbaf] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_50da53e6-08cc-42d9-0a27-a7fc199bc5a7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f59c5dad-ff2c-36bf-93f7-9c036319ade4] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e2d67b88-674c-43e4-cfbb-b8f21d179eba] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f0b7d139-e088-ffee-1e58-d79bb65d4e51] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_147e6cb9-a6a4-df14-f8fe-bebf80283b7d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7347a644-3888-21ab-85d8-cc16a92b71e8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_851d0e31-30a2-2387-7f39-5f22619f8228] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_602b3647-be8a-e4ac-eeff-8d7a19cb0c93] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f052b173-56d7-c4f2-9fe8-e4b8d2a9d5ac] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1094fd09-d64d-c2e7-0c7f-0aa42cf78f70] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fdde9248-8a81-60d2-e98c-10bb13859b27] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5a8a1024-181c-c2fd-36a8-6a16f3bc378f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d82a5e0e-c5ca-64af-c3b0-49a0cb7b8473] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ba89033d-381e-380f-a313-8e67732bd8e6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6ad86a9-22a0-728d-f2e1-9af135f52ed7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c365ee2-a710-1b93-5489-d0c66d4cdd22] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_acfec9f6-3133-31a4-078f-40ebb2f39f8e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8f01e4ba-aaed-521b-9a45-82e5967eaacc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_df369a2f-6e29-de39-1db1-ed4e3f522024] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e71392f-868f-aa7e-3785-5ac5d7df25d8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e3bc3462-39f9-edc1-53dd-7c2b7cd5da3d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6b5a6b71-5250-8673-f593-aa6ee4638065] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7bf40c9b-bede-9649-a6ab-22a452a78fc0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d28a14be-5c9d-8542-037e-9c6f94fac918] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_16fc5971-df89-6f70-dbb2-fd4d1aa02b7c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ffa03404-35c9-544f-61e4-384f3072e448] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f9acabbb-ea2e-d896-3bb8-164c69f3f7bb] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6c4f7ed-a56e-d4f1-5e51-26fd37bbaf47] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e053ed0c-b0d0-a10c-706e-cad9c3894ba3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0b226dd2-157d-a28a-c795-9029e48ec15b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1dea44e1-d487-8e23-882e-41764384910c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_39f0010c-215b-705d-90cd-e44ad95abb3b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a2007bca-0063-ded6-2156-5b85e78439ff] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_68f21a10-ad1f-2fad-3da3-04ceb007fd12] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2b545b55-a879-2a61-c48d-2ad8d3d6fdfd] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_05b44026-b546-6787-3560-5a85067e7e21] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6ca3778-e595-7aac-c2f3-74760a353791] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ce0a768-4980-be18-f5a0-d5be0db3fdc1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9526fbc7-3610-5fe1-e724-a58f5d8c2252] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_87357443-30c3-d5eb-2316-d6be690569f9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_57b76880-d518-ad46-f61c-3f5a55832697] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_437c77a9-d009-363c-a310-e84018d92267] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2b614f06-bae3-c190-2282-3e6223b7695f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fbe4f379-1123-03c7-fe5b-8b149fb2e820] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3dbdeafa-ed09-723a-cbba-af0d2327332f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6ea2d588-0158-03da-4cff-ce09bf17f39a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d8865f9e-41e6-11ca-9028-2cac33fca873] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1812823e-1ee3-18fc-f949-53008ee41c21] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c7d91bc1-6104-053d-a3c3-3b0db6144971] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_df58adfc-14cb-4f61-0276-28174b269aa1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5834c11d-dedf-2c3f-2f1e-be556fb85bf7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d2acea1d-937a-44e9-9bf6-9d5e7a6c12e9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe444a04-7a6d-ea8c-15ff-615bd8b72403] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ca9ac364-9d2b-7599-e303-b0cdd6a722ce] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c586be6-1f73-d38e-4f9c-e1cc0425b55c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5c3a3083-fa05-b95c-7070-40b6a46bb240] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4434bda8-193f-f9cd-1c83-9d33bbe8458b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9a6dc408-e798-0eff-e5ff-540a8dd61ad5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f230141a-1c86-26b5-523e-a228f361dcfa] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_825ff2df-fbdc-029b-9838-6bc708458c70] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7b8cfe16-c275-e351-03bd-26f57c25ea03] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_72986a87-83b2-5673-9767-e73aeb0cd4d1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7740c5b7-a332-8a5a-eb7a-d63462803226] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_200f68d9-e9d7-82be-5be6-b5b77ca86dd5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f243cd25-2d58-e4c6-77be-263d5637bb73] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d9d5571d-c958-4fe8-c343-2cc42f538303] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2836b024-a589-985c-0807-40e98016c865] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a804a117-5551-f327-5208-5000ee6bfb5a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_14890f6d-8985-170b-f811-da5760a854c0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a388182f-c115-1cb7-507f-a5a42e8cf46b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8d6a65b7-c873-747f-3d33-2e3bc676886a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ecc21062-ca4c-0d9e-29f8-7afc9392a289] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0c7bbb9-44fd-b90e-ef1b-51f611c554d6] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_defe8e1b-55d9-c971-0063-5ca8e75c638d] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_65bed301-410d-8c09-85dc-256b48f45d20] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aa65a4b0-8af8-e682-b93a-250f8ba52762] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_00014660-de8f-45ad-7205-27cf2fc40766] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4972ae37-0cd4-78a9-c529-5599b017947c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_59587176-1988-7701-e2e2-d7108c65c9a1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8decf8fe-2f57-a52e-e09d-f1cc175e4127] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a9a80500-d699-abf6-5232-9c404c76c177] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_362ea24f-432e-a901-819e-ae87c85b56b1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe3d20c4-1010-3934-193b-d30a30796e51] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b6b529bb-d3e2-6754-09da-843990e2cfdd] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4c1c6bcd-7f8a-c4ed-4e7b-8b77943bcd86] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ba54ee9f-1ea8-8d7a-97f4-c4bbb8a100cc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a04f9617-e186-e57d-9d39-58fadc1884a9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_76ebdad2-ea21-607e-cf2f-31d7a5f52c27] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b35fc10f-6c49-05ef-5fbd-54e1d34a8eb9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e2a406c0-4c8b-3ffc-6949-4fbcce959db5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bffa0040-dc3d-2500-fb74-3da2566daaa0] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ce6fae0f-a797-6072-4fb0-bd9a657b3b36] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5f61822e-ae64-73bd-a123-0d9c1cb664ee] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6276dcd1-b0ac-44a2-0f55-1775300b7745] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_62a893f3-952f-600f-4c1f-1b870e2f955a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_586342d7-6208-0207-a6e0-e064ae0ac3b5] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_94c6920c-00fc-23c3-c8cb-640119555801] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9aaf1bc7-2914-f827-5915-08fc1f7bd026] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5b5ec3f2-86bf-8243-aac6-14ae5a917fc1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d91b81f5-3de4-6ff1-fb28-04e8261292fd] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8e63a3b8-fea4-00e8-6528-5c82237253b3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1fa606ce-7e0c-fbb9-fedf-ecadeca615f3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1ec67236-1fcc-8371-b6bc-c6b4ed14880e] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c87421f7-d728-d8ae-1a40-759fd46f1d65] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1880d484-d075-0c21-7927-1cb6ba25c4cf] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3ed4cf72-29c8-f403-5abe-d8494413b0e1] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1d42f5cb-82ae-0546-c63d-05072aa6e6f9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b56ec4e1-3925-2be5-30c9-92e24d3762de] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5363c786-e762-4a48-4c75-ec2bbb0f2f61] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c171b5fc-54cd-7e22-3024-f3d5687cf6ec] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33ed6753-c1cc-1774-e8b5-54613a9a276c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4cdcba1a-474a-3066-72b8-1f89f2e8e92a] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ff86af4-8662-81e3-5438-5f07e592fdf9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f25b1973-2baf-1b77-9ea7-1e241d692354] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0610f998-5930-1541-7be6-d59ec2e47229] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d98fc096-da2b-f179-f9a6-de4a11cc4c16] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6683b92-e22b-03f8-a8a6-975ebc47518f] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_07e6f9bd-969e-72ba-4947-f8552f47c5e7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_822821b2-b293-2ea7-0b3c-34f77dc63476] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2a2868e4-18b3-5fb2-3740-28939a6b385c] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dbdcddfa-ce19-ac32-dd7e-dcca9d5884f9] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_38c2572b-24c0-65ba-43a6-5489504a32f8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-bd7464de-fa72-736b-e57c-6782cc7d7202] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6a550518-6fe6-1d6c-758e-b8d79b4a7f91] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-grafana-bd7464de-fa72-736b-e57c-6782cc7d7202] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3d074f19-a056-6a32-78be-833cf8cafeed] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_96e6dcbf-7d61-6ca1-2df1-b6eb54bd7ade] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_907c3a55-036a-5569-003b-04839e1fc6c8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3f553197-72d8-5935-7960-15619519b1fc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e9a342a8-67cb-b747-1c58-839ea5d53d3b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bd7464de-fa72-736b-e57c-6782cc7d7202] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2d549ab9-a7a8-65ac-1697-dd440dd0e3d7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_82b10536-6a1c-644a-c3bc-3e2259536c58] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-d9d9c594-c60d-e002-6448-2a9b8b5fa6ec] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-e9a342a8-67cb-b747-1c58-839ea5d53d3b] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-2d549ab9-a7a8-65ac-1697-dd440dd0e3d7] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4d92967c-5996-752c-1cac-6f079b2c8099] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/minio-a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b9bd1537-0bae-8c11-41b3-437a4c21df29] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-4d92967c-5996-752c-1cac-6f079b2c8099] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9a3ae9f7-2ed3-c25c-12d9-d792452841d8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-prometheus-exporter-4d92967c-5996-752c-1cac-6f079b2c8099] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-agent-9a3ae9f7-2ed3-c25c-12d9-d792452841d8] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_87cd4d30-a263-1598-0a57-72046f840473] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-b9bd1537-0bae-8c11-41b3-437a4c21df29] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1022bc44-6bd1-c8c5-62c5-4166c31f7afc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/keycloak_postgres-27dfb19c-1e44-2e49-a689-0a4e369f7bd2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-postgres-27dfb19c-1e44-2e49-a689-0a4e369f7bd2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_27dfb19c-1e44-2e49-a689-0a4e369f7bd2] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/keycloak-1022bc44-6bd1-c8c5-62c5-4166c31f7afc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-1022bc44-6bd1-c8c5-62c5-4166c31f7afc] 2023-05-11T11:38:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/forwardauth-87cd4d30-a263-1598-0a57-72046f840473] 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.391664ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=13.084372972s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=11 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=alloc_health_watcher start="2023-05-11 09:38:11.933528731 +0000 UTC m=+311.207575305" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=network start="2023-05-11 09:38:11.933670472 +0000 UTC m=+311.207717044" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=migrate_disk end="2023-05-11 09:38:11.933461221 +0000 UTC m=+311.207507796" duration=2.319001ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=alloc_health_watcher end="2023-05-11 09:38:11.933656522 +0000 UTC m=+311.207703108" duration="127.803ยตs" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.runner_hook.alloc_health_watcher: watching: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce deadline="2023-05-11 09:43:11.9335415 +0000 UTC m=+611.207588077" checks=true min_healthy_time=10s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43606: EOF 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=await_previous_allocations end="2023-05-11 09:38:11.931033639 +0000 UTC m=+311.205080213" duration=1.782711ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.356588445s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=migrate_disk start="2023-05-11 09:38:11.931142216 +0000 UTC m=+311.205188795" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for remote previous alloc to terminate: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce previous_alloc=1845d837-285b-1d3c-86b9-16c47274106e 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=cgroup end="2023-05-11 09:38:11.929207389 +0000 UTC m=+311.203253971" duration=26.694643ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=await_previous_allocations start="2023-05-11 09:38:11.929250918 +0000 UTC m=+311.203297502" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for remote previous alloc to terminate: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce previous_alloc=1845d837-285b-1d3c-86b9-16c47274106e 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=2.316353ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.559229ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15490 duration="552.181ยตs" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hooks: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce start="2023-05-11 09:38:11.902169301 +0000 UTC m=+311.176215884" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=alloc_dir end="2023-05-11 09:38:11.902464255 +0000 UTC m=+311.176510836" duration="225.865ยตs" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=cgroup start="2023-05-11 09:38:11.902512751 +0000 UTC m=+311.176559328" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.cpuset.v2: add allocation: name=observability.logunifier[0] id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce name=alloc_dir start="2023-05-11 09:38:11.902238386 +0000 UTC m=+311.176284971" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=1 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce task=logunifier type=Received msg="Task received by client" failed=false 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=5f8ff55c-60f6-b98d-d49f-bac37cf70bce from=init to=init 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=10.915048007s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15493 total=22 pulled=10 filtered=12 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=1 removed=0 updated=9 ignored=12 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.966431454s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15490 total=21 pulled=9 filtered=12 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c8b908c8-32ae-b314-e642-c6478aee4e9c type=service namespace=default job_id=observability node_id=e2eb7460-2bca-ac62-5c53-999281062667 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 1) (destructive 0) (inplace 0) (stop 1) (disconnect 0) (reconnect 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=c8b908c8-32ae-b314-e642-c6478aee4e9c job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c8b908c8-32ae-b314-e642-c6478aee4e9c 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.639361841s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15485 duration=5.585918ms 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval c8b908c8, job observability, NodeUpdates: (node[e2eb7460] (1845d837 stop/evict)), NodeAllocations: (node[36d1fc65] (5f8ff55c observability.logunifier[0] run)))" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=c8b908c8-32ae-b314-e642-c6478aee4e9c job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=de373f48-0d3a-6e76-44af-4945530e2687 type=service namespace=default job_id=security node_id=e2eb7460-2bca-ac62-5c53-999281062667 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c8b908c8-32ae-b314-e642-c6478aee4e9c type=service namespace=default job_id=observability node_id=e2eb7460-2bca-ac62-5c53-999281062667 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=c8b908c8-32ae-b314-e642-c6478aee4e9c job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=de373f48-0d3a-6e76-44af-4945530e2687 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=de373f48-0d3a-6e76-44af-4945530e2687 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=de373f48-0d3a-6e76-44af-4945530e2687 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=de373f48-0d3a-6e76-44af-4945530e2687 type=service namespace=default job_id=security node_id=e2eb7460-2bca-ac62-5c53-999281062667 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=f4d43290-1d94-3564-06aa-2eee740fdb1c type=service namespace=default job_id=security node_id=f652ee64-d508-464f-bfb5-d1a36ac8f3d9 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15478 duration=26.934086037s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=f4d43290-1d94-3564-06aa-2eee740fdb1c 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=f4d43290-1d94-3564-06aa-2eee740fdb1c job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15474 duration=37.050748056s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=f4d43290-1d94-3564-06aa-2eee740fdb1c type=service namespace=default job_id=security node_id=f652ee64-d508-464f-bfb5-d1a36ac8f3d9 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=f4d43290-1d94-3564-06aa-2eee740fdb1c job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval f4d43290, job security, NodeUpdates: (node[f652ee64] (57e22169 stop/evict))(node[e2eb7460] (e5fea251 stop/evict)))" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=f4d43290-1d94-3564-06aa-2eee740fdb1c job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=0b0dd917-087f-201f-502d-c4d20c7c8476 type=system namespace=default job_id=ingress node_id=e2eb7460-2bca-ac62-5c53-999281062667 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.system_sched: setting eval status: eval_id=0b0dd917-087f-201f-502d-c4d20c7c8476 job_id=ingress namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.system_sched.binpack: NewBinPackIterator created: eval_id=0b0dd917-087f-201f-502d-c4d20c7c8476 job_id=ingress namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.system_sched: reconciled current state with desired state: eval_id=0b0dd917-087f-201f-502d-c4d20c7c8476 job_id=ingress namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 1) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=0b0dd917-087f-201f-502d-c4d20c7c8476 type=system namespace=default job_id=ingress node_id=e2eb7460-2bca-ac62-5c53-999281062667 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=43794fb8-78a5-1145-5e2c-c00a26455fd8 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=43794fb8-78a5-1145-5e2c-c00a26455fd8 type=system namespace=default job_id=ingress node_id=f652ee64-d508-464f-bfb5-d1a36ac8f3d9 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.system_sched: setting eval status: eval_id=43794fb8-78a5-1145-5e2c-c00a26455fd8 job_id=ingress namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.system_sched: reconciled current state with desired state: eval_id=43794fb8-78a5-1145-5e2c-c00a26455fd8 job_id=ingress namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 1) (lost 2) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.system_sched.binpack: NewBinPackIterator created: eval_id=43794fb8-78a5-1145-5e2c-c00a26455fd8 job_id=ingress namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=43794fb8-78a5-1145-5e2c-c00a26455fd8 type=system namespace=default job_id=ingress node_id=f652ee64-d508-464f-bfb5-d1a36ac8f3d9 triggered_by=node-update 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 43794fb8, job ingress, NodeUpdates: (node[f652ee64] (52348e2b stop/evict))(node[e2eb7460] (75709f55 stop/evict)))" 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15481 duration=23.677542803s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: job GC scanning before cutoff index: index=4 job_gc_threshold=4h0m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=0ed323bd-714b-49a5-2de3-47f31ee908cb type=_core namespace=- job_id=job-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: CSI plugin GC scanning before cutoff index: index=4 csi_plugin_gc_threshold=1h0m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=8b6836e4-18b8-02ea-0293-d91cab5172ba type=_core namespace=- job_id=csi-plugin-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=8b6836e4-18b8-02ea-0293-d91cab5172ba type=_core namespace=- job_id=csi-plugin-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=0ed323bd-714b-49a5-2de3-47f31ee908cb type=_core namespace=- job_id=job-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] core.sched: garbage collecting unclaimed CSI volume claims: eval.JobID=csi-volume-claim-gc 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=29769bda-90f5-7216-ee3d-12c93561ee6d type=_core namespace=- job_id=node-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=29769bda-90f5-7216-ee3d-12c93561ee6d type=_core namespace=- job_id=node-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad.heartbeat: node TTL expired: node_id=f652ee64-d508-464f-bfb5-d1a36ac8f3d9 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=0333d6ec-c2a0-9326-0f5f-432e615e2535 type=_core namespace=- job_id=deployment-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad.heartbeat: node TTL expired: node_id=e2eb7460-2bca-ac62-5c53-999281062667 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=dce3616a-9fa7-c6f8-fece-efbc379bfbb3 type=_core namespace=- job_id=local-token-expired-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=eade5460-16e7-5a83-7e54-dc27f4525acb type=_core namespace=- job_id=csi-volume-claim-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=eade5460-16e7-5a83-7e54-dc27f4525acb type=_core namespace=- job_id=csi-volume-claim-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=0333d6ec-c2a0-9326-0f5f-432e615e2535 type=_core namespace=- job_id=deployment-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: node GC scanning before cutoff index: index=4 node_gc_threshold=24h0m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: deployment GC scanning before cutoff index: index=4 deployment_gc_threshold=1h0m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=71a73f95-c8fe-081b-e9d1-c50d05c96be7 type=_core namespace=- job_id=eval-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=71a73f95-c8fe-081b-e9d1-c50d05c96be7 type=_core namespace=- job_id=eval-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: eval GC scanning before cutoff index: index=4 batch_eval_gc_threshold=24h0m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: eval GC scanning before cutoff index: index=4 eval_gc_threshold=1h0m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=dce3616a-9fa7-c6f8-fece-efbc379bfbb3 type=_core namespace=- job_id=local-token-expired-gc node_id="" triggered_by=scheduled 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] core.sched: CSI volume claim GC scanning before cutoff index: index=9056 csi_volume_claim_gc_threshold=5m0s 2023-05-11T11:38:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.078898ms 2023-05-11T11:38:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.662434ms 2023-05-11T11:38:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.046548ms 2023-05-11T11:38:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.507177ms 2023-05-11T11:38:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="457.887ยตs" 2023-05-11T11:38:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:38:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="373.252ยตs" 2023-05-11T11:38:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:38:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:38:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="322.898ยตs" 2023-05-11T11:38:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="364.206ยตs" 2023-05-11T11:38:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.555843ms 2023-05-11T11:38:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.706395ms 2023-05-11T11:38:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=2.149776ms 2023-05-11T11:38:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration=2.304411ms 2023-05-11T11:38:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:38:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:38:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="733.778ยตs" 2023-05-11T11:38:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:38:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.409346ms 2023-05-11T11:38:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:43858 2023-05-11T11:38:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:38:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="385.083ยตs" 2023-05-11T11:38:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.551946ms 2023-05-11T11:38:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.02755ms 2023-05-11T11:38:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.36617ms 2023-05-11T11:38:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="499.199ยตs" 2023-05-11T11:38:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:38:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="383.747ยตs" 2023-05-11T11:38:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="357.713ยตs" 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-ready\" check" task=group-grafana-agent time_limit=40s 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent time_limit=20s 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.003880155s 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="450.117ยตs" 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration="957.884ยตs" 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.15202ms 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=3.807492ms 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="606.955ยตs" 2023-05-11T11:38:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:38:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.393327ms 2023-05-11T11:38:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:38:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="501.425ยตs" 2023-05-11T11:38:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:38:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="369.775ยตs" 2023-05-11T11:38:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.077795ms 2023-05-11T11:38:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.129183ms 2023-05-11T11:38:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.414664ms 2023-05-11T11:38:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="756.452ยตs" 2023-05-11T11:38:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="354.231ยตs" 2023-05-11T11:37:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:37:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) no new data (index was the same) 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) marking successful data response 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) successful contact, resetting retries 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): GET /v1/health/service/mimir?index=14250&stale=true&wait=1m0s 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results after filtering 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="336.032ยตs" 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): GET /v1/health/service/mimir?index=14250&stale=true&wait=1m0s 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results after filtering 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) successful contact, resetting retries 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) no new data (index was the same) 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) marking successful data response 2023-05-11T11:37:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results 2023-05-11T11:37:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=7.238316ms 2023-05-11T11:37:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=3.782663ms 2023-05-11T11:37:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.327818ms 2023-05-11T11:37:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="782.236ยตs" 2023-05-11T11:37:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="848.591ยตs" 2023-05-11T11:37:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:37:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) no new data (index was the same) 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) marking successful data response 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) health.service(mimir|passing) successful contact, resetting retries 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): returned 1 results after filtering 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: health.service(mimir|passing): GET /v1/health/service/mimir?index=14250&stale=true&wait=1m0s 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.127142ms 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="679.961ยตs" 2023-05-11T11:37:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:59390 2023-05-11T11:37:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:37:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="739.061ยตs" 2023-05-11T11:37:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.448529ms 2023-05-11T11:37:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.537415ms 2023-05-11T11:37:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.344088ms 2023-05-11T11:37:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="546.255ยตs" 2023-05-11T11:37:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="445.901ยตs" 2023-05-11T11:37:53+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="320.194ยตs" 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee check=keycloak_postgres_ping task=group-keycloak-postgres time_limit=20s 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.075459ms 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.09331ms 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.829856ms 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="389.867ยตs" 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="420.181ยตs" 2023-05-11T11:37:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.537646ms 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="490.399ยตs" 2023-05-11T11:37:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.704175ms 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.70376ms 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:56934: EOF 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="405.779ยตs" 2023-05-11T11:37:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.109304ms 2023-05-11T11:37:49+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=4.820534ms 2023-05-11T11:37:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:37:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="461.796ยตs" 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="626.003ยตs" 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42528: EOF 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.991534ms 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42520: EOF 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.585149ms 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42514: EOF 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="529.462ยตs" 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.425039ms 2023-05-11T11:37:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:37:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="366.646ยตs" 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.756191ms 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:52064 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15479 duration="363.472ยตs" 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=1e312fb7-c33e-5368-bdfc-5ba01d1f10e8 type=service namespace=default job_id=security node_id="" triggered_by=deployment-watcher 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=1e312fb7-c33e-5368-bdfc-5ba01d1f10e8 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15459 duration=33.710356294s 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=1e312fb7-c33e-5368-bdfc-5ba01d1f10e8 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15478 duration=1.248666375s 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Deployment Update for ID "89024955-4251-b075-2135-dd298aaabc72": Status "successful"; Description "Deployment completed successfully" 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15477 duration=4.847179813s 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=1e312fb7-c33e-5368-bdfc-5ba01d1f10e8 type=service namespace=default job_id=security node_id="" triggered_by=deployment-watcher 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 1e312fb7, job security, DeploymentUpdates: (89024955 successful))" 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=1e312fb7-c33e-5368-bdfc-5ba01d1f10e8 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:37:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=1e312fb7-c33e-5368-bdfc-5ba01d1f10e8 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:37:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:37:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:45+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="771.126ยตs" 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42484: EOF 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.647499ms 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42460: EOF 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.933971ms 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42462: EOF 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="371ยตs" 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.218056ms 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.758258632s 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15475 duration=6.837519402s 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15478 total=21 pulled=9 filtered=12 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15475 duration=6.845146928s 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.777645616s 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=running desired_status="" 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.runner_hook.alloc_health_watcher: health set: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 healthy=true 2023-05-11T11:37:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="479.221ยตs" 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.930174ms 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42452: EOF 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.2369ms 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.570367ms 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="863.52ยตs" 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.898357ms 2023-05-11T11:37:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check missed TTL, is now critical: check=_nomad-check-716e088c4a623de12ff9c906dfc61990e48aeb5a 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats failure=true event="1683797862842133739 - Restart Signaled" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running alloc task restart hook: alloc_id=54969951-d541-ae97-922a-7db38096bae5 name=group_services start="2023-05-11 09:37:42.842170309 +0000 UTC m=+282.116216885" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter failure=true event="1683797862842133739 - Restart Signaled" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<2, 2, 0, 0>" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished alloc task restart hook: alloc_id=54969951-d541-ae97-922a-7db38096bae5 name=group_services end="2023-05-11 09:37:42.842421784 +0000 UTC m=+282.116468362" duration="251.477ยตs" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=54969951-d541-ae97-922a-7db38096bae5 task=group-nats check="service: \"nats\" check" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=54969951-d541-ae97-922a-7db38096bae5 task=group-nats check="service: \"nats-prometheus-exporter\" check" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="357.693ยตs" 2023-05-11T11:37:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.898775ms 2023-05-11T11:37:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="332.869ยตs" 2023-05-11T11:37:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.347587ms 2023-05-11T11:37:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=2.138748ms 2023-05-11T11:37:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="401.26ยตs" 2023-05-11T11:37:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.09237ms 2023-05-11T11:37:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="349.369ยตs" 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15476 duration="214.668ยตs" 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=dd48e9f0-fce6-39f5-42cb-553d53e25774 type=service namespace=default job_id=security node_id="" triggered_by=deployment-watcher 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=dd48e9f0-fce6-39f5-42cb-553d53e25774 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=dd48e9f0-fce6-39f5-42cb-553d53e25774 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=dd48e9f0-fce6-39f5-42cb-553d53e25774 type=service namespace=default job_id=security node_id="" triggered_by=deployment-watcher 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=dd48e9f0-fce6-39f5-42cb-553d53e25774 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15470 duration=13.460960557s 2023-05-11T11:37:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:37:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="415.531ยตs" 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="339.302ยตs" 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.770715ms 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42432: EOF 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="569.418ยตs" 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:42424: EOF 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.071526ms 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.161912ms 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.100398168s 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15474 duration=3.223333849s 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15468 duration=15.463585474s 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15475 total=21 pulled=9 filtered=12 2023-05-11T11:37:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=16.738113178s 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.runner_hook.alloc_health_watcher: health set: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc healthy=true 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=7.944665ms 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.428063ms 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="828.06ยตs" 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.574313ms 2023-05-11T11:37:37+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:37:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="670.925ยตs" 2023-05-11T11:37:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:37:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.950488ms 2023-05-11T11:37:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:36484 2023-05-11T11:37:35+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="898.968ยตs" 2023-05-11T11:37:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:37:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="641.367ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.192446ms 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="772.293ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.135464ms 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.911971ms 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=10.49628369s 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15474 total=21 pulled=9 filtered=12 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15473 duration=2.145945468s 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15473 duration=2.109204938s 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.730138659s 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks end="2023-05-11 09:37:34.692896987 +0000 UTC m=+273.966943556" duration="73.267ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=stats_hook start="2023-05-11 09:37:34.692552175 +0000 UTC m=+273.966598754" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=stats_hook end="2023-05-11 09:37:34.692616079 +0000 UTC m=+273.966662656" duration="63.902ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template start="2023-05-11 09:37:34.692625682 +0000 UTC m=+273.966672251" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_services start="2023-05-11 09:37:34.692665234 +0000 UTC m=+273.966711804" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template end="2023-05-11 09:37:34.692656061 +0000 UTC m=+273.966702638" duration="30.387ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth end="2023-05-11 09:37:34.69290597 +0000 UTC m=+273.966952539" duration="389.345ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=running desired_status="" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth start="2023-05-11 09:37:34.69251662 +0000 UTC m=+273.966563194" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_services end="2023-05-11 09:37:34.692801775 +0000 UTC m=+273.966848353" duration="136.549ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks start="2023-05-11 09:37:34.692823719 +0000 UTC m=+273.966870289" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth type=Started msg="Task started by client" failed=false 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth state=running 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:34.690Z 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker network=unix @module=docker_logger address=/tmp/plugin741601722 timestamp=2023-05-11T09:37:34.688Z 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13906 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=f71a85e2b27db0e4c65d4d5045ca2cbb748bbbda1cd83874c3a00768f5ed7a26 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=f71a85e2b27db0e4c65d4d5045ca2cbb748bbbda1cd83874c3a00768f5ed7a26 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=forwardauth network_mode=container:64e1e510b0787e191845d6e3dc75d2ef07b762138520ea4b0f2308509d18a026 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=forwardauth memory=34359738368 memory_reservation=268435456 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=forwardauth binds="[]string{\"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/local:/local\", \"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets:/secrets\"}" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=forwardauth volumes=["/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc:/alloc", "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/local:/local", "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets:/secrets"] 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=forwardauth 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=forwardauth container_name=forwardauth-87cd4d30-a263-1598-0a57-72046f840473 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=forwardauth labels="map[com.github.logunifier.application.name:mesosphere com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:3.1.0 com.hashicorp.nomad.alloc_id:87cd4d30-a263-1598-0a57-72046f840473 com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-ingress com.hashicorp.nomad.task_name:forwardauth]" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/mesosphere/traefik-forward-auth:3.1.0 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/mesosphere/traefik-forward-auth:3.1.0 image_id=sha256:116b3d1ffd4c7a8c4e26f20789e3dd84d9b5555afecab14063d48bdce8a7e468 references=1 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=volumes start="2023-05-11 09:37:34.537178951 +0000 UTC m=+273.811225529" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=volumes end="2023-05-11 09:37:34.53720068 +0000 UTC m=+273.811247254" duration="21.725ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth allocID=87cd4d30-a263-1598-0a57-72046f840473 task="&{forwardauth docker map[image:registry.cloud.private/mesosphere/traefik-forward-auth:3.1.0 labels:[map[com.github.logunifier.application.name:mesosphere com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:3.1.0]] ports:[auth]] map[CLIENT_ID:ingress ENCRYPTION_KEY:45659373957778734945638459467936 LIFETIME:60 LOG_LEVEL:warn PROVIDER_URI:https://security.cloud.private/realms/nomadder SECRET:9e7d7b0776f032e3a1996272c2fe22d2] [] [0xc0009e1500] [] [] 0xc0036c3920 0xc002fb39b0 map[] 5s 0xc002cf2a68 [] false 0s [0xc0021c1100] [] }" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template end="2023-05-11 09:37:34.537350207 +0000 UTC m=+273.811396777" duration="5.625ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth @module=logmon path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc/logs/.forwardauth.stderr.fifo timestamp=2023-05-11T09:37:34.536Z 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks start="2023-05-11 09:37:34.537384411 +0000 UTC m=+273.811430988" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc/logs/.forwardauth.stdout.fifo @module=logmon timestamp=2023-05-11T09:37:34.536Z 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: skipping done prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=artifacts 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth end="2023-05-11 09:37:34.537419744 +0000 UTC m=+273.811466318" duration=1.41297ms 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: skipping done prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=dispatch_payload 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks end="2023-05-11 09:37:34.537395373 +0000 UTC m=+273.811441950" duration="10.962ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=logmon end="2023-05-11 09:37:34.5370355 +0000 UTC m=+273.811082079" duration="728.527ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=api end="2023-05-11 09:37:34.537310306 +0000 UTC m=+273.811356877" duration="5.794ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=api start="2023-05-11 09:37:34.537304507 +0000 UTC m=+273.811351083" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template start="2023-05-11 09:37:34.537344573 +0000 UTC m=+273.811391152" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: skipping done prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=devices 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: skipping done prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=validate 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=logmon start="2023-05-11 09:37:34.536306976 +0000 UTC m=+273.810353552" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=identity start="2023-05-11 09:37:34.536251658 +0000 UTC m=+273.810298235" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_dir start="2023-05-11 09:37:34.536136294 +0000 UTC m=+273.810182873" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth start="2023-05-11 09:37:34.536006773 +0000 UTC m=+273.810053348" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=identity end="2023-05-11 09:37:34.536262667 +0000 UTC m=+273.810309241" duration="11.006ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_dir end="2023-05-11 09:37:34.536146053 +0000 UTC m=+273.810192622" duration="9.749ยตs" 2023-05-11T11:37:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth 2023-05-11T11:37:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="330.304ยตs" 2023-05-11T11:37:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.221457ms 2023-05-11T11:37:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.629526ms 2023-05-11T11:37:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="731.779ยตs" 2023-05-11T11:37:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats\" check" task=group-nats time_limit=20s 2023-05-11T11:37:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats time_limit=10s 2023-05-11T11:37:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="323.862ยตs" 2023-05-11T11:37:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.646602ms 2023-05-11T11:37:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:37:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="484.687ยตs" 2023-05-11T11:37:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54244: EOF 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15472 duration="232.299ยตs" 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="790.635ยตs" 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=3.137912ms 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=2.167956ms 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.013103ms 2023-05-11T11:37:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15472 duration="436.014ยตs" 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="343.71ยตs" 2023-05-11T11:37:29+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:29+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.077878565s 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15473 total=21 pulled=9 filtered=12 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:29+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.584930615s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.47694857s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:bd7464de-fa72-736b-e57c-6782cc7d7202 method=GET url="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/observability@default.global) successful contact, resetting retries 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.953103796s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/observability@default.global): GET /v1/var/nomad/jobs/observability?index=3271&stale=true&wait=1m0s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/observability@default.global) no new data (index was the same) 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/observability@default.global): returned "nomad/jobs/observability" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/observability@default.global) marking successful data response 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" duration=1m3.60415911s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54236: EOF 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54240: EOF 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.958466ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=3.588802ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="968.799ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.556355ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=18.22607347s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15472 total=21 pulled=9 filtered=12 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15471 duration=239.133047ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15471 duration=199.514148ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.381516373s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth end="2023-05-11 09:37:28.845350382 +0000 UTC m=+268.119396957" duration="47.868ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:116b3d1ffd4c7a8c4e26f20789e3dd84d9b5555afecab14063d48bdce8a7e468 references=0 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=stats_hook start="2023-05-11 09:37:28.845321707 +0000 UTC m=+268.119368284" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=stats_hook end="2023-05-11 09:37:28.845329251 +0000 UTC m=+268.119375825" duration="7.541ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth reason="Restart within policy" delay=5.688450662s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_services start="2023-05-11 09:37:28.845338826 +0000 UTC m=+268.119385402" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth start="2023-05-11 09:37:28.845302515 +0000 UTC m=+268.119349089" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth state=pending 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_services end="2023-05-11 09:37:28.845343808 +0000 UTC m=+268.119390383" duration="4.981ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth type=Restarting msg="Task restarting in 5.688450662s" failed=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=13794 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=running desired_status="" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54220: EOF 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54224: EOF 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template start="2023-05-11 09:37:28.691093174 +0000 UTC m=+267.965139748" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks start="2023-05-11 09:37:28.691191267 +0000 UTC m=+267.965237836" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth end="2023-05-11 09:37:28.691237371 +0000 UTC m=+267.965283946" duration="409.179ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=running desired_status="" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template end="2023-05-11 09:37:28.691147653 +0000 UTC m=+267.965194232" duration="54.484ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_services end="2023-05-11 09:37:28.691184137 +0000 UTC m=+267.965230716" duration="28.444ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks end="2023-05-11 09:37:28.691232117 +0000 UTC m=+267.965278694" duration="40.858ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 from=main to=poststart 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_services start="2023-05-11 09:37:28.691155703 +0000 UTC m=+267.965202272" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=stats_hook end="2023-05-11 09:37:28.691018337 +0000 UTC m=+267.965064911" duration="109.042ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 from=poststart to=wait_alloc 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth start="2023-05-11 09:37:28.690828183 +0000 UTC m=+267.964874767" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=stats_hook start="2023-05-11 09:37:28.69090929 +0000 UTC m=+267.964955869" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth state=running 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth type=Started msg="Task started by client" failed=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=4.540776ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:28.688Z 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin972546717 network=unix timestamp=2023-05-11T09:37:28.686Z 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="469.311ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.751456ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.383599143s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15471 total=21 pulled=9 filtered=12 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13794 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=6ecf88aa4261f55281889c46e01e7008cdb5575759f2bacdeeddab9c72cf3fbf 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15466 duration=12.975010463s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15468 duration=6.034082087s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=19.625886684s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=6ecf88aa4261f55281889c46e01e7008cdb5575759f2bacdeeddab9c72cf3fbf 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="394.884ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=forwardauth labels="map[com.github.logunifier.application.name:mesosphere com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:3.1.0 com.hashicorp.nomad.alloc_id:87cd4d30-a263-1598-0a57-72046f840473 com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-ingress com.hashicorp.nomad.task_name:forwardauth]" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=forwardauth volumes=["/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc:/alloc", "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/local:/local", "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets:/secrets"] 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=forwardauth network_mode=container:64e1e510b0787e191845d6e3dc75d2ef07b762138520ea4b0f2308509d18a026 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=forwardauth 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=forwardauth binds="[]string{\"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/local:/local\", \"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets:/secrets\"}" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/mesosphere/traefik-forward-auth:3.1.0 image_id=sha256:116b3d1ffd4c7a8c4e26f20789e3dd84d9b5555afecab14063d48bdce8a7e468 references=1 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=forwardauth memory=34359738368 memory_reservation=268435456 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=forwardauth container_name=forwardauth-87cd4d30-a263-1598-0a57-72046f840473 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth end="2023-05-11 09:37:28.526112848 +0000 UTC m=+267.800159421" duration=40.738871ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks end="2023-05-11 09:37:28.526078739 +0000 UTC m=+267.800125314" duration="524.831ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth allocID=87cd4d30-a263-1598-0a57-72046f840473 task="&{forwardauth docker map[image:registry.cloud.private/mesosphere/traefik-forward-auth:3.1.0 labels:[map[com.github.logunifier.application.name:mesosphere com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:3.1.0]] ports:[auth]] map[CLIENT_ID:ingress ENCRYPTION_KEY:45659373957778734945638459467936 LIFETIME:60 LOG_LEVEL:warn PROVIDER_URI:https://security.cloud.private/realms/nomadder SECRET:9e7d7b0776f032e3a1996272c2fe22d2] [] [0xc0009e1500] [] [] 0xc0036c3920 0xc002fb39b0 map[] 5s 0xc002cf2a68 [] false 0s [0xc0021c1100] [] }" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.141064568s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:87cd4d30-a263-1598-0a57-72046f840473 method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=script_checks start="2023-05-11 09:37:28.525553902 +0000 UTC m=+267.799600483" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template end="2023-05-11 09:37:28.525482263 +0000 UTC m=+267.799528838" duration=5.923164ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets/env.vars" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets/env.vars" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) starting fetch 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) received data 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.816194019s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 550a891296c5b27beb51ed9e1b2f00e5 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=18.403769035s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" duration=1.113951ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:87cd4d30-a263-1598-0a57-72046f840473 method=GET url="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) starting fetch 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?stale=true&wait=1m0s 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (watcher) nomad.var.block(nomad/jobs/security@default.global) starting 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 550a891296c5b27beb51ed9e1b2f00e5 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n CLIENT_SECRET = {{.keycloak_ingress_secret}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/forwardauth"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=api end="2023-05-11 09:37:28.519506576 +0000 UTC m=+267.793553152" duration="723.208ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=template start="2023-05-11 09:37:28.519559094 +0000 UTC m=+267.793605674" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=devices end="2023-05-11 09:37:28.518727148 +0000 UTC m=+267.792773725" duration="776.702ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=api start="2023-05-11 09:37:28.518783364 +0000 UTC m=+267.792829944" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=devices start="2023-05-11 09:37:28.517950446 +0000 UTC m=+267.791997023" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=artifacts start="2023-05-11 09:37:28.517170516 +0000 UTC m=+267.791217095" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=volumes end="2023-05-11 09:37:28.517111598 +0000 UTC m=+267.791158173" duration="811.348ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=artifacts end="2023-05-11 09:37:28.517884401 +0000 UTC m=+267.791930981" duration="713.886ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=volumes start="2023-05-11 09:37:28.516300246 +0000 UTC m=+267.790346825" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=dispatch_payload end="2023-05-11 09:37:28.516241218 +0000 UTC m=+267.790287792" duration="667.699ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=dispatch_payload start="2023-05-11 09:37:28.515573514 +0000 UTC m=+267.789620093" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=logmon end="2023-05-11 09:37:28.515501387 +0000 UTC m=+267.789547962" duration=27.158985ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth @module=logmon path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc/logs/.forwardauth.stderr.fifo timestamp=2023-05-11T09:37:28.514Z 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth @module=logmon path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc/logs/.forwardauth.stdout.fifo timestamp=2023-05-11T09:37:28.514Z 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth @module=logmon address=/tmp/plugin950233625 network=unix timestamp=2023-05-11T09:37:28.513Z 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth version=2 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=identity end="2023-05-11 09:37:28.488271677 +0000 UTC m=+267.762318250" duration="556.363ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth path=/usr/local/bin/nomad pid=13744 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=logmon start="2023-05-11 09:37:28.488342398 +0000 UTC m=+267.762388977" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth path=/usr/local/bin/nomad 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=identity start="2023-05-11 09:37:28.487715309 +0000 UTC m=+267.761761887" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_dir end="2023-05-11 09:37:28.487643481 +0000 UTC m=+267.761690056" duration=1.404566ms 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=task_dir start="2023-05-11 09:37:28.486238911 +0000 UTC m=+267.760285490" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=validate end="2023-05-11 09:37:28.486169817 +0000 UTC m=+267.760216391" duration="718.876ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth start="2023-05-11 09:37:28.485373974 +0000 UTC m=+267.759420550" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 from=prestart to=main 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth name=validate start="2023-05-11 09:37:28.485450937 +0000 UTC m=+267.759497515" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak start="2023-05-11 09:37:28.484332687 +0000 UTC m=+267.758379262" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak end="2023-05-11 09:37:28.484387911 +0000 UTC m=+267.758434480" duration="55.218ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak reason="Restart unnecessary as task terminated successfully" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=stats_hook start="2023-05-11 09:37:28.484354191 +0000 UTC m=+267.758400766" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=task_services end="2023-05-11 09:37:28.484382373 +0000 UTC m=+267.758428943" duration="6.07ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:7cfbbec8963d8f13e6c70416d6592e1cc10f47a348131290a55d43c3acab3fb9 references=0 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=stats_hook end="2023-05-11 09:37:28.484362758 +0000 UTC m=+267.758409328" duration="8.562ยตs" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=task_services start="2023-05-11 09:37:28.484376304 +0000 UTC m=+267.758422873" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak state=dead 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=13469 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=running desired_status="" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:28+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:37:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-1022bc44-6bd1-c8c5-62c5-4166c31f7afc-group-keycloak-keycloak-8080-sidecar-proxy:1 2023-05-11T11:37:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-b642dad76eb99166b806b6f6fdb93e5a535b1bdd 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52490: EOF 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.795956ms 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="581.892ยตs" 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.274651ms 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.46717ms 2023-05-11T11:37:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="410.203ยตs" 2023-05-11T11:37:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:37:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-c6654054fecb60c2c6f4d9631a926c1c2d889962 2023-05-11T11:37:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.207584ms 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=16.221175595s 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) no new data (index was the same) 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m2.414674233s 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="345.288ยตs" 2023-05-11T11:37:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:42324 2023-05-11T11:37:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:37:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="346.193ยตs" 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52478: EOF 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=10.078535ms 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration=5.511103ms 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.963128ms 2023-05-11T11:37:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.971764ms 2023-05-11T11:37:24+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-c6bf6764df379854a80dff2c692a9667f13b0dde 2023-05-11T11:37:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="615.002ยตs" 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15469 duration="528.755ยตs" 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=ff1961c4-7500-c093-1a2d-586b96309c4b type=service namespace=default job_id=security node_id="" triggered_by=deployment-watcher 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=ff1961c4-7500-c093-1a2d-586b96309c4b job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=ff1961c4-7500-c093-1a2d-586b96309c4b job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=ff1961c4-7500-c093-1a2d-586b96309c4b type=service namespace=default job_id=security node_id="" triggered_by=deployment-watcher 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=ff1961c4-7500-c093-1a2d-586b96309c4b job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15460 duration=11.424320988s 2023-05-11T11:37:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="661.903ยตs" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres failure=true event="1683797842786943300 - Restart Signaled" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 1, 0, 0>" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=857749e0-52fe-92ee-7bef-fafbe67605ee task=group-keycloak-postgres check=keycloak_postgres_ping 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: Restart requested: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres failure=true event="1683797842786943300 - Restart Signaled" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished alloc task restart hook: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee name=group_services end="2023-05-11 09:37:22.787151774 +0000 UTC m=+262.061198350" duration="174.883ยตs" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee check=keycloak_postgres_ping task=group-keycloak-postgres 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running alloc task restart hook: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee name=group_services start="2023-05-11 09:37:22.78697689 +0000 UTC m=+262.061023467" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration=1.034364ms 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52466: EOF 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.417558ms 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.296073ms 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.561746ms 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.581779951s 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15468 total=21 pulled=9 filtered=12 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15467 duration=6.06596775s 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15459 duration=10.933958179s 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=18.111290303s 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.runner_hook.alloc_health_watcher: health set: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 healthy=true 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=running desired_status="" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="365.452ยตs" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.259281ms 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="589.317ยตs" 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.236196ms 2023-05-11T11:37:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.394935ms 2023-05-11T11:37:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.882104ms 2023-05-11T11:37:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="466.1ยตs" 2023-05-11T11:37:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="781.649ยตs" 2023-05-11T11:37:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.64814ms 2023-05-11T11:37:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="559.672ยตs" 2023-05-11T11:37:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.808386ms 2023-05-11T11:37:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.53744ms 2023-05-11T11:37:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:37:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.529694ms 2023-05-11T11:37:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-b642dad76eb99166b806b6f6fdb93e5a535b1bdd 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="796.156ยตs" 2023-05-11T11:37:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-1022bc44-6bd1-c8c5-62c5-4166c31f7afc-group-keycloak-keycloak-8080-sidecar-proxy:2 2023-05-11T11:37:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54182: EOF 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration=1.804086ms 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.964756ms 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=2.253478ms 2023-05-11T11:37:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.447036ms 2023-05-11T11:37:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c6654054fecb60c2c6f4d9631a926c1c2d889962 2023-05-11T11:37:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:37:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.042904ms 2023-05-11T11:37:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.161722ms 2023-05-11T11:37:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:49212 2023-05-11T11:37:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:37:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="593.728ยตs" 2023-05-11T11:37:14+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c6bf6764df379854a80dff2c692a9667f13b0dde 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="369.677ยตs" 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54140: EOF 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="693.325ยตs" 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54126: EOF 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=2.106765ms 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54116: read tcp 10.21.21.41:4646->10.21.21.42:54116: read: connection reset by peer 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/87cd4d30-a263-1598-0a57-72046f840473/stats duration=1.301748ms 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=1.658611ms 2023-05-11T11:37:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15465 duration="600.215ยตs" 2023-05-11T11:37:14+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-27dfb19c-1e44-2e49-a689-0a4e369f7bd2-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15463 duration="190.536ยตs" 2023-05-11T11:37:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="496.135ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.682910727s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15467 total=21 pulled=9 filtered=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.906885091s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 method=GET url="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.774474458s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=10.035756976s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/minio@default.global): GET /v1/var/nomad/jobs/minio?index=10463&stale=true&wait=1m0s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) no new data (index was the same) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" duration=1m2.046210196s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) marking successful data response 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/minio@default.global): returned "nomad/jobs/minio" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/minio@default.global) successful contact, resetting retries 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.733756163s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15466 total=21 pulled=9 filtered=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.072969057s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=template end="2023-05-11 09:37:12.656344579 +0000 UTC m=+251.930391160" duration="102.522ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=stats_hook end="2023-05-11 09:37:12.656195643 +0000 UTC m=+251.930242220" duration="156.578ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc from=poststart to=wait_alloc 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=task_services start="2023-05-11 09:37:12.656357759 +0000 UTC m=+251.930404331" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=stats_hook start="2023-05-11 09:37:12.656039061 +0000 UTC m=+251.930085642" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=script_checks start="2023-05-11 09:37:12.6564273 +0000 UTC m=+251.930473874" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=script_checks end="2023-05-11 09:37:12.656537497 +0000 UTC m=+251.930584080" duration="110.206ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak end="2023-05-11 09:37:12.656549664 +0000 UTC m=+251.930596236" duration="637.206ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=task_services end="2023-05-11 09:37:12.656418578 +0000 UTC m=+251.930465159" duration="60.828ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc from=main to=poststart 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=template start="2023-05-11 09:37:12.656242056 +0000 UTC m=+251.930288638" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak start="2023-05-11 09:37:12.655912456 +0000 UTC m=+251.929959030" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak state=running 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak type=Started msg="Task started by client" failed=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:12.652Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin152051485 network=unix @module=docker_logger timestamp=2023-05-11T09:37:12.650Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13570 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=447ff170e31c05006371236b06fed397af0607316221a5813d350c1f07ecedf2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54108: EOF 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:54102: EOF 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="493.331ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/stats duration=1.44761ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/stats duration=2.374997ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=447ff170e31c05006371236b06fed397af0607316221a5813d350c1f07ecedf2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15461 duration="355.809ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/stack/core/keycloak:21.1.1.0 image_id=sha256:c3f97ccc75b7ce6b444f7a9d1093acfdd6a78038a63e2993cf34f9d2cc409e36 references=1 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=keycloak memory=34359738368 memory_reservation=2147483648 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=keycloak container_name=keycloak-1022bc44-6bd1-c8c5-62c5-4166c31f7afc 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=keycloak binds="[]string{\"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/local:/local\", \"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/secrets:/secrets\"}" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=keycloak volumes=["/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc:/alloc", "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/local:/local", "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/secrets:/secrets"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=keycloak network_mode=container:60ce164cee8b69f91549c431446ae963eb248a954ae5f584c285044c4ed23c40 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=keycloak 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=keycloak labels="map[com.github.logunifier.application.name:keycloak com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.version:21.1.1.0 com.hashicorp.nomad.alloc_id:1022bc44-6bd1-c8c5-62c5-4166c31f7afc com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak com.hashicorp.nomad.task_name:keycloak]" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=script_checks end="2023-05-11 09:37:12.490816572 +0000 UTC m=+251.764863146" duration="568.181ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=script_checks start="2023-05-11 09:37:12.490248385 +0000 UTC m=+251.764294965" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak allocID=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task="&{keycloak docker map[args:[start --import-realm --optimized] image:registry.cloud.private/stack/core/keycloak:21.1.1.0 labels:[map[com.github.logunifier.application.name:keycloak com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.version:21.1.1.0]] ports:[ui]] map[KC_DB:postgres KC_DB_SCHEMA:keycloak KC_DB_URL_HOST:${NOMAD_UPSTREAM_IP_keycloak_postgres} KC_DB_URL_PORT:${NOMAD_UPSTREAM_PORT_keycloak_postgres} KC_DB_USERNAME:keycloak KC_HEALTH_ENABLED:true KC_HOSTNAME:security.cloud.private KC_HOSTNAME_STRICT_HTTPS:false KC_HTTP_ENABLED:true KC_PROXY:edge KEYCLOAK_ADMIN:admin] [] [0xc0012c6f00] [] [] 0xc000a44660 0xc0022b45d0 map[] 5s 0xc002f14d50 [] false 0s [] [] }" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=template end="2023-05-11 09:37:12.490150013 +0000 UTC m=+251.764196591" duration=5.859282ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.222304305s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak end="2023-05-11 09:37:12.490872934 +0000 UTC m=+251.764919508" duration=40.824995ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:1022bc44-6bd1-c8c5-62c5-4166c31f7afc method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/secrets/env.vars" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/secrets/env.vars" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.38647561s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" duration="917.679ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 7382522149c1fa0ee539c4acc2b323ba 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) received data 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) starting fetch 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=13.797234775s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:1022bc44-6bd1-c8c5-62c5-4166c31f7afc method=GET url="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (watcher) nomad.var.block(nomad/jobs/security@default.global) starting 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) starting fetch 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?stale=true&wait=1m0s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 7382522149c1fa0ee539c4acc2b323ba 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n KEYCLOAK_ADMIN_PASSWORD = {{.keycloak_password}}\n KC_DB_PASSWORD = {{.keycloak_db_password}}\n KC_NOMADDER_CLIENT_SECRET = {{.keycloak_ingress_secret}}\n KC_NOMADDER_CLIENT_SECRET_GRAFANA = {{.keycloak_secret_observability_grafana}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/keycloak"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=api end="2023-05-11 09:37:12.484148573 +0000 UTC m=+251.758195149" duration="599.333ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=template start="2023-05-11 09:37:12.484290718 +0000 UTC m=+251.758337309" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=devices end="2023-05-11 09:37:12.483476988 +0000 UTC m=+251.757523563" duration="484.932ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=api start="2023-05-11 09:37:12.483549236 +0000 UTC m=+251.757595816" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=devices start="2023-05-11 09:37:12.482992051 +0000 UTC m=+251.757038631" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=artifacts start="2023-05-11 09:37:12.482407578 +0000 UTC m=+251.756454157" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=volumes end="2023-05-11 09:37:12.482338094 +0000 UTC m=+251.756384671" duration="534.823ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=artifacts end="2023-05-11 09:37:12.482919576 +0000 UTC m=+251.756966150" duration="511.993ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=dispatch_payload start="2023-05-11 09:37:12.481016784 +0000 UTC m=+251.755063364" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=volumes start="2023-05-11 09:37:12.481803263 +0000 UTC m=+251.755849848" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=dispatch_payload end="2023-05-11 09:37:12.48170693 +0000 UTC m=+251.755753510" duration="690.146ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=logmon end="2023-05-11 09:37:12.480890632 +0000 UTC m=+251.754937209" duration=27.921751ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc/logs/.keycloak.stdout.fifo timestamp=2023-05-11T09:37:12.478Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc/logs/.keycloak.stderr.fifo timestamp=2023-05-11T09:37:12.478Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak @module=logmon address=/tmp/plugin928500427 network=unix timestamp=2023-05-11T09:37:12.476Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak version=2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak path=/usr/local/bin/nomad 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak path=/usr/local/bin/nomad pid=13515 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=identity start="2023-05-11 09:37:12.45234754 +0000 UTC m=+251.726394119" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=logmon start="2023-05-11 09:37:12.452968879 +0000 UTC m=+251.727015458" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=identity end="2023-05-11 09:37:12.452899136 +0000 UTC m=+251.726945721" duration="551.602ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=task_dir end="2023-05-11 09:37:12.452271949 +0000 UTC m=+251.726318523" duration=1.36235ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak start="2023-05-11 09:37:12.450047938 +0000 UTC m=+251.724094513" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=validate start="2023-05-11 09:37:12.450212469 +0000 UTC m=+251.724259045" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=validate end="2023-05-11 09:37:12.450822673 +0000 UTC m=+251.724869249" duration="610.204ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak name=task_dir start="2023-05-11 09:37:12.450909594 +0000 UTC m=+251.724956173" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres state=dead 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=task_services end="2023-05-11 09:37:12.449042645 +0000 UTC m=+251.723089215" duration="5.733ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres reason="Restart unnecessary as task terminated successfully" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc from=prestart to=main 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres end="2023-05-11 09:37:12.449048442 +0000 UTC m=+251.723095017" duration="132.318ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished exited hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=stats_hook end="2023-05-11 09:37:12.449022844 +0000 UTC m=+251.723069420" duration="85.288ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=task_services start="2023-05-11 09:37:12.449036907 +0000 UTC m=+251.723083482" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres start="2023-05-11 09:37:12.448916125 +0000 UTC m=+251.722962699" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running exited hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=stats_hook start="2023-05-11 09:37:12.448937563 +0000 UTC m=+251.722984132" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:7cfbbec8963d8f13e6c70416d6592e1cc10f47a348131290a55d43c3acab3fb9 references=1 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=13468 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.651048577s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15465 total=21 pulled=9 filtered=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=12.056335215s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak end="2023-05-11 09:37:12.407600394 +0000 UTC m=+251.681646973" duration="464.537ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=task_services start="2023-05-11 09:37:12.407426974 +0000 UTC m=+251.681473544" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=stats_hook end="2023-05-11 09:37:12.407404716 +0000 UTC m=+251.681451297" duration="121.339ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=task_services end="2023-05-11 09:37:12.40746386 +0000 UTC m=+251.681510439" duration="36.895ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=script_checks end="2023-05-11 09:37:12.407526047 +0000 UTC m=+251.681572628" duration="49.317ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak start="2023-05-11 09:37:12.407135859 +0000 UTC m=+251.681182436" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=stats_hook start="2023-05-11 09:37:12.40728336 +0000 UTC m=+251.681329958" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=script_checks start="2023-05-11 09:37:12.407476733 +0000 UTC m=+251.681523311" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak type=Started msg="Task started by client" failed=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak state=running 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres end="2023-05-11 09:37:12.403772222 +0000 UTC m=+251.677818802" duration="779.494ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=script_checks end="2023-05-11 09:37:12.403736895 +0000 UTC m=+251.677783474" duration="240.385ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=script_checks start="2023-05-11 09:37:12.403496511 +0000 UTC m=+251.677543089" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=task_services start="2023-05-11 09:37:12.403299327 +0000 UTC m=+251.677345915" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=stats_hook start="2023-05-11 09:37:12.40305852 +0000 UTC m=+251.677105101" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=stats_hook end="2023-05-11 09:37:12.403207565 +0000 UTC m=+251.677254143" duration="149.042ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=task_services end="2023-05-11 09:37:12.40342624 +0000 UTC m=+251.677472822" duration="126.907ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres state=running 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres start="2023-05-11 09:37:12.402992734 +0000 UTC m=+251.677039308" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:12.396Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:12.397Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2826275756 network=unix timestamp=2023-05-11T09:37:12.394Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker network=unix @module=docker_logger address=/tmp/plugin1658978929 timestamp=2023-05-11T09:37:12.391Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.248183ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13469 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=ebd9b861a80e271335a077a2495a90d6032cc72c9d68ef204baaf7e3eecb2e94 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13468 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=12dc8e6496c65864e08ad66de131a6cc6523da0e21c634460529765ad4d3d367 2023-05-11T11:37:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-27dfb19c-1e44-2e49-a689-0a4e369f7bd2-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:37:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-6b5c913b6ca6013a964c0cd040901fdffb6c7df5 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=ebd9b861a80e271335a077a2495a90d6032cc72c9d68ef204baaf7e3eecb2e94 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=12dc8e6496c65864e08ad66de131a6cc6523da0e21c634460529765ad4d3d367 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=stats_hook start="2023-05-11 09:37:12.222052346 +0000 UTC m=+251.496098925" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 from=poststart to=wait_alloc 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=script_checks end="2023-05-11 09:37:12.222438702 +0000 UTC m=+251.496485281" duration="99.516ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres start="2023-05-11 09:37:12.222001907 +0000 UTC m=+251.496048482" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=task_services start="2023-05-11 09:37:12.222288041 +0000 UTC m=+251.496334621" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=template start="2023-05-11 09:37:12.222234697 +0000 UTC m=+251.496281278" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=script_checks start="2023-05-11 09:37:12.222339194 +0000 UTC m=+251.496385765" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.script_checks: tasklet executing: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=stats_hook end="2023-05-11 09:37:12.22221541 +0000 UTC m=+251.496261986" duration="163.061ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=running desired_status="" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=task_services end="2023-05-11 09:37:12.222330121 +0000 UTC m=+251.496376701" duration="42.08ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=template end="2023-05-11 09:37:12.222277995 +0000 UTC m=+251.496324575" duration="43.297ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 from=main to=poststart 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 done=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres end="2023-05-11 09:37:12.222447684 +0000 UTC m=+251.496494253" duration="445.771ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres state=running 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:12.218Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1972935584 network=unix timestamp=2023-05-11T09:37:12.216Z 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=await-for-keycloak network_mode=container:64e1e510b0787e191845d6e3dc75d2ef07b762138520ea4b0f2308509d18a026 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=await-for-keycloak-postgres labels="map[com.hashicorp.nomad.alloc_id:1022bc44-6bd1-c8c5-62c5-4166c31f7afc com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak com.hashicorp.nomad.task_name:await-for-keycloak-postgres]" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=await-for-keycloak-postgres container_name=await-for-keycloak-postgres-1022bc44-6bd1-c8c5-62c5-4166c31f7afc 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=await-for-keycloak labels="map[com.hashicorp.nomad.alloc_id:87cd4d30-a263-1598-0a57-72046f840473 com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-ingress com.hashicorp.nomad.task_name:await-for-keycloak]" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=await-for-keycloak container_name=await-for-keycloak-87cd4d30-a263-1598-0a57-72046f840473 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container startup command: driver=docker task_name=await-for-keycloak command="sh -c echo -n 'Waiting for service keycloak'; until nslookup keycloak.service.consul 2>&1 >/dev/null; do echo '.'; sleep 2; done" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=await-for-keycloak-postgres volumes=["/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc:/alloc", "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/local:/local", "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/secrets:/secrets"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=await-for-keycloak-postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/secrets:/secrets\"}" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/busybox:1.36 image_id=sha256:7cfbbec8963d8f13e6c70416d6592e1cc10f47a348131290a55d43c3acab3fb9 references=2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=await-for-keycloak volumes=["/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc:/alloc", "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/local:/local", "/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/secrets:/secrets"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=await-for-keycloak-postgres memory=1073741824 memory_reservation=134217728 cpu_shares=200 cpu_quota=0 cpu_period=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=await-for-keycloak binds="[]string{\"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/local:/local\", \"/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/secrets:/secrets\"}" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=await-for-keycloak-postgres 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=await-for-keycloak 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container startup command: driver=docker task_name=await-for-keycloak-postgres command="sh -c echo -n 'Waiting for service keycloak-postgres'; until nslookup keycloak-postgres.service.consul 2>&1 >/dev/null; do echo '.'; sleep 2; done" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=await-for-keycloak-postgres network_mode=container:60ce164cee8b69f91549c431446ae963eb248a954ae5f584c285044c4ed23c40 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/busybox:1.36 image_id=sha256:7cfbbec8963d8f13e6c70416d6592e1cc10f47a348131290a55d43c3acab3fb9 references=1 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=await-for-keycloak memory=1073741824 memory_reservation=134217728 cpu_shares=200 cpu_quota=0 cpu_period=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: docker pull succeeded: driver=docker image_ref=registry.cloud.private/busybox:1.36 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13330 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=af508eef5709a4970d9e5b4eaedaa6dbaf8961c3a7c008e58d27577a40c4ac4f 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.479315324s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15464 total=21 pulled=9 filtered=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.794353677s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=af508eef5709a4970d9e5b4eaedaa6dbaf8961c3a7c008e58d27577a40c4ac4f 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=keycloak_postgres container_name=keycloak_postgres-27dfb19c-1e44-2e49-a689-0a4e369f7bd2 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=keycloak_postgres 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=keycloak_postgres network_mode=container:f97917aecbc609cb170d8f89386f9b960d402341fcfa8d0737558c0b62d00600 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=keycloak_postgres labels="map[com.hashicorp.nomad.alloc_id:27dfb19c-1e44-2e49-a689-0a4e369f7bd2 com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:keycloak_postgres]" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=keycloak_postgres memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=keycloak_postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local/initddb.sql:/docker-entrypoint-initdb.d/initddb.sql\"}" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/postgres:14.5 image_id=sha256:cefd1c9e490c8b581d834d878081cf64c133df1f9f443c5e5f8d94fbd7c7a1d4 references=1 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/postgres:14.5 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=keycloak_postgres volumes=["/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc:/alloc", "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local:/local", "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/secrets:/secrets", "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local/initddb.sql:/docker-entrypoint-initdb.d/initddb.sql"] 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres allocID=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task="&{keycloak_postgres docker map[image:registry.cloud.private/postgres:14.5 ports:[db] volumes:[local/initddb.sql:/docker-entrypoint-initdb.d/initddb.sql]] map[PGDATA:/var/lib/postgresql/data/pgdata PGUSER:keycloak POSTGRES_DB:keycloak POSTGRES_INITDB_ARGS:--encoding=UTF8 POSTGRES_USER:keycloak] [] [0xc0009e1080 0xc0009e1140] [] [] 0xc0036c2ea0 0xc002fb2c60 map[] 5s 0xc000d0b770 [] false 0s [0xc0021c0e00] [] }" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres end="2023-05-11 09:37:12.00797799 +0000 UTC m=+251.282024568" duration=56.788974ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=template end="2023-05-11 09:37:12.007204868 +0000 UTC m=+251.281251442" duration=10.218806ms 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=script_checks start="2023-05-11 09:37:12.007291522 +0000 UTC m=+251.281338102" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=script_checks end="2023-05-11 09:37:12.007933542 +0000 UTC m=+251.281980124" duration="642.022ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local/initddb.sql" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/secrets/env.vars" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 3d38a7b72b56e929ec016a5e38bef38f 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/secrets/env.vars" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=14.763755947s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:27dfb19c-1e44-2e49-a689-0a4e369f7bd2 method=GET url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) received data 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): returned "nomad/jobs/security" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" duration="962.493ยตs" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?index=10462&stale=true&wait=1m0s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) starting fetch 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 6e2d32cd4b2988cfe8f68f510d993549 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) marking successful data response 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) successful contact, resetting retries 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] http: Authenticated request: id=alloc:27dfb19c-1e44-2e49-a689-0a4e369f7bd2 method=GET url="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.477902924s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (view) nomad.var.block(nomad/jobs/security@default.global) starting fetch 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 3d38a7b72b56e929ec016a5e38bef38f 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: (watcher) nomad.var.block(nomad/jobs/security@default.global) starting 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] agent: nomad.var.block(nomad/jobs/security@default.global): GET /v1/var/nomad/jobs/security?stale=true&wait=1m0s 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local/initddb.sql" 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:37:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 6e2d32cd4b2988cfe8f68f510d993549 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" CREATE SCHEMA IF NOT EXISTS keycloak;\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local/initddb.sql","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres"},{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n POSTGRES_PASSWORD = {{.keycloak_db_password}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/keycloak_postgres/local/initddb.sql" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=api end="2023-05-11 09:37:11.996867386 +0000 UTC m=+251.270913961" duration="914.913ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=template start="2023-05-11 09:37:11.996986054 +0000 UTC m=+251.271032636" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=api start="2023-05-11 09:37:11.995952467 +0000 UTC m=+251.269999048" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=devices end="2023-05-11 09:37:11.995829608 +0000 UTC m=+251.269876191" duration=1.383036ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=artifacts end="2023-05-11 09:37:11.994351693 +0000 UTC m=+251.268398276" duration="884.09ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=devices start="2023-05-11 09:37:11.994446577 +0000 UTC m=+251.268493155" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=volumes end="2023-05-11 09:37:11.993285213 +0000 UTC m=+251.267331792" duration="652.223ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=artifacts start="2023-05-11 09:37:11.993467603 +0000 UTC m=+251.267514186" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=dispatch_payload end="2023-05-11 09:37:11.992555053 +0000 UTC m=+251.266601627" duration="623.068ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=volumes start="2023-05-11 09:37:11.99263299 +0000 UTC m=+251.266679569" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=logmon end="2023-05-11 09:37:11.991837866 +0000 UTC m=+251.265884449" duration=36.957193ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=dispatch_payload start="2023-05-11 09:37:11.99193198 +0000 UTC m=+251.265978559" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc/logs/.keycloak_postgres.stdout.fifo timestamp=2023-05-11T09:37:11.990Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc/logs/.keycloak_postgres.stderr.fifo timestamp=2023-05-11T09:37:11.990Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres @module=logmon address=/tmp/plugin3233796807 network=unix timestamp=2023-05-11T09:37:11.988Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.988598ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres path=/usr/local/bin/nomad pid=13270 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=identity end="2023-05-11 09:37:11.954776064 +0000 UTC m=+251.228822647" duration="712.659ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=logmon start="2023-05-11 09:37:11.954880675 +0000 UTC m=+251.228927256" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=identity start="2023-05-11 09:37:11.954063405 +0000 UTC m=+251.228109988" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=task_dir end="2023-05-11 09:37:11.953900416 +0000 UTC m=+251.227946994" duration=1.862134ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=task_dir start="2023-05-11 09:37:11.952038278 +0000 UTC m=+251.226084860" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=running desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=running desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=validate end="2023-05-11 09:37:11.951937761 +0000 UTC m=+251.225984343" duration="673.651ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres start="2023-05-11 09:37:11.951189013 +0000 UTC m=+251.225235594" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=script_checks end="2023-05-11 09:37:11.951020148 +0000 UTC m=+251.225066730" duration="86.915ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 from=prestart to=main 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres name=validate start="2023-05-11 09:37:11.951264108 +0000 UTC m=+251.225310692" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres end="2023-05-11 09:37:11.951032795 +0000 UTC m=+251.225079374" duration="315.646ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=task_services end="2023-05-11 09:37:11.950924226 +0000 UTC m=+251.224970805" duration="41.608ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=script_checks start="2023-05-11 09:37:11.950933242 +0000 UTC m=+251.224979815" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=task_services start="2023-05-11 09:37:11.95088262 +0000 UTC m=+251.224929197" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=stats_hook start="2023-05-11 09:37:11.95077609 +0000 UTC m=+251.224822669" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=stats_hook end="2023-05-11 09:37:11.950866519 +0000 UTC m=+251.224913099" duration="90.43ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres start="2023-05-11 09:37:11.950717145 +0000 UTC m=+251.224763728" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres state=running 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:11.948Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin338668771 network=unix timestamp=2023-05-11T09:37:11.946Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13257 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=439d7de4fcb74bf1c62a664d98d7673b8ed6b5f84bfab990933125eb925d0a3a 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=439d7de4fcb74bf1c62a664d98d7673b8ed6b5f84bfab990933125eb925d0a3a 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-keycloak-postgres network_mode=container:f97917aecbc609cb170d8f89386f9b960d402341fcfa8d0737558c0b62d00600 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=3 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=connect-proxy-keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-keycloak-postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/secrets:/secrets\"}" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=connect-proxy-keycloak-postgres volumes=["/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc:/alloc", "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/local:/local", "/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/secrets:/secrets"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-keycloak-postgres labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:27dfb19c-1e44-2e49-a689-0a4e369f7bd2 com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:connect-proxy-keycloak-postgres]" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-keycloak-postgres container_name=connect-proxy-keycloak-postgres-27dfb19c-1e44-2e49-a689-0a4e369f7bd2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-keycloak-postgres memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres end="2023-05-11 09:37:11.76789154 +0000 UTC m=+251.041938113" duration=223.019105ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=script_checks end="2023-05-11 09:37:11.767841551 +0000 UTC m=+251.041888126" duration="757.499ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=script_checks start="2023-05-11 09:37:11.767084047 +0000 UTC m=+251.041130627" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres allocID=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task="&{connect-proxy-keycloak-postgres docker map[args:[-c ${NOMAD_SECRETS_DIR}/envoy_bootstrap.json -l ${meta.connect.log_level} --concurrency ${meta.connect.proxy_concurrency} --disable-hot-restart] image:envoyproxy/envoy:v1.25.1 labels:[map[com.github.logunifier.application.pattern.key:envoy]]] map[] [] [] [${attr.consul.version} semver >= 1.8.0 ${attr.consul.grpc} > 0] [] 0xc0036c2f00 0xc002fb2d50 0xc0009eff38 map[] 5s 0xc000d0b7e8 [] false 0s [] [] connect-proxy:keycloak-postgres }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=envoy_bootstrap end="2023-05-11 09:37:11.766943888 +0000 UTC m=+251.040990466" duration=157.090189ms 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/acl/token/self from=127.0.0.1:59062 error="ACL support disabled" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=17.310788807s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15463 total=21 pulled=9 filtered=12 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15459 duration=59.378334ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.547832157s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: check for SI token for task: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres task=connect-proxy-keycloak-postgres exists=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping envoy: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres namespace="" proxy_id=_nomad-task-27dfb19c-1e44-2e49-a689-0a4e369f7bd2-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service=keycloak-postgres gateway="" bootstrap_file=/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/secrets/envoy_bootstrap.json grpc_addr=unix://alloc/tmp/consul_grpc.sock admin_bind=127.0.0.2:19001 ready_bind=127.0.0.1:19101 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: no SI token to load: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres task=connect-proxy-keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=envoy_bootstrap start="2023-05-11 09:37:11.609853696 +0000 UTC m=+250.883900277" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping Consul connect-proxy: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres task=connect-proxy-keycloak-postgres service=keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=envoy_version end="2023-05-11 09:37:11.609753479 +0000 UTC m=+250.883800053" duration=2.983495ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.envoy_version: setting task envoy image: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres image=envoyproxy/envoy:v1.25.1 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/connect-proxy-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=api end="2023-05-11 09:37:11.60669843 +0000 UTC m=+250.880745012" duration="729.37ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=envoy_version start="2023-05-11 09:37:11.606769979 +0000 UTC m=+250.880816558" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=devices end="2023-05-11 09:37:11.605891688 +0000 UTC m=+250.879938269" duration="645.779ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=api start="2023-05-11 09:37:11.605969058 +0000 UTC m=+250.880015642" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=devices start="2023-05-11 09:37:11.605245911 +0000 UTC m=+250.879292490" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=artifacts end="2023-05-11 09:37:11.605189779 +0000 UTC m=+250.879236354" duration="568.551ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=volumes end="2023-05-11 09:37:11.604563049 +0000 UTC m=+250.878609625" duration="638.825ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=artifacts start="2023-05-11 09:37:11.604621223 +0000 UTC m=+250.878667803" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=dispatch_payload start="2023-05-11 09:37:11.603188293 +0000 UTC m=+250.877234896" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=volumes start="2023-05-11 09:37:11.603924221 +0000 UTC m=+250.877970800" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=dispatch_payload end="2023-05-11 09:37:11.603853135 +0000 UTC m=+250.877899718" duration="664.822ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=logmon end="2023-05-11 09:37:11.603084897 +0000 UTC m=+250.877131471" duration=52.539353ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc/logs/.connect-proxy-keycloak-postgres.stdout.fifo @module=logmon timestamp=2023-05-11T09:37:11.600Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/27dfb19c-1e44-2e49-a689-0a4e369f7bd2/alloc/logs/.connect-proxy-keycloak-postgres.stderr.fifo timestamp=2023-05-11T09:37:11.600Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres @module=logmon address=/tmp/plugin188649827 network=unix timestamp=2023-05-11T09:37:11.597Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-27dfb19c-1e44-2e49-a689-0a4e369f7bd2-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-27dfb19c-1e44-2e49-a689-0a4e369f7bd2-group-keycloak-postgres-keycloak-postgres-5432 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad pid=13182 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=identity end="2023-05-11 09:37:11.550437722 +0000 UTC m=+250.824484298" duration=1.263548ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=logmon start="2023-05-11 09:37:11.55054553 +0000 UTC m=+250.824592118" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=identity start="2023-05-11 09:37:11.549174171 +0000 UTC m=+250.823220750" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=task_dir end="2023-05-11 09:37:11.549091999 +0000 UTC m=+250.823138575" duration=2.331206ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-task-27dfb19c-1e44-2e49-a689-0a4e369f7bd2-group-keycloak-postgres-keycloak-postgres-5432 exists=false reason=operations 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=validate end="2023-05-11 09:37:11.546647742 +0000 UTC m=+250.820694320" duration=1.702843ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=task_dir start="2023-05-11 09:37:11.546760789 +0000 UTC m=+250.820807369" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=csi_hook end="2023-05-11 09:37:11.544735563 +0000 UTC m=+250.818782139" duration="56.356ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=network end="2023-05-11 09:37:11.544432529 +0000 UTC m=+250.818479106" duration=1.062182242s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=group-keycloak-postgres check=keycloak_postgres_ping 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=consul_grpc_socket end="2023-05-11 09:37:11.544662103 +0000 UTC m=+250.818708679" duration="68.24ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres start="2023-05-11 09:37:11.544872432 +0000 UTC m=+250.818919008" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 from=init to=prestart 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=checks_hook start="2023-05-11 09:37:11.544748302 +0000 UTC m=+250.818794881" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.71\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"32:75:56:87:d2:5c\",\"Sandbox\":\"/var/run/docker/netns/dbfab78c8128\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth0f1c5d73\":{\"IPConfigs\":null,\"Mac\":\"f6:2d:e0:21:64:df\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=group_services start="2023-05-11 09:37:11.544450841 +0000 UTC m=+250.818497412" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=consul_http_socket end="2023-05-11 09:37:11.544673915 +0000 UTC m=+250.818720485" duration="4.721ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 1, 0, 0>" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=checks_hook end="2023-05-11 09:37:11.544758726 +0000 UTC m=+250.818805309" duration="10.428ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hooks: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 end="2023-05-11 09:37:11.544769462 +0000 UTC m=+250.818816042" duration=1.109612071s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=consul_http_socket start="2023-05-11 09:37:11.544669194 +0000 UTC m=+250.818715764" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=csi_hook start="2023-05-11 09:37:11.544679214 +0000 UTC m=+250.818725783" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=group_services end="2023-05-11 09:37:11.544584964 +0000 UTC m=+250.818631539" duration="134.127ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres name=validate start="2023-05-11 09:37:11.544944898 +0000 UTC m=+250.818991477" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 name=consul_grpc_socket start="2023-05-11 09:37:11.544593866 +0000 UTC m=+250.818640439" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=15.817920153s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15462 total=21 pulled=9 filtered=12 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=19.040312072s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=task_services start="2023-05-11 09:37:11.446325905 +0000 UTC m=+250.720372482" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=script_checks end="2023-05-11 09:37:11.446539605 +0000 UTC m=+250.720586188" duration="151.256ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=running desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=task_services end="2023-05-11 09:37:11.446376995 +0000 UTC m=+250.720423575" duration="51.093ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=script_checks start="2023-05-11 09:37:11.446388353 +0000 UTC m=+250.720434932" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=stats_hook end="2023-05-11 09:37:11.446307927 +0000 UTC m=+250.720354509" duration="87.213ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak start="2023-05-11 09:37:11.446146053 +0000 UTC m=+250.720192629" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running poststart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=stats_hook start="2023-05-11 09:37:11.446220705 +0000 UTC m=+250.720267296" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished poststart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak end="2023-05-11 09:37:11.446623494 +0000 UTC m=+250.720670069" duration="477.44ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: setting task state: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak state=running 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak type=Started msg="Task started by client" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:37:11.442Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker.docker_logger.stdio: waiting for stdio data: driver=docker 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker network=unix @module=docker_logger address=/tmp/plugin2133435657 timestamp=2023-05-11T09:37:11.441Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=9876c30082db8e30863019c34659465feb8d1324c8c5f87f6068d5cc778b1818 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=13133 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak type=Driver msg="Downloading image" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: emitting event: driver=docker event="&{87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/896a3e36 await-for-keycloak 87cd4d30-a263-1598-0a57-72046f840473 2023-05-11 09:37:11.354707793 +0000 UTC m=+250.628754370 Downloading image map[image:registry.cloud.private/busybox:1.36] }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr: task event received: driver=docker event="&{87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/896a3e36 await-for-keycloak 87cd4d30-a263-1598-0a57-72046f840473 2023-05-11 09:37:11.354707793 +0000 UTC m=+250.628754370 Downloading image map[image:registry.cloud.private/busybox:1.36] }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=api end="2023-05-11 09:37:11.353003846 +0000 UTC m=+250.627050420" duration="658.03ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=script_checks end="2023-05-11 09:37:11.353596122 +0000 UTC m=+250.627642698" duration="539.814ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak end="2023-05-11 09:37:11.353642597 +0000 UTC m=+250.627689171" duration=65.388905ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=script_checks start="2023-05-11 09:37:11.353056304 +0000 UTC m=+250.627102884" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak allocID=87cd4d30-a263-1598-0a57-72046f840473 task="&{await-for-keycloak docker map[args:[-c echo -n 'Waiting for service keycloak'; until nslookup keycloak.service.consul 2>&1 >/dev/null; do echo '.'; sleep 2; done] command:sh image:registry.cloud.private/busybox:1.36] map[] [] [] [] [] 0xc0036c38c0 0xc002fb38f0 0xc000f88528 map[] 5s 0xc002cf2a50 [] false 0s [] [] }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=devices end="2023-05-11 09:37:11.352285779 +0000 UTC m=+250.626332353" duration="415.214ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=api start="2023-05-11 09:37:11.35234581 +0000 UTC m=+250.626392390" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/await-for-keycloak/secrets/api.sock: bind: invalid argument" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=artifacts end="2023-05-11 09:37:11.351819922 +0000 UTC m=+250.625866498" duration="527.781ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=artifacts start="2023-05-11 09:37:11.351292139 +0000 UTC m=+250.625338717" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=volumes end="2023-05-11 09:37:11.351229839 +0000 UTC m=+250.625276419" duration="585.565ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=devices start="2023-05-11 09:37:11.35187056 +0000 UTC m=+250.625917139" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=volumes start="2023-05-11 09:37:11.350644275 +0000 UTC m=+250.624690854" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=dispatch_payload end="2023-05-11 09:37:11.350584986 +0000 UTC m=+250.624631561" duration="602.239ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=dispatch_payload start="2023-05-11 09:37:11.349982741 +0000 UTC m=+250.624029322" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=logmon end="2023-05-11 09:37:11.349893058 +0000 UTC m=+250.623939640" duration=54.317026ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc/logs/.await-for-keycloak.stderr.fifo @module=logmon timestamp=2023-05-11T09:37:11.348Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/87cd4d30-a263-1598-0a57-72046f840473/alloc/logs/.await-for-keycloak.stdout.fifo timestamp=2023-05-11T09:37:11.347Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak @module=logmon address=/tmp/plugin4257362822 network=unix timestamp=2023-05-11T09:37:11.344Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-87cd4d30-a263-1598-0a57-72046f840473-group-keycloak-ingress-forwardauth-auth 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak path=/usr/local/bin/nomad pid=13023 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=identity end="2023-05-11 09:37:11.295320598 +0000 UTC m=+250.569367172" duration=2.19325ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=logmon start="2023-05-11 09:37:11.295576033 +0000 UTC m=+250.569622614" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=identity start="2023-05-11 09:37:11.293127343 +0000 UTC m=+250.567173922" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=task_dir end="2023-05-11 09:37:11.292791954 +0000 UTC m=+250.566838553" duration=3.452001ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-task-87cd4d30-a263-1598-0a57-72046f840473-group-keycloak-ingress-forwardauth-auth exists=false reason=operations 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=task_dir start="2023-05-11 09:37:11.28933997 +0000 UTC m=+250.563386552" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=validate end="2023-05-11 09:37:11.289241052 +0000 UTC m=+250.563287633" duration="879.901ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="494.224ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 end="2023-05-11 09:37:11.288108121 +0000 UTC m=+250.562154702" duration=854.878262ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=checks_hook start="2023-05-11 09:37:11.288090796 +0000 UTC m=+250.562137377" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak start="2023-05-11 09:37:11.288253696 +0000 UTC m=+250.562300266" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=csi_hook start="2023-05-11 09:37:11.288072881 +0000 UTC m=+250.562119455" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=consul_http_socket end="2023-05-11 09:37:11.288009597 +0000 UTC m=+250.562056193" duration="7.565ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=csi_hook end="2023-05-11 09:37:11.288082537 +0000 UTC m=+250.562129114" duration="9.659ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 from=init to=prestart 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=checks_hook end="2023-05-11 09:37:11.288100632 +0000 UTC m=+250.562147214" duration="9.837ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak name=validate start="2023-05-11 09:37:11.288361153 +0000 UTC m=+250.562407732" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=consul_http_socket start="2023-05-11 09:37:11.288002058 +0000 UTC m=+250.562048628" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=network end="2023-05-11 09:37:11.287645293 +0000 UTC m=+250.561691887" duration=822.604385ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=group_services end="2023-05-11 09:37:11.287910522 +0000 UTC m=+250.561957106" duration="244.037ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=group_services start="2023-05-11 09:37:11.287666492 +0000 UTC m=+250.561713069" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.70\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"2e:1d:b6:c2:63:54\",\"Sandbox\":\"/var/run/docker/netns/b4529492179a\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"vetheebb19b8\":{\"IPConfigs\":null,\"Mac\":\"62:a5:b4:69:02:95\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=consul_grpc_socket end="2023-05-11 09:37:11.287993139 +0000 UTC m=+250.562039712" duration="10.97ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 name=consul_grpc_socket start="2023-05-11 09:37:11.287982155 +0000 UTC m=+250.562028742" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 0, 0, 0>" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=9876c30082db8e30863019c34659465feb8d1324c8c5f87f6068d5cc778b1818 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=12 errors=0 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc modify_index=15370 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-keycloak memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: no docker log driver provided, defaulting to plugin config: driver=docker task_name=connect-proxy-keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-keycloak container_name=connect-proxy-keycloak-1022bc44-6bd1-c8c5-62c5-4166c31f7afc 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-keycloak labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:1022bc44-6bd1-c8c5-62c5-4166c31f7afc com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak com.hashicorp.nomad.task_name:connect-proxy-keycloak]" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-keycloak network_mode=container:60ce164cee8b69f91549c431446ae963eb248a954ae5f584c285044c4ed23c40 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: binding volumes: driver=docker task_name=connect-proxy-keycloak volumes=["/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc:/alloc", "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/local:/local", "/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/secrets:/secrets"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-keycloak binds="[]string{\"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/local:/local\", \"/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/secrets:/secrets\"}" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 modify_index=15384 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak end="2023-05-11 09:37:11.241290896 +0000 UTC m=+250.515337470" duration=170.583066ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak allocID=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task="&{connect-proxy-keycloak docker map[args:[-c ${NOMAD_SECRETS_DIR}/envoy_bootstrap.json -l ${meta.connect.log_level} --concurrency ${meta.connect.proxy_concurrency} --disable-hot-restart] image:envoyproxy/envoy:v1.25.1 labels:[map[com.github.logunifier.application.pattern.key:envoy]]] map[] [] [] [${attr.consul.version} semver >= 1.8.0 ${attr.consul.grpc} > 0] [] 0xc000a446c0 0xc0022b46f0 0xc000c9cf30 map[] 5s 0xc002f14dc8 [] false 0s [] [] connect-proxy:keycloak }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=script_checks end="2023-05-11 09:37:11.241242181 +0000 UTC m=+250.515288758" duration=1.892952ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="563.101ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 modify_index=15428 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=envoy_bootstrap end="2023-05-11 09:37:11.239204757 +0000 UTC m=+250.513251339" duration=105.387973ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=script_checks start="2023-05-11 09:37:11.239349217 +0000 UTC m=+250.513395806" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 modify_index=15442 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=54969951-d541-ae97-922a-7db38096bae5 modify_index=15406 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=434f71a9-4b50-8512-effc-5858456f87be modify_index=15370 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 modify_index=15373 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff modify_index=15428 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=18.714319087s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: AllocRunner has terminated, skipping alloc update: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d modify_index=15373 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15461 total=21 pulled=9 filtered=12 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=12 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/acl/token/self from=127.0.0.1:59050 error="ACL support disabled" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15459 duration=713.199406ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client: next heartbeat: period=11.625467157s 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: emitting event: driver=docker event="&{1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/3af7486f await-for-keycloak-postgres 1022bc44-6bd1-c8c5-62c5-4166c31f7afc 2023-05-11 09:37:11.142391235 +0000 UTC m=+250.416437815 Downloading image map[image:registry.cloud.private/busybox:1.36] }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr: task event received: driver=docker event="&{1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/3af7486f await-for-keycloak-postgres 1022bc44-6bd1-c8c5-62c5-4166c31f7afc 2023-05-11 09:37:11.142391235 +0000 UTC m=+250.416437815 Downloading image map[image:registry.cloud.private/busybox:1.36] }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres type=Driver msg="Downloading image" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=script_checks end="2023-05-11 09:37:11.140537771 +0000 UTC m=+250.414584350" duration="859.768ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres end="2023-05-11 09:37:11.140697709 +0000 UTC m=+250.414744284" duration=70.452009ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: waiting for cgroup to exist for: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres allocID=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task="&{await-for-keycloak-postgres docker map[args:[-c echo -n 'Waiting for service keycloak-postgres'; until nslookup keycloak-postgres.service.consul 2>&1 >/dev/null; do echo '.'; sleep 2; done] command:sh image:registry.cloud.private/busybox:1.36] map[] [] [] [] [] 0xc000a44600 0xc0022b4480 0xc000c9ce28 map[] 5s 0xc002f14cf0 [] false 0s [] [] }" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=script_checks start="2023-05-11 09:37:11.139678 +0000 UTC m=+250.413724582" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=api end="2023-05-11 09:37:11.139336061 +0000 UTC m=+250.413382641" duration=1.325551ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=api start="2023-05-11 09:37:11.13801051 +0000 UTC m=+250.412057090" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/await-for-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=devices end="2023-05-11 09:37:11.137845416 +0000 UTC m=+250.411891992" duration="815.475ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=devices start="2023-05-11 09:37:11.137029938 +0000 UTC m=+250.411076517" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=artifacts start="2023-05-11 09:37:11.136184948 +0000 UTC m=+250.410231527" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=volumes end="2023-05-11 09:37:11.13603142 +0000 UTC m=+250.410077994" duration="893.956ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=artifacts end="2023-05-11 09:37:11.136870217 +0000 UTC m=+250.410916792" duration="685.265ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=volumes start="2023-05-11 09:37:11.135137457 +0000 UTC m=+250.409184038" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=dispatch_payload end="2023-05-11 09:37:11.134939077 +0000 UTC m=+250.408985656" duration=1.81997ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: no SI token to load: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak task=connect-proxy-keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping envoy: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak namespace="" proxy_id=_nomad-task-1022bc44-6bd1-c8c5-62c5-4166c31f7afc-group-keycloak-keycloak-8080-sidecar-proxy service=keycloak gateway="" bootstrap_file=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/secrets/envoy_bootstrap.json grpc_addr=unix://alloc/tmp/consul_grpc.sock admin_bind=127.0.0.2:19002 ready_bind=127.0.0.1:19102 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: check for SI token for task: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak task=connect-proxy-keycloak exists=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=envoy_bootstrap start="2023-05-11 09:37:11.133816787 +0000 UTC m=+250.407863366" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping Consul connect-proxy: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak task=connect-proxy-keycloak service=keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=envoy_version end="2023-05-11 09:37:11.133714265 +0000 UTC m=+250.407760839" duration=3.336247ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=dispatch_payload start="2023-05-11 09:37:11.133119107 +0000 UTC m=+250.407165686" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.envoy_version: setting task envoy image: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak image=envoyproxy/envoy:v1.25.1 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=logmon end="2023-05-11 09:37:11.132943128 +0000 UTC m=+250.406989702" duration=55.044169ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc/logs/.await-for-keycloak-postgres.stdout.fifo @module=logmon timestamp=2023-05-11T09:37:11.131Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc/logs/.await-for-keycloak-postgres.stderr.fifo timestamp=2023-05-11T09:37:11.131Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=envoy_version start="2023-05-11 09:37:11.130378012 +0000 UTC m=+250.404424592" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=api end="2023-05-11 09:37:11.130168747 +0000 UTC m=+250.404215325" duration=1.393978ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/connect-proxy-keycloak/secrets/api.sock: bind: invalid argument" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=api start="2023-05-11 09:37:11.128774768 +0000 UTC m=+250.402821347" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=devices end="2023-05-11 09:37:11.128618169 +0000 UTC m=+250.402664744" duration=2.214408ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=devices start="2023-05-11 09:37:11.126403757 +0000 UTC m=+250.400450336" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres @module=logmon address=/tmp/plugin1512723149 network=unix timestamp=2023-05-11T09:37:11.126Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=artifacts end="2023-05-11 09:37:11.126199657 +0000 UTC m=+250.400246234" duration=1.437232ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=artifacts start="2023-05-11 09:37:11.124762421 +0000 UTC m=+250.398809002" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=volumes end="2023-05-11 09:37:11.124468427 +0000 UTC m=+250.398515006" duration=1.086966ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=volumes start="2023-05-11 09:37:11.12338146 +0000 UTC m=+250.397428040" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=dispatch_payload end="2023-05-11 09:37:11.123170888 +0000 UTC m=+250.397217470" duration=2.342954ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=dispatch_payload start="2023-05-11 09:37:11.120827918 +0000 UTC m=+250.394874516" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=logmon end="2023-05-11 09:37:11.120699454 +0000 UTC m=+250.394746034" duration=42.292225ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc/logs/.connect-proxy-keycloak.stderr.fifo timestamp=2023-05-11T09:37:11.118Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/1022bc44-6bd1-c8c5-62c5-4166c31f7afc/alloc/logs/.connect-proxy-keycloak.stdout.fifo timestamp=2023-05-11T09:37:11.118Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner.task_hook.logmon.stdio: waiting for stdio data: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak @module=logmon address=/tmp/plugin411442571 network=unix timestamp=2023-05-11T09:37:11.116Z 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak version=2 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-1022bc44-6bd1-c8c5-62c5-4166c31f7afc-group-keycloak-keycloak-8080 2023-05-11T11:37:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-1022bc44-6bd1-c8c5-62c5-4166c31f7afc-group-keycloak-keycloak-8080-sidecar-proxy 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres path=/usr/local/bin/nomad pid=12909 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak path=/usr/local/bin/nomad 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak path=/usr/local/bin/nomad pid=12910 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=identity end="2023-05-11 09:37:11.078251183 +0000 UTC m=+250.352297762" duration=1.627464ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=logmon start="2023-05-11 09:37:11.07840722 +0000 UTC m=+250.352453809" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=logmon start="2023-05-11 09:37:11.077898954 +0000 UTC m=+250.351945533" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=identity end="2023-05-11 09:37:11.077314832 +0000 UTC m=+250.351361411" duration=1.697584ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=identity start="2023-05-11 09:37:11.076623719 +0000 UTC m=+250.350670298" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=task_dir end="2023-05-11 09:37:11.076537326 +0000 UTC m=+250.350583901" duration=2.944149ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=task_dir end="2023-05-11 09:37:11.075459842 +0000 UTC m=+250.349506422" duration=3.756274ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=identity start="2023-05-11 09:37:11.075617251 +0000 UTC m=+250.349663827" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=task_dir start="2023-05-11 09:37:11.073593174 +0000 UTC m=+250.347639752" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=validate end="2023-05-11 09:37:11.073526666 +0000 UTC m=+250.347573241" duration=2.043227ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-task-1022bc44-6bd1-c8c5-62c5-4166c31f7afc-group-keycloak-keycloak-8080 exists=false reason=operations 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: finished prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=validate end="2023-05-11 09:37:11.071638993 +0000 UTC m=+250.345685567" duration=1.316901ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak name=validate start="2023-05-11 09:37:11.071483429 +0000 UTC m=+250.345530014" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=task_dir start="2023-05-11 09:37:11.071703569 +0000 UTC m=+250.345750148" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc end="2023-05-11 09:37:11.070053759 +0000 UTC m=+250.344100332" duration=639.05651ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=checks_hook end="2023-05-11 09:37:11.070046794 +0000 UTC m=+250.344093366" duration="9.535ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: handling task state update: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc done=false 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres start="2023-05-11 09:37:11.070245696 +0000 UTC m=+250.344292275" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_coordinator: state transition: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc from=init to=prestart 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres name=validate start="2023-05-11 09:37:11.070322084 +0000 UTC m=+250.344368666" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=checks_hook start="2023-05-11 09:37:11.070037252 +0000 UTC m=+250.344083831" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=csi_hook start="2023-05-11 09:37:11.07001875 +0000 UTC m=+250.344065322" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=consul_http_socket end="2023-05-11 09:37:11.07000658 +0000 UTC m=+250.344053159" duration="23.237ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: sending updated alloc: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc client_status=pending desired_status="" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=csi_hook end="2023-05-11 09:37:11.07002847 +0000 UTC m=+250.344075049" duration="9.727ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner.task_runner: running prestart hooks: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak start="2023-05-11 09:37:11.070707812 +0000 UTC m=+250.344754404" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=consul_http_socket start="2023-05-11 09:37:11.06998334 +0000 UTC m=+250.344029922" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=group-keycloak check=ready 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] watch.checks: now watching check: alloc_i=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=group-keycloak check=health 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=consul_grpc_socket end="2023-05-11 09:37:11.069511667 +0000 UTC m=+250.343558241" duration="781.764ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<1, 3, 0, 0>" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=group_services end="2023-05-11 09:37:11.068718414 +0000 UTC m=+250.342764991" duration="157.609ยตs" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.69\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"02:fc:c5:a7:d0:d1\",\"Sandbox\":\"/var/run/docker/netns/97c791d597c8\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth7fb8a220\":{\"IPConfigs\":null,\"Mac\":\"76:d6:af:cc:09:02\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=group_services start="2023-05-11 09:37:11.068560807 +0000 UTC m=+250.342607382" 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: finished pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=network end="2023-05-11 09:37:11.068543949 +0000 UTC m=+250.342590523" duration=613.229271ms 2023-05-11T11:37:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.alloc_runner: running pre-run hook: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc name=consul_grpc_socket start="2023-05-11 09:37:11.068729898 +0000 UTC m=+250.342776477" 2023-05-11T11:37:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_230c4069-3739-55ae-acf5-e00ad49a78d0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d2928440-57bd-9f0d-be52-db209648a8b7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0c17f752-dbd0-0697-a3e6-0f53fd8e9e6b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2c7dabe8-496b-1b80-2b3b-fad6530c94be] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/security_postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_395f79bb-f3aa-730c-b2a8-d3eacbad0b1e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_815b8320-f88f-14f8-a12c-2ad2e7c81af8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c754856f-9e64-7585-4e96-aea4177f666d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_98060fb3-c42f-2555-1181-7a8c2ccdf5e3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e222c9d8-611d-1158-3bd1-5cd575e3bfae] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8a26522c-7343-fd3f-ecb6-e7a8207afef3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33410af5-f1eb-d8c0-0f95-18eed2fc0545] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f7fb25b5-c57c-f328-5492-982192b038e3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3f2ee033-a760-0227-0c16-00efd8830dae] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-prometheus-exporter-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-agent-83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dd98d116-01c5-bc43-1d57-e93333e5c5fb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_578ba772-e064-722a-a608-dcd67e4f0d69] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8958df2c-747b-937d-fda6-9de0502129a3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c66d139-6193-eec9-7f1f-bb66615746be] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0a6c32ef-65d7-113b-e95f-0d4fcbe5b17e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_53ac7e54-c453-1470-617d-705c82a9d303] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-security-postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d914a6cb-df4c-c1a2-72c4-01fa1270a96a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d07aefbb-0892-0f71-277f-62b800275169] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/forwardauth-8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_62fe668f-b911-d1b2-fe29-65c45e4132fd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b962fb2f-9cfc-0ee5-c8e0-13cb5ae7eef4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_778d37b8-b473-7ae4-1aeb-5c89c8aafdb2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8dc8d24e-db95-9ab8-4100-5ff73fd0107f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f66b4793-86ac-5933-935d-753d07bbe2f1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a604a30-09cb-dd55-49a9-51661b063017] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_00abd284-a436-a771-7932-2588a00861bb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2550e92d-7eea-a28c-8df3-8ac8921a618c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2d9a275c-c00f-97de-5355-7211f16b700b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0aaf5216-6aca-175e-5690-e24589f6b339] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/minio-273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c8555ba-1133-2e30-c23f-a1ed2f3a52cd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_29b5878e-b982-304a-661d-27e4346863cb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d180bfaf-7db9-6d0c-f06e-d9f8fa4b0d08] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_689aa6d5-4263-33ed-bd9d-6b995cab3269] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/logunifier-dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e27123c-e020-cca6-d17f-227a9204992f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9256b9c1-e86a-c54e-76c9-19b554d3590a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_95892491-2a91-03ad-131b-87fe8c898a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebf9b42c-3684-c50b-d777-98758305caa2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-4eee21ed-4cb6-1319-8401-19107f29b34b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b55f2d5e-2db6-b3c2-0406-c99a6086799c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_10d3203e-fb17-25f5-0443-9dc14bee4353] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_293c489e-6f64-7df6-181d-8d4bd3cc6dc7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d9fe8ae4-5f8f-ea8c-01ce-032da8d61c02] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4caca5c4-5782-7020-3b9c-319bb6fb8d6f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_51934506-a27c-fce0-0046-1cb998e8fc8a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_094169c6-7f71-8797-2811-d28b34f98cbc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_cdaf15cd-9c50-e527-3d84-4ca24fe957d0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c936c4a-1073-d460-a67e-b7ecc9193d82] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b3f39155-7204-f2bf-49a9-edbf0fcaa6f4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d76e8314-e43c-b6fc-ffeb-a4a16f92ed30] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ad09d1f3-ea27-a0a4-a909-a03433289f68] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dad6f08d-63f9-981e-0710-f606f16b1612] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b546beb3-03b8-6d1e-5043-668d3eda5687] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a5d4961-4f69-11d7-5c6a-f1173533a8da] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_25437077-cb41-bc3e-8448-39bb4fc3a1b9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f464894-eacb-9913-c641-a29461c38cc8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_24bc8713-ad7d-041a-fe16-cada508034d2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aff7cd30-fe31-a88e-15d9-87c1d9c99e5c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a59a5bb9-4564-eb81-3c8f-4cd28b8bfefb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6b90ef66-8558-63c4-b4ae-28a46aff796c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fb14d5de-dcda-5120-191e-6db17613d746] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83b8ebe5-10cd-3595-e5c9-4f32bf9a0aea] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_76f5d304-f03e-56a5-03a9-4b6f1ffade8b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4be86bf6-ec41-3afe-28b8-390999baf6ca] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e1697492-e775-22a5-0fdd-e6016b11d065] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d16b376-9f86-997f-d7c1-6ce409c639f7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0f2a367-7d42-c7f2-54ea-40a11bfd0c76] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d43f469a-4e69-6c5b-1ec6-8c5f0cc0a6c1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_38e26055-c0c5-c599-ce16-47b8ac4a6b9f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_781552f3-602d-39aa-b2af-c18f114d768f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_eb7d66d3-d0ed-d3f2-99c7-82425a05fc94] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5ab17d03-a694-611a-4c30-fe42fa77d611] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c10a90bb-5cbc-5809-13ea-aacae5ca3ae0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a44fc66c-ddee-a1a9-b338-f5d1c62dcec1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b9b30315-d632-5298-de37-f84d88930d30] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5f709788-d95a-3f64-a2d9-a5ba9b541c67] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33b79af9-85f0-b75c-5f82-4d67c4de5a1e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac36bbc3-943b-46eb-85ca-55e5ab0da25a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b5ea3aa9-e526-8c29-a60d-7795d09509ac] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c29afea1-c6d0-e407-a4f5-4eb92385ca32] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4a73890a-df6f-c603-d0c4-72fada95a363] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1db5c222-3edd-5cf8-40e0-c4e8dff94230] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_115fc8b3-d9fa-010d-1405-a2e0c7fa1b5e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_db869c37-317e-290d-53f6-6fe4e1203f9f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a16cbe1-f824-2af9-80ee-09842a8a8241] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7d870d5b-8579-b2de-24c8-93d37eaee260] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c2979cd-c640-aee6-e178-449b7f874c61] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a93c54c3-3cf5-127e-c6d2-bb3fe7ec6a9e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_88b47db0-ff4e-8503-7a1a-5602041fcd91] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8139a273-35af-1267-555c-90af7f1b171d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a68e3b40-11a8-92c8-5e79-f4e5958c2cd9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2767f1f6-303d-39e8-7386-5bc11447ee70] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dfac93eb-d5ef-016d-e1b1-391bf3f6b98e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8f79622e-0821-d1dc-fa63-db1705ca0c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f25898d6-1b9b-fa45-6d54-e4eb24679da6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_70753ccb-16bc-c1b1-f3ac-3a4def318638] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6cc03836-41d7-fd6b-7681-79da7c0e3e29] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_acdc0594-ed2a-2dfd-d480-ad3fa90f0c14] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6d413e3e-9258-cfbe-9570-147272d499ec] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_af43521c-3db6-cdd6-b4a9-7ac7d6eb06df] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_63327bdf-7c68-5c6a-1729-29d45fedf215] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f856633a-575f-54e5-c7f6-c90d5ea6f20b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_39f5e534-1fd5-fccd-a32e-c685e853a88e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e3fb29a-a589-a7c0-412a-7da320727211] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_232196fb-c36e-a126-5765-816d8f277f5b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4866d91a-2ec9-04b4-38d0-64f463f2ae12] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0c9e5807-351d-1412-6a76-dad70f1a2aee] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_000e4e5e-9756-1007-de31-5e01ddfd9afc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1e90aad6-28ad-fc45-fbf7-37e45ea84120] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fa80e84e-7ab6-c17a-80f4-0d2dbad5f124] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_99ee7671-099a-b141-199e-7eabb093bdff] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2f87b450-8e20-757e-c6f8-02352961a64f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7f515eb6-5425-326a-eb88-11177c6983e7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_392b1f76-301b-85ba-a56a-659fa7c68b5b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_883add62-5382-573f-01a4-74ccdd730433] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83dd82ef-357a-e09a-5f68-c4f83f5d3022] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_90186d8f-0c98-0f3a-1a20-f526ed171d42] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8fc6b74b-9f30-7f05-61ab-2c924a3b8098] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_30dfa12f-0b80-5592-04e7-a4e42593df05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0667520a-191e-2abc-16af-3b665c6b0a8a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_65078e15-e522-a18d-9cd8-c84b61d04080] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d72501c2-107e-16f2-a426-1bc6a1f1b393] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5db518fa-7beb-128d-b446-19ecc1ed26e7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_108c13c9-3484-3eec-2469-18eb5d5bcc9f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aea4eb53-7bfa-7d7c-e516-bac0b392d2cc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e2de0ad3-36e0-c7be-a60a-7aeccc465072] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f8b65a43-697f-5ace-c12e-57699129054f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6e1e8ce7-3002-4409-160c-853735b87aad] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_702b6438-6595-5532-4c19-aa6af75c6a97] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a540a62f-1446-270f-c043-051226c5dff7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8b796690-2887-eeea-8678-0ec52778b89d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_795d0184-24eb-9a8d-2125-9da13758ed2a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_18b05f18-14bd-730e-716a-9dc14e22c1a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f567b5c5-c5e3-0a70-822e-1e9da955d602] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_40e835b5-7ee9-e829-79da-4a86171d644d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-7402ab24-7a72-daa7-18e8-791c7fef8f1f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4b9fd3bc-2afa-468c-f449-b0a22dbc83a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d969d14-48a2-3794-c8c9-0a5ff6ca6397] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_50eb5136-e840-f32a-98fb-44a2030fc7db] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bb573a46-ace6-432e-e4b1-3a6f5a9802d4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_401e0cdd-f517-3b40-94d7-81b2f92e2e46] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d7cee0be-360d-3028-cfe7-1749771765e7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0c6a4b5-a4ac-3ca7-b7c3-e59986aef734] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ef11d95f-70b0-c7c6-bc36-14168c070c72] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a03f35f1-5783-1872-e534-e8182154d3a0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a84dbceb-e563-6f63-78a3-a60434903fc4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a0b1e2d4-d14b-8ef4-c8f8-e0e077da1120] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac66cfa7-0a1d-1e31-2d90-2926b3573291] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fff8efd3-199c-9fc5-f043-a2daea286333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_96fc6978-1b1e-e4ca-768f-806373b8ed61] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac62d1a8-321f-3c9f-bca0-f66d4ee6051a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_63541a26-e4e1-b0e0-43e7-63f512c38704] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe182175-94df-c3a2-5033-f4863d6daf59] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_729ec73e-7c5d-a7f4-6bbe-ba89e1c82c48] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c7bb4f9b-baef-303f-b0fa-6b64918cbb34] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f0a21535-449e-3f2c-3607-c38cdae7df35] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_41b9d251-ee3c-d626-89b2-b7f89e4df0fb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a37c5dc0-676b-40b6-4ec5-d88db5c5104a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e3cd1900-5492-239d-07f1-a5f1163701db] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_41bbbe03-e95c-ff4a-e63d-42a01d25b68f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_66731feb-b41a-e45c-8786-63f8f308d1c6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e4a2a0f0-ce1a-8dd4-909b-8d493b7b3147] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6b659b2f-9582-a3ee-3921-1f921693fad7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_afba5d27-f527-a783-8a1f-9a007e0dcf1f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b0bc336e-3fe5-259c-3af5-62ee25fe8880] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_94ca89a3-e84e-34d4-dccd-8df84ef01e62] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8010ea2f-c1d4-95df-3c60-c986afcf820d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_10e8a5dc-cae6-ca39-6e75-ccc24fbb5793] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6bfbafe7-6eb7-d3e7-f611-ad21bde726e1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_60f70a58-4ada-2bd8-3a59-0d7496e78e3c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9764c731-33c4-f84c-0bac-acf982471ec6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/await-for-keycloak-3372a577-0d74-8c27-cd77-bd38c6aba0e3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b345211d-47ba-4b7f-cd61-bc3dd1e25047] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_475586af-fe17-46be-548f-30bb43ad8c67] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_69f54715-9420-d012-cf73-aca5df54e002] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f269f1e0-3257-65c9-251f-cfb8263f7178] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7204c6f8-b9f0-e7ed-306b-b399af5f4902] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_58a6d31f-e3af-ee39-636e-b56cf3535c35] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_022aa122-779f-2169-ceec-50158e708783] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0f894843-0f10-5d66-e833-a2a02c7d0d6d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dfaa3956-7472-60cf-7e26-24a8b746f22e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a503b091-ec04-4d76-7053-83d537d6d591] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ea18f060-4b36-58c4-a1b0-804bc0b2c4a8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_282695f7-4eda-6d20-e6df-960a05c641fa] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a850ddb2-d6d9-37bc-7d4e-617f82f436a6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3e6dc8cb-c970-c304-f6a4-a91bd1ae4f4c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ed8ae133-d03d-b420-9ae5-75b474a3956f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_37ec705e-dd7e-66a9-cb09-925a608d00db] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_edb8f187-0b2a-f981-0de8-8a482860261f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9d1dd65b-e0e8-f50a-b5a4-501fcc32dffa] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_25e88201-39fa-8d38-f654-b4ac5332328f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6379ab71-b1ef-8a23-d223-65f2b1e1f67b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c7c0acf5-af3e-6071-5a04-dac053d8d528] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_18488429-5e8d-8fdf-8263-9ff8198ba613] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7b173beb-1432-9c14-f9f7-c327fb91f092] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dff8d527-e218-0ea0-aa27-b5e7e7192b49] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_23a7476b-2d56-d8af-d7d2-be2dd74a819b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5c072300-990b-ec8e-3601-53d5a5c1a695] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0ebfdace-f1f3-5f9c-8b58-e5975abaf36b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0240e90c-8b34-989c-e88f-289eb9aa3349] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-7b0a58a6-2274-4708-4288-dcee1491567b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-1b74c940-fcb1-69e1-6d67-0b63054f266c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_562d515c-f5f3-6318-5cb0-62a6cc8a3f52] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_68c263df-6161-b2db-758e-f0db602bef34] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0c39273-62f1-45b4-bc64-555e01b80a75] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_147e6cb9-a6a4-df14-f8fe-bebf80283b7d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/await-for-security-postgres-7b0a58a6-2274-4708-4288-dcee1491567b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f59c5dad-ff2c-36bf-93f7-9c036319ade4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_363a1b3e-f1dd-475f-0835-3361a71e3959] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1094fd09-d64d-c2e7-0c7f-0aa42cf78f70] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0c76bf1-d720-811f-6d49-a0c51e021385] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_113f1fc1-3b11-332d-13a7-453709d6aac0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-security-postgres-7482c5fb-31aa-9a2e-ae65-03b48382616d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4e0eb0a5-1e92-b0ef-5626-6baee78a0096] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5976fd18-1ef3-b888-83ae-3315150cfca0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0d5d8f2-f0fa-3d69-f582-584918264a2b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0fe726d0-4672-85b9-4eda-a856d61b94b3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-fe8e888f-c9b8-4899-22c7-3e5299e6a286] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_80d103b4-f1b5-cc84-6bbf-3d918c2e7a1f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e0571cc3-5d44-db64-4574-aabbd14c7e11] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a21a59e6-1a0c-bab8-aee9-7738c527f613] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f99620d3-5066-702b-1bdd-785063bd8458] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ba89033d-381e-380f-a313-8e67732bd8e6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_210c5f9d-f522-4018-44a3-a35a5be973d0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_055db48f-1569-e610-bf66-855290350cc2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5a8a1024-181c-c2fd-36a8-6a16f3bc378f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_43216db8-e085-1598-2de0-33d011c7a694] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_851d0e31-30a2-2387-7f39-5f22619f8228] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_42d8020c-ab01-d852-bf4d-3f8b818fec88] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b81968a4-1c3c-a446-5958-9f4abc7a1203] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_48da3f5d-7216-93a2-def9-e94bc908f612] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0e5811e2-d255-c4d9-bf36-5fd969595349] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_08893a28-b5e6-ac4b-8a80-da7dda48d71c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ee692e00-1d1f-5224-ee51-d131f95cc6a1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_602b3647-be8a-e4ac-eeff-8d7a19cb0c93] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a959690f-4ca6-b16b-2ef7-af85f846e39e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_56fa7fb8-f6b7-bba4-0176-23f4fc2c53a3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7347a644-3888-21ab-85d8-cc16a92b71e8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d884e696-31a8-7a7f-a3f0-2f3115571b25] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_01a65f63-28e6-8105-5beb-1d4f0d295757] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4ce37b28-01e7-3e0d-609a-86b12b47dfa8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a496910b-e46c-2a50-94d9-b9cccade6da3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/await-for-keycloak-113f1fc1-3b11-332d-13a7-453709d6aac0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_791a2e74-2587-1b51-8973-3237ad797a81] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_faeb2629-97cd-4c98-5dde-f8daf8bfb9a4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_85fde91b-9eda-e1e7-bbcd-50ca082eb50b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f30a2d53-ad5c-2680-6145-5fcca77d0806] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_faffa699-9a3f-f235-00d0-8222bd2704db] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_51ead1db-8c4c-a83d-7530-31b0d7d642b0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_44b945dd-d512-8604-6209-1a337950815f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0104afc-2307-d89f-85b4-d5bda88c5d75] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d5435bc0-09c5-8116-8e85-b6dc6de92dcb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f1f2976d-c36a-35d4-fa00-7c3f8252f292] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f38e15f-9ef3-095d-5920-6c3f5d5f4154] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ae8402f0-024f-f584-5f52-49c3a9faa0ae] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7b0a58a6-2274-4708-4288-dcee1491567b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_31167ba1-9760-26f3-9b81-f6a26e1cdee3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_42d0351a-ded8-99c2-7e3b-9142fb79a495] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ca1198a8-f040-1f2c-f390-ff919c577efa] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f4fda85-3686-00a6-6d40-c08d2ef781d4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6022c83a-faa7-0544-d9f9-d04827d0e938] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c3694fc9-5c13-0c7c-ce5a-1cd873f7b090] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_976be946-07a4-010b-bd1b-8b24f97b77ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6ad86a9-22a0-728d-f2e1-9af135f52ed7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7482c5fb-31aa-9a2e-ae65-03b48382616d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_459d1eb6-22c4-53c0-49f1-171a5d754fd7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7a06fc09-a648-27d4-21ba-a106ae8a67a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3bd03adf-59fd-d7f7-014c-f702435b095c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d28e4db-8bc5-4422-54b5-69150fa7108a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_47ae54eb-9927-1223-3927-e04326bae278] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-a4a139ac-114d-d326-46d9-15f4eda159bc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a9654f51-b825-964e-6c7d-c0b47c8db273] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_630a54e1-e69e-7110-a27e-220de6cba009] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3af77b66-c371-f94e-a6c4-1a714e550ea5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1d400e59-1686-5416-26ed-e328c33657ce] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_314ca142-d66e-f476-2c45-dee23cd0fbaf] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_76fe058b-0dd7-4b9b-232c-5e75ffd01069] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c4eb1af-0b87-2a6e-4201-2afba9d75bc2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_54a42993-1455-1101-f925-7b44cd21e63a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1ff419ee-9614-de46-aeee-ed5d68d0cca3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a72b497-e2be-fdb7-67b8-b4da3300e99e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d82a5e0e-c5ca-64af-c3b0-49a0cb7b8473] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/logunifier-3d488e11-cda7-c934-acc4-58513c8fa9fc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/security_postgres-7482c5fb-31aa-9a2e-ae65-03b48382616d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_496351af-34dc-7b83-f449-323049b051b3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_42f56e72-cd27-d498-8ee0-9f61ddd62bd3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f0b7d139-e088-ffee-1e58-d79bb65d4e51] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e2d67b88-674c-43e4-cfbb-b8f21d179eba] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b2ef4cd5-3443-645b-9bbd-d962b7273c5a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4a634af1-b6b5-7399-72ed-8df147121301] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6a29b206-61b1-8943-4fd4-3d774b903927] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fd07f4b0-448a-89f3-e776-042884a954b6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2548a0e3-5a69-6b6d-f666-16da44615de5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7402ab24-7a72-daa7-18e8-791c7fef8f1f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4be69159-89ef-a308-568d-b1f562aa3b82] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_06167f7a-9fc6-e391-c32e-04f1c9be9d74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2369851b-4d5c-486a-01b6-b3be16cf1dd7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_94773357-3cc2-d331-0413-db39d3a0a0fe] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_72352633-73de-befd-2546-8f194631491b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7b8cfe16-c275-e351-03bd-26f57c25ea03] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a9a80500-d699-abf6-5232-9c404c76c177] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_81c2c252-0fd8-13bf-2a2c-86e11f7f7c2f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_94c6920c-00fc-23c3-c8cb-640119555801] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6ca3778-e595-7aac-c2f3-74760a353791] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f25b1973-2baf-1b77-9ea7-1e241d692354] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_59587176-1988-7701-e2e2-d7108c65c9a1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a2007bca-0063-ded6-2156-5b85e78439ff] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_27d401ae-ccac-5357-9800-f3b93a2bbf8a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6b5a6b71-5250-8673-f593-aa6ee4638065] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2836b024-a589-985c-0807-40e98016c865] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1ec67236-1fcc-8371-b6bc-c6b4ed14880e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_abe6958e-ded5-09e2-b468-4445ea93d505] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_defe8e1b-55d9-c971-0063-5ca8e75c638d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5836b290-9722-d43c-69e3-0ca326679dc2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b35fc10f-6c49-05ef-5fbd-54e1d34a8eb9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9ba5f11e-f068-a873-b0d8-4aab2225ee73] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2a33e7e3-89c7-eff7-640a-26d44fef367c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4972ae37-0cd4-78a9-c529-5599b017947c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_00014660-de8f-45ad-7205-27cf2fc40766] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d9d5571d-c958-4fe8-c343-2cc42f538303] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_65bed301-410d-8c09-85dc-256b48f45d20] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c586be6-1f73-d38e-4f9c-e1cc0425b55c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6e7f5f6-420c-d67d-914d-fcb14f5ccb82] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fdde9248-8a81-60d2-e98c-10bb13859b27] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ffa03404-35c9-544f-61e4-384f3072e448] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2b614f06-bae3-c190-2282-3e6223b7695f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5c3a3083-fa05-b95c-7070-40b6a46bb240] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1d42f5cb-82ae-0546-c63d-05072aa6e6f9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5fc105f0-5a7e-408d-3e3c-e3366b5ac222] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ca9ac364-9d2b-7599-e303-b0cdd6a722ce] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aa65a4b0-8af8-e682-b93a-250f8ba52762] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bffa0040-dc3d-2500-fb74-3da2566daaa0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6c4f7ed-a56e-d4f1-5e51-26fd37bbaf47] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4434bda8-193f-f9cd-1c83-9d33bbe8458b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9a6dc408-e798-0eff-e5ff-540a8dd61ad5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f052b173-56d7-c4f2-9fe8-e4b8d2a9d5ac] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c87421f7-d728-d8ae-1a40-759fd46f1d65] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8decf8fe-2f57-a52e-e09d-f1cc175e4127] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d2acea1d-937a-44e9-9bf6-9d5e7a6c12e9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d8865f9e-41e6-11ca-9028-2cac33fca873] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a0bb855f-ec2d-f607-fcad-4e8d1b651540] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b77e81cb-1c47-924e-de6c-178aa424d548] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6ea2d588-0158-03da-4cff-ce09bf17f39a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e3bc3462-39f9-edc1-53dd-7c2b7cd5da3d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_87357443-30c3-d5eb-2316-d6be690569f9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_825ff2df-fbdc-029b-9838-6bc708458c70] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe444a04-7a6d-ea8c-15ff-615bd8b72403] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e71392f-868f-aa7e-3785-5ac5d7df25d8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7cbb5e24-5969-999b-9fbb-df11e6023463] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ecc21062-ca4c-0d9e-29f8-7afc9392a289] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f9acabbb-ea2e-d896-3bb8-164c69f3f7bb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c7d91bc1-6104-053d-a3c3-3b0db6144971] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8e63a3b8-fea4-00e8-6528-5c82237253b3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_62a893f3-952f-600f-4c1f-1b870e2f955a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2a2868e4-18b3-5fb2-3740-28939a6b385c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3dbdeafa-ed09-723a-cbba-af0d2327332f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6276dcd1-b0ac-44a2-0f55-1775300b7745] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_66e7e264-304b-aef4-a77e-c53e504d7c41] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9e970bd1-1db4-8a3c-0012-af615216a289] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_451b867b-b426-9d6b-1402-3b5b3bb8dc10] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8f01e4ba-aaed-521b-9a45-82e5967eaacc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_586342d7-6208-0207-a6e0-e064ae0ac3b5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6c18dc9-765a-269e-ddd4-6541faaa71fc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_67d482ae-3889-8323-7eac-18928a2e2b79] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ba54ee9f-1ea8-8d7a-97f4-c4bbb8a100cc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fbe4f379-1123-03c7-fe5b-8b149fb2e820] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d98fc096-da2b-f179-f9a6-de4a11cc4c16] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d91b81f5-3de4-6ff1-fb28-04e8261292fd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_274e1659-46f7-b45b-1b70-10889078595c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a804a117-5551-f327-5208-5000ee6bfb5a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_76ebdad2-ea21-607e-cf2f-31d7a5f52c27] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c4763cc3-c845-d35c-3851-6aef16578419] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8d6a65b7-c873-747f-3d33-2e3bc676886a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a04f9617-e186-e57d-9d39-58fadc1884a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6dcf4bfa-0af8-24fa-b4ca-f51cf59bc72a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_df58adfc-14cb-4f61-0276-28174b269aa1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5363c786-e762-4a48-4c75-ec2bbb0f2f61] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2b545b55-a879-2a61-c48d-2ad8d3d6fdfd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1fa606ce-7e0c-fbb9-fedf-ecadeca615f3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0162d7ac-15c0-ee4c-84e1-acb0ec062a70] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0b226dd2-157d-a28a-c795-9029e48ec15b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_df369a2f-6e29-de39-1db1-ed4e3f522024] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5f61822e-ae64-73bd-a123-0d9c1cb664ee] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_14890f6d-8985-170b-f811-da5760a854c0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5834c11d-dedf-2c3f-2f1e-be556fb85bf7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d0c7bbb9-44fd-b90e-ef1b-51f611c554d6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_362ea24f-432e-a901-819e-ae87c85b56b1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f243cd25-2d58-e4c6-77be-263d5637bb73] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_32561c34-1040-b3d2-dbb1-f70e3ae87139] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f6683b92-e22b-03f8-a8a6-975ebc47518f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33ed6753-c1cc-1774-e8b5-54613a9a276c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9526fbc7-3610-5fe1-e724-a58f5d8c2252] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c171b5fc-54cd-7e22-3024-f3d5687cf6ec] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0610f998-5930-1541-7be6-d59ec2e47229] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1880d484-d075-0c21-7927-1cb6ba25c4cf] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3ed4cf72-29c8-f403-5abe-d8494413b0e1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1115fa0b-5239-07ff-fd6a-360806092c0f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4cdcba1a-474a-3066-72b8-1f89f2e8e92a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_50da53e6-08cc-42d9-0a27-a7fc199bc5a7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6a550518-6fe6-1d6c-758e-b8d79b4a7f91] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_38c2572b-24c0-65ba-43a6-5489504a32f8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d28a14be-5c9d-8542-037e-9c6f94fac918] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1d7ad2fe-9254-a983-8f30-3474d14985a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ce6fae0f-a797-6072-4fb0-bd9a657b3b36] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7740c5b7-a332-8a5a-eb7a-d63462803226] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe3d20c4-1010-3934-193b-d30a30796e51] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b6b529bb-d3e2-6754-09da-843990e2cfdd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dbdcddfa-ce19-ac32-dd7e-dcca9d5884f9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_39f0010c-215b-705d-90cd-e44ad95abb3b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_57b76880-d518-ad46-f61c-3f5a55832697] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1812823e-1ee3-18fc-f949-53008ee41c21] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2236ba16-24ec-4cae-a904-6e2ec00c2235] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f36080f5-de47-4f3a-43e2-f2cbd77471fe] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c365ee2-a710-1b93-5489-d0c66d4cdd22] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4c1c6bcd-7f8a-c4ed-4e7b-8b77943bcd86] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c9db02e1-a294-f637-9c2c-4ed146c46db5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e053ed0c-b0d0-a10c-706e-cad9c3894ba3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1dea44e1-d487-8e23-882e-41764384910c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_437c77a9-d009-363c-a310-e84018d92267] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_822821b2-b293-2ea7-0b3c-34f77dc63476] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b56ec4e1-3925-2be5-30c9-92e24d3762de] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_05b44026-b546-6787-3560-5a85067e7e21] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5b5ec3f2-86bf-8243-aac6-14ae5a917fc1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_acfec9f6-3133-31a4-078f-40ebb2f39f8e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_92c3fc44-45a1-7e5c-d1e7-fa0936e305d3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e2a406c0-4c8b-3ffc-6949-4fbcce959db5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f230141a-1c86-26b5-523e-a228f361dcfa] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ff86af4-8662-81e3-5438-5f07e592fdf9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_200f68d9-e9d7-82be-5be6-b5b77ca86dd5] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_72986a87-83b2-5673-9767-e73aeb0cd4d1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_245b7a62-98e2-a367-fdea-79bbae4351b8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ce0a768-4980-be18-f5a0-d5be0db3fdc1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bae6e060-b7ad-12c1-490f-a6a99ec6038f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_16fc5971-df89-6f70-dbb2-fd4d1aa02b7c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7bf40c9b-bede-9649-a6ab-22a452a78fc0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a388182f-c115-1cb7-507f-a5a42e8cf46b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_68f21a10-ad1f-2fad-3da3-04ceb007fd12] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_07e6f9bd-969e-72ba-4947-f8552f47c5e7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9aaf1bc7-2914-f827-5915-08fc1f7bd026] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ce740085-603e-8c99-f95f-6bd3a046125c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-agent-9a3ae9f7-2ed3-c25c-12d9-d792452841d8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-grafana-bd7464de-fa72-736b-e57c-6782cc7d7202] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_bd7464de-fa72-736b-e57c-6782cc7d7202] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-d9d9c594-c60d-e002-6448-2a9b8b5fa6ec] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_82b10536-6a1c-644a-c3bc-3e2259536c58] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3d074f19-a056-6a32-78be-833cf8cafeed] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_907c3a55-036a-5569-003b-04839e1fc6c8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-bd7464de-fa72-736b-e57c-6782cc7d7202] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-2d549ab9-a7a8-65ac-1697-dd440dd0e3d7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-4d92967c-5996-752c-1cac-6f079b2c8099] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/minio-a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9a3ae9f7-2ed3-c25c-12d9-d792452841d8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e9a342a8-67cb-b747-1c58-839ea5d53d3b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2d549ab9-a7a8-65ac-1697-dd440dd0e3d7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-b9bd1537-0bae-8c11-41b3-437a4c21df29] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-e9a342a8-67cb-b747-1c58-839ea5d53d3b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3f553197-72d8-5935-7960-15619519b1fc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_96e6dcbf-7d61-6ca1-2df1-b6eb54bd7ade] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b9bd1537-0bae-8c11-41b3-437a4c21df29] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-prometheus-exporter-4d92967c-5996-752c-1cac-6f079b2c8099] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4d92967c-5996-752c-1cac-6f079b2c8099] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-agent-83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/security_postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-security-postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/forwardauth-8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-4eee21ed-4cb6-1319-8401-19107f29b34b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-prometheus-exporter-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/logunifier-dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d914a6cb-df4c-c1a2-72c4-01fa1270a96a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c936c4a-1073-d460-a67e-b7ecc9193d82] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0667520a-191e-2abc-16af-3b665c6b0a8a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fe182175-94df-c3a2-5033-f4863d6daf59] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_230c4069-3739-55ae-acf5-e00ad49a78d0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b962fb2f-9cfc-0ee5-c8e0-13cb5ae7eef4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_815b8320-f88f-14f8-a12c-2ad2e7c81af8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac62d1a8-321f-3c9f-bca0-f66d4ee6051a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_51934506-a27c-fce0-0046-1cb998e8fc8a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c754856f-9e64-7585-4e96-aea4177f666d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ac36bbc3-943b-46eb-85ca-55e5ab0da25a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0aaf5216-6aca-175e-5690-e24589f6b339] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_eb7d66d3-d0ed-d3f2-99c7-82425a05fc94] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d2928440-57bd-9f0d-be52-db209648a8b7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_578ba772-e064-722a-a608-dcd67e4f0d69] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3e6dc8cb-c970-c304-f6a4-a91bd1ae4f4c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8a26522c-7343-fd3f-ecb6-e7a8207afef3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c66d139-6193-eec9-7f1f-bb66615746be] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7c2979cd-c640-aee6-e178-449b7f874c61] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_795d0184-24eb-9a8d-2125-9da13758ed2a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_62fe668f-b911-d1b2-fe29-65c45e4132fd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_25e88201-39fa-8d38-f654-b4ac5332328f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c29afea1-c6d0-e407-a4f5-4eb92385ca32] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_23a7476b-2d56-d8af-d7d2-be2dd74a819b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f66b4793-86ac-5933-935d-753d07bbe2f1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_395f79bb-f3aa-730c-b2a8-d3eacbad0b1e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2767f1f6-303d-39e8-7386-5bc11447ee70] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_90186d8f-0c98-0f3a-1a20-f526ed171d42] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dd98d116-01c5-bc43-1d57-e93333e5c5fb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1e90aad6-28ad-fc45-fbf7-37e45ea84120] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4a73890a-df6f-c603-d0c4-72fada95a363] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0c17f752-dbd0-0697-a3e6-0f53fd8e9e6b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0d969d14-48a2-3794-c8c9-0a5ff6ca6397] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f269f1e0-3257-65c9-251f-cfb8263f7178] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1a604a30-09cb-dd55-49a9-51661b063017] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_70753ccb-16bc-c1b1-f3ac-3a4def318638] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2d9a275c-c00f-97de-5355-7211f16b700b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_66731feb-b41a-e45c-8786-63f8f308d1c6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d9fe8ae4-5f8f-ea8c-01ce-032da8d61c02] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_29b5878e-b982-304a-661d-27e4346863cb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5ab17d03-a694-611a-4c30-fe42fa77d611] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_98060fb3-c42f-2555-1181-7a8c2ccdf5e3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/minio-273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f856633a-575f-54e5-c7f6-c90d5ea6f20b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_69f54715-9420-d012-cf73-aca5df54e002] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_108c13c9-3484-3eec-2469-18eb5d5bcc9f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_293c489e-6f64-7df6-181d-8d4bd3cc6dc7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_53ac7e54-c453-1470-617d-705c82a9d303] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83dd82ef-357a-e09a-5f68-c4f83f5d3022] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a68e3b40-11a8-92c8-5e79-f4e5958c2cd9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fff8efd3-199c-9fc5-f043-a2daea286333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2e27123c-e020-cca6-d17f-227a9204992f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_6bfbafe7-6eb7-d3e7-f611-ad21bde726e1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a84dbceb-e563-6f63-78a3-a60434903fc4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8958df2c-747b-937d-fda6-9de0502129a3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8010ea2f-c1d4-95df-3c60-c986afcf820d] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e222c9d8-611d-1158-3bd1-5cd575e3bfae] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d76e8314-e43c-b6fc-ffeb-a4a16f92ed30] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_00abd284-a436-a771-7932-2588a00861bb] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33b79af9-85f0-b75c-5f82-4d67c4de5a1e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d07aefbb-0892-0f71-277f-62b800275169] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a44fc66c-ddee-a1a9-b338-f5d1c62dcec1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_95892491-2a91-03ad-131b-87fe8c898a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_ebf9b42c-3684-c50b-d777-98758305caa2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_1c8555ba-1133-2e30-c23f-a1ed2f3a52cd] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_db869c37-317e-290d-53f6-6fe4e1203f9f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_e4a2a0f0-ce1a-8dd4-909b-8d493b7b3147] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d180bfaf-7db9-6d0c-f06e-d9f8fa4b0d08] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_fb14d5de-dcda-5120-191e-6db17613d746] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_729ec73e-7c5d-a7f4-6bbe-ba89e1c82c48] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_094169c6-7f71-8797-2811-d28b34f98cbc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_cdaf15cd-9c50-e527-3d84-4ca24fe957d0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9256b9c1-e86a-c54e-76c9-19b554d3590a] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b55f2d5e-2db6-b3c2-0406-c99a6086799c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_65078e15-e522-a18d-9cd8-c84b61d04080] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f7fb25b5-c57c-f328-5492-982192b038e3] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/traefik-4eee21ed-4cb6-1319-8401-19107f29b34b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8dc8d24e-db95-9ab8-4100-5ff73fd0107f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_232196fb-c36e-a126-5765-816d8f277f5b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9f464894-eacb-9913-c641-a29461c38cc8] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8f79622e-0821-d1dc-fa63-db1705ca0c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9d1dd65b-e0e8-f50a-b5a4-501fcc32dffa] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_778d37b8-b473-7ae4-1aeb-5c89c8aafdb2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a503b091-ec04-4d76-7053-83d537d6d591] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_9764c731-33c4-f84c-0bac-acf982471ec6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2550e92d-7eea-a28c-8df3-8ac8921a618c] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_4caca5c4-5782-7020-3b9c-319bb6fb8d6f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3f2ee033-a760-0227-0c16-00efd8830dae] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_2c7dabe8-496b-1b80-2b3b-fad6530c94be] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_781552f3-602d-39aa-b2af-c18f114d768f] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_689aa6d5-4263-33ed-bd9d-6b995cab3269] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_10d3203e-fb17-25f5-0443-9dc14bee4353] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_c10a90bb-5cbc-5809-13ea-aacae5ca3ae0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0a6c32ef-65d7-113b-e95f-0d4fcbe5b17e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_33410af5-f1eb-d8c0-0f95-18eed2fc0545] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/tempo-1df99288-8878-cc8c-5048-da8fa9ab2ac1] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-prometheus-exporter-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-keycloak-7ea6f631-b991-3c18-18f8-d4a88c779de0] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0a6c32ef-65d7-113b-e95f-0d4fcbe5b17e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_dfaa3956-7472-60cf-7e26-24a8b746f22e] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d7cee0be-360d-3028-cfe7-1749771765e7] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/mimir-e6d2193c-f1b2-7888-cac9-0207318d1ad2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_3f2ee033-a760-0227-0c16-00efd8830dae] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-agent-83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_b962fb2f-9cfc-0ee5-c8e0-13cb5ae7eef4] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_aea4eb53-7bfa-7d7c-e516-bac0b392d2cc] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/minio-273102c3-f542-e0a3-3eae-d80c6ec50c74] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/security_postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/forwardauth-8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/grafana-303fbdec-6d2a-bb10-4a8e-a6c0dad17a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/connect-proxy-security-postgres-ebfef72d-5162-582e-79d2-714b5a5854a9] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_83234bd8-8193-89ab-6234-59d2fb689df2] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d07aefbb-0892-0f71-277f-62b800275169] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nats-d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/loki-f9449c12-3984-3c99-55eb-606389ce2624] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_a850ddb2-d6d9-37bc-7d4e-617f82f436a6] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_95892491-2a91-03ad-131b-87fe8c898a05] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_0c17f752-dbd0-0697-a3e6-0f53fd8e9e6b] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_8678d135-46d3-ef22-2581-818e5ec07333] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/logunifier-dad53d62-2b73-6995-03d3-6c225a2fb549] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_5f709788-d95a-3f64-a2d9-a5ba9b541c67] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿ‘€] client.driver_mgr.docker: listed container: driver=docker names=[/nomad_init_d16da4e2-3e1e-fc6f-864c-a5f40ea080ef] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="442.864ยตs" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=10 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=9 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/versions?diffs=true duration=4.597556ms 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=3 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=connect-proxy-keycloak-postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=27dfb19c-1e44-2e49-a689-0a4e369f7bd2 task=keycloak_postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=forwardauth type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=87cd4d30-a263-1598-0a57-72046f840473 task=await-for-keycloak type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=keycloak type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=connect-proxy-keycloak type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=1022bc44-6bd1-c8c5-62c5-4166c31f7afc task=await-for-keycloak-postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15458 duration="319.651ยตs" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=3 removed=0 updated=9 ignored=9 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15459 total=21 pulled=12 filtered=9 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15458 duration="256.275ยตs" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=12bdf712-8262-41dd-248b-427d4ea92684 type=service namespace=default job_id=security node_id="" triggered_by=job-register 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15456 duration=4.522086697s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15456 duration=2.756156649s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15314 duration=10.546329472s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15455 duration=754.856534ms 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15314 duration=1m53.65717681s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15314 duration=1m53.544967941s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=15314 duration=755.538628ms 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=12bdf712-8262-41dd-248b-427d4ea92684 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=12bdf712-8262-41dd-248b-427d4ea92684 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15455 duration=8.108080526s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=POST path=/v1/job/security duration=13.458928ms 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15453 duration=4.506709939s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15453 duration=2.720484816s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15453 duration=6.472694626s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15453 duration=2.738379204s 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 12bdf712, job security, deploy 89024955, NodeAllocations: (node[36d1fc65] (27dfb19c security.keycloak-postgres[0] run) (87cd4d30 security.keycloak-ingress[0] run) (1022bc44 security.keycloak[0] run)))" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 3) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 1) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 1) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Created Deployment: "89024955-4251-b075-2135-dd298aaabc72" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 1) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=12bdf712-8262-41dd-248b-427d4ea92684 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=12bdf712-8262-41dd-248b-427d4ea92684 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=12bdf712-8262-41dd-248b-427d4ea92684 type=service namespace=default job_id=security node_id="" triggered_by=job-register 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job validate results: validator=validate warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job validate results: validator=memory_oversubscription warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job validate results: validator=vault warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job validate results: validator=connect warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job validate results: validator=namespace-constraint-check warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job validate results: validator=expose-check warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job mutate results: mutator=expose-check warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job mutate results: mutator=connect warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job mutate results: mutator=constraints warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.job: job mutate results: mutator=canonicalize warnings=[] error= 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security duration="557.16ยตs" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="791.936ยตs" 2023-05-11T11:37:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="995.89ยตs" 2023-05-11T11:37:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:37:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 error="dial tcp 10.21.21.42:24915: connect: connection refused" 2023-05-11T11:37:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="611.755ยตs" 2023-05-11T11:37:08+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="628.719ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48706: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="963.977ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/f652ee64-d508-464f-bfb5-d1a36ac8f3d9 duration="519.914ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/36d1fc65-c097-97bc-18ac-079c1262ccfd duration="632.139ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/e2eb7460-2bca-ac62-5c53-999281062667 duration="492.422ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="177.931ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48704: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48700: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment duration="230.294ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/vars?prefix=nomad%2Fjobs%2Fsecurity" duration="407.135ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=1 duration="220.276ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=1 duration="411.729ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48660: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48666: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&namespace=default" duration="397.347ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="219.876ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations duration=1.11768ms 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations duration="364.289ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.811818ms 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security duration="881.209ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48624: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48646: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48636: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/members duration="555.384ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/operator/license duration="67.861ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=POST path=/v1/search/fuzzy duration="747.985ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/regions duration=2.141944ms 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/acl/token/self error="RPC Error:: 400,ACL support disabled" code=400 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/acl/token/self duration="734.452ยตs" 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48614: EOF 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48544: EOF 2023-05-11T11:37:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:37:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="421.548ยตs" 2023-05-11T11:37:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:37:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.4341ms 2023-05-11T11:37:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="351.231ยตs" 2023-05-11T11:37:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:52378 2023-05-11T11:37:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:37:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="332.268ยตs" 2023-05-11T11:37:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="330.098ยตs" 2023-05-11T11:37:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="730.669ยตs" 2023-05-11T11:37:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48486: EOF 2023-05-11T11:37:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15451 duration="250.119ยตs" 2023-05-11T11:37:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15452 duration="449.732ยตs" 2023-05-11T11:37:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="520.776ยตs" 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="431.083ยตs" 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee check=keycloak_postgres_ping task=group-keycloak-postgres time_limit=20s 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15456 total=18 pulled=9 filtered=9 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15455 total=18 pulled=9 filtered=9 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15381 duration=1m45.382466937s 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15381 duration=1m45.540582113s 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=15381 duration=2.391761009s 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad pid=8692 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner: waiting for task to exit: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=1 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Killed msg="Task successfully killed" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=12252 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=6be1224a77982c6acd5892c6a273d0943101c25a56d2d0f111b78b9e98f6ff57 driver=docker 2023-05-11T11:37:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.envoy: Error receiving new DeltaDiscoveryRequest; closing request channel: error="rpc error: code = Canceled desc = context canceled" 2023-05-11T11:37:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:37:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-716e088c4a623de12ff9c906dfc61990e48aeb5a 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres path=/usr/local/bin/nomad pid=8868 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner: main tasks dead, destroying all sidecar tasks: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type="Main Tasks Dead" msg="Main tasks in the group died" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15454 total=18 pulled=9 filtered=9 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Killed msg="Task successfully killed" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:cefd1c9e490c8b581d834d878081cf64c133df1f9f443c5e5f8d94fbd7c7a1d4 references=0 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=12254 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:37:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=94cf9bd135e9f52310092fa921a1f3d6b36886ae36682434de17ae3ced6bb184 driver=docker 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.473422ms 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/857749e0-52fe-92ee-7bef-fafbe67605ee/stats duration=2.487711ms 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration=1.019044ms 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=10 ignored=8 errors=0 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=10 ignored=8 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15452 total=18 pulled=10 filtered=8 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=90e4892a-534a-f6bd-c637-24f859378cbe type=service namespace=default job_id=security node_id="" triggered_by=job-deregister 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/8f3fb4a6-629a-7afd-a334-5580bf2d3374?index=15373 duration=1m37.943482953s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15451 duration="353.4ยตs" 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15450 duration=68.334794ms 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=90e4892a-534a-f6bd-c637-24f859378cbe job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=90e4892a-534a-f6bd-c637-24f859378cbe 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/8f3fb4a6-629a-7afd-a334-5580bf2d3374?index=15373 duration=6.53671383s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/57e22169-b9b1-7fc0-d7e9-615ff2623261?index=15373 duration=1m37.834620795s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15121 duration=2.024740893s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15121 duration=1m45.010471212s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15407 duration=1m47.163414482s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15443 duration=29.271735ms 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 3) (disconnect 0) (reconnect 0) 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?index=15121 duration=1m47.121752451s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=90e4892a-534a-f6bd-c637-24f859378cbe job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=90e4892a-534a-f6bd-c637-24f859378cbe job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=90e4892a-534a-f6bd-c637-24f859378cbe type=service namespace=default job_id=security node_id="" triggered_by=job-deregister 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 90e4892a, job security, NodeUpdates: (node[f652ee64] (57e22169 stop/evict))(node[36d1fc65] (857749e0 stop/evict))(node[e2eb7460] (e5fea251 stop/evict)))" 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=DELETE path=/v1/job/security duration=1.688692ms 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15445" duration=3.604376407s 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:37:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="445.807ยตs" 2023-05-11T11:37:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:37:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="372.353ยตs" 2023-05-11T11:37:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-f2cfef6fca0c54054488693ca23684fc092ffa40 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48440: EOF 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/857749e0-52fe-92ee-7bef-fafbe67605ee/stats duration=1.170635ms 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="562.797ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48426: EOF 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/857749e0-52fe-92ee-7bef-fafbe67605ee/stats duration=1.800849ms 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="710.041ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48396: EOF 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15419 duration="382.317ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48384: EOF 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations?index=15407 duration="264.779ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="219.576ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&namespace=default" duration="265.453ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security duration="341.899ยตs" 2023-05-11T11:36:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="327.417ยตs" 2023-05-11T11:36:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="340.681ยตs" 2023-05-11T11:36:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:48352: EOF 2023-05-11T11:36:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?meta=true duration="426.884ยตs" 2023-05-11T11:36:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="205.282ยตs" 2023-05-11T11:36:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.620887ms 2023-05-11T11:36:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:36:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:36:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.451477ms 2023-05-11T11:36:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="647.508ยตs" 2023-05-11T11:36:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:51472 2023-05-11T11:36:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:36:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="812.125ยตs" 2023-05-11T11:36:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="617.204ยตs" 2023-05-11T11:36:53+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="787.084ยตs" 2023-05-11T11:36:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="589.563ยตs" 2023-05-11T11:36:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.716891ms 2023-05-11T11:36:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="332.487ยตs" 2023-05-11T11:36:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:49+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="380.455ยตs" 2023-05-11T11:36:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="349.224ยตs" 2023-05-11T11:36:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="317.472ยตs" 2023-05-11T11:36:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:36:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:36:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="350.456ยตs" 2023-05-11T11:36:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.686248ms 2023-05-11T11:36:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:38942 2023-05-11T11:36:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:36:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:45+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="381.775ยตs" 2023-05-11T11:36:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="384.752ยตs" 2023-05-11T11:36:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="376.387ยตs" 2023-05-11T11:36:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-ready\" check" task=group-grafana-agent time_limit=40s 2023-05-11T11:36:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent time_limit=20s 2023-05-11T11:36:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="349.866ยตs" 2023-05-11T11:36:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.671347ms 2023-05-11T11:36:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="356.628ยตs" 2023-05-11T11:36:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="350.286ยตs" 2023-05-11T11:36:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.610415ms 2023-05-11T11:36:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="724.263ยตs" 2023-05-11T11:36:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="692.941ยตs" 2023-05-11T11:36:37+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:36:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:36:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.368277ms 2023-05-11T11:36:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:56030 2023-05-11T11:36:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="676.465ยตs" 2023-05-11T11:36:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:36:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:35+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:35+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:35+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15450 total=18 pulled=9 filtered=9 2023-05-11T11:36:35+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="620.095ยตs" 2023-05-11T11:36:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="471.699ยตs" 2023-05-11T11:36:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="320.503ยตs" 2023-05-11T11:36:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats 2023-05-11T11:36:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.691708ms 2023-05-11T11:36:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="375.023ยตs" 2023-05-11T11:36:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:36:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15449 total=18 pulled=9 filtered=9 2023-05-11T11:36:30+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="323.037ยตs" 2023-05-11T11:36:29+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="406.024ยตs" 2023-05-11T11:36:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="347.913ยตs" 2023-05-11T11:36:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="320.876ยตs" 2023-05-11T11:36:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:36:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:36:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="346.872ยตs" 2023-05-11T11:36:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.4489ms 2023-05-11T11:36:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:60800 2023-05-11T11:36:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:36:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="341.597ยตs" 2023-05-11T11:36:25+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" duration=1m3.191366427s 2023-05-11T11:36:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-70aab0652a7bd0752dd874aea93591402b616533 2023-05-11T11:36:24+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:36:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="316.053ยตs" 2023-05-11T11:36:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m0.170671595s 2023-05-11T11:36:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="522.056ยตs" 2023-05-11T11:36:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats\" check" task=group-nats time_limit=20s 2023-05-11T11:36:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats time_limit=10s 2023-05-11T11:36:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="341.801ยตs" 2023-05-11T11:36:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=4.630828ms 2023-05-11T11:36:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="818.739ยตs" 2023-05-11T11:36:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-b3d1f5b9ee3978cf6b8f0255a106a7688c82cab7 2023-05-11T11:36:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="752.534ยตs" 2023-05-11T11:36:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:36:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="715.179ยตs" 2023-05-11T11:36:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="728.07ยตs" 2023-05-11T11:36:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:36:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="687.747ยตs" 2023-05-11T11:36:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:36:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.556903ms 2023-05-11T11:36:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:54114 2023-05-11T11:36:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" duration=1m2.572153834s 2023-05-11T11:36:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:36:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="721.268ยตs" 2023-05-11T11:36:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15448 total=18 pulled=9 filtered=9 2023-05-11T11:36:15+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-70aab0652a7bd0752dd874aea93591402b616533 2023-05-11T11:36:14+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:36:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="685.074ยตs" 2023-05-11T11:36:14+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-eb14e1331928ab27d2b8b775e25ee3c228dccd31 2023-05-11T11:36:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="805.319ยตs" 2023-05-11T11:36:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-859b3d99c092fd0ca87a2ba57fadf8e13163a73a 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.010976ms 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-716e088c4a623de12ff9c906dfc61990e48aeb5a 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15447 total=18 pulled=9 filtered=9 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:36:12.537Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:36:12.531Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin497444879 network=unix timestamp=2023-05-11T09:36:12.530Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker network=unix @module=docker_logger address=/tmp/plugin2272204851 timestamp=2023-05-11T09:36:12.528Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=12254 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=6be1224a77982c6acd5892c6a273d0943101c25a56d2d0f111b78b9e98f6ff57 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=12252 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=94cf9bd135e9f52310092fa921a1f3d6b36886ae36682434de17ae3ced6bb184 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=6be1224a77982c6acd5892c6a273d0943101c25a56d2d0f111b78b9e98f6ff57 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=94cf9bd135e9f52310092fa921a1f3d6b36886ae36682434de17ae3ced6bb184 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=keycloak_postgres container_name=keycloak_postgres-857749e0-52fe-92ee-7bef-fafbe67605ee 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=keycloak_postgres labels="map[com.hashicorp.nomad.alloc_id:857749e0-52fe-92ee-7bef-fafbe67605ee com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:keycloak_postgres]" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/postgres:14.5 image_id=sha256:cefd1c9e490c8b581d834d878081cf64c133df1f9f443c5e5f8d94fbd7c7a1d4 references=1 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/postgres:14.5 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=keycloak_postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local/initddb.sql:/docker-entrypoint-initdb.d/initddb.sql\"}" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=keycloak_postgres memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=keycloak_postgres network_mode=container:5c2a6e2bed9928c93ccf1d654aac2b51225a12f01b4610592ea2563448aacf2a 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-keycloak-postgres labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:857749e0-52fe-92ee-7bef-fafbe67605ee com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:connect-proxy-keycloak-postgres]" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-keycloak-postgres memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-keycloak-postgres container_name=connect-proxy-keycloak-postgres-857749e0-52fe-92ee-7bef-fafbe67605ee 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-keycloak-postgres network_mode=container:5c2a6e2bed9928c93ccf1d654aac2b51225a12f01b4610592ea2563448aacf2a 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=2 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-keycloak-postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets:/secrets\"}" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.keycloak_postgres.stderr.fifo timestamp=2023-05-11T09:36:12.398Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.keycloak_postgres.stdout.fifo timestamp=2023-05-11T09:36:12.398Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.connect-proxy-keycloak-postgres.stdout.fifo timestamp=2023-05-11T09:36:12.396Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.connect-proxy-keycloak-postgres.stderr.fifo timestamp=2023-05-11T09:36:12.396Z 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres reason="" delay=0s 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:cefd1c9e490c8b581d834d878081cf64c133df1f9f443c5e5f8d94fbd7c7a1d4 references=0 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres reason="" delay=0s 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=1 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=8828 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=8966 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=fa3b7bc7827c5ee18fd7ce567ee426f56f5bce66e44534bdcc0b5d9b1f568108 driver=docker 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=3b4d212e87bebe3a8c28a21431efcec507575987069ec72cd9ea0edf04ee3cdc driver=docker 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=PUT path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/restart?namespace=default error="No path to node" code=404 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=PUT path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/restart?namespace=default duration="822.495ยตs" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/deployments?all=true&namespace=default" duration="533.099ยตs" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=PUT path=/v1/client/allocation/857749e0-52fe-92ee-7bef-fafbe67605ee/restart?namespace=default duration=140.95824ms 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15446 total=18 pulled=9 filtered=9 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15419 duration=57.562294961s 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/scale?index=15389 duration=55.431902885s 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations?index=15419 duration=55.452509713s 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.envoy: Error receiving new DeltaDiscoveryRequest; closing request channel: error="rpc error: code = Canceled desc = context canceled" 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432 2023-05-11T11:36:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type="Restart Signaled" msg="User requested running tasks to restart" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type="Restart Signaled" msg="User requested running tasks to restart" failed=false 2023-05-11T11:36:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/deployments?all=true&namespace=default" duration="178.747ยตs" 2023-05-11T11:36:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.737901ms 2023-05-11T11:36:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="490.422ยตs" 2023-05-11T11:36:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15445 total=18 pulled=9 filtered=9 2023-05-11T11:36:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:36:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3-minio-minio-http 2023-05-11T11:36:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3-minio-minio-console-console 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio type=Started msg="Task started by client" failed=false 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:36:10.932Z 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin4031503232 network=unix timestamp=2023-05-11T09:36:10.929Z 2023-05-11T11:36:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=12074 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=745cad68ccbd67608b8b690be47a89a294223d2541efd3358097c05642d064ac 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=9 errors=0 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15444 total=18 pulled=9 filtered=9 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=9 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=745cad68ccbd67608b8b690be47a89a294223d2541efd3358097c05642d064ac 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/minio/minio:RELEASE.2023-04-28T18-11-17Z image_id=sha256:5ba81f3dad7fb4d608d375ec64cac33fcb196e0ed530be35e002177639b11d21 references=1 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=minio network_mode=container:1305933a306296634391b261c11335c2d5befaa15992a64eaff72b80722b1cea 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=minio memory=4294967296 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container startup command: driver=docker task_name=minio command="server /data --console-address :9001 --certs-dir /certs" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=minio labels="map[com.hashicorp.nomad.alloc_id:a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 com.hashicorp.nomad.job_id:minio com.hashicorp.nomad.job_name:minio com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:minio com.hashicorp.nomad.task_name:minio]" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/minio/minio:RELEASE.2023-04-28T18-11-17Z 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=minio container_name=minio-a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=minio binds="[]string{\"/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/minio/local:/local\", \"/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/minio/secrets:/secrets\"}" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/minio/secrets/env.vars" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 124b5b35ead742dca1d9561857552116 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/minio@default.global) 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/minio/secrets/env.vars" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/minio@default.global) is still needed 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?namespace=default&stale=&wait=60000ms" duration="826.911ยตs" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/minio@default.global) 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/minio@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/minio@default.global) 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/minio\" -}}\n MINIO_ROOT_USER = {{.minio_root_user}}\n MINIO_ROOT_PASSWORD = {{.minio_root_password}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/minio/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/minio"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 124b5b35ead742dca1d9561857552116 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio @module=logmon path=/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/alloc/logs/.minio.stdout.fifo timestamp=2023-05-11T09:36:10.764Z 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio @module=logmon path=/opt/services/core/nomad/data/alloc/a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3/alloc/logs/.minio.stderr.fifo timestamp=2023-05-11T09:36:10.764Z 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio version=2 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio @module=logmon address=/tmp/plugin4253698064 network=unix timestamp=2023-05-11T09:36:10.763Z 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio path=/usr/local/bin/nomad 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio path=/usr/local/bin/nomad pid=12020 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.68\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"66:7b:fe:34:1c:56\",\"Sandbox\":\"/var/run/docker/netns/af27f53de5ca\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"vethd6e23ab3\":{\"IPConfigs\":null,\"Mac\":\"8a:a8:39:1e:1d:f9\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="395.581ยตs" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 previous_alloc=a37da363-9048-86d5-93e1-d6facf1490b1 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 previous_alloc=a37da363-9048-86d5-93e1-d6facf1490b1 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=1 removed=0 updated=9 ignored=8 errors=0 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a3a463c6-cc7d-b968-3e20-d09dfc2ac6b3 task=minio type=Received msg="Task received by client" failed=false 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=1 removed=0 updated=9 ignored=8 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15442 total=18 pulled=10 filtered=8 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe type=service namespace=default job_id=minio node_id="" triggered_by=alloc-failure 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 1) (destructive 0) (inplace 0) (stop 1) (disconnect 0) (reconnect 0) 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval e37593a3, job minio, NodeUpdates: (node[36d1fc65] (a37da363 stop/evict)), NodeAllocations: (node[36d1fc65] (a3a463c6 minio.minio[0] run)))" 2023-05-11T11:36:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe type=service namespace=default job_id=minio node_id="" triggered_by=alloc-failure 2023-05-11T11:36:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="683.071ยตs" 2023-05-11T11:36:08+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="782.872ยตs" 2023-05-11T11:36:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="692.699ยตs" 2023-05-11T11:36:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:36:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:36:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.6283ms 2023-05-11T11:36:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="671.419ยตs" 2023-05-11T11:36:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:46356 2023-05-11T11:36:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-19895c3bd525cf9c3a506c1368b2a62e05311dbf 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-50b7019c86255addd5dcd01d59711761ba228062 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-2ad01ec58eea90b73b8f7910539272078c378498 2023-05-11T11:36:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="521.519ยตs" 2023-05-11T11:36:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-70aab0652a7bd0752dd874aea93591402b616533 2023-05-11T11:36:04+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:36:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="334.848ยตs" 2023-05-11T11:36:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-5b9fc6e1d9e4477d8b86b1980d5788f90b681884 2023-05-11T11:36:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:36:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="402.53ยตs" 2023-05-11T11:36:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="347.58ยตs" 2023-05-11T11:36:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.702363ms 2023-05-11T11:36:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:36:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="318.759ยตs" 2023-05-11T11:36:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=8 errors=0 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=8 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15440 total=17 pulled=9 filtered=8 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=9 ignored=8 errors=0 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15438 total=17 pulled=9 filtered=8 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=9 ignored=8 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=9 errors=0 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=9 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15436 total=17 pulled=8 filtered=9 2023-05-11T11:36:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-f20bbef77e104d3e4619184819b0ce43a43d5663 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 type=service namespace=default job_id=minio node_id="" triggered_by=alloc-failure 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 2366c69d, job minio, NodeAllocations: (node[36d1fc65] (a37da363 minio.minio[0] run)))" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=e37593a3-e0f2-a323-ed16-2d9728395bbe 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=2366c69d-1a0a-c83a-63e8-95520ec84bb4 type=service namespace=default job_id=minio node_id="" triggered_by=alloc-failure 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.client: adding evaluations for rescheduling failed allocations: num_evals=1 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio path=/usr/local/bin/nomad pid=6641 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:5ba81f3dad7fb4d608d375ec64cac33fcb196e0ed530be35e002177639b11d21 references=0 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-87544ecc56e51c7b778bce62ebcebb7ede77d2edb55a51a18761935953107a17.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=10717 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:36:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=87544ecc56e51c7b778bce62ebcebb7ede77d2edb55a51a18761935953107a17 driver=docker 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="371.447ยตs" 2023-05-11T11:35:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-760a142ce99b087dda36d076210381837fccee9a 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=9 errors=0 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=9 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15435 total=17 pulled=8 filtered=9 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 2023-05-11T11:35:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-console-console 2023-05-11T11:35:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-http 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 check=minio-ready task=minio 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type="Restart Signaled" msg="healthcheck: check \"minio-ready\" unhealthy" failed=false 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=9 errors=0 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=9 2023-05-11T11:35:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15434 total=17 pulled=8 filtered=9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="328.449ยตs" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=9 errors=0 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15433 total=17 pulled=8 filtered=9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo type=Started msg="Task started by client" failed=false 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:35:58.759Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin2429080487 network=unix @module=docker_logger timestamp=2023-05-11T09:35:58.755Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=308318634a64409afcf206737ccd2e0d9dd263312fd1cc2c3a434e243dc1a1a5 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=11677 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=308318634a64409afcf206737ccd2e0d9dd263312fd1cc2c3a434e243dc1a1a5 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=tempo labels="map[com.github.logunifier.application.name:tempo com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.1.1 com.hashicorp.nomad.alloc_id:b9bd1537-0bae-8c11-41b3-437a4c21df29 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:tempo com.hashicorp.nomad.task_name:tempo]" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=tempo network_mode=container:01c62dde5e9110fe0a3a8aed5f44dd89d02f516e24092847d495273a7ba17db8 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=tempo container_name=tempo-b9bd1537-0bae-8c11-41b3-437a4c21df29 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=tempo binds="[]string{\"/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo/local:/local\", \"/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo/local/tempo.yaml:/config/tempo.yaml\"}" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=1 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=tempo memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo/local/tempo.yaml" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo/local/tempo.yaml" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 723104a980ccc69983d3c9e8709349f9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: health.service(mimir|passing) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency health.service(mimir|passing) to missing since isLeader but do not have a watcher 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding health.service(mimir|passing) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 723104a980ccc69983d3c9e8709349f9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"multitenancy_enabled: false\n\nserver:\n http_listen_port: 3200\n\ndistributor:\n receivers: # this configuration will listen on all ports and protocols that tempo is capable of.\n jaeger: # the receives all come from the OpenTelemetry collector. more configuration information can\n protocols: # be found there: https://github.com/open-telemetry/opentelemetry-collector/tree/main/receiver\n thrift_http: #\n grpc: # for a production deployment you should only enable the receivers you need!\n thrift_binary:\n thrift_compact:\n zipkin:\n otlp:\n protocols:\n http:\n grpc:\n opencensus:\n\ningester:\n trace_idle_period: 10s # the length of time after a trace has not received spans to consider it complete and flush it\n max_block_bytes: 1_000_000 # cut the head block when it hits this size or ...\n max_block_duration: 5m # this much time passes\n\ncompactor:\n compaction:\n compaction_window: 1h # blocks in this time window will be compacted together\n max_block_bytes: 100_000_000 # maximum size of compacted blocks\n block_retention: 24h # Duration to keep blocks 1d\n\nmetrics_generator:\n registry:\n external_labels:\n source: tempo\n cluster: nomadder1\n storage:\n path: /data/generator/wal\n remote_write:\n++- range service \"mimir\" ++\n - url: http://++.Name++.service.consul:++.Port++/api/v1/push\n send_exemplars: true\n headers:\n x-scope-orgid: 1\n++- end ++\n\nstorage:\n trace:\n backend: local # backend configuration to use\n block:\n bloom_filter_false_positive: .05 # bloom filter false positive rate. lower values create larger filters but fewer false positives\n wal:\n path: /data/wal # where to store the the wal locally\n local:\n path: /data/blocks\n pool:\n max_workers: 100 # worker pool determines the number of parallel requests to the object store backend\n queue_depth: 10000\n\nquery_frontend:\n search:\n # how to define year here ? define 5 years\n max_duration: 43800h\n\noverrides:\n metrics_generator_processors: [service-graphs, span-metrics]","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo/local/tempo.yaml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/tempo"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/alloc/logs/.tempo.stderr.fifo timestamp=2023-05-11T09:35:58.571Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/b9bd1537-0bae-8c11-41b3-437a4c21df29/alloc/logs/.tempo.stdout.fifo timestamp=2023-05-11T09:35:58.570Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo @module=logmon address=/tmp/plugin2721173001 network=unix timestamp=2023-05-11T09:35:58.565Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo version=2 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=9 errors=0 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=5 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-b9bd1537-0bae-8c11-41b3-437a4c21df29-group-tempo-tempo-otlp-http-otlp_http 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15432 total=17 pulled=8 filtered=9 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-b9bd1537-0bae-8c11-41b3-437a4c21df29-group-tempo-tempo-jaeger-jaeger 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-b9bd1537-0bae-8c11-41b3-437a4c21df29-group-tempo-tempo-zipkin-zipkin 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-b9bd1537-0bae-8c11-41b3-437a4c21df29-group-tempo-tempo-otlp-grpc-otlp_grpc 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo path=/usr/local/bin/nomad 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-b9bd1537-0bae-8c11-41b3-437a4c21df29-group-tempo-tempo-tempo 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo path=/usr/local/bin/nomad pid=11620 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.67\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"9a:06:33:d2:a5:c3\",\"Sandbox\":\"/var/run/docker/netns/2b48f2685e95\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth2dd3b0c1\":{\"IPConfigs\":null,\"Mac\":\"66:86:f1:aa:03:e4\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent type=Started msg="Task started by client" failed=false 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:35:58.433Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin1581620523 network=unix @module=docker_logger timestamp=2023-05-11T09:35:58.431Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=11562 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=b82e3d236cce24f85e06b057b67e5d550fe6f623252acc4e9a3a0174d1a1c5c1 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=b82e3d236cce24f85e06b057b67e5d550fe6f623252acc4e9a3a0174d1a1c5c1 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=9 errors=0 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15431 total=17 pulled=8 filtered=9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=9 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=1 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=grafana-agent network_mode=container:8de4b3ce95d048f1710d786f18a9e79e6a21a22b56ea1ed830c1c75917501be5 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=grafana-agent memory=2147483648 memory_reservation=67108864 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=grafana-agent binds="[]string{\"/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent/local:/local\", \"/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent/local/agent.yaml:/config/agent.yaml\"}" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=grafana-agent container_name=grafana-agent-9a3ae9f7-2ed3-c25c-12d9-d792452841d8 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=grafana-agent labels="map[com.github.logunifier.application.name:grafana_agent com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:0.33.1 com.hashicorp.nomad.alloc_id:9a3ae9f7-2ed3-c25c-12d9-d792452841d8 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana-agent com.hashicorp.nomad.task_name:grafana-agent]" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent/local/agent.yaml" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent/local/agent.yaml" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template d3fbf984da4c8c04e753c284e9a12f26 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: health.service(mimir|passing) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency health.service(mimir|passing) to missing since isLeader but do not have a watcher 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template d3fbf984da4c8c04e753c284e9a12f26 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding health.service(mimir|passing) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"server:\n log_level: info\n\nmetrics:\n wal_directory: \"/data/wal\"\n global:\n scrape_interval: 5s\n remote_write:\n++- range service \"mimir\" ++\n - url: http://++.Name++.service.consul:++.Port++/api/v1/push\n++- end ++\n configs:\n - name: integrations\n scrape_configs:\n - job_name: integrations/traefik\n scheme: http\n metrics_path: '/metrics'\n static_configs:\n - targets:\n - ingress.cloud.private:8081\n # grab all metric endpoints with stadanrd /metrics endpoint\n - job_name: \"integrations/consul_sd\"\n consul_sd_configs:\n - server: \"consul.service.consul:8501\"\n tags: [\"prometheus\"]\n tls_config:\n insecure_skip_verify: true\n ca_file: \"/certs/ca/ca.crt\"\n cert_file: \"/certs/consul/consul.pem\"\n key_file: \"/certs/consul/consul-key.pem\"\n datacenter: \"nomadder1\"\n scheme: https\n relabel_configs:\n - source_labels: [__meta_consul_node]\n target_label: instance\n - source_labels: [__meta_consul_service]\n target_label: service\n# - source_labels: [__meta_consul_tags]\n# separator: ','\n# regex: 'prometheus:([^=]+)=([^,]+)'\n# target_label: '$${1}'\n# replacement: '$${2}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:server_id=([^,]+),.*'\n target_label: 'server_id'\n replacement: '$${1}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:version=([^,]+),.*'\n target_label: 'version'\n replacement: '$${1}'\n - source_labels: ['__meta_consul_tags']\n target_label: 'labels'\n regex: '(.+)'\n replacement: '$${1}'\n action: 'keep'\n # - action: replace\n # replacement: '1'\n # target_label: 'test'\n metric_relabel_configs:\n - action: labeldrop\n regex: 'exported_.*'\n\n\n - job_name: \"integrations/consul_sd_minio\"\n metrics_path: \"/minio/v2/metrics/cluster\"\n consul_sd_configs:\n - server: \"consul.service.consul:8501\"\n tags: [\"prometheus_minio\"]\n tls_config:\n insecure_skip_verify: true\n ca_file: \"/certs/ca/ca.crt\"\n cert_file: \"/certs/consul/consul.pem\"\n key_file: \"/certs/consul/consul-key.pem\"\n datacenter: \"nomadder1\"\n scheme: https\n relabel_configs:\n - source_labels: [__meta_consul_node]\n target_label: instance\n - source_labels: [__meta_consul_service]\n target_label: service\n# - source_labels: [__meta_consul_tags]\n# separator: ','\n# regex: 'prometheus:([^=]+)=([^,]+)'\n# target_label: '$${1}'\n# replacement: '$${2}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:server=([^,]+),.*'\n target_label: 'server'\n replacement: '$${1}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:version=([^,]+),.*'\n target_label: 'version'\n replacement: '$${1}'\n - source_labels: ['__meta_consul_tags']\n target_label: 'labels'\n regex: '(.+)'\n replacement: '$${1}'\n action: 'keep'\n# - action: replace\n# replacement: '38'\n# target_label: 'test'\n metric_relabel_configs:\n - action: labeldrop\n regex: 'exported_.*'","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent/local/agent.yaml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/grafana-agent"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent @module=logmon path=/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/alloc/logs/.grafana-agent.stderr.fifo timestamp=2023-05-11T09:35:58.249Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent @module=logmon path=/opt/services/core/nomad/data/alloc/9a3ae9f7-2ed3-c25c-12d9-d792452841d8/alloc/logs/.grafana-agent.stdout.fifo timestamp=2023-05-11T09:35:58.249Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent version=2 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent @module=logmon address=/tmp/plugin2425814785 network=unix timestamp=2023-05-11T09:35:58.247Z 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-9a3ae9f7-2ed3-c25c-12d9-d792452841d8-group-grafana-agent-grafana-agent-ready-server 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-9a3ae9f7-2ed3-c25c-12d9-d792452841d8-group-grafana-agent-grafana-agent-health-server 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent path=/usr/local/bin/nomad 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent path=/usr/local/bin/nomad pid=11417 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.66\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"4a:88:c7:22:c5:a0\",\"Sandbox\":\"/var/run/docker/netns/539769fb44f1\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth143a1ac9\":{\"IPConfigs\":null,\"Mac\":\"0e:d9:82:7c:61:7a\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:35:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent 2023-05-11T11:35:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="462.55ยตs" 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=9a3ae9f7-2ed3-c25c-12d9-d792452841d8 task=grafana-agent type=Received msg="Task received by client" failed=false 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=2 removed=0 updated=8 ignored=7 errors=0 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=b9bd1537-0bae-8c11-41b3-437a4c21df29 task=tempo type=Received msg="Task received by client" failed=false 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=2 removed=0 updated=8 ignored=7 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15428 total=17 pulled=10 filtered=7 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=e846d353-074d-9f2a-45bb-c6b5c56af0cd type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=e846d353-074d-9f2a-45bb-c6b5c56af0cd job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=e846d353-074d-9f2a-45bb-c6b5c56af0cd job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=b1829055-1a14-1f17-d428-13797dad7929 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=e846d353-074d-9f2a-45bb-c6b5c56af0cd job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=e846d353-074d-9f2a-45bb-c6b5c56af0cd type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=b1829055-1a14-1f17-d428-13797dad7929 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=b1829055-1a14-1f17-d428-13797dad7929 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 2) (destructive 0) (inplace 0) (stop 2) (disconnect 0) (reconnect 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval b1829055, job observability, NodeUpdates: (node[36d1fc65] (86dcc3e0 stop/evict) (a04015b3 stop/evict)), NodeAllocations: (node[36d1fc65] (b9bd1537 observability.tempo[0] run) (9a3ae9f7 observability.grafana-agent[0] run)))" 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=b1829055-1a14-1f17-d428-13797dad7929 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=b1829055-1a14-1f17-d428-13797dad7929 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=b1829055-1a14-1f17-d428-13797dad7929 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:35:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:35:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:35:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:35:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:35:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:35:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="329.905ยตs" 2023-05-11T11:35:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.0766ms 2023-05-11T11:35:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:36844 2023-05-11T11:35:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:35:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="329.608ยตs" 2023-05-11T11:35:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:35:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:35:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:35:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:35:54+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=PUT path=/v1/client/allocation/57e22169-b9b1-7fc0-d7e9-615ff2623261/restart?namespace=default error="No path to node" code=404 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=PUT path=/v1/client/allocation/57e22169-b9b1-7fc0-d7e9-615ff2623261/restart?namespace=default duration="361.551ยตs" 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/deployments?all=true&namespace=default" duration="159.083ยตs" 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/allocations?all=true&namespace=default" duration="247.003ยตs" 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/versions?diffs=false&namespace=default" duration="855.473ยตs" 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security?namespace=default duration="265.725ยตs" 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?prefix=security duration="183.06ยตs" 2023-05-11T11:35:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="454.249ยตs" 2023-05-11T11:35:53+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="489.229ยตs" 2023-05-11T11:35:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:35:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:35:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="365.657ยตs" 2023-05-11T11:35:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.703483ms 2023-05-11T11:35:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="333.978ยตs" 2023-05-11T11:35:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="321.612ยตs" 2023-05-11T11:35:49+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="470.394ยตs" 2023-05-11T11:35:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:35:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="326.385ยตs" 2023-05-11T11:35:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:35:48+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:35:48+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15427 total=15 pulled=6 filtered=9 2023-05-11T11:35:48+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=6 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=8 ignored=7 errors=0 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=8 ignored=7 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15425 total=15 pulled=8 filtered=7 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15422 total=15 pulled=6 filtered=9 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 8dc7947d, job observability, NodeAllocations: (node[36d1fc65] (86dcc3e0 observability.tempo[0] run) (a04015b3 observability.grafana-agent[0] run)))" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=b1829055-1a14-1f17-d428-13797dad7929 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=e846d353-074d-9f2a-45bb-c6b5c56af0cd 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=8dc7947d-4e68-8b3d-0fa6-8e074c04c2b9 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.client: adding evaluations for rescheduling failed allocations: num_evals=1 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo path=/usr/local/bin/nomad pid=6615 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent path=/usr/local/bin/nomad pid=6621 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=0 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=0 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=10600 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=10585 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Terminated msg="Exit Code: 137, Exit Message: \"Docker container exited with non-zero exit code: 137\"" failed=false 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Terminated msg="Exit Code: 137, Exit Message: \"Docker container exited with non-zero exit code: 137\"" failed=false 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=b66288be6bae0aa5594dfbbaed40277c48874bbef35aaa3a4e937774ce38118f driver=docker 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=c155a213a1563be2deb96a5d558719f7410118f3c3961144c6b55adc34145d55 driver=docker 2023-05-11T11:35:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:35:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="802.121ยตs" 2023-05-11T11:35:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:35:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:35:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.286938ms 2023-05-11T11:35:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="480.489ยตs" 2023-05-11T11:35:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:49156 2023-05-11T11:35:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="616.953ยตs" 2023-05-11T11:35:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:35:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:35:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:35:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:35:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:35:45+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="711.78ยตs" 2023-05-11T11:35:44+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:35:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="544.513ยตs" 2023-05-11T11:35:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.292585ms 2023-05-11T11:35:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:35:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="694.683ยตs" 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15421 total=15 pulled=6 filtered=9 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:35964: EOF 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/7a915e46-a1f6-7969-6957-5e29cf7a40fa duration=1.041308ms 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/allocation/7a915e46-a1f6-7969-6957-5e29cf7a40fa error="alloc not found" code=404 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/2cd3d2bc-2c20-2aca-df8f-770f3883b52e duration="770.273ยตs" 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/allocation/2cd3d2bc-2c20-2aca-df8f-770f3883b52e error="alloc not found" code=404 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type="Restart Signaled" msg="healthcheck: check \"health\" unhealthy" failed=false 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 check=health task=group-tempo 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type="Restart Signaled" msg="healthcheck: check \"service: \\"grafana-agent-health\\" check\" unhealthy" failed=false 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent 2023-05-11T11:35:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="501.095ยตs" 2023-05-11T11:35:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.295316ms 2023-05-11T11:35:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="360.239ยตs" 2023-05-11T11:35:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="356.428ยตs" 2023-05-11T11:35:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="220ยตs" 2023-05-11T11:35:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="373.631ยตs" 2023-05-11T11:35:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 check=minio-ready task=minio time_limit=20s 2023-05-11T11:35:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 check="service: \"minio-console\" check" task=minio time_limit=20s 2023-05-11T11:35:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="322.243ยตs" 2023-05-11T11:35:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:35:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="590.591ยตs" 2023-05-11T11:35:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:35:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="713.771ยตs" 2023-05-11T11:35:37+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:35:37+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:35:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:35:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="414.942ยตs" 2023-05-11T11:35:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.796968ms 2023-05-11T11:35:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="218.031ยตs" 2023-05-11T11:35:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security duration="260.101ยตs" 2023-05-11T11:35:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:33076 2023-05-11T11:35:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:35:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:35:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:35:35+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="346.16ยตs" 2023-05-11T11:35:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:35:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:35:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:35:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="242.07ยตs" 2023-05-11T11:35:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="353.367ยตs" 2023-05-11T11:35:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:35:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:35:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="495.12ยตs" 2023-05-11T11:35:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="235.839ยตs" 2023-05-11T11:35:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="340.771ยตs" 2023-05-11T11:35:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.455924ms 2023-05-11T11:35:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:35:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="313.758ยตs" 2023-05-11T11:35:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="216.097ยตs" 2023-05-11T11:35:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="314.329ยตs" 2023-05-11T11:35:29+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="341.95ยตs" 2023-05-11T11:35:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:35:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="202.385ยตs" 2023-05-11T11:35:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:35:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="341.007ยตs" 2023-05-11T11:35:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:35:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:35:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="393.758ยตs" 2023-05-11T11:35:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:35:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.557355ms 2023-05-11T11:35:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="280.392ยตs" 2023-05-11T11:35:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:40552 2023-05-11T11:35:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:35:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=3.737672ms 2023-05-11T11:35:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:35:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:35:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:35:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:35:24+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:35:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="610.395ยตs" 2023-05-11T11:35:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="563.149ยตs" 2023-05-11T11:35:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m2.9868059s 2023-05-11T11:35:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="583.104ยตs" 2023-05-11T11:35:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:35:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="661.2ยตs" 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-ready\" check" task=group-grafana-agent time_limit=40s 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent time_limit=20s 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 check=health task=group-tempo time_limit=20s 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="644.474ยตs" 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" duration=1m0.668508509s 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/57e22169-b9b1-7fc0-d7e9-615ff2623261?index=1 duration="762.463ยตs" 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:46636: EOF 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/allocation/7a915e46-a1f6-7969-6957-5e29cf7a40fa error="alloc not found" code=404 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/7a915e46-a1f6-7969-6957-5e29cf7a40fa duration="372.41ยตs" 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/2cd3d2bc-2c20-2aca-df8f-770f3883b52e duration=1.288569ms 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/allocation/2cd3d2bc-2c20-2aca-df8f-770f3883b52e error="alloc not found" code=404 2023-05-11T11:35:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/allocation/8f3fb4a6-629a-7afd-a334-5580bf2d3374?index=1 duration="595.911ยตs" 2023-05-11T11:35:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.497462ms 2023-05-11T11:35:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="314.219ยตs" 2023-05-11T11:35:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="385.529ยตs" 2023-05-11T11:35:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="256.598ยตs" 2023-05-11T11:35:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="466.95ยตs" 2023-05-11T11:35:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:35:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="342.371ยตs" 2023-05-11T11:35:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="198.53ยตs" 2023-05-11T11:35:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:35:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="484.32ยตs" 2023-05-11T11:35:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:35:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43214: EOF 2023-05-11T11:35:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43210: EOF 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/scale duration="621.854ยตs" 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations duration="706.475ยตs" 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration=1.912959ms 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&namespace=default" duration=1.622212ms 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security duration="598.944ยตs" 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.78884ms 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="554.472ยตs" 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="709.645ยตs" 2023-05-11T11:35:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:54312 2023-05-11T11:35:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:35:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:35:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:35:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:35:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:35:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="664.288ยตs" 2023-05-11T11:35:14+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43196: EOF 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/857749e0-52fe-92ee-7bef-fafbe67605ee/stats duration=2.257349ms 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats duration="874.604ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/client/allocation/e5fea251-9d66-4cf4-2090-83ae30046fb1/stats error="No path to node" code=404 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/36d1fc65-c097-97bc-18ac-079c1262ccfd duration="734ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/e2eb7460-2bca-ac62-5c53-999281062667 duration="450.008ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/f652ee64-d508-464f-bfb5-d1a36ac8f3d9 duration="603.825ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43184: EOF 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment duration="165.507ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43178: EOF 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/vars?prefix=nomad%2Fjobs%2Fsecurity" duration="606.439ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/summary?index=1 duration="343.495ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/deployment?index=1 duration="236.712ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43122: EOF 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43126: EOF 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:43120: EOF 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="215.001ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&namespace=default" duration="261.786ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/evaluations duration="351.017ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security/allocations duration="475.926ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/job/security duration="374.562ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="224.327ยตs" 2023-05-11T11:35:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="324.098ยตs" 2023-05-11T11:35:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" duration=1m1.090688766s 2023-05-11T11:35:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:35:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:35:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="345.683ยตs" 2023-05-11T11:35:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats time_limit=10s 2023-05-11T11:35:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats\" check" task=group-nats time_limit=20s 2023-05-11T11:35:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration=1.716139ms 2023-05-11T11:35:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="656.25ยตs" 2023-05-11T11:35:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.616876ms 2023-05-11T11:35:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15356" duration="522.427ยตs" 2023-05-11T11:35:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="728.242ยตs" 2023-05-11T11:35:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="550.599ยตs" 2023-05-11T11:35:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="700.429ยตs" 2023-05-11T11:35:08+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="755.483ยตs" 2023-05-11T11:35:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:35:08+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="482.2ยตs" 2023-05-11T11:35:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:35:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="590.944ยตs" 2023-05-11T11:35:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:35:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:35:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:35:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="327.994ยตs" 2023-05-11T11:35:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.367922ms 2023-05-11T11:35:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="221.577ยตs" 2023-05-11T11:35:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:46920 2023-05-11T11:35:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:35:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="535.097ยตs" 2023-05-11T11:35:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:35:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:35:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:35:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:35:04+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:35:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="326.439ยตs" 2023-05-11T11:35:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="229.703ยตs" 2023-05-11T11:35:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="369.343ยตs" 2023-05-11T11:35:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:35:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:35:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="319.8ยตs" 2023-05-11T11:35:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="237.027ยตs" 2023-05-11T11:35:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.606134ms 2023-05-11T11:35:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:35:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="350.394ยตs" 2023-05-11T11:35:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/allocations?all=true&namespace=default" duration=2.327769ms 2023-05-11T11:35:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:35:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="598.737ยตs" 2023-05-11T11:34:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="631.058ยตs" 2023-05-11T11:34:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/allocations?all=true&namespace=default" duration="562.762ยตs" 2023-05-11T11:34:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:34:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="674.382ยตs" 2023-05-11T11:34:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:34:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:34:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.75209ms 2023-05-11T11:34:57+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/job/security/allocations?all=true&namespace=default" duration="899.714ยตs" 2023-05-11T11:34:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:34:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.55324ms 2023-05-11T11:34:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:58062 2023-05-11T11:34:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="328.59ยตs" 2023-05-11T11:34:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="215.073ยตs" 2023-05-11T11:34:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:34:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:34:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:34:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:34:55+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:34:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="341.235ยตs" 2023-05-11T11:34:54+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:34:53+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?namespace=%2A" duration="213.38ยตs" 2023-05-11T11:34:53+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="353.907ยตs" 2023-05-11T11:34:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="586.148ยตs" 2023-05-11T11:34:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.528937ms 2023-05-11T11:34:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="347.524ยตs" 2023-05-11T11:34:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="333.45ยตs" 2023-05-11T11:34:49+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="432.615ยตs" 2023-05-11T11:34:49+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:49+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:49+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15419 total=15 pulled=6 filtered=9 2023-05-11T11:34:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:34:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="357.5ยตs" 2023-05-11T11:34:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:34:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="409.694ยตs" 2023-05-11T11:34:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:34:47+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:34:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.524942ms 2023-05-11T11:34:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=3.841833ms 2023-05-11T11:34:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:50884 2023-05-11T11:34:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:34:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:34:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:34:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:34:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:34:45+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="687.633ยตs" 2023-05-11T11:34:44+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:34:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="699.886ยตs" 2023-05-11T11:34:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="748.179ยตs" 2023-05-11T11:34:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.764725ms 2023-05-11T11:34:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.369431ms 2023-05-11T11:34:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="618.079ยตs" 2023-05-11T11:34:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="587.499ยตs" 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: canceling restart because check became healthy: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 check=health task=group-loki 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15418 total=15 pulled=6 filtered=9 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-http 2023-05-11T11:34:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-console-console 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Started msg="Task started by client" failed=false 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:39.057Z 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin3603184325 network=unix timestamp=2023-05-11T09:34:39.055Z 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=10717 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:39+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=87544ecc56e51c7b778bce62ebcebb7ede77d2edb55a51a18761935953107a17 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=87544ecc56e51c7b778bce62ebcebb7ede77d2edb55a51a18761935953107a17 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=minio container_name=minio-a37da363-9048-86d5-93e1-d6facf1490b1 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=minio labels="map[com.hashicorp.nomad.alloc_id:a37da363-9048-86d5-93e1-d6facf1490b1 com.hashicorp.nomad.job_id:minio com.hashicorp.nomad.job_name:minio com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:minio com.hashicorp.nomad.task_name:minio]" 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=minio binds="[]string{\"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/local:/local\", \"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/secrets:/secrets\"}" 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/minio/minio:RELEASE.2023-04-28T18-11-17Z image_id=sha256:5ba81f3dad7fb4d608d375ec64cac33fcb196e0ed530be35e002177639b11d21 references=1 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=minio network_mode=container:8fd2dea07fb2c7b0e687c8c4abecf5c81db4d9c16eb9b72e035d1b495d302652 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=minio memory=4294967296 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container startup command: driver=docker task_name=minio command="server /data --console-address :9001 --certs-dir /certs" 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/minio/minio:RELEASE.2023-04-28T18-11-17Z 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio @module=logmon path=/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/alloc/logs/.minio.stderr.fifo timestamp=2023-05-11T09:34:38.854Z 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio @module=logmon path=/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/alloc/logs/.minio.stdout.fifo timestamp=2023-05-11T09:34:38.854Z 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio 2023-05-11T11:34:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="606.994ยตs" 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:38+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15417 total=15 pulled=6 filtered=9 2023-05-11T11:34:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="499.124ยตs" 2023-05-11T11:34:37+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:34:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="505.314ยตs" 2023-05-11T11:34:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.098397ms 2023-05-11T11:34:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:55682 2023-05-11T11:34:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:34:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:34:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:34:35+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="484.916ยตs" 2023-05-11T11:34:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:34:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:34:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:34:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="615.459ยตs" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15416 total=15 pulled=6 filtered=9 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15415 total=15 pulled=6 filtered=9 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Started msg="Task started by client" failed=false 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:33.542Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2364851827 network=unix timestamp=2023-05-11T09:34:33.539Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Started msg="Task started by client" failed=false 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:33.529Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin3088852503 network=unix timestamp=2023-05-11T09:34:33.527Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=10600 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=b66288be6bae0aa5594dfbbaed40277c48874bbef35aaa3a4e937774ce38118f 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=c155a213a1563be2deb96a5d558719f7410118f3c3961144c6b55adc34145d55 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=10585 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="365.748ยตs" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=b66288be6bae0aa5594dfbbaed40277c48874bbef35aaa3a4e937774ce38118f 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=1 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=tempo network_mode=container:17f8d4850acbf1afd975993a1d243d3ad4eb270abe7bf87f5c9d2069d0e3e1ea 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=tempo memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=tempo labels="map[com.github.logunifier.application.name:tempo com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.1.1 com.hashicorp.nomad.alloc_id:86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:tempo com.hashicorp.nomad.task_name:tempo]" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=tempo binds="[]string{\"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local:/local\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml:/config/tempo.yaml\"}" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=tempo container_name=tempo-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc/logs/.tempo.stderr.fifo timestamp=2023-05-11T09:34:33.389Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc/logs/.tempo.stdout.fifo timestamp=2023-05-11T09:34:33.389Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo 2023-05-11T11:34:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=c155a213a1563be2deb96a5d558719f7410118f3c3961144c6b55adc34145d55 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=grafana-agent container_name=grafana-agent-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=grafana-agent memory=2147483648 memory_reservation=67108864 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=grafana-agent network_mode=container:e2f503486651f498010ccfecea17ff3e5b17e0ac02020372de2eacae60403d04 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=grafana-agent binds="[]string{\"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local:/local\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml:/config/agent.yaml\"}" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=grafana-agent labels="map[com.github.logunifier.application.name:grafana_agent com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:0.33.1 com.hashicorp.nomad.alloc_id:a04015b3-dc90-7f18-8bfd-c1cf7bc37eff com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana-agent com.hashicorp.nomad.task_name:grafana-agent]" 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=1 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent @module=logmon path=/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc/logs/.grafana-agent.stderr.fifo timestamp=2023-05-11T09:34:33.357Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent @module=logmon path=/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc/logs/.grafana-agent.stdout.fifo timestamp=2023-05-11T09:34:33.356Z 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15414 total=15 pulled=6 filtered=9 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:5ba81f3dad7fb4d608d375ec64cac33fcb196e0ed530be35e002177639b11d21 references=0 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:34:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio reason="Restart within policy" delay=5.850956292s 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7658 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=547d7e83b2604951f51a6692156f9ff3f680c1fed5107911e46620c056210788 driver=docker 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15413 total=15 pulled=6 filtered=9 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-http 2023-05-11T11:34:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-console-console 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type="Restart Signaled" msg="healthcheck: check \"minio-ready\" unhealthy" failed=false 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 check=minio-ready task=minio 2023-05-11T11:34:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="359.172ยตs" 2023-05-11T11:34:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=7.86193ms 2023-05-11T11:34:31+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-3e667fb12f8585b4515bd5b571a2693af6e10509 2023-05-11T11:34:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:34:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="661.087ยตs" 2023-05-11T11:34:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="680.663ยตs" 2023-05-11T11:34:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:34:29+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="718.634ยตs" 2023-05-11T11:34:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:34:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15406" duration="633.222ยตs" 2023-05-11T11:34:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-585e0fa8b45a978488bdb9d7972f10ba290f9e97 2023-05-11T11:34:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="686.617ยตs" 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15412 total=15 pulled=6 filtered=9 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo reason="Restart within policy" delay=5.850956292s 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=0 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=9788 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Terminated msg="Exit Code: 137, Exit Message: \"Docker container exited with non-zero exit code: 137\"" failed=false 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=59659a67b276c5c640ec0b4d272b8f1f7238a66fea0fed4b912700b2a311b2dd driver=docker 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent reason="Restart within policy" delay=5.850956292s 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=0 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=9540 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Terminated msg="Exit Code: 137, Exit Message: \"Docker container exited with non-zero exit code: 137\"" failed=false 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=d4ce39238d03d8ca97a7a33b56329e696ed2fbe84b50b269e79c4bff78fb95c1 driver=docker 2023-05-11T11:34:27+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="396.986ยตs" 2023-05-11T11:34:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:34:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:34:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:34:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.46725ms 2023-05-11T11:34:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:33876 2023-05-11T11:34:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="357.701ยตs" 2023-05-11T11:34:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:34:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:34:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:34:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:34:25+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:34:24+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:34:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="312.843ยตs" 2023-05-11T11:34:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 check=health task=group-loki time_limit=20s 2023-05-11T11:34:24+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-f2cfef6fca0c54054488693ca23684fc092ffa40 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="370.6ยตs" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15411 total=15 pulled=6 filtered=9 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15410 total=15 pulled=6 filtered=9 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter type=Started msg="Task started by client" failed=false 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:23.496Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1568756249 network=unix timestamp=2023-05-11T09:34:23.495Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=10357 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=115e3832c9d37d6727e523ea06498b6343388b7a12b668f94dcd72d03794519d 2023-05-11T11:34:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=115e3832c9d37d6727e523ea06498b6343388b7a12b668f94dcd72d03794519d 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=nats-prometheus-exporter binds="[]string{\"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats-prometheus-exporter/local:/local\", \"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats-prometheus-exporter/secrets:/secrets\"}" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=nats-prometheus-exporter labels="map[com.github.logunifier.application.name:prometheus-nats-exporter com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.version:0.11.0.0 com.hashicorp.nomad.alloc_id:4d92967c-5996-752c-1cac-6f079b2c8099 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:nats com.hashicorp.nomad.task_name:nats-prometheus-exporter]" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=nats-prometheus-exporter memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=nats-prometheus-exporter network_mode=container:6eeb1f91e3c0a14b3a24502cb2c68e7f2d5471b327e4d46b1cf9bfe6ebb4169f 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=nats-prometheus-exporter container_name=nats-prometheus-exporter-4d92967c-5996-752c-1cac-6f079b2c8099 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/natsio/prometheus-nats-exporter:0.11.0 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/natsio/prometheus-nats-exporter:0.11.0 image_id=sha256:e5358311d02ae05b73d37045f1ce747a2088c015d4458aa90221d4f04f71ed07 references=1 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter path=/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats-prometheus-exporter/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats-prometheus-exporter/secrets/api.sock: bind: invalid argument" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter @module=logmon path=/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/alloc/logs/.nats-prometheus-exporter.stderr.fifo timestamp=2023-05-11T09:34:23.350Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter @module=logmon path=/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/alloc/logs/.nats-prometheus-exporter.stdout.fifo timestamp=2023-05-11T09:34:23.349Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter @module=logmon address=/tmp/plugin656198134 network=unix timestamp=2023-05-11T09:34:23.348Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter version=2 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter path=/usr/local/bin/nomad pid=10299 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter path=/usr/local/bin/nomad 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats type=Started msg="Task started by client" failed=false 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:23.319Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin182107622 network=unix timestamp=2023-05-11T09:34:23.316Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=10286 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=89cfc46dc07ba28191443322cee55d9e9afeb3fdfb02182b95793f1dabb3656d 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15409 total=15 pulled=6 filtered=9 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=89cfc46dc07ba28191443322cee55d9e9afeb3fdfb02182b95793f1dabb3656d 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=nats binds="[]string{\"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats/local:/local\", \"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats/local/nats.conf:/config/nats.conf\"}" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/nats:2.9.16-alpine image_id=sha256:657fde4007c4b6834917360e99c6d1d2aba8008f86063236cf1fafb1ac022404 references=1 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/nats:2.9.16-alpine 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=nats container_name=nats-4d92967c-5996-752c-1cac-6f079b2c8099 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=nats memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=nats labels="map[com.github.logunifier.application.name:nats com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.version:2.9.16 com.hashicorp.nomad.alloc_id:4d92967c-5996-752c-1cac-6f079b2c8099 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:nats com.hashicorp.nomad.task_name:nats]" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=nats network_mode=container:6eeb1f91e3c0a14b3a24502cb2c68e7f2d5471b327e4d46b1cf9bfe6ebb4169f 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats/local/nats.conf" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 0 dependencies 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"# Client port of ++ env \"NOMAD_PORT_client\" ++ on all interfaces\nport: ++ env \"NOMAD_PORT_client\" ++\n\n# HTTP monitoring port\nmonitor_port: ++ env \"NOMAD_PORT_http\" ++\nserver_name: \"++ env \"NOMAD_ALLOC_NAME\" ++\"\n#If true enable protocol trace log messages. Excludes the system account.\ntrace: false\n#If true enable protocol trace log messages. Includes the system account.\ntrace_verbose: false\n#if true enable debug log messages\ndebug: false\nhttp_port: ++ env \"NOMAD_PORT_http\" ++\n#http: nats.service.consul:++ env \"NOMAD_PORT_http\" ++\n\njetstream {\n store_dir: /data/jetstream\n\n # 1GB\n max_memory_store: 2G\n\n # 10GB\n max_file_store: 10G\n}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats/local/nats.conf","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 8010a1378d33c582bd83aa917c8fbe05 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/nats/local/nats.conf" 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats @module=logmon path=/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/alloc/logs/.nats.stdout.fifo timestamp=2023-05-11T09:34:23.168Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats @module=logmon path=/opt/services/core/nomad/data/alloc/4d92967c-5996-752c-1cac-6f079b2c8099/alloc/logs/.nats.stderr.fifo timestamp=2023-05-11T09:34:23.168Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats version=2 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats @module=logmon address=/tmp/plugin1205918543 network=unix timestamp=2023-05-11T09:34:23.165Z 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-4d92967c-5996-752c-1cac-6f079b2c8099-group-nats-nats-prometheus-exporter-prometheus-exporter 2023-05-11T11:34:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-4d92967c-5996-752c-1cac-6f079b2c8099-group-nats-nats-client 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats path=/usr/local/bin/nomad pid=10228 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats path=/usr/local/bin/nomad 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats 2023-05-11T11:34:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.65\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"66:0e:5c:e6:1a:dd\",\"Sandbox\":\"/var/run/docker/netns/ad9224d4affb\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"vethe860b880\":{\"IPConfigs\":null,\"Mac\":\"46:9b:78:ba:f3:3d\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="353.039ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=9 errors=0 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=9 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15408 total=15 pulled=6 filtered=9 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 previous_alloc=54969951-d541-ae97-922a-7db38096bae5 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 previous_alloc=54969951-d541-ae97-922a-7db38096bae5 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15401" duration=3.069868671s 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats type=Received msg="Task received by client" failed=false 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=1 removed=0 updated=6 ignored=8 errors=0 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15401" duration=208.621723ms 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=4d92967c-5996-752c-1cac-6f079b2c8099 task=nats-prometheus-exporter type=Received msg="Task received by client" failed=false 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 1) (destructive 0) (inplace 0) (stop 1) (disconnect 0) (reconnect 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval b7a7e205, job observability, NodeUpdates: (node[36d1fc65] (54969951 stop/evict)), NodeAllocations: (node[36d1fc65] (4d92967c observability.nats[0] run)))" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=1 removed=0 updated=6 ignored=8 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15406 total=15 pulled=7 filtered=8 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=5 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-otlp-grpc-otlp_grpc 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-tempo 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-otlp-http-otlp_http 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-jaeger-jaeger 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-zipkin-zipkin 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=5 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-zipkin-zipkin 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-tempo 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-otlp-http-otlp_http 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-otlp-grpc-otlp_grpc 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-jaeger-jaeger 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff-group-grafana-agent-grafana-agent-ready-server 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff-group-grafana-agent-grafana-agent-health-server 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff-group-grafana-agent-grafana-agent-health-server 2023-05-11T11:34:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff-group-grafana-agent-grafana-agent-ready-server 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type="Restart Signaled" msg="healthcheck: check \"health\" unhealthy" failed=false 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type="Restart Signaled" msg="healthcheck: check \"service: \\"grafana-agent-health\\" check\" unhealthy" failed=false 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 check=health task=group-tempo 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="214.331ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.589137ms 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="739.166ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?meta=true duration="560.781ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52452: EOF 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52454: EOF 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52458: EOF 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/members duration="369.233ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/operator/license duration="3.736ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/acl/token/self duration="205.274ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/acl/token/self error="RPC Error:: 400,ACL support disabled" code=400 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=POST path=/v1/search/fuzzy duration=2.132233ms 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/regions duration="163.447ยตs" 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52414: EOF 2023-05-11T11:34:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:52412: EOF 2023-05-11T11:34:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=4.864614ms 2023-05-11T11:34:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="520.757ยตs" 2023-05-11T11:34:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" duration=1m0.903576321s 2023-05-11T11:34:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m0.037509967s 2023-05-11T11:34:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="368.848ยตs" 2023-05-11T11:34:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:34:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:34:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="336.221ยตs" 2023-05-11T11:34:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:34:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="329.139ยตs" 2023-05-11T11:34:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:34:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:34:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:34:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:34:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="498.159ยตs" 2023-05-11T11:34:17+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:34:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:34:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.660352ms 2023-05-11T11:34:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="312.566ยตs" 2023-05-11T11:34:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:54204 2023-05-11T11:34:15+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="346.671ยตs" 2023-05-11T11:34:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m3.132310569s 2023-05-11T11:34:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" duration=1m2.86097801s 2023-05-11T11:34:14+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m3.149959579s 2023-05-11T11:34:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="329.882ยตs" 2023-05-11T11:34:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:34:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="438.374ยตs" 2023-05-11T11:34:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=8 errors=0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=8 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15405 total=14 pulled=6 filtered=8 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=6 ignored=8 errors=0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=6 ignored=8 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15403 total=14 pulled=6 filtered=8 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 1b85bcf9, job observability, NodeAllocations: (node[36d1fc65] (54969951 observability.nats[0] run)))" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=b7a7e205-a1b7-170e-9e2a-b99702e04664 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=1b85bcf9-25ca-5ecc-9ebb-461cc731d32b type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15399" duration=3.317404211s 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.client: adding evaluations for rescheduling failed allocations: num_evals=1 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15401 total=14 pulled=5 filtered=9 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner: waiting for task to exit: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=54969951-d541-ae97-922a-7db38096bae5 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/usr/local/bin/nomad pid=6707 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats path=/usr/local/bin/nomad pid=6705 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:657fde4007c4b6834917360e99c6d1d2aba8008f86063236cf1fafb1ac022404 references=0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type="Sibling Task Failed" msg="Task's sibling \"nats-prometheus-exporter\" failed" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner: task failure, destroying all tasks: alloc_id=54969951-d541-ae97-922a-7db38096bae5 failed_task=nats-prometheus-exporter 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:e5358311d02ae05b73d37045f1ce747a2088c015d4458aa90221d4f04f71ed07 references=0 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7343 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7346 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=bfc44881a19d03c35e243d4c1d576e2118d4ed5afbdd93ae38837eeb3ccc8e6d driver=docker 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15400 total=14 pulled=5 filtered=9 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type=Terminated msg="Exit Code: 2, Exit Message: \"Docker container exited with non-zero exit code: 2\"" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=17e4c15a57cdff24937d392d474774a758f32704c69f65772e6fde3b859ab34a driver=docker 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" duration=1m1.10603955s 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-54969951-d541-ae97-922a-7db38096bae5-group-nats-nats-prometheus-exporter-prometheus-exporter 2023-05-11T11:34:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-54969951-d541-ae97-922a-7db38096bae5-group-nats-nats-client 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 2023-05-11T11:34:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-54969951-d541-ae97-922a-7db38096bae5-group-nats-nats-prometheus-exporter-prometheus-exporter 2023-05-11T11:34:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-54969951-d541-ae97-922a-7db38096bae5-group-nats-nats-client 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type="Restart Signaled" msg="healthcheck: check \"service: \\"nats-prometheus-exporter\\" check\" unhealthy" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type="Restart Signaled" msg="healthcheck: check \"service: \\"nats-prometheus-exporter\\" check\" unhealthy" failed=false 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 check="service: \"minio-console\" check" task=minio time_limit=20s 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: restarting due to unhealthy check: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 check=minio-ready task=minio time_limit=20s 2023-05-11T11:34:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="398.942ยตs" 2023-05-11T11:34:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.200183ms 2023-05-11T11:34:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.94825ms 2023-05-11T11:34:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="647.374ยตs" 2023-05-11T11:34:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="429.003ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="911.827ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="219.448ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?meta=true duration="929.578ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:47348: EOF 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:47338: EOF 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=POST path=/v1/search/fuzzy duration="276.524ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/members duration="134.859ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/operator/license duration="4.064ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/acl/token/self error="RPC Error:: 400,ACL support disabled" code=400 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/acl/token/self duration="314.9ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/regions duration="153.901ยตs" 2023-05-11T11:34:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:47302: EOF 2023-05-11T11:34:08+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="440.359ยตs" 2023-05-11T11:34:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:34:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:34:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:34:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:34:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:34:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="317.354ยตs" 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15399 total=14 pulled=5 filtered=9 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15398 total=14 pulled=5 filtered=9 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Started msg="Task started by client" failed=false 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:07.093Z 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin933749528 network=unix @module=docker_logger timestamp=2023-05-11T09:34:07.091Z 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=9788 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=59659a67b276c5c640ec0b4d272b8f1f7238a66fea0fed4b912700b2a311b2dd 2023-05-11T11:34:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=59659a67b276c5c640ec0b4d272b8f1f7238a66fea0fed4b912700b2a311b2dd 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=tempo labels="map[com.github.logunifier.application.name:tempo com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.1.1 com.hashicorp.nomad.alloc_id:86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:tempo com.hashicorp.nomad.task_name:tempo]" 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=tempo binds="[]string{\"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local:/local\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml:/config/tempo.yaml\"}" 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=tempo container_name=tempo-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=tempo network_mode=container:17f8d4850acbf1afd975993a1d243d3ad4eb270abe7bf87f5c9d2069d0e3e1ea 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=tempo memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=1 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc/logs/.tempo.stdout.fifo timestamp=2023-05-11T09:34:06.942Z 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc/logs/.tempo.stderr.fifo timestamp=2023-05-11T09:34:06.942Z 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=0 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo reason="" delay=0s 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6876 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Terminated msg="Exit Code: 137, Exit Message: \"Docker container exited with non-zero exit code: 137\"" failed=false 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=0a967a4563724f517f86e1426d5ff4edb2562122a25d5e29166b789859d5c54c driver=docker 2023-05-11T11:34:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="485.994ยตs" 2023-05-11T11:34:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.820968ms 2023-05-11T11:34:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:36400 2023-05-11T11:34:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="335.998ยตs" 2023-05-11T11:34:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="414.832ยตs" 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="482.915ยตs" 2023-05-11T11:34:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15396 total=14 pulled=5 filtered=9 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type=Started msg="Task started by client" failed=false 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:03.099Z 2023-05-11T11:34:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin3871120323 network=unix timestamp=2023-05-11T09:34:03.097Z 2023-05-11T11:34:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=9690 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=9500e2fcf39726039eae8b8e67b2608b28d385b12dfe713e4abddc0b85682838 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15395 total=14 pulled=5 filtered=9 2023-05-11T11:34:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=9500e2fcf39726039eae8b8e67b2608b28d385b12dfe713e4abddc0b85682838 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=1 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=loki network_mode=container:e4463b034889663c02f8ba8c2487f9070bcc3fad3353795039860d255a1388df 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=loki container_name=loki-2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=loki binds="[]string{\"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local:/local\", \"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml:/config/loki.yaml\"}" 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=loki labels="map[com.github.logunifier.application.name:loki com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.8.2 com.hashicorp.nomad.alloc_id:2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:loki com.hashicorp.nomad.task_name:loki]" 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=loki memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki @module=logmon path=/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/alloc/logs/.loki.stderr.fifo timestamp=2023-05-11T09:34:02.947Z 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki @module=logmon path=/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/alloc/logs/.loki.stdout.fifo timestamp=2023-05-11T09:34:02.947Z 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki reason="" delay=0s 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=0 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=9295 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type=Terminated msg="Exit Code: 137, Exit Message: \"Docker container exited with non-zero exit code: 137\"" failed=false 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=735bade88f42ce91f71c7d74c52b3bb4f2494adc00ff495e686877e7c630c6ba driver=docker 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="358.636ยตs" 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-health\" check" task=group-grafana-agent time_limit=20s 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats-prometheus-exporter\" check" task=group-nats time_limit=10s 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=54969951-d541-ae97-922a-7db38096bae5 check="service: \"nats\" check" task=group-nats time_limit=20s 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff check="service: \"grafana-agent-ready\" check" task=group-grafana-agent time_limit=40s 2023-05-11T11:34:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] watch.checks: check became unhealthy. Will restart if check doesn't become healthy: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 check=health task=group-tempo time_limit=20s 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.141961ms 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15394 total=14 pulled=5 filtered=9 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="447.233ยตs" 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15393 total=14 pulled=5 filtered=9 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15392 total=14 pulled=5 filtered=9 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:01+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Started msg="Task started by client" failed=false 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:34:01.016Z 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:34:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1548796737 network=unix timestamp=2023-05-11T09:34:01.013Z 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=9540 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=d4ce39238d03d8ca97a7a33b56329e696ed2fbe84b50b269e79c4bff78fb95c1 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=d4ce39238d03d8ca97a7a33b56329e696ed2fbe84b50b269e79c4bff78fb95c1 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=grafana-agent binds="[]string{\"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local:/local\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml:/config/agent.yaml\"}" 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=grafana-agent labels="map[com.github.logunifier.application.name:grafana_agent com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:0.33.1 com.hashicorp.nomad.alloc_id:a04015b3-dc90-7f18-8bfd-c1cf7bc37eff com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana-agent com.hashicorp.nomad.task_name:grafana-agent]" 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=grafana-agent container_name=grafana-agent-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=grafana-agent network_mode=container:e2f503486651f498010ccfecea17ff3e5b17e0ac02020372de2eacae60403d04 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=grafana-agent memory=2147483648 memory_reservation=67108864 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=1 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent path=/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc/logs/.grafana-agent.stderr.fifo @module=logmon timestamp=2023-05-11T09:34:00.867Z 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent @module=logmon path=/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc/logs/.grafana-agent.stdout.fifo timestamp=2023-05-11T09:34:00.867Z 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=0 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent reason="" delay=0s 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6813 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: stopped container: container_id=ea256c425b611c926e06e03a46ea9c20225dc9850ad129c8f61ceef7784189f2 driver=docker 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15391 total=14 pulled=5 filtered=9 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:34:00+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="393.666ยตs" 2023-05-11T11:34:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:34:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:34:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:33:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:33:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="377.357ยตs" 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:33:58+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="569.505ยตs" 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:33:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:33:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15390 total=14 pulled=5 filtered=9 2023-05-11T11:33:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:33:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:33:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="480.338ยตs" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml" 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template ae4a8f4564353ce5e970f7f7d1c6a2da 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template d3fbf984da4c8c04e753c284e9a12f26 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 723104a980ccc69983d3c9e8709349f9 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-4d973a2a83cdc2d4fa18d396cc085397b0d13ffd 2023-05-11T11:33:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:33:56+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.714526ms 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:58766 2023-05-11T11:33:56+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="343.948ยตs" 2023-05-11T11:33:55+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="379.701ยตs" 2023-05-11T11:33:54+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="390.588ยตs" 2023-05-11T11:33:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:33:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:33:53+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="340.391ยตs" 2023-05-11T11:33:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.107906ms 2023-05-11T11:33:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="326.646ยตs" 2023-05-11T11:33:51+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:33:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="307.558ยตs" 2023-05-11T11:33:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:33:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:33:50+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:33:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:33:49+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="327.28ยตs" 2023-05-11T11:33:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:33:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="374.004ยตs" 2023-05-11T11:33:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:33:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:33:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:33:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:33:48+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:47+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="431.281ยตs" 2023-05-11T11:33:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-4d973a2a83cdc2d4fa18d396cc085397b0d13ffd 2023-05-11T11:33:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:33:46+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:33:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="351.263ยตs" 2023-05-11T11:33:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.756967ms 2023-05-11T11:33:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:43686 2023-05-11T11:33:45+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="377.424ยตs" 2023-05-11T11:33:44+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=3.354793ms 2023-05-11T11:33:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:33:43+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="590.695ยตs" 2023-05-11T11:33:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:33:43+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:42+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="734.328ยตs" 2023-05-11T11:33:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.965295ms 2023-05-11T11:33:41+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:33:41+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:33:41+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15389 total=14 pulled=5 filtered=9 2023-05-11T11:33:41+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="633.479ยตs" 2023-05-11T11:33:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:33:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:33:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:33:40+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:33:40+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="655.266ยตs" 2023-05-11T11:33:39+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:33:38+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.430334ms 2023-05-11T11:33:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:33:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:33:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:33:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:33:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:33:38+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:37+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="341.286ยตs" 2023-05-11T11:33:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-4d973a2a83cdc2d4fa18d396cc085397b0d13ffd 2023-05-11T11:33:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:33:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="334.466ยตs" 2023-05-11T11:33:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:33:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15388 total=14 pulled=5 filtered=9 2023-05-11T11:33:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:33:36+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:33:36+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.476605ms 2023-05-11T11:33:36+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:37282 2023-05-11T11:33:35+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="361.18ยตs" 2023-05-11T11:33:34+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="453.002ยตs" 2023-05-11T11:33:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:33:33+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="449.22ยตs" 2023-05-11T11:33:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:33:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:32+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="424.717ยตs" 2023-05-11T11:33:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.587403ms 2023-05-11T11:33:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:33:31+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-716e088c4a623de12ff9c906dfc61990e48aeb5a 2023-05-11T11:33:31+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:33:31+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.428914ms 2023-05-11T11:33:31+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:33:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:33:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:33:30+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:33:30+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="352.389ยตs" 2023-05-11T11:33:29+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:33:28+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="526.251ยตs" 2023-05-11T11:33:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-a995356b87d403bba9e4d5c8d4334eb75abc1c70 2023-05-11T11:33:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:33:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:33:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:33:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:33:28+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:27+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=4.231068ms 2023-05-11T11:33:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-4d973a2a83cdc2d4fa18d396cc085397b0d13ffd 2023-05-11T11:33:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:33:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:33:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=4.084932ms 2023-05-11T11:33:26+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=6.442556ms 2023-05-11T11:33:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-bd7464de-fa72-736b-e57c-6782cc7d7202-group-grafana-grafana-3000-sidecar-proxy:1 2023-05-11T11:33:26+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-a1de24d6ed90478d5e35018b5e35744d97819639 2023-05-11T11:33:26+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:43862 2023-05-11T11:33:25+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=4.189808ms 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15387 total=14 pulled=5 filtered=9 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type=Started msg="Task started by client" failed=false 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:24.609Z 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin679068547 network=unix timestamp=2023-05-11T09:33:24.607Z 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=9295 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=735bade88f42ce91f71c7d74c52b3bb4f2494adc00ff495e686877e7c630c6ba 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=9 errors=0 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15386 total=14 pulled=5 filtered=9 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=9 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=735bade88f42ce91f71c7d74c52b3bb4f2494adc00ff495e686877e7c630c6ba 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=loki labels="map[com.github.logunifier.application.name:loki com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.8.2 com.hashicorp.nomad.alloc_id:2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:loki com.hashicorp.nomad.task_name:loki]" 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=loki container_name=loki-2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=loki binds="[]string{\"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local:/local\", \"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml:/config/loki.yaml\"}" 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=loki memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=loki network_mode=container:e4463b034889663c02f8ba8c2487f9070bcc3fad3353795039860d255a1388df 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=1 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml" 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml" 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template ae4a8f4564353ce5e970f7f7d1c6a2da 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency health.service(mimir|passing) to missing since isLeader but do not have a watcher 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: health.service(mimir|passing) 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding health.service(mimir|passing) 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"auth_enabled: false\n\nserver:\n #default 3100\n http_listen_port: 3100\n #default 9005\n #grpc_listen_port: 9005\n # Max gRPC message size that can be received\n # CLI flag: -server.grpc-max-recv-msg-size-bytes\n #default 4194304 -\u003e 4MB\n grpc_server_max_recv_msg_size: 419430400\n\n # Max gRPC message size that can be sent\n # CLI flag: -server.grpc-max-send-msg-size-bytes\n #default 4194304 -\u003e 4MB\n grpc_server_max_send_msg_size: 419430400\n\n # Limit on the number of concurrent streams for gRPC calls (0 = unlimited)\n # CLI flag: -server.grpc-max-concurrent-streams\n grpc_server_max_concurrent_streams: 100\n\n # Log only messages with the given severity or above. Supported values [debug,\n # info, warn, error]\n # CLI flag: -log.level\n log_level: \"warn\"\ningester:\n wal:\n enabled: true\n dir: /data/wal\n lifecycler:\n address: 127.0.0.1\n ring:\n kvstore:\n store: memberlist\n replication_factor: 1\n final_sleep: 0s\n chunk_idle_period: 5m\n chunk_retain_period: 30s\n chunk_encoding: snappy\n\nruler:\n evaluation_interval : 1m\n poll_interval: 1m\n storage:\n type: local\n local:\n directory: /data/rules\n rule_path: /data/scratch\n++- range $index, $service := service \"mimir\" -++\n++- if eq $index 0 ++\n alertmanager_url: http://++$service.Name++.service.consul:++ $service.Port ++/alertmanager\n++- end ++\n++- end ++\n\n ring:\n kvstore:\n store: memberlist\n enable_api: true\n enable_alertmanager_v2: true\n\ncompactor:\n working_directory: /data/retention\n shared_store: filesystem\n compaction_interval: 10m\n retention_enabled: true\n retention_delete_delay: 2h\n retention_delete_worker_count: 150\n\nschema_config:\n configs:\n - from: 2023-03-01\n store: boltdb-shipper\n object_store: filesystem\n schema: v12\n index:\n prefix: index_\n period: 24h\n\nstorage_config:\n boltdb_shipper:\n active_index_directory: /data/index\n cache_location: /data/index-cache\n shared_store: filesystem\n filesystem:\n directory: /data/chunks\n index_queries_cache_config:\n enable_fifocache: false\n embedded_cache:\n max_size_mb: 4096\n enabled: true\nquerier:\n multi_tenant_queries_enabled: false\n max_concurrent: 4096\n query_store_only: false\n\nquery_scheduler:\n max_outstanding_requests_per_tenant: 10000\n\nquery_range:\n cache_results: true\n results_cache:\n cache:\n enable_fifocache: false\n embedded_cache:\n enabled: true\n\nchunk_store_config:\n chunk_cache_config:\n enable_fifocache: false\n embedded_cache:\n max_size_mb: 4096\n enabled: true\n write_dedupe_cache_config:\n enable_fifocache: false\n embedded_cache:\n max_size_mb: 4096\n enabled: true\n\ndistributor:\n ring:\n kvstore:\n store: memberlist\n\ntable_manager:\n retention_deletes_enabled: true\n retention_period: 24h\n\nlimits_config:\n ingestion_rate_mb: 64\n ingestion_burst_size_mb: 8\n max_label_name_length: 4096\n max_label_value_length: 8092\n enforce_metric_name: false\n # Loki will reject any log lines that have already been processed and will not index them again\n reject_old_samples: false\n # 5y\n reject_old_samples_max_age: 43800h\n # The limit to length of chunk store queries. 0 to disable.\n # 5y\n max_query_length: 43800h\n # Maximum number of log entries that will be returned for a query.\n max_entries_limit_per_query: 20000\n # Limit the maximum of unique series that is returned by a metric query.\n max_query_series: 100000\n # Maximum number of queries that will be scheduled in parallel by the frontend.\n max_query_parallelism: 64\n split_queries_by_interval: 24h\n # Alter the log line timestamp during ingestion when the timestamp is the same as the\n # previous entry for the same stream. When enabled, if a log line in a push request has\n # the same timestamp as the previous line for the same stream, one nanosecond is added\n # to the log line. This will preserve the received order of log lines with the exact\n # same timestamp when they are queried, by slightly altering their stored timestamp.\n # NOTE: This is imperfect, because Loki accepts out of order writes, and another push\n # request for the same stream could contain duplicate timestamps to existing\n # entries and they will not be incremented.\n # CLI flag: -validation.increment-duplicate-timestamps\n increment_duplicate_timestamp: true\n #Log data retention for all\n retention_period: 24h\n # Comment this out for fine grained retention\n# retention_stream:\n# - selector: '{namespace=\"dev\"}'\n# priority: 1\n# period: 24h\n # Comment this out for having overrides\n# per_tenant_override_config: /etc/overrides.yaml","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki/local/loki.yaml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/loki"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template ae4a8f4564353ce5e970f7f7d1c6a2da 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki path=/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/alloc/logs/.loki.stderr.fifo @module=logmon timestamp=2023-05-11T09:33:24.369Z 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki @module=logmon path=/opt/services/core/nomad/data/alloc/2d549ab9-a7a8-65ac-1697-dd440dd0e3d7/alloc/logs/.loki.stdout.fifo timestamp=2023-05-11T09:33:24.368Z 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki address=/tmp/plugin3669077197 network=unix @module=logmon timestamp=2023-05-11T09:33:24.366Z 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki version=2 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:24+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-2d549ab9-a7a8-65ac-1697-dd440dd0e3d7-group-loki-loki-http 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki path=/usr/local/bin/nomad 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki path=/usr/local/bin/nomad pid=9237 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.64\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"be:ff:5a:8b:b8:85\",\"Sandbox\":\"/var/run/docker/netns/7ae16aa66c30\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"vethe9e3726a\":{\"IPConfigs\":null,\"Mac\":\"d6:21:75:a6:ae:5b\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:33:24+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="566.502ยตs" 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=1 removed=0 updated=5 ignored=8 errors=0 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=2d549ab9-a7a8-65ac-1697-dd440dd0e3d7 task=loki type=Received msg="Task received by client" failed=false 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c8803169-3052-ff6f-4027-786f5675356f type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15384 total=14 pulled=6 filtered=8 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=1 removed=0 updated=5 ignored=8 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=c8803169-3052-ff6f-4027-786f5675356f job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c8803169-3052-ff6f-4027-786f5675356f 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 1) (destructive 0) (inplace 0) (stop 1) (disconnect 0) (reconnect 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c8803169-3052-ff6f-4027-786f5675356f type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=c8803169-3052-ff6f-4027-786f5675356f job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval c8803169, job observability, NodeUpdates: (node[36d1fc65] (dcd787cd stop/evict)), NodeAllocations: (node[36d1fc65] (2d549ab9 observability.loki[0] run)))" 2023-05-11T11:33:23+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=c8803169-3052-ff6f-4027-786f5675356f job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:33:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:33:23+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:22+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="471.495ยตs" 2023-05-11T11:33:22+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.427803ms 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="664.799ยตs" 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15383 total=13 pulled=4 filtered=9 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:21.326Z 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin3303450961 network=unix timestamp=2023-05-11T09:33:21.321Z 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=8966 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=3b4d212e87bebe3a8c28a21431efcec507575987069ec72cd9ea0edf04ee3cdc 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15382 total=13 pulled=4 filtered=9 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:21+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=3b4d212e87bebe3a8c28a21431efcec507575987069ec72cd9ea0edf04ee3cdc 2023-05-11T11:33:21+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana type=Started msg="Task started by client" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:20.985Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1658287858 network=unix timestamp=2023-05-11T09:33:20.974Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15381 total=13 pulled=4 filtered=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=keycloak_postgres labels="map[com.hashicorp.nomad.alloc_id:857749e0-52fe-92ee-7bef-fafbe67605ee com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:keycloak_postgres]" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=keycloak_postgres network_mode=container:5c2a6e2bed9928c93ccf1d654aac2b51225a12f01b4610592ea2563448aacf2a 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=keycloak_postgres container_name=keycloak_postgres-857749e0-52fe-92ee-7bef-fafbe67605ee 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=keycloak_postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local/initddb.sql:/docker-entrypoint-initdb.d/initddb.sql\"}" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/postgres:14.5 image_id=sha256:cefd1c9e490c8b581d834d878081cf64c133df1f9f443c5e5f8d94fbd7c7a1d4 references=1 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=keycloak_postgres memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=8901 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=806ddc5ee28f676d5bd1ee347e536e2fe8f6b2903faf9b11400dcf35ec3b483f 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/secrets/env.vars" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 3d38a7b72b56e929ec016a5e38bef38f 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local/initddb.sql" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/secrets/env.vars" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 6e2d32cd4b2988cfe8f68f510d993549 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" duration=1.629861ms 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-bd7464de-fa72-736b-e57c-6782cc7d7202-group-grafana-grafana-3000-sidecar-proxy:2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local/initddb.sql" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 3d38a7b72b56e929ec016a5e38bef38f 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 6e2d32cd4b2988cfe8f68f510d993549 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local/initddb.sql" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" CREATE SCHEMA IF NOT EXISTS keycloak;\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/local/initddb.sql","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres"},{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n POSTGRES_PASSWORD = {{.keycloak_db_password}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/keycloak_postgres"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.keycloak_postgres.stdout.fifo timestamp=2023-05-11T09:33:20.843Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.keycloak_postgres.stderr.fifo timestamp=2023-05-11T09:33:20.843Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres address=/tmp/plugin670948133 network=unix @module=logmon timestamp=2023-05-11T09:33:20.837Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres path=/usr/local/bin/nomad pid=8868 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:20.756Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin1334273263 network=unix @module=docker_logger timestamp=2023-05-11T09:33:20.751Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15380 total=13 pulled=4 filtered=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client: node registration complete 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: state changed, updating node and re-registering 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=8828 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=fa3b7bc7827c5ee18fd7ce567ee426f56f5bce66e44534bdcc0b5d9b1f568108 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="490.818ยตs" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=806ddc5ee28f676d5bd1ee347e536e2fe8f6b2903faf9b11400dcf35ec3b483f 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=grafana binds="[]string{\"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/grafana/local:/local\", \"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/grafana/secrets:/secrets\"}" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=grafana container_name=grafana-bd7464de-fa72-736b-e57c-6782cc7d7202 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=grafana labels="map[com.github.logunifier.application.name:grafana com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:9.5.1.0 com.hashicorp.nomad.alloc_id:bd7464de-fa72-736b-e57c-6782cc7d7202 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana com.hashicorp.nomad.task_name:grafana]" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=grafana memory=4294967296 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=grafana network_mode=container:600ce6982bc7fb12eb4a01364ba3cbaf1e5449027f1893db6e5abdb303e659ee 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/stack/observability/grafana:9.5.1.0 image_id=sha256:7e35787ef40f6cf939d161c814f18b91c0162bf61e1ad2ddda37156e77bda016 references=1 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/grafana/secrets/env.vars" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template aa33fb80064c17acc1b58a55de5165e2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/grafana/secrets/env.vars" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/observability@default.global) is still needed 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?namespace=default&stale=&wait=60000ms" duration=2.559672ms 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/observability@default.global) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/observability@default.global) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/observability@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template aa33fb80064c17acc1b58a55de5165e2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/observability@default.global) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{ with nomadVar \"nomad/jobs/observability\" }}\n GF_AUTH_GENERIC_OAUTH_CLIENT_SECRET = {{.keycloak_secret_observability_grafana}}\n {{ end }}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/grafana/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/grafana"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana @module=logmon path=/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/alloc/logs/.grafana.stderr.fifo timestamp=2023-05-11T09:33:20.583Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana @module=logmon path=/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/alloc/logs/.grafana.stdout.fifo timestamp=2023-05-11T09:33:20.583Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana network=unix @module=logmon address=/tmp/plugin76971813 timestamp=2023-05-11T09:33:20.574Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=fa3b7bc7827c5ee18fd7ce567ee426f56f5bce66e44534bdcc0b5d9b1f568108 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana path=/usr/local/bin/nomad pid=8775 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana type=Started msg="Task started by client" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:20.489Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin1064035812 network=unix @module=docker_logger timestamp=2023-05-11T09:33:20.485Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-keycloak-postgres container_name=connect-proxy-keycloak-postgres-857749e0-52fe-92ee-7bef-fafbe67605ee 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-keycloak-postgres network_mode=container:5c2a6e2bed9928c93ccf1d654aac2b51225a12f01b4610592ea2563448aacf2a 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-keycloak-postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets:/secrets\"}" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-keycloak-postgres memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-keycloak-postgres labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:857749e0-52fe-92ee-7bef-fafbe67605ee com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:connect-proxy-keycloak-postgres]" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15378 total=13 pulled=4 filtered=9 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/acl/token/self from=127.0.0.1:43276 error="ACL support disabled" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=a4f0463a7c6ab563cbf8ef74cbb854fc6dea0f445c35b5eee0ef35a902d67da2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=8760 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping envoy: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres namespace="" proxy_id=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service=keycloak-postgres gateway="" bootstrap_file=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets/envoy_bootstrap.json grpc_addr=unix://alloc/tmp/consul_grpc.sock admin_bind=127.0.0.2:19001 ready_bind=127.0.0.1:19101 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: check for SI token for task: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres task=connect-proxy-keycloak-postgres exists=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping Consul connect-proxy: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres task=connect-proxy-keycloak-postgres service=keycloak-postgres 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/connect-proxy-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.connect-proxy-keycloak-postgres.stdout.fifo timestamp=2023-05-11T09:33:20.294Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/857749e0-52fe-92ee-7bef-fafbe67605ee/alloc/logs/.connect-proxy-keycloak-postgres.stderr.fifo timestamp=2023-05-11T09:33:20.295Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres @module=logmon address=/tmp/plugin3591322366 network=unix timestamp=2023-05-11T09:33:20.285Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=a4f0463a7c6ab563cbf8ef74cbb854fc6dea0f445c35b5eee0ef35a902d67da2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-857749e0-52fe-92ee-7bef-fafbe67605ee-group-keycloak-postgres-keycloak-postgres-5432 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad pid=8692 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.63\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"ee:56:67:99:73:0f\",\"Sandbox\":\"/var/run/docker/netns/5431854b9e01\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth59f1db79\":{\"IPConfigs\":null,\"Mac\":\"ca:cd:b9:79:12:fa\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15377 total=13 pulled=4 filtered=9 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-grafana container_name=connect-proxy-grafana-bd7464de-fa72-736b-e57c-6782cc7d7202 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-grafana memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-grafana labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:bd7464de-fa72-736b-e57c-6782cc7d7202 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana com.hashicorp.nomad.task_name:connect-proxy-grafana]" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-grafana binds="[]string{\"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/connect-proxy-grafana/local:/local\", \"/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/connect-proxy-grafana/secrets:/secrets\"}" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=1 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-grafana network_mode=container:600ce6982bc7fb12eb4a01364ba3cbaf1e5449027f1893db6e5abdb303e659ee 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=envoyproxy/envoy:v1.25.1 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/acl/token/self from=127.0.0.1:43272 error="ACL support disabled" 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: check for SI token for task: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana task=connect-proxy-grafana exists=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping envoy: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana namespace="" proxy_id=_nomad-task-bd7464de-fa72-736b-e57c-6782cc7d7202-group-grafana-grafana-3000-sidecar-proxy service=grafana gateway="" bootstrap_file=/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/connect-proxy-grafana/secrets/envoy_bootstrap.json grpc_addr=unix://alloc/tmp/consul_grpc.sock admin_bind=127.0.0.2:19001 ready_bind=127.0.0.1:19101 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: bootstrapping Consul connect-proxy: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana task=connect-proxy-grafana service=grafana 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana path=/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/connect-proxy-grafana/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/connect-proxy-grafana/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir type=Started msg="Task started by client" failed=false 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:20.047Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana @module=logmon path=/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/alloc/logs/.connect-proxy-grafana.stdout.fifo timestamp=2023-05-11T09:33:20.044Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana @module=logmon path=/opt/services/core/nomad/data/alloc/bd7464de-fa72-736b-e57c-6782cc7d7202/alloc/logs/.connect-proxy-grafana.stderr.fifo timestamp=2023-05-11T09:33:20.044Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin4229262119 network=unix timestamp=2023-05-11T09:33:20.043Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana version=2 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana @module=logmon address=/tmp/plugin605659869 network=unix timestamp=2023-05-11T09:33:20.041Z 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:20+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-bd7464de-fa72-736b-e57c-6782cc7d7202-group-grafana-grafana-3000-sidecar-proxy 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=8569 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:20+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=559b1cfffe0e06b6de1a936f8edffda5bf93adc014251b1bc54abd06e86b878c 2023-05-11T11:33:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-bd7464de-fa72-736b-e57c-6782cc7d7202-group-grafana-grafana-3000 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana path=/usr/local/bin/nomad pid=8546 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana path=/usr/local/bin/nomad 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.62\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"fa:94:25:4c:bf:cb\",\"Sandbox\":\"/var/run/docker/netns/50fd5bc62441\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth1e79e3c6\":{\"IPConfigs\":null,\"Mac\":\"8e:2f:20:b9:a8:63\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=9 errors=0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=9 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15376 total=13 pulled=4 filtered=9 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=559b1cfffe0e06b6de1a936f8edffda5bf93adc014251b1bc54abd06e86b878c 2023-05-11T11:33:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=mimir container_name=mimir-e9a342a8-67cb-b747-1c58-839ea5d53d3b 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=mimir network_mode=container:0cce76aba66e350407168dc7bd91c25ebc13c9cd7441f94b345fa363c6f33eba 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=mimir labels="map[com.github.logunifier.application.name:mimir com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.8.0 com.hashicorp.nomad.alloc_id:e9a342a8-67cb-b747-1c58-839ea5d53d3b com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:mimir com.hashicorp.nomad.task_name:mimir]" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=mimir binds="[]string{\"/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir/local:/local\", \"/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir/local/mimir.yml:/config/mimir.yaml\"}" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/mimir:2.8.0 image_id=sha256:32637dc0e216f8623d88603773acd1bebf3864e8d9ade74453616aaeb2899f4a references=1 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/mimir:2.8.0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=mimir memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir/local/mimir.yml" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 0 dependencies 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir/local/mimir.yml" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template b1554d65cfdf1fc13ab7265841148372 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"\n# Test ++ env \"NOMAD_ALLOC_NAME\" ++\n# Do not use this configuration in production.\n# It is for demonstration purposes only.\n\n# Run Mimir in single process mode, with all components running in 1 process.\ntarget: all,alertmanager,overrides-exporter\n# Disable tendency support.\nmultitenancy_enabled: false\n\nserver:\n http_listen_port: 9009\n log_level: debug\n # Configure the server to allow messages up to 100MB.\n grpc_server_max_recv_msg_size: 104857600\n grpc_server_max_send_msg_size: 104857600\n grpc_server_max_concurrent_streams: 1000\n\nblocks_storage:\n backend: filesystem\n bucket_store:\n sync_dir: /data/tsdb-sync\n #ignore_blocks_within: 10h # default 10h\n filesystem:\n dir: /data/blocks\n tsdb:\n dir: /data/tsdb\n # Note that changing this requires changes to some other parameters like\n # -querier.query-store-after,\n # -querier.query-ingesters-within and\n # -blocks-storage.bucket-store.ignore-blocks-within.\n # retention_period: 24h # default 24h\nquerier:\n # query_ingesters_within: 13h # default 13h\n #query_store_after: 12h #default 12h\nruler_storage:\n backend: filesystem\n filesystem:\n dir: /data/rules\n\nalertmanager_storage:\n backend: filesystem\n filesystem:\n dir: /data/alarms\n\nfrontend:\n grpc_client_config:\n grpc_compression: snappy\n\nfrontend_worker:\n grpc_client_config:\n grpc_compression: snappy\n\ningester_client:\n grpc_client_config:\n grpc_compression: snappy\n\nquery_scheduler:\n grpc_client_config:\n grpc_compression: snappy\n\nalertmanager:\n data_dir: /data/alertmanager\n# retention: 120h\n sharding_ring:\n replication_factor: 1\n alertmanager_client:\n grpc_compression: snappy\n\nruler:\n query_frontend:\n grpc_client_config:\n grpc_compression: snappy\n\ncompactor:\n# compaction_interval: 1h # default 1h\n# deletion_delay: 12h # default 12h\n max_closing_blocks_concurrency: 2\n max_opening_blocks_concurrency: 4\n symbols_flushers_concurrency: 4\n data_dir: /data/compactor\n sharding_ring:\n kvstore:\n store: memberlist\n\n\ningester:\n ring:\n replication_factor: 1\n\nstore_gateway:\n sharding_ring:\n replication_factor: 1\n\nlimits:\n # Limit queries to 5 years. You can override this on a per-tenant basis.\n max_total_query_length: 43680h\n max_label_names_per_series: 42\n # Allow ingestion of out-of-order samples up to 2 hours since the latest received sample for the series.\n out_of_order_time_window: 1d\n # delete old blocks from long-term storage.\n # Delete from storage metrics data older than 1d.\n compactor_blocks_retention_period: 1d\n ingestion_rate: 100000","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir/local/mimir.yml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/mimir"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir @module=logmon path=/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/alloc/logs/.mimir.stdout.fifo timestamp=2023-05-11T09:33:19.829Z 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir @module=logmon path=/opt/services/core/nomad/data/alloc/e9a342a8-67cb-b747-1c58-839ea5d53d3b/alloc/logs/.mimir.stderr.fifo timestamp=2023-05-11T09:33:19.830Z 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir version=2 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir address=/tmp/plugin3710024115 network=unix @module=logmon timestamp=2023-05-11T09:33:19.826Z 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir path=/usr/local/bin/nomad 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir path=/usr/local/bin/nomad pid=8402 2023-05-11T11:33:19+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-e9a342a8-67cb-b747-1c58-839ea5d53d3b-group-mimir-mimir-api 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir type="Task Setup" msg="Building Task Directory" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.runner_hook: received result from CNI: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b result="{\"Interfaces\":{\"eth0\":{\"IPConfigs\":[{\"IP\":\"172.26.66.61\",\"Gateway\":\"172.26.64.1\"}],\"Mac\":\"86:03:59:c6:c9:86\",\"Sandbox\":\"/var/run/docker/netns/7a6a75ce4888\"},\"nomad\":{\"IPConfigs\":null,\"Mac\":\"b6:77:25:d7:7f:65\",\"Sandbox\":\"\"},\"veth7647c7d7\":{\"IPConfigs\":null,\"Mac\":\"92:fe:53:21:a9:c5\",\"Sandbox\":\"\"}},\"DNS\":[{}],\"Routes\":[{\"dst\":\"0.0.0.0/0\"}]}" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="609.787ยตs" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee previous_alloc=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee previous_alloc=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=6 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b previous_alloc=24c845fb-fff0-3707-fe56-95075c5f28dc 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b previous_alloc=24c845fb-fff0-3707-fe56-95075c5f28dc 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=5 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 previous_alloc=434f71a9-4b50-8512-effc-5858456f87be 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_migrator: waiting for previous alloc to terminate: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 previous_alloc=434f71a9-4b50-8512-effc-5858456f87be 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=1 removed=0 updated=4 ignored=8 errors=0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=connect-proxy-keycloak-postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=857749e0-52fe-92ee-7bef-fafbe67605ee task=keycloak_postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15373 total=13 pulled=7 filtered=6 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=1 removed=0 updated=4 ignored=8 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=e9a342a8-67cb-b747-1c58-839ea5d53d3b task=mimir type=Received msg="Task received by client" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=2 removed=0 updated=4 ignored=6 errors=0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=connect-proxy-grafana type=Received msg="Task received by client" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=bd7464de-fa72-736b-e57c-6782cc7d7202 task=grafana type=Received msg="Task received by client" failed=false 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15370 total=12 pulled=6 filtered=6 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=2 removed=0 updated=4 ignored=6 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=f1aa7cec-a641-9a90-c558-d132cfac8d2e type=service namespace=default job_id=security node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=f1aa7cec-a641-9a90-c558-d132cfac8d2e job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=f1aa7cec-a641-9a90-c558-d132cfac8d2e job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=f1aa7cec-a641-9a90-c558-d132cfac8d2e type=service namespace=default job_id=security node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=f1aa7cec-a641-9a90-c558-d132cfac8d2e job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=2bf3485e-2620-3436-0939-3081bc294002 type=service namespace=default job_id=security node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=2bf3485e-2620-3436-0939-3081bc294002 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=2bf3485e-2620-3436-0939-3081bc294002 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 2bf3485e, job security, NodeUpdates: (node[36d1fc65] (a1aad1ed stop/evict) (8f3fb4a6 stop/evict)), NodeAllocations: (node[36d1fc65] (857749e0 security.keycloak-postgres[0] run))(node[f652ee64] (57e22169 security.keycloak[0] run)))" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 2) (destructive 0) (inplace 0) (stop 2) (disconnect 0) (reconnect 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=2bf3485e-2620-3436-0939-3081bc294002 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=2bf3485e-2620-3436-0939-3081bc294002 type=service namespace=default job_id=security node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=2bf3485e-2620-3436-0939-3081bc294002 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=33406d97-1bd8-8441-dd20-4db3e06ee954 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=33406d97-1bd8-8441-dd20-4db3e06ee954 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=33406d97-1bd8-8441-dd20-4db3e06ee954 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=33406d97-1bd8-8441-dd20-4db3e06ee954 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=33406d97-1bd8-8441-dd20-4db3e06ee954 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 1) (inplace 0) (destructive 0) (stop 1) (migrate 0) (ignore 0) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 2) (destructive 0) (inplace 0) (stop 2) (disconnect 0) (reconnect 0) 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval c4d5c467, job observability, NodeUpdates: (node[36d1fc65] (24c845fb stop/evict) (434f71a9 stop/evict)), NodeAllocations: (node[36d1fc65] (e9a342a8 observability.mimir[0] run) (bd7464de observability.grafana[0] run)))" 2023-05-11T11:33:19+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:33:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:33:18+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="475.972ยตs" 2023-05-11T11:33:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:33:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:33:18+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:17+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=3.556834ms 2023-05-11T11:33:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:33:16+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:33:16+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.454711ms 2023-05-11T11:33:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=3.629444ms 2023-05-11T11:33:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:43746 2023-05-11T11:33:15+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="922.81ยตs" 2023-05-11T11:33:14+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=3.783563ms 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=6 errors=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15369 total=10 pulled=4 filtered=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=4 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-dcd787cd-aa19-0c02-2a4c-676ab7dd0520-group-loki-loki-http 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=5 ignored=5 errors=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15367 total=10 pulled=5 filtered=5 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=5 ignored=5 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=6 errors=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15365 total=10 pulled=4 filtered=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval d80b728a, job observability, NodeAllocations: (node[36d1fc65] (dcd787cd observability.loki[0] run)))" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=c8803169-3052-ff6f-4027-786f5675356f 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=d80b728a-f942-5dc2-e6c4-1fa557b19a64 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.client: adding evaluations for rescheduling failed allocations: num_evals=1 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki path=/usr/local/bin/nomad pid=6674 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7955 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-7c33dfaf21ce09baf344a19ef5b57003ed6cef9e0840838f95b63683eb295cf2.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=4 ignored=6 errors=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=4 ignored=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15363 total=10 pulled=4 filtered=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=2 ignored=8 errors=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=2 ignored=8 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15359 total=10 pulled=2 filtered=8 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval d9fae19c, job observability, NodeAllocations: (node[36d1fc65] (24c845fb observability.mimir[0] run) (434f71a9 observability.grafana[0] run)))" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=c4d5c467-5391-f78e-3cdf-b67a1642ef8e 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=33406d97-1bd8-8441-dd20-4db3e06ee954 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=d9fae19c-45e6-6062-0c9a-94f00d5b3696 type=service namespace=default job_id=observability node_id="" triggered_by=alloc-failure 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 type=service namespace=default job_id=security node_id="" triggered_by=alloc-failure 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: submitted plan for evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Started msg="Task started by client" failed=false 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:13.382Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15356 total=10 pulled=0 filtered=10 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=0 ignored=10 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=0 ignored=10 errors=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin803486342 network=unix timestamp=2023-05-11T09:33:13.379Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad: evaluating plan: plan="(eval 711f2e2e, job security, NodeAllocations: (node[36d1fc65] (8f3fb4a6 security.keycloak[0] run) (a1aad1ed security.keycloak-postgres[0] run)))" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=f1aa7cec-a641-9a90-c558-d132cfac8d2e 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: found reschedulable allocs, followup eval created: eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 followup_eval_id=2bf3485e-2620-3436-0939-3081bc294002 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: created evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path="/v1/jobs?meta=true&index=15327" duration=1.209623333s 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 type=service namespace=default job_id=security node_id="" triggered_by=alloc-failure 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=711f2e2e-bd4f-549f-0739-d78e4cca7da0 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.client: adding evaluations for rescheduling failed allocations: num_evals=2 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=7c33dfaf21ce09baf344a19ef5b57003ed6cef9e0840838f95b63683eb295cf2 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7955 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=5 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=6 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=7c33dfaf21ce09baf344a19ef5b57003ed6cef9e0840838f95b63683eb295cf2 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=loki memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=loki container_name=loki-dcd787cd-aa19-0c02-2a4c-676ab7dd0520 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=loki network_mode=container:5f689cac61874e6b55a015b07f723f9bce8e57a55f5fbc94d41131bb77e8b546 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=loki binds="[]string{\"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local:/local\", \"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local/loki.yaml:/config/loki.yaml\"}" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=loki labels="map[com.github.logunifier.application.name:loki com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.8.2 com.hashicorp.nomad.alloc_id:dcd787cd-aa19-0c02-2a4c-676ab7dd0520 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:loki com.hashicorp.nomad.task_name:loki]" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: cancelling removal of container image: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=1 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki @module=logmon path=/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/alloc/logs/.loki.stderr.fifo timestamp=2023-05-11T09:33:13.196Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki path=/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/alloc/logs/.loki.stdout.fifo @module=logmon timestamp=2023-05-11T09:33:13.195Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki 2023-05-11T11:33:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:33:13+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner: waiting for task to exit: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad pid=6622 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/usr/local/bin/nomad pid=6663 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner: waiting for task to exit: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres path=/usr/local/bin/nomad pid=6627 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres @module=logmon timestamp=2023-05-11T09:33:13.033Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres @module=logmon timestamp=2023-05-11T09:33:13.033Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak path=/usr/local/bin/nomad pid=6667 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak @module=logmon timestamp=2023-05-11T09:33:13.029Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak @module=logmon timestamp=2023-05-11T09:33:13.029Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner: waiting for task to exit: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/usr/local/bin/nomad pid=6628 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=434f71a9-4b50-8512-effc-5858456f87be 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana path=/usr/local/bin/nomad pid=6637 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana @module=logmon timestamp=2023-05-11T09:33:13.016Z 2023-05-11T11:33:13+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana @module=logmon timestamp=2023-05-11T09:33:13.016Z 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=5.460373ms 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=WaitingToDequeue to=Paused 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=WaitingToDequeue to=Paused 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration="417.348ยตs" 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=2 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-console-console 2023-05-11T11:33:12+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-http 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Started msg="Task started by client" failed=false 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:12.108Z 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin2622050451 network=unix @module=docker_logger timestamp=2023-05-11T09:33:12.105Z 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7658 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=547d7e83b2604951f51a6692156f9ff3f680c1fed5107911e46620c056210788 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=PUT path=/v1/node/36d1fc65-c097-97bc-18ac-079c1262ccfd/eligibility duration=1.594538ms 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/node/36d1fc65-c097-97bc-18ac-079c1262ccfd duration=2.816783ms 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/nodes?prefix=36d1fc65-c097-97bc-18ac-079c1262ccfd duration=2.976034971s 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=0 ignored=10 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=0 ignored=10 errors=0 2023-05-11T11:33:12+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: updated allocations: index=15343 total=10 pulled=0 filtered=10 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.093329ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=9.878309ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/namespaces duration=1.776816ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/jobs?meta=true duration="645.707ยตs" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=POST path=/v1/search/fuzzy duration=1.044719876s 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=547d7e83b2604951f51a6692156f9ff3f680c1fed5107911e46620c056210788 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=07891dd4-4f70-f4a0-ce4d-5440c0c79d17 type=service namespace=default job_id=observability node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=07891dd4-4f70-f4a0-ce4d-5440c0c79d17 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=07891dd4-4f70-f4a0-ce4d-5440c0c79d17 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=07891dd4-4f70-f4a0-ce4d-5440c0c79d17 job_id=observability namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=07891dd4-4f70-f4a0-ce4d-5440c0c79d17 type=service namespace=default job_id=observability node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=3f440711-87a5-6581-0e26-47bdf9ca0820 type=service namespace=default job_id=minio node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Scheduling to=WaitingToDequeue 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=3f440711-87a5-6581-0e26-47bdf9ca0820 type=service namespace=default job_id=minio node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=3f440711-87a5-6581-0e26-47bdf9ca0820 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=3f440711-87a5-6581-0e26-47bdf9ca0820 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=3f440711-87a5-6581-0e26-47bdf9ca0820 job_id=minio namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=45287a62-e384-ff13-ef41-26be6f20c432 type=service namespace=default job_id=security node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval="" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: reconciled current state with desired state: eval_id=45287a62-e384-ff13-ef41-26be6f20c432 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.service_sched: setting eval status: eval_id=45287a62-e384-ff13-ef41-26be6f20c432 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 eval_id=45287a62-e384-ff13-ef41-26be6f20c432 type=service namespace=default job_id=security node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=45287a62-e384-ff13-ef41-26be6f20c432 job_id=security namespace=default worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingForRaft to=Scheduling 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client: evaluations triggered by node registration: num_evals=4 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client: node registration complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=Scheduling to=Paused 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: ack evaluation: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd eval_id=660dce64-5ce1-c051-c96e-18f6083f7cc9 type=system namespace=default job_id=ingress node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: updated evaluation: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd eval="" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.system_sched: reconciled current state with desired state: eval_id=660dce64-5ce1-c051-c96e-18f6083f7cc9 job_id=ingress namespace=default worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 3) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker.system_sched: setting eval status: eval_id=660dce64-5ce1-c051-c96e-18f6083f7cc9 job_id=ingress namespace=default worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.system_sched.binpack: NewBinPackIterator created: eval_id=660dce64-5ce1-c051-c96e-18f6083f7cc9 job_id=ingress namespace=default worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=WaitingForRaft to=Scheduling 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: dequeued evaluation: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd eval_id=660dce64-5ce1-c051-c96e-18f6083f7cc9 type=system namespace=default job_id=ingress node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd triggered_by=node-update 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=2 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-http 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container startup command: driver=docker task_name=minio command="server /data --console-address :9001 --certs-dir /certs" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=minio network_mode=container:8fd2dea07fb2c7b0e687c8c4abecf5c81db4d9c16eb9b72e035d1b495d302652 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=minio container_name=minio-a37da363-9048-86d5-93e1-d6facf1490b1 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=minio labels="map[com.hashicorp.nomad.alloc_id:a37da363-9048-86d5-93e1-d6facf1490b1 com.hashicorp.nomad.job_id:minio com.hashicorp.nomad.job_name:minio com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:minio com.hashicorp.nomad.task_name:minio]" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=6.442613ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=minio memory=4294967296 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=minio binds="[]string{\"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/local:/local\", \"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/secrets:/secrets\"}" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/minio/minio:RELEASE.2023-04-28T18-11-17Z image_id=sha256:5ba81f3dad7fb4d608d375ec64cac33fcb196e0ed530be35e002177639b11d21 references=1 2023-05-11T11:33:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-console-console 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/grafana/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/observability@default.global) is still needed 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio reason="" delay=0s 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:5ba81f3dad7fb4d608d375ec64cac33fcb196e0ed530be35e002177639b11d21 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/grafana/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/observability@default.global) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template aa33fb80064c17acc1b58a55de5165e2 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.drain.job_watcher: getting job allocs at index: index=15343 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: blocked evals status modified: paused=false 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.drain.job_watcher: getting job allocs at index: index=1 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: eval broker status modified: paused=false 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.drain.job_watcher: retrieved allocs for draining jobs: num_allocs=0 index=15343 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.autopilot: state update routine is now running 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.autopilot: autopilot is now running 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio error="container stopped" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284 driver=docker 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/keycloak/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/keycloak/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 7382522149c1fa0ee539c4acc2b323ba 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=Started to=Pausing 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=Started to=Pausing 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=Started to=Pausing 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=275566f2-1745-48d5-3c1d-eda320c72c4b job_id=security namespace=default algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=51eb2f95-831a-feff-7092-ec0204ed6d2a job_id=observability namespace=default status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=275566f2-1745-48d5-3c1d-eda320c72c4b job_id=security namespace=default 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=275566f2-1745-48d5-3c1d-eda320c72c4b job_id=security namespace=default status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=cae1ba9c-57e6-c9de-4528-9980d7fb47e0 job_id=minio namespace=default status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=51eb2f95-831a-feff-7092-ec0204ed6d2a job_id=observability namespace=default 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=51eb2f95-831a-feff-7092-ec0204ed6d2a job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=76e93ac6-19a5-0c71-39a9-e012822e0a2a job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=cae1ba9c-57e6-c9de-4528-9980d7fb47e0 job_id=minio namespace=default 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=76e93ac6-19a5-0c71-39a9-e012822e0a2a job_id=ingress namespace=default status=complete 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=76e93ac6-19a5-0c71-39a9-e012822e0a2a job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 2) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=cae1ba9c-57e6-c9de-4528-9980d7fb47e0 job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" duration=1.141413ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" duration=1.804019ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/minio@default.global) is still needed 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 124b5b35ead742dca1d9561857552116 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/minio@default.global) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres/local/initddb.sql" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 6e2d32cd4b2988cfe8f68f510d993549 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 3d38a7b72b56e929ec016a5e38bef38f 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres/secrets/env.vars" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" duration="825.332ยตs" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" duration=3.821764ms 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:11+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:33872: EOF 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/members duration=1.140014ms 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/operator/license duration="4.058ยตs" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/acl/token/self duration="949.016ยตs" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request failed: method=GET path=/v1/acl/token/self error="RPC Error:: 400,ACL support disabled" code=400 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/regions duration="203.707ยตs" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="857.68ยตs" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:33:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-d3a0a820c835edfd96b2b113480c431d255b8840 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:10+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-24c845fb-fff0-3707-fe56-95075c5f28dc-group-mimir-mimir-api 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="608.744ยตs" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="607.289ยตs" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) stopping 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) received finish 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:cefd1c9e490c8b581d834d878081cf64c133df1f9f443c5e5f8d94fbd7c7a1d4 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) stopping all views 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.gc: marking allocation for GC: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) stopping watcher 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin exited: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner.task_hook.logmon: plugin process exited: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir path=/usr/local/bin/nomad pid=6616 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.stdio: received EOF, stopping recv loop: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:c3f97ccc75b7ce6b444f7a9d1093acfdd6a78038a63e2993cf34f9d2cc409e36 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:32637dc0e216f8623d88603773acd1bebf3864e8d9ade74453616aaeb2899f4a references=0 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:7e35787ef40f6cf939d161c814f18b91c0162bf61e1ad2ddda37156e77bda016 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7345 2023-05-11T11:33:09+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e driver=docker 2023-05-11T11:33:09+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres type="Sibling Task Failed" msg="Task's sibling \"connect-proxy-keycloak-postgres\" failed" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner: task failure, destroying all tasks: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d failed_task=connect-proxy-keycloak-postgres 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: task run loop exiting: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=await-for-keycloak-postgres 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak type=Terminated msg="Exit Code: 143, Exit Message: \"Docker container exited with non-zero exit code: 143\"" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a driver=docker 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=await-for-keycloak-postgres type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner: task failure, destroying all tasks: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 failed_task=connect-proxy-keycloak 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak type="Sibling Task Failed" msg="Task's sibling \"connect-proxy-keycloak\" failed" failed=false 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=1 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d driver=docker 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana type=Killing msg="Sent interrupt. Waiting 5s before force killing" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7252 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner: task failure, destroying all tasks: alloc_id=434f71a9-4b50-8512-effc-5858456f87be failed_task=connect-proxy-grafana 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana type="Sibling Task Failed" msg="Task's sibling \"connect-proxy-grafana\" failed" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: not restarting task: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana reason="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type="Not Restarting" msg="Exceeded allowed attempts 1 in interval 1h0m0s and mode is \"fail\"" failed=true 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7309 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: read unix @->/tmp/plugin981231429: read: connection reset by peer" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=7266 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-c9bab43dc01aaa4f2fb76ff56f6bc875a603b652 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.548Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin4140515417 network=unix timestamp=2023-05-11T09:33:08.544Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.512Z 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-d9d9c594-c60d-e002-6448-2a9b8b5fa6ec-traefik-traefik- 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin3677397115 network=unix timestamp=2023-05-11T09:33:08.508Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.493Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2828463621 network=unix timestamp=2023-05-11T09:33:08.490Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.483Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin1668767597 network=unix @module=docker_logger timestamp=2023-05-11T09:33:08.480Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 error="dial tcp 10.21.21.42:9411: connect: connection refused" 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-8caff4c82182deed92be7cb585067c0b09756bf5 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.461Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin981231429 network=unix @module=docker_logger timestamp=2023-05-11T09:33:08.457Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7345 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7346 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=17e4c15a57cdff24937d392d474774a758f32704c69f65772e6fde3b859ab34a 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=46cca2d1242a1f7e9ddd9f8e8177abe902defd6b3f3da5be9b6e10b5290bf033 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7343 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=bfc44881a19d03c35e243d4c1d576e2118d4ed5afbdd93ae38837eeb3ccc8e6d 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7337 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=a5bd876343749ed54b8f75520f403cc4b950b1fdbf20aeff9636d5f2316bb04a 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.412Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7309 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=6d0a4302046f294d146ddab5e767bff43577ae9f39deb3ab5cc162a4f76b5213 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2505957083 network=unix timestamp=2023-05-11T09:33:08.409Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type=Started msg="Task started by client" failed=false 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:08.401Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1668951041 network=unix timestamp=2023-05-11T09:33:08.396Z 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7266 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=45b96b73067a0a0de16a11d86ebf5cd314df18f5fac40c9b2f721d48fa272165 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=7252 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=f8809b2781385070381cd122fc045cfddfef529f1ad138d72129a71374beed3b 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-e3ae79199f9b053996c6e377cee17e7f89628f2e 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=17e4c15a57cdff24937d392d474774a758f32704c69f65772e6fde3b859ab34a 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=bfc44881a19d03c35e243d4c1d576e2118d4ed5afbdd93ae38837eeb3ccc8e6d 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=46cca2d1242a1f7e9ddd9f8e8177abe902defd6b3f3da5be9b6e10b5290bf033 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=6d0a4302046f294d146ddab5e767bff43577ae9f39deb3ab5cc162a4f76b5213 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=nats labels="map[com.github.logunifier.application.name:nats com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.version:2.9.16 com.hashicorp.nomad.alloc_id:54969951-d541-ae97-922a-7db38096bae5 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:nats com.hashicorp.nomad.task_name:nats]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=nats container_name=nats-54969951-d541-ae97-922a-7db38096bae5 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=nats network_mode=container:80ce0535afbf7deeeed5b8adafb77e08e63f7bc9df8574a8065abbb07d47f1e8 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/nats:2.9.16-alpine image_id=sha256:657fde4007c4b6834917360e99c6d1d2aba8008f86063236cf1fafb1ac022404 references=1 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=nats binds="[]string{\"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats/local:/local\", \"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats/local/nats.conf:/config/nats.conf\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=nats memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=nats-prometheus-exporter binds="[]string{\"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/local:/local\", \"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets:/secrets\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=nats-prometheus-exporter container_name=nats-prometheus-exporter-54969951-d541-ae97-922a-7db38096bae5 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=nats-prometheus-exporter network_mode=container:80ce0535afbf7deeeed5b8adafb77e08e63f7bc9df8574a8065abbb07d47f1e8 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/natsio/prometheus-nats-exporter:0.11.0 image_id=sha256:e5358311d02ae05b73d37045f1ce747a2088c015d4458aa90221d4f04f71ed07 references=1 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-f8811becea279166dcf7a6bbcd9d2b5aea72e20f error="dial tcp 10.21.21.42:4317: connect: connection refused" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=nats-prometheus-exporter memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=nats-prometheus-exporter labels="map[com.github.logunifier.application.name:prometheus-nats-exporter com.github.logunifier.application.pattern.key:tslevelmsg com.github.logunifier.application.version:0.11.0.0 com.hashicorp.nomad.alloc_id:54969951-d541-ae97-922a-7db38096bae5 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:nats com.hashicorp.nomad.task_name:nats-prometheus-exporter]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=mimir memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=mimir network_mode=container:0679e85fefe7840b4a0c9f548664b360ac3eb47d27a6a06b395a4033e8a4e727 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=mimir binds="[]string{\"/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/mimir/local:/local\", \"/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/mimir/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/mimir/local/mimir.yml:/config/mimir.yaml\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=mimir labels="map[com.github.logunifier.application.name:mimir com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.8.0 com.hashicorp.nomad.alloc_id:24c845fb-fff0-3707-fe56-95075c5f28dc com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:mimir com.hashicorp.nomad.task_name:mimir]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=mimir container_name=mimir-24c845fb-fff0-3707-fe56-95075c5f28dc 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/mimir:2.8.0 image_id=sha256:32637dc0e216f8623d88603773acd1bebf3864e8d9ade74453616aaeb2899f4a references=1 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=3 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-keycloak memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-keycloak binds="[]string{\"/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/connect-proxy-keycloak/local:/local\", \"/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/connect-proxy-keycloak/secrets:/secrets\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-keycloak labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:8f3fb4a6-629a-7afd-a334-5580bf2d3374 com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak com.hashicorp.nomad.task_name:connect-proxy-keycloak]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-keycloak network_mode=container:f758cbee8e54f1d079007d17e546d977fd64c5cd1ff8b13e13dd44b1badb3367 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-keycloak container_name=connect-proxy-keycloak-8f3fb4a6-629a-7afd-a334-5580bf2d3374 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/connect-proxy-keycloak/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/connect-proxy-keycloak/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=45b96b73067a0a0de16a11d86ebf5cd314df18f5fac40c9b2f721d48fa272165 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=f8809b2781385070381cd122fc045cfddfef529f1ad138d72129a71374beed3b 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=2 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-grafana memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-grafana container_name=connect-proxy-grafana-434f71a9-4b50-8512-effc-5858456f87be 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-grafana binds="[]string{\"/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/local:/local\", \"/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets:/secrets\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-grafana network_mode=container:4f69f417c113884459aa1f913366b5885b4e98c92884520555b378e2beb8c6e9 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-grafana labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:434f71a9-4b50-8512-effc-5858456f87be com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana com.hashicorp.nomad.task_name:connect-proxy-grafana]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=a5bd876343749ed54b8f75520f403cc4b950b1fdbf20aeff9636d5f2316bb04a 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=connect-proxy-keycloak-postgres binds="[]string{\"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/local:/local\", \"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets:/secrets\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=connect-proxy-keycloak-postgres memory=314572800 memory_reservation=0 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=envoyproxy/envoy:v1.25.1 image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a references=1 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=connect-proxy-keycloak-postgres network_mode=container:137b9338e91ed234323c060041588513e2ac0cb3371e09276ab4d305ba26a213 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=connect-proxy-keycloak-postgres labels="map[com.github.logunifier.application.pattern.key:envoy com.hashicorp.nomad.alloc_id:a1aad1ed-27cd-b3ad-a9a0-075a42fac82d com.hashicorp.nomad.job_id:security com.hashicorp.nomad.job_name:security com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:keycloak-postgres com.hashicorp.nomad.task_name:connect-proxy-keycloak-postgres]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=connect-proxy-keycloak-postgres container_name=connect-proxy-keycloak-postgres-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=traefik labels="map[com.github.logunifier.application.name:traefik com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.10.1 com.hashicorp.nomad.alloc_id:d9d9c594-c60d-e002-6448-2a9b8b5fa6ec com.hashicorp.nomad.job_id:ingress com.hashicorp.nomad.job_name:ingress com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:traefik com.hashicorp.nomad.task_name:traefik]" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: allocated static port: driver=docker task_name=traefik ip=10.21.21.42 port=80 label=http 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=traefik binds="[]string{\"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local:/local\", \"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local/traefik.toml:/etc/traefik/traefik.toml\", \"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local/certconfig.toml:/etc/traefik/certconfig.toml\"}" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=10.21.21.41:5000/traefik:v2.10.1 image_id=sha256:63d7224eb30e1c2f2976e0043c6cb6f4f7f9bebb820253dc65998bbd747fcc85 references=1 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=traefik memory=4294967296 memory_reservation=134217728 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: exposed port: driver=docker task_name=traefik port=443 label=https 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=traefik container_name=traefik-d9d9c594-c60d-e002-6448-2a9b8b5fa6ec 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: allocated static port: driver=docker task_name=traefik ip=10.21.21.42 port=443 label=https 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: exposed port: driver=docker task_name=traefik port=80 label=http 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:08+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=2.678086ms 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path=/v1/nodes?prefix=36d1fc65-c097-97bc-18ac-079c1262ccfd error="rpc error: Not ready to serve consistent reads" code=500 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/nodes?prefix=36d1fc65-c097-97bc-18ac-079c1262ccfd duration=5.03762022s 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-491b4cb2a89cbc30d59ab6a91103d17b0182c6e0 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image id reference count decremented: driver=docker image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=0 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6956 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error updating allocations: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.UpdateAlloc server=10.21.21.41:4647 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.UpdateAlloc server=10.21.21.41:4647 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=Backoff to=WaitingToDequeue 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=WaitingToDequeue to=Backoff 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error querying node allocations: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=Backoff to=WaitingToDequeue 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Started msg="Task started by client" failed=false 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:07.044Z 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2169704878 network=unix timestamp=2023-05-11T09:33:07.042Z 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=WaitingToDequeue to=Backoff 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Backoff to=WaitingToDequeue 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=f1080078600309a13fda57d4937c9f4a5bbaed1110f29b093b258c565c611e45 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:07+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6956 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=WaitingToDequeue to=Backoff 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.Register server=10.21.21.41:4647 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.Register server=10.21.21.41:4647 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error registering: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=Backoff to=WaitingToDequeue 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=WaitingToDequeue to=Backoff 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=f1080078600309a13fda57d4937c9f4a5bbaed1110f29b093b258c565c611e45 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=loki container_name=loki-dcd787cd-aa19-0c02-2a4c-676ab7dd0520 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/loki:2.8.2 image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 references=1 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=loki binds="[]string{\"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local:/local\", \"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local/loki.yaml:/config/loki.yaml\"}" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=loki memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=loki labels="map[com.github.logunifier.application.name:loki com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.8.2 com.hashicorp.nomad.alloc_id:dcd787cd-aa19-0c02-2a4c-676ab7dd0520 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:loki com.hashicorp.nomad.task_name:loki]" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=loki network_mode=container:5f689cac61874e6b55a015b07f723f9bce8e57a55f5fbc94d41131bb77e8b546 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:2162690d5e710430db23773f44d4e68408e89f21fe55e24a1a6c627812d90d40 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki reason="" delay=0s 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki error="container stopped" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=f0e5c607a3af6576205f6532af9f937016a8dee10babf1c8585b51b449a7a814 driver=docker 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=5.449107ms 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d45710de0cca1e80642f67df321da7ece955acfd 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.585313ms 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-f0e5c607a3af6576205f6532af9f937016a8dee10babf1c8585b51b449a7a814.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-b7d4b44b6996d12c68ac89cc6783c9bd6e55ce3e 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=127.0.0.1:33642 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:06+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=7eb5a5ee-6d7c-a1a9-06a3-9f4cec235a6b job_id=security namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=7eb5a5ee-6d7c-a1a9-06a3-9f4cec235a6b job_id=security namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=7eb5a5ee-6d7c-a1a9-06a3-9f4cec235a6b job_id=security namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=00857472-a4cf-270f-288a-ea9c95df18e5 job_id=observability namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=724294b2-6195-f511-ab41-572830b968d1 job_id=minio namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=00857472-a4cf-270f-288a-ea9c95df18e5 job_id=observability namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=00857472-a4cf-270f-288a-ea9c95df18e5 job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=724294b2-6195-f511-ab41-572830b968d1 job_id=minio namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=724294b2-6195-f511-ab41-572830b968d1 job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=da5ff823-a915-07e8-a71c-56cfed719f5a job_id=ingress namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=da5ff823-a915-07e8-a71c-56cfed719f5a job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 1) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=da5ff823-a915-07e8-a71c-56cfed719f5a job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "whoami": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 3) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=d8bc7a1c-0c89-96c3-8195-94ce48ebe37d job_id=minio namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=545220ee-c250-917e-b33e-fcabb6c08ffd job_id=security namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=f644b65b-3431-9727-b6c4-55e30da71b04 job_id=ingress namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=545220ee-c250-917e-b33e-fcabb6c08ffd job_id=security namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=d8bc7a1c-0c89-96c3-8195-94ce48ebe37d job_id=minio namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=54af94e1-333d-5d20-b925-c38237361798 job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=d8bc7a1c-0c89-96c3-8195-94ce48ebe37d job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=54af94e1-333d-5d20-b925-c38237361798 job_id=observability namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=545220ee-c250-917e-b33e-fcabb6c08ffd job_id=security namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=7f74673e-a339-f5bf-a461-1fc054045b78 job_id=whoami namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=7f74673e-a339-f5bf-a461-1fc054045b78 job_id=whoami namespace=default status=complete 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=7f74673e-a339-f5bf-a461-1fc054045b78 job_id=whoami namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=54af94e1-333d-5d20-b925-c38237361798 job_id=observability namespace=default 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=f644b65b-3431-9727-b6c4-55e30da71b04 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=f644b65b-3431-9727-b6c4-55e30da71b04 job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 3) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=5.663568ms 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-f0e5c607a3af6576205f6532af9f937016a8dee10babf1c8585b51b449a7a814.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana type=Template msg="Missing: nomad.var.block(nomad/jobs/observability@default.global)" failed=false 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak type=Template msg="Missing: nomad.var.block(nomad/jobs/security@default.global)" failed=false 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Template msg="Missing: nomad.var.block(nomad/jobs/minio@default.global)" failed=false 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres type=Template msg="Missing: nomad.var.block(nomad/jobs/security@default.global)" failed=false 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9831358df74f11262eea00f389597318c98843e3 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:05+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=_nomad-check-3e667fb12f8585b4515bd5b571a2693af6e10509 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-19f2f8c6d48ee2052e823941c97579f7b24fa995 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Started msg="Task started by client" failed=false 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:04.506Z 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2993812290 network=unix timestamp=2023-05-11T09:33:04.504Z 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=0a967a4563724f517f86e1426d5ff4edb2562122a25d5e29166b789859d5c54c 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6876 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=0a967a4563724f517f86e1426d5ff4edb2562122a25d5e29166b789859d5c54c 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-f0e5c607a3af6576205f6532af9f937016a8dee10babf1c8585b51b449a7a814.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=tempo memory=34359738368 memory_reservation=536870912 cpu_shares=500 cpu_quota=0 cpu_period=0 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=tempo binds="[]string{\"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local:/local\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml:/config/tempo.yaml\"}" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=tempo network_mode=container:17f8d4850acbf1afd975993a1d243d3ad4eb270abe7bf87f5c9d2069d0e3e1ea 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=tempo container_name=tempo-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=tempo labels="map[com.github.logunifier.application.name:tempo com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:2.1.1 com.hashicorp.nomad.alloc_id:86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:tempo com.hashicorp.nomad.task_name:tempo]" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/tempo:2.1.1 image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de references=1 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:04523525f28246daa9a499f2dd136f356806e965ecdab5b5dc9b4d3eddcff2de 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo reason="" delay=0s 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo error="container stopped" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Terminated msg="Exit Code: 2, Exit Message: \"Docker container exited with non-zero exit code: 2\"" failed=false 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=6de871bedc13520fe8bde867c62bc48c9b1e11667d753aee6549130c2aa57b4c driver=docker 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=5.546882ms 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Started msg="Task started by client" failed=false 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:04.146Z 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin2123016458 network=unix @module=docker_logger timestamp=2023-05-11T09:33:04.142Z 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:04+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy:1 2023-05-11T11:33:04+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy:1 error="dial tcp 10.21.21.42:25496: connect: connection refused" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6813 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: started container: driver=docker container_id=ea256c425b611c926e06e03a46ea9c20225dc9850ad129c8f61ceef7784189f2 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "whoami": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 3) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=e731af41-e05e-ce96-6c5a-335ce7c5240e job_id=security namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=ab9dcbad-96f0-34a2-d07c-8cedf1613e65 job_id=whoami namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=3e8fd000-cbfc-7efd-dd81-57dae4517d49 job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=e731af41-e05e-ce96-6c5a-335ce7c5240e job_id=security namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=1b2ee9a9-7c6e-d77f-08a0-3b30e86415c5 job_id=minio namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=3e8fd000-cbfc-7efd-dd81-57dae4517d49 job_id=observability namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=ab9dcbad-96f0-34a2-d07c-8cedf1613e65 job_id=whoami namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=1b2ee9a9-7c6e-d77f-08a0-3b30e86415c5 job_id=minio namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=3e8fd000-cbfc-7efd-dd81-57dae4517d49 job_id=observability namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=e731af41-e05e-ce96-6c5a-335ce7c5240e job_id=security namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=ab9dcbad-96f0-34a2-d07c-8cedf1613e65 job_id=whoami namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "whoami": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 3) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=144fad24-4657-523e-9703-92ab917036ae job_id=whoami namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=c4984170-fabc-0abc-3328-c6fd08df2e82 job_id=observability namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=421d0029-8ed3-377d-2e79-c512ba0625d3 job_id=security namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=144fad24-4657-523e-9703-92ab917036ae job_id=whoami namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=1311716e-eff9-1a22-6585-b40191a29733 job_id=ingress namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=1311716e-eff9-1a22-6585-b40191a29733 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=956f6fbc-586d-d565-dac1-1f399c6efd02 job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=c4984170-fabc-0abc-3328-c6fd08df2e82 job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=421d0029-8ed3-377d-2e79-c512ba0625d3 job_id=security namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=c4984170-fabc-0abc-3328-c6fd08df2e82 job_id=observability namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=144fad24-4657-523e-9703-92ab917036ae job_id=whoami namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=956f6fbc-586d-d565-dac1-1f399c6efd02 job_id=minio namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=956f6fbc-586d-d565-dac1-1f399c6efd02 job_id=minio namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=1311716e-eff9-1a22-6585-b40191a29733 job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 3) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=e7046d56-7bd5-2f65-1445-3a896cd6084d job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 3) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=421d0029-8ed3-377d-2e79-c512ba0625d3 job_id=security namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=e7046d56-7bd5-2f65-1445-3a896cd6084d job_id=ingress namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=1b2ee9a9-7c6e-d77f-08a0-3b30e86415c5 job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=e7046d56-7bd5-2f65-1445-3a896cd6084d job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=ae642172-e5bf-7d50-f7bf-16956db27d9e job_id=whoami namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 1) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "whoami": (place 0) (inplace 0) (destructive 1) (stop 0) (migrate 0) (ignore 2) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=72262b97-1351-2d41-3578-9c7716ad8ba6 job_id=security namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=72262b97-1351-2d41-3578-9c7716ad8ba6 job_id=security namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=72262b97-1351-2d41-3578-9c7716ad8ba6 job_id=security namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=ae642172-e5bf-7d50-f7bf-16956db27d9e job_id=whoami namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=f6003ab1-05dc-beaf-64ac-fb31675a1318 job_id=observability namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=ae642172-e5bf-7d50-f7bf-16956db27d9e job_id=whoami namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=f6003ab1-05dc-beaf-64ac-fb31675a1318 job_id=observability namespace=default 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=75a11aeb-925e-7a27-ed75-5813b410b6d4 job_id=minio namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=f6003ab1-05dc-beaf-64ac-fb31675a1318 job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=75a11aeb-925e-7a27-ed75-5813b410b6d4 job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=378978fd-4c93-333f-8e3b-19de2a12aff9 job_id=ingress namespace=default status=complete 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=378978fd-4c93-333f-8e3b-19de2a12aff9 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=378978fd-4c93-333f-8e3b-19de2a12aff9 job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 3) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=75a11aeb-925e-7a27-ed75-5813b410b6d4 job_id=minio namespace=default 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker: created container: driver=docker container_id=ea256c425b611c926e06e03a46ea9c20225dc9850ad129c8f61ceef7784189f2 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-ingress": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "keycloak-postgres": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "nats": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "logunifier": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "loki": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "mimir": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "tempo": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "grafana-agent": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=919f2ea1-92f3-ff6d-188a-d8cd1c24b49e job_id=observability namespace=default status=complete 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=722b6296-1ad8-fe4f-3a4d-d66c4a9c6ddb job_id=security namespace=default algorithm=spread 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=722b6296-1ad8-fe4f-3a4d-d66c4a9c6ddb job_id=security namespace=default 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=722b6296-1ad8-fe4f-3a4d-d66c4a9c6ddb job_id=security namespace=default status=complete 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Desired Changes for "minio": (place 0) (inplace 0) (destructive 0) (stop 0) (migrate 0) (ignore 1) (canary 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] results= 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] | Total changes: (place 0) (destructive 0) (inplace 0) (stop 0) (disconnect 0) (reconnect 0) 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=919f2ea1-92f3-ff6d-188a-d8cd1c24b49e job_id=observability namespace=default algorithm=spread 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: setting eval status: eval_id=8c0d1d09-060b-cce7-8c2c-572b239aa14a job_id=minio namespace=default status=complete 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=919f2ea1-92f3-ff6d-188a-d8cd1c24b49e job_id=observability namespace=default 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: setting eval status: eval_id=7cfa7750-d36b-a5c2-ce39-5dee2e6baecd job_id=ingress namespace=default status=complete 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.service_sched: reconciled current state with desired state: eval_id=8c0d1d09-060b-cce7-8c2c-572b239aa14a job_id=minio namespace=default 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=8c0d1d09-060b-cce7-8c2c-572b239aa14a job_id=minio namespace=default algorithm=spread 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.fsm.system_sched: reconciled current state with desired state: eval_id=7cfa7750-d36b-a5c2-ce39-5dee2e6baecd job_id=ingress namespace=default results="allocs: (place 0) (update 0) (migrate 0) (stop 0) (ignore 3) (lost 0) (disconnecting 0) (reconnecting 0)" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=7cfa7750-d36b-a5c2-ce39-5dee2e6baecd job_id=ingress namespace=default algorithm=spread 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configuring network mode for task group: driver=docker task_name=grafana-agent network_mode=container:e2f503486651f498010ccfecea17ff3e5b17e0ac02020372de2eacae60403d04 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: setting container name: driver=docker task_name=grafana-agent container_name=grafana-agent-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: applied labels on the container: driver=docker task_name=grafana-agent labels="map[com.github.logunifier.application.name:grafana_agent com.github.logunifier.application.pattern.key:logfmt com.github.logunifier.application.version:0.33.1 com.hashicorp.nomad.alloc_id:a04015b3-dc90-7f18-8bfd-c1cf7bc37eff com.hashicorp.nomad.job_id:observability com.hashicorp.nomad.job_name:observability com.hashicorp.nomad.namespace:default com.hashicorp.nomad.node_id:36d1fc65-c097-97bc-18ac-079c1262ccfd com.hashicorp.nomad.node_name:worker-01 com.hashicorp.nomad.task_group_name:grafana-agent com.hashicorp.nomad.task_name:grafana-agent]" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: configured resources: driver=docker task_name=grafana-agent memory=2147483648 memory_reservation=67108864 cpu_shares=100 cpu_quota=0 cpu_period=0 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: binding directories: driver=docker task_name=grafana-agent binds="[]string{\"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc:/alloc\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local:/local\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/secrets:/secrets\", \"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml:/config/agent.yaml\"}" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/grafana/agent:v0.33.1 image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf references=1 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:9833434074df83909804de7688db8f00ef71ef5fc00eb115d1e2e41d21ece5cf 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent reason="" delay=0s 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Restarting msg="Task restarting in 0s" failed=false 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent error="container stopped" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Terminated msg="Exit Code: 2, Exit Message: \"Docker container exited with non-zero exit code: 2\"" failed=false 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: attempted to stop an not-running container: container_id=308b9182a0cf2cfb0f8a3b4cc593fa8cb4d8518989d4cc5cd61b7a09809f1563 driver=docker 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type="Restart Signaled" msg="Template with change_mode restart re-rendered" failed=false 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy:1 error="dial tcp 10.21.21.42:24296: connect: connection refused" 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy:1 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.raft: vote granted: from=a1c8b791-e2b8-7606-19bd-988341a75d1b term=11 tally=1 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: cluster leadership acquired 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: entering leader state: leader="Node at 10.21.21.41:4647 [Leader]" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.raft: calculated votes needed: needed=1 term=11 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: election won: term=11 tally=1 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad.raft: heartbeat timeout reached, starting election: last-leader-addr= last-leader-id= 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: entering candidate state: node="Node at 10.21.21.41:4647 [Candidate]" term=11 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.raft: voting for self: term=11 id=a1c8b791-e2b8-7606-19bd-988341a75d1b 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3afd330277073ed9c7a777aec9a396da1e13ff91 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d1dd8e51a64b75a5a697b63d281784927c3e61ea 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9d0c7746fe5a6fac7981a073c371ce7c48faf5fb 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-f0e5c607a3af6576205f6532af9f937016a8dee10babf1c8585b51b449a7a814.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-6de871bedc13520fe8bde867c62bc48c9b1e11667d753aee6549130c2aa57b4c.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-308b9182a0cf2cfb0f8a3b4cc593fa8cb4d8518989d4cc5cd61b7a09809f1563.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-c24c12a7dc29441f77e7c9c88de14a1f21eee99b 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-83dd2ae30b209fba7eb8536144989cad6a457788 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:03+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="402.545ยตs" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.299179ms 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-6efbe1816a24cb8fcf96d53a9c697c70495afc8c 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="470.108ยตs" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="401.276ยตs" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy:1 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="791.018ยตs" 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy:2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=21.979499ms 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="630.393ยตs" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.event_broker: requested index no longer in buffer: requsted=15346 closest=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.event_broker: requested index no longer in buffer: requsted=15343 closest=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.event_broker: requested index no longer in buffer: requsted=15342 closest=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.event_broker: requested index no longer in buffer: requsted=15314 closest=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:e5358311d02ae05b73d37045f1ce747a2088c015d4458aa90221d4f04f71ed07 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:657fde4007c4b6834917360e99c6d1d2aba8008f86063236cf1fafb1ac022404 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type=Terminated msg="Exit Code: 2, Exit Message: \"Docker container exited with non-zero exit code: 2\"" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type=Terminated msg="Exit Code: 1, Exit Message: \"Docker container exited with non-zero exit code: 1\"" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local/loki.yaml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 0 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats/local/nats.conf" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.event_broker: requested index no longer in buffer: requsted=15314 closest=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local/loki.yaml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template ae4a8f4564353ce5e970f7f7d1c6a2da 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"# Client port of ++ env \"NOMAD_PORT_client\" ++ on all interfaces\nport: ++ env \"NOMAD_PORT_client\" ++\n\n# HTTP monitoring port\nmonitor_port: ++ env \"NOMAD_PORT_http\" ++\nserver_name: \"++ env \"NOMAD_ALLOC_NAME\" ++\"\n#If true enable protocol trace log messages. Excludes the system account.\ntrace: false\n#If true enable protocol trace log messages. Includes the system account.\ntrace_verbose: false\n#if true enable debug log messages\ndebug: false\nhttp_port: ++ env \"NOMAD_PORT_http\" ++\n#http: nats.service.consul:++ env \"NOMAD_PORT_http\" ++\n\njetstream {\n store_dir: /data/jetstream\n\n # 1GB\n max_memory_store: 2G\n\n # 10GB\n max_file_store: 10G\n}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats/local/nats.conf","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 8010a1378d33c582bd83aa917c8fbe05 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template d3fbf984da4c8c04e753c284e9a12f26 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter @module=logmon path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/alloc/logs/.nats-prometheus-exporter.stdout.fifo timestamp=2023-05-11T09:33:02.336Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter @module=logmon path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/alloc/logs/.nats-prometheus-exporter.stderr.fifo timestamp=2023-05-11T09:33:02.336Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/alloc/logs/.nats.stderr.fifo @module=logmon timestamp=2023-05-11T09:33:02.333Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats @module=logmon path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/alloc/logs/.nats.stdout.fifo timestamp=2023-05-11T09:33:02.333Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats address=/tmp/plugin3338324910 network=unix @module=logmon timestamp=2023-05-11T09:33:02.329Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:32637dc0e216f8623d88603773acd1bebf3864e8d9ade74453616aaeb2899f4a 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter network=unix @module=logmon address=/tmp/plugin1884751575 timestamp=2023-05-11T09:33:02.326Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) health.service(mimir|passing) is still needed 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency health.service(mimir|passing) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) rendered "(dynamic)" => "/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"auth_enabled: false\n\nserver:\n #default 3100\n http_listen_port: 3100\n #default 9005\n #grpc_listen_port: 9005\n # Max gRPC message size that can be received\n # CLI flag: -server.grpc-max-recv-msg-size-bytes\n #default 4194304 -\u003e 4MB\n grpc_server_max_recv_msg_size: 419430400\n\n # Max gRPC message size that can be sent\n # CLI flag: -server.grpc-max-send-msg-size-bytes\n #default 4194304 -\u003e 4MB\n grpc_server_max_send_msg_size: 419430400\n\n # Limit on the number of concurrent streams for gRPC calls (0 = unlimited)\n # CLI flag: -server.grpc-max-concurrent-streams\n grpc_server_max_concurrent_streams: 100\n\n # Log only messages with the given severity or above. Supported values [debug,\n # info, warn, error]\n # CLI flag: -log.level\n log_level: \"warn\"\ningester:\n wal:\n enabled: true\n dir: /data/wal\n lifecycler:\n address: 127.0.0.1\n ring:\n kvstore:\n store: memberlist\n replication_factor: 1\n final_sleep: 0s\n chunk_idle_period: 5m\n chunk_retain_period: 30s\n chunk_encoding: snappy\n\nruler:\n evaluation_interval : 1m\n poll_interval: 1m\n storage:\n type: local\n local:\n directory: /data/rules\n rule_path: /data/scratch\n++- range $index, $service := service \"mimir\" -++\n++- if eq $index 0 ++\n alertmanager_url: http://++$service.Name++.service.consul:++ $service.Port ++/alertmanager\n++- end ++\n++- end ++\n\n ring:\n kvstore:\n store: memberlist\n enable_api: true\n enable_alertmanager_v2: true\n\ncompactor:\n working_directory: /data/retention\n shared_store: filesystem\n compaction_interval: 10m\n retention_enabled: true\n retention_delete_delay: 2h\n retention_delete_worker_count: 150\n\nschema_config:\n configs:\n - from: 2023-03-01\n store: boltdb-shipper\n object_store: filesystem\n schema: v12\n index:\n prefix: index_\n period: 24h\n\nstorage_config:\n boltdb_shipper:\n active_index_directory: /data/index\n cache_location: /data/index-cache\n shared_store: filesystem\n filesystem:\n directory: /data/chunks\n index_queries_cache_config:\n enable_fifocache: false\n embedded_cache:\n max_size_mb: 4096\n enabled: true\nquerier:\n multi_tenant_queries_enabled: false\n max_concurrent: 4096\n query_store_only: false\n\nquery_scheduler:\n max_outstanding_requests_per_tenant: 10000\n\nquery_range:\n cache_results: true\n results_cache:\n cache:\n enable_fifocache: false\n embedded_cache:\n enabled: true\n\nchunk_store_config:\n chunk_cache_config:\n enable_fifocache: false\n embedded_cache:\n max_size_mb: 4096\n enabled: true\n write_dedupe_cache_config:\n enable_fifocache: false\n embedded_cache:\n max_size_mb: 4096\n enabled: true\n\ndistributor:\n ring:\n kvstore:\n store: memberlist\n\ntable_manager:\n retention_deletes_enabled: true\n retention_period: 24h\n\nlimits_config:\n ingestion_rate_mb: 64\n ingestion_burst_size_mb: 8\n max_label_name_length: 4096\n max_label_value_length: 8092\n enforce_metric_name: false\n # Loki will reject any log lines that have already been processed and will not index them again\n reject_old_samples: false\n # 5y\n reject_old_samples_max_age: 43800h\n # The limit to length of chunk store queries. 0 to disable.\n # 5y\n max_query_length: 43800h\n # Maximum number of log entries that will be returned for a query.\n max_entries_limit_per_query: 20000\n # Limit the maximum of unique series that is returned by a metric query.\n max_query_series: 100000\n # Maximum number of queries that will be scheduled in parallel by the frontend.\n max_query_parallelism: 64\n split_queries_by_interval: 24h\n # Alter the log line timestamp during ingestion when the timestamp is the same as the\n # previous entry for the same stream. When enabled, if a log line in a push request has\n # the same timestamp as the previous line for the same stream, one nanosecond is added\n # to the log line. This will preserve the received order of log lines with the exact\n # same timestamp when they are queried, by slightly altering their stored timestamp.\n # NOTE: This is imperfect, because Loki accepts out of order writes, and another push\n # request for the same stream could contain duplicate timestamps to existing\n # entries and they will not be incremented.\n # CLI flag: -validation.increment-duplicate-timestamps\n increment_duplicate_timestamp: true\n #Log data retention for all\n retention_period: 24h\n # Comment this out for fine grained retention\n# retention_stream:\n# - selector: '{namespace=\"dev\"}'\n# priority: 1\n# period: 24h\n # Comment this out for having overrides\n# per_tenant_override_config: /etc/overrides.yaml","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki/local/loki.yaml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/loki"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template ae4a8f4564353ce5e970f7f7d1c6a2da 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) receiving dependency health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 723104a980ccc69983d3c9e8709349f9 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency health.service(mimir|passing) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"server:\n log_level: info\n\nmetrics:\n wal_directory: \"/data/wal\"\n global:\n scrape_interval: 5s\n remote_write:\n++- range service \"mimir\" ++\n - url: http://++.Name++.service.consul:++.Port++/api/v1/push\n++- end ++\n configs:\n - name: integrations\n scrape_configs:\n - job_name: integrations/traefik\n scheme: http\n metrics_path: '/metrics'\n static_configs:\n - targets:\n - ingress.cloud.private:8081\n # grab all metric endpoints with stadanrd /metrics endpoint\n - job_name: \"integrations/consul_sd\"\n consul_sd_configs:\n - server: \"consul.service.consul:8501\"\n tags: [\"prometheus\"]\n tls_config:\n insecure_skip_verify: true\n ca_file: \"/certs/ca/ca.crt\"\n cert_file: \"/certs/consul/consul.pem\"\n key_file: \"/certs/consul/consul-key.pem\"\n datacenter: \"nomadder1\"\n scheme: https\n relabel_configs:\n - source_labels: [__meta_consul_node]\n target_label: instance\n - source_labels: [__meta_consul_service]\n target_label: service\n# - source_labels: [__meta_consul_tags]\n# separator: ','\n# regex: 'prometheus:([^=]+)=([^,]+)'\n# target_label: '$${1}'\n# replacement: '$${2}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:server_id=([^,]+),.*'\n target_label: 'server_id'\n replacement: '$${1}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:version=([^,]+),.*'\n target_label: 'version'\n replacement: '$${1}'\n - source_labels: ['__meta_consul_tags']\n target_label: 'labels'\n regex: '(.+)'\n replacement: '$${1}'\n action: 'keep'\n # - action: replace\n # replacement: '1'\n # target_label: 'test'\n metric_relabel_configs:\n - action: labeldrop\n regex: 'exported_.*'\n\n\n - job_name: \"integrations/consul_sd_minio\"\n metrics_path: \"/minio/v2/metrics/cluster\"\n consul_sd_configs:\n - server: \"consul.service.consul:8501\"\n tags: [\"prometheus_minio\"]\n tls_config:\n insecure_skip_verify: true\n ca_file: \"/certs/ca/ca.crt\"\n cert_file: \"/certs/consul/consul.pem\"\n key_file: \"/certs/consul/consul-key.pem\"\n datacenter: \"nomadder1\"\n scheme: https\n relabel_configs:\n - source_labels: [__meta_consul_node]\n target_label: instance\n - source_labels: [__meta_consul_service]\n target_label: service\n# - source_labels: [__meta_consul_tags]\n# separator: ','\n# regex: 'prometheus:([^=]+)=([^,]+)'\n# target_label: '$${1}'\n# replacement: '$${2}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:server=([^,]+),.*'\n target_label: 'server'\n replacement: '$${1}'\n - source_labels: [__meta_consul_tags]\n separator: ','\n regex: '.*,prometheus:version=([^,]+),.*'\n target_label: 'version'\n replacement: '$${1}'\n - source_labels: ['__meta_consul_tags']\n target_label: 'labels'\n regex: '(.+)'\n replacement: '$${1}'\n action: 'keep'\n# - action: replace\n# replacement: '38'\n# target_label: 'test'\n metric_relabel_configs:\n - action: labeldrop\n regex: 'exported_.*'","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent/local/agent.yaml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/grafana-agent"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template d3fbf984da4c8c04e753c284e9a12f26 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/observability?namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki @module=logmon path=/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/alloc/logs/.loki.stderr.fifo timestamp=2023-05-11T09:33:02.304Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki @module=logmon path=/opt/services/core/nomad/data/alloc/dcd787cd-aa19-0c02-2a4c-676ab7dd0520/alloc/logs/.loki.stdout.fifo timestamp=2023-05-11T09:33:02.304Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template aa33fb80064c17acc1b58a55de5165e2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/observability@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/observability@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/observability@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki @module=logmon address=/tmp/plugin1753412588 network=unix timestamp=2023-05-11T09:33:02.300Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{ with nomadVar \"nomad/jobs/observability\" }}\n GF_AUTH_GENERIC_OAUTH_CLIENT_SECRET = {{.keycloak_secret_observability_grafana}}\n {{ end }}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/grafana/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/grafana"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-d349b73f4a9622cd0802ed03da77e1883b8fbbff19e82ba911cc5e0003bd373d.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-131627b48ae95a652e4260c0b86e03392e264deb5e3d371f8e4e0d3704eb0e7e.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-6de871bedc13520fe8bde867c62bc48c9b1e11667d753aee6549130c2aa57b4c.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-93c95083b6df6f65e9fe9beb5d974433ec223dbecfba0ff7502a301a1b05e222.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-f0e5c607a3af6576205f6532af9f937016a8dee10babf1c8585b51b449a7a814.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-857157e7d59654c13670dcefbad1f5b8803b839808a4cd1444ac26c4086d4284.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-ee0d11b627be159638e2a3e528eeafcc16308f93414a6951079e450b54fcc57b.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-8cc6bed2d8af2b99e583850d84bb5480133079fc6e3f2203fd2aaa5e2153145a.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-5d453d599db84f3ea1c033841424a6436387473b87b27b7b4cdbae08da08ebb6.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-4d3f7a0d50fa71d5a298944b3d3f138515713e83935dd8a5fcc3bf38e4f84e85.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: failed to copy cpuset: driver=docker error="openat2 /sys/fs/cgroup/nomad.slice/docker-308b9182a0cf2cfb0f8a3b4cc593fa8cb4d8518989d4cc5cd61b7a09809f1563.scope/cpuset.cpus: no such file or directory" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 0 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/connect-proxy-keycloak/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/connect-proxy-keycloak/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana @module=logmon path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/alloc/logs/.grafana.stderr.fifo timestamp=2023-05-11T09:33:02.296Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/mimir/local/mimir.yml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/alloc/logs/.grafana.stdout.fifo @module=logmon timestamp=2023-05-11T09:33:02.296Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent @module=logmon path=/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc/logs/.grafana-agent.stdout.fifo timestamp=2023-05-11T09:33:02.295Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template b1554d65cfdf1fc13ab7265841148372 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"\n# Test ++ env \"NOMAD_ALLOC_NAME\" ++\n# Do not use this configuration in production.\n# It is for demonstration purposes only.\n\n# Run Mimir in single process mode, with all components running in 1 process.\ntarget: all,alertmanager,overrides-exporter\n# Disable tendency support.\nmultitenancy_enabled: false\n\nserver:\n http_listen_port: 9009\n log_level: debug\n # Configure the server to allow messages up to 100MB.\n grpc_server_max_recv_msg_size: 104857600\n grpc_server_max_send_msg_size: 104857600\n grpc_server_max_concurrent_streams: 1000\n\nblocks_storage:\n backend: filesystem\n bucket_store:\n sync_dir: /data/tsdb-sync\n #ignore_blocks_within: 10h # default 10h\n filesystem:\n dir: /data/blocks\n tsdb:\n dir: /data/tsdb\n # Note that changing this requires changes to some other parameters like\n # -querier.query-store-after,\n # -querier.query-ingesters-within and\n # -blocks-storage.bucket-store.ignore-blocks-within.\n # retention_period: 24h # default 24h\nquerier:\n # query_ingesters_within: 13h # default 13h\n #query_store_after: 12h #default 12h\nruler_storage:\n backend: filesystem\n filesystem:\n dir: /data/rules\n\nalertmanager_storage:\n backend: filesystem\n filesystem:\n dir: /data/alarms\n\nfrontend:\n grpc_client_config:\n grpc_compression: snappy\n\nfrontend_worker:\n grpc_client_config:\n grpc_compression: snappy\n\ningester_client:\n grpc_client_config:\n grpc_compression: snappy\n\nquery_scheduler:\n grpc_client_config:\n grpc_compression: snappy\n\nalertmanager:\n data_dir: /data/alertmanager\n# retention: 120h\n sharding_ring:\n replication_factor: 1\n alertmanager_client:\n grpc_compression: snappy\n\nruler:\n query_frontend:\n grpc_client_config:\n grpc_compression: snappy\n\ncompactor:\n# compaction_interval: 1h # default 1h\n# deletion_delay: 12h # default 12h\n max_closing_blocks_concurrency: 2\n max_opening_blocks_concurrency: 4\n symbols_flushers_concurrency: 4\n data_dir: /data/compactor\n sharding_ring:\n kvstore:\n store: memberlist\n\n\ningester:\n ring:\n replication_factor: 1\n\nstore_gateway:\n sharding_ring:\n replication_factor: 1\n\nlimits:\n # Limit queries to 5 years. You can override this on a per-tenant basis.\n max_total_query_length: 43680h\n max_label_names_per_series: 42\n # Allow ingestion of out-of-order samples up to 2 hours since the latest received sample for the series.\n out_of_order_time_window: 1d\n # delete old blocks from long-term storage.\n # Delete from storage metrics data older than 1d.\n compactor_blocks_retention_period: 1d\n ingestion_rate: 100000","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/mimir/local/mimir.yml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/mimir"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent path=/opt/services/core/nomad/data/alloc/a04015b3-dc90-7f18-8bfd-c1cf7bc37eff/alloc/logs/.grafana-agent.stderr.fifo @module=logmon timestamp=2023-05-11T09:33:02.295Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/alloc/logs/.connect-proxy-keycloak.stderr.fifo timestamp=2023-05-11T09:33:02.294Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/alloc/logs/.connect-proxy-keycloak.stdout.fifo @module=logmon timestamp=2023-05-11T09:33:02.294Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent @module=logmon address=/tmp/plugin2895059317 network=unix timestamp=2023-05-11T09:33:02.292Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana address=/tmp/plugin2628121178 network=unix @module=logmon timestamp=2023-05-11T09:33:02.290Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir @module=logmon path=/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/alloc/logs/.mimir.stderr.fifo timestamp=2023-05-11T09:33:02.288Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:2ad34861f5b0fedb86cf65d1adc23f0f9d135b1272b64e7da8c8530ec9ff830a 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir @module=logmon path=/opt/services/core/nomad/data/alloc/24c845fb-fff0-3707-fe56-95075c5f28dc/alloc/logs/.mimir.stdout.fifo timestamp=2023-05-11T09:33:02.288Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak address=/tmp/plugin1810121401 network=unix @module=logmon timestamp=2023-05-11T09:33:02.287Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"multitenancy_enabled: false\n\nserver:\n http_listen_port: 3200\n\ndistributor:\n receivers: # this configuration will listen on all ports and protocols that tempo is capable of.\n jaeger: # the receives all come from the OpenTelemetry collector. more configuration information can\n protocols: # be found there: https://github.com/open-telemetry/opentelemetry-collector/tree/main/receiver\n thrift_http: #\n grpc: # for a production deployment you should only enable the receivers you need!\n thrift_binary:\n thrift_compact:\n zipkin:\n otlp:\n protocols:\n http:\n grpc:\n opencensus:\n\ningester:\n trace_idle_period: 10s # the length of time after a trace has not received spans to consider it complete and flush it\n max_block_bytes: 1_000_000 # cut the head block when it hits this size or ...\n max_block_duration: 5m # this much time passes\n\ncompactor:\n compaction:\n compaction_window: 1h # blocks in this time window will be compacted together\n max_block_bytes: 100_000_000 # maximum size of compacted blocks\n block_retention: 24h # Duration to keep blocks 1d\n\nmetrics_generator:\n registry:\n external_labels:\n source: tempo\n cluster: nomadder1\n storage:\n path: /data/generator/wal\n remote_write:\n++- range service \"mimir\" ++\n - url: http://++.Name++.service.consul:++.Port++/api/v1/push\n send_exemplars: true\n headers:\n x-scope-orgid: 1\n++- end ++\n\nstorage:\n trace:\n backend: local # backend configuration to use\n block:\n bloom_filter_false_positive: .05 # bloom filter false positive rate. lower values create larger filters but fewer false positives\n wal:\n path: /data/wal # where to store the the wal locally\n local:\n path: /data/blocks\n pool:\n max_workers: 100 # worker pool determines the number of parallel requests to the object store backend\n queue_depth: 10000\n\nquery_frontend:\n search:\n # how to define year here ? define 5 years\n max_duration: 43800h\n\noverrides:\n metrics_generator_processors: [service-graphs, span-metrics]","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo/local/tempo.yaml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/tempo"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency health.service(mimir|passing) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 723104a980ccc69983d3c9e8709349f9 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: health.service(mimir|passing) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir @module=logmon address=/tmp/plugin3773294230 network=unix timestamp=2023-05-11T09:33:02.276Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n KEYCLOAK_ADMIN_PASSWORD = {{.keycloak_password}}\n KC_DB_PASSWORD = {{.keycloak_db_password}}\n KC_NOMADDER_CLIENT_SECRET = {{.keycloak_ingress_secret}}\n KC_NOMADDER_CLIENT_SECRET_GRAFANA = {{.keycloak_secret_observability_grafana}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/keycloak/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/keycloak"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 7382522149c1fa0ee539c4acc2b323ba 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo path=/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc/logs/.tempo.stderr.fifo @module=logmon timestamp=2023-05-11T09:33:02.274Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo @module=logmon path=/opt/services/core/nomad/data/alloc/86dcc3e0-5f12-7d37-85c4-1d9b6c82c075/alloc/logs/.tempo.stdout.fifo timestamp=2023-05-11T09:33:02.274Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/alloc/logs/.keycloak.stderr.fifo timestamp=2023-05-11T09:33:02.271Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak @module=logmon path=/opt/services/core/nomad/data/alloc/8f3fb4a6-629a-7afd-a334-5580bf2d3374/alloc/logs/.keycloak.stdout.fifo timestamp=2023-05-11T09:33:02.271Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/minio?namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/minio@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/minio@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/minio@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak @module=logmon address=/tmp/plugin494176607 network=unix timestamp=2023-05-11T09:33:02.265Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 124b5b35ead742dca1d9561857552116 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/minio\" -}}\n MINIO_ROOT_USER = {{.minio_root_user}}\n MINIO_ROOT_PASSWORD = {{.minio_root_password}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/minio"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/alloc/logs/.connect-proxy-keycloak-postgres.stdout.fifo timestamp=2023-05-11T09:33:02.263Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres @module=logmon path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/alloc/logs/.connect-proxy-keycloak-postgres.stderr.fifo timestamp=2023-05-11T09:33:02.263Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana @module=logmon path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/alloc/logs/.connect-proxy-grafana.stderr.fifo timestamp=2023-05-11T09:33:02.262Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/alloc/logs/.connect-proxy-grafana.stdout.fifo @module=logmon timestamp=2023-05-11T09:33:02.262Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres @module=logmon address=/tmp/plugin3562993167 network=unix timestamp=2023-05-11T09:33:02.261Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=0 deregistered_services=1 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana network=unix @module=logmon address=/tmp/plugin1008516101 timestamp=2023-05-11T09:33:02.259Z 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Deregistered service: service=_nomad-task-d9d9c594-c60d-e002-6448-2a9b8b5fa6ec-traefik-traefik- 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio @module=logmon path=/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/alloc/logs/.minio.stdout.fifo timestamp=2023-05-11T09:33:02.256Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio @module=logmon path=/opt/services/core/nomad/data/alloc/a37da363-9048-86d5-93e1-d6facf1490b1/alloc/logs/.minio.stderr.fifo timestamp=2023-05-11T09:33:02.257Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio address=/tmp/plugin3514079268 network=unix @module=logmon timestamp=2023-05-11T09:33:02.254Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: restarting task: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik reason="Restart within policy" delay=5.850956292s 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik type=Restarting msg="Task restarting in 5.850956292s" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: RemoveImage on non-referenced counted image id: driver=docker image_id=sha256:63d7224eb30e1c2f2976e0043c6cb6f4f7f9bebb820253dc65998bbd747fcc85 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats path=/usr/local/bin/nomad pid=6705 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/usr/local/bin/nomad pid=6707 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 3d38a7b72b56e929ec016a5e38bef38f 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" CREATE SCHEMA IF NOT EXISTS keycloak;\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres/local/initddb.sql","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres"},{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n POSTGRES_PASSWORD = {{.keycloak_db_password}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/keycloak_postgres/local/initddb.sql" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 6e2d32cd4b2988cfe8f68f510d993549 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/alloc/logs/.keycloak_postgres.stderr.fifo timestamp=2023-05-11T09:33:02.234Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres @module=logmon path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/alloc/logs/.keycloak_postgres.stdout.fifo timestamp=2023-05-11T09:33:02.233Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres network=unix @module=logmon address=/tmp/plugin3912229585 timestamp=2023-05-11T09:33:02.231Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo @module=logmon address=/tmp/plugin3532372146 network=unix timestamp=2023-05-11T09:33:02.224Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki path=/usr/local/bin/nomad pid=6674 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik type=Terminated msg="Exit Code: 0" failed=false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik error="container stopped" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak path=/usr/local/bin/nomad pid=6667 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/usr/local/bin/nomad pid=6663 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local/traefik.toml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) watching 0 dependencies 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template d24b27ea3f27f2740c3e274ec4f939a9 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local/certconfig.toml" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio path=/usr/local/bin/nomad pid=6641 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating watcher 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) checking template 2d035bb7463e642bb2acc3bd9f422ce6 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) starting 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/usr/local/bin/nomad pid=6628 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana path=/usr/local/bin/nomad pid=6637 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"[http.serversTransports]\n[http.serversTransports.default]\n insecureSkipVerify = false\n rootCAs = [\"/etc/opt/certs/ca/ca.crt\",\"/etc/opt/certs/ca/cluster-ca.crt\"]\n [[http.serversTransports.default.certificates]]\n certFile = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n keyFile = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n\n[tls.stores]\n [tls.stores.default]\n [tls.stores.default.defaultCertificate]\n certFile = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n keyFile = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n\n[[tls.certificates]]\n certFile = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n keyFile = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n stores = [\"default\"]\n\n\n\n[http.services]\n# Service to nomad\n [http.services.nomad.loadBalancer]\n serversTransport = \"default\"\n [[http.services.nomad.loadBalancer.servers]]\n url = \"https://10.21.21.41:4646\"\n# Service to consul\n [http.services.consul.loadBalancer]\n serversTransport = \"default\"\n [[http.services.consul.loadBalancer.servers]]\n url = \"https://10.21.21.41:8501\"\n\n# Service to vault\n [http.services.vault.loadBalancer]\n serversTransport = \"default\"\n [[http.services.vault.loadBalancer.servers]]\n url = \"https://10.21.21.41:8200\"\n\n# Service to nexus ui\n [http.services.nexus-ui.loadBalancer]\n serversTransport = \"default\"\n [[http.services.nexus-ui.loadBalancer.servers]]\n url = \"http://10.21.21.41:5002\"\n\n# Service to nexus push\n# [http.services.nexus-push.loadBalancer]\n# serversTransport = \"default\"\n# [[http.services.nexus-push.loadBalancer.servers]]\n# url = \"http://10.21.21.41:5001\"\n\n # Service to nexus pull\n [http.services.nexus-pull.loadBalancer]\n serversTransport = \"default\"\n [[http.services.nexus-pull.loadBalancer.servers]]\n url = \"http://10.21.21.41:5000\"\n\n[http.routers]\n# Route to consul ui\n [http.routers.consul]\n entryPoints = [\"https\"]\n rule = \"Host(`consul.cloud.private`) \"\n service = \"consul\"\n # will terminate the TLS request\n # [http.routers.consul.tls]\n [[http.routers.consul.tls.domains]]\n # main = \"cloud.private\"\n sans = [\"consul.cloud.private\"]\n\n# Route to nomad ui\n [http.routers.nomad]\n entryPoints = [\"https\"]\n rule = \"Host(`nomad.cloud.private`) \"\n service = \"nomad\"\n [[http.routers.nomad.tls.domains]]\n #main = \"cloud.private\"\n sans = [\"nomad.cloud.private\"]\n\n# Route to vault ui\n [http.routers.vault]\n entryPoints = [\"https\"]\n rule = \"Host(`vault.cloud.private`) \"\n service = \"vault\"\n [[http.routers.vault.tls.domains]]\n sans = [\"vault.cloud.private\"]\n\n# Route to nexus ui\n [http.routers.nexus-ui]\n entryPoints = [\"https\"]\n rule = \"Host(`nexus.cloud.private`) \"\n service = \"nexus-ui\"\n [[http.routers.nexus-ui.tls.domains]]\n sans = [\"nexus.cloud.private\"]\n\n# Route to nexus pull\n [http.routers.nexus-pull]\n entryPoints = [\"https\"]\n # rule = \"Host(`registry.cloud.private`) \u0026\u0026 Method(`GET`,`HEAD`)\"\n rule = \"Host(`registry.cloud.private`)\"\n service = \"nexus-pull\"\n [[http.routers.nexus-pull.tls.domains]]\n sans = [\"registry.cloud.private\"]\n\n# Route to nexus push\n# [http.routers.nexus-push]\n# entryPoints = [\"https\"]\n# rule = \"Host(`registry.cloud.private`) \u0026\u0026 Method(`POST`,`PUT`,`DELETE`,`PATCH`)\"\n# service = \"nexus-push\"\n# [[http.routers.nexus-push.tls.domains]]\n# sans = [\"registry.cloud.private\"]\n\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local/certconfig.toml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik"},{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"[entryPoints]\n [entryPoints.http]\n address = \":80\"\n [entryPoints.http.http.redirections]\n [entryPoints.http.http.redirections.entryPoint]\n to = \"https\"\n scheme = \"https\"\n [entryPoints.https]\n address = \":443\"\n\n [entryPoints.traefik]\n # The default port 8080 is used by cdvisor\n address = \":8081\"\n# TCP / UDP over one port\n# [entryPoints.tcpep]\n# address = \":3179\"\n# [entryPoints.udpep]\n# address = \":3179/udp\"\n# [entryPoints.streaming]\n# address = \":1704/udp\"\n[api]\n dashboard = true\n insecure = false\n debug = false\n[providers]\n [providers.file]\n filename = \"/etc/traefik/certconfig.toml\"\n debugLogGeneratedTemplate = true\n watch = true\n\n# Enable Consul Catalog configuration backend.\n[providers.consulCatalog]\n prefix = \"traefik\"\n exposedByDefault = false\n connectAware = true\n connectByDefault = false\n watch = true\n # applied if no traefik.http.routers.{name-of-your-choice}.rule tag found\n defaultRule = \"Host(`{{ .Name }}.cloud.private`)\"\n\n [providers.consulCatalog.endpoint]\n address = \"127.0.0.1:8501\"\n scheme = \"https\"\n\n\n[providers.consulCatalog.endpoint.tls]\n ca = \"/etc/opt/certs/ca/cluster-ca.crt\"\n cert = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n key = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n\n[tracing]\n# [tracing.zipkin]\n# httpEndpoint = \"http://tempo-zipkin.service.consul:9411/api/v2/spans\"\n# sameSpan = true\n# id128Bit = true\n [tracing.jaeger]\n samplingServerURL = \"http://tempo-jaeger.service.consul:14268/sampling\"\n propagation = \"b3\"\n gen128Bit = true\n [tracing.jaeger.collector]\n endpoint = \"http://tempo-jaeger.service.consul:14268/api/traces?format=jaeger.thrift\"\n\n[metrics]\n [metrics.prometheus]\n buckets = [0.1,0.3,1.2,5.0,7.5,9.5,9.9]\n addEntryPointsLabels = true\n addRoutersLabels = true\n addServicesLabels = true\n[log]\n level = \"DEBUG\"\n# format = \"json\"\n\n[accessLog]\n filePath = \"/logs/access.log\"\n #format = \"json\"\n [accessLog.fields]\n defaultMode = \"keep\"\n\n [accessLog.fields.headers]\n defaultMode = \"keep\"\n\n\n\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik/local/traefik.toml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/traefik"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":13,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: (runner) creating new runner (dry: false, once: false) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres path=/usr/local/bin/nomad pid=6627 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad pid=6622 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent path=/usr/local/bin/nomad pid=6621 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir path=/usr/local/bin/nomad pid=6616 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik @module=logmon path=/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/alloc/logs/.traefik.stderr.fifo timestamp=2023-05-11T09:33:02.177Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik @module=logmon path=/opt/services/core/nomad/data/alloc/d9d9c594-c60d-e002-6448-2a9b8b5fa6ec/alloc/logs/.traefik.stdout.fifo timestamp=2023-05-11T09:33:02.177Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo path=/usr/local/bin/nomad pid=6615 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: using plugin: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon.nomad: plugin address: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik @module=logmon address=/tmp/plugin355988543 network=unix timestamp=2023-05-11T09:33:02.167Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=9 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=8 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: plugin started: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik path=/usr/local/bin/nomad pid=6606 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: waiting for RPC address: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik path=/usr/local/bin/nomad 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner.task_hook.logmon: starting plugin: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "logmon"] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook: failed to reattach to logmon process: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik error="Reattachment process not found" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=7 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-9fc7562140ecf776e3eaeb1112921d14e1d0690d 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=6 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=5 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=4 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=3 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy:2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=1 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.647192ms 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent.joiner: retry join completed: initial_servers=1 agent_mode=client 2023-05-11T11:33:02+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-client-oznp2sij7uyvjzllfof24rqlxb4bzh6p 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6591 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent.joiner: starting retry join: servers=10.21.21.41 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Nomad agent started! Log data will stream in below: 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Client: true 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Bind Addrs: HTTP: [0.0.0.0:4646] 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Nomad agent configuration: 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Log Level: TRACE 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Server: false 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Region: global (DC: nomadder1) 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Version: 1.5.5 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Advertise Addrs: HTTP: 10.21.21.42:4646 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: UI is enabled 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: read unix @->/tmp/plugin2238270531: read: connection reset by peer" 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] http: UI is enabled 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=e17fb10a-adf0-ae4e-6ed5-c8a793f7ebbf 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=e185bfa4-98a6-2766-e712-587ef4a0b68f 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client: started client: node_id=36d1fc65-c097-97bc-18ac-079c1262ccfd 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: not registering Nomad HTTPS Health Check because verify_https_client enabled 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:02.007Z 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:02+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2238270531 network=unix timestamp=2023-05-11T09:33:02.004Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6577 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6591 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=dcd787cd-aa19-0c02-2a4c-676ab7dd0520 task=loki type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=da0ceb1c-7a71-b283-0cd3-ea0c053812f8 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=dc9ea193-4d13-55a8-1497-d3844c7a6638 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.975Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin183584073 network=unix timestamp=2023-05-11T09:33:01.974Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6567 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6577 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=d9d9c594-c60d-e002-6448-2a9b8b5fa6ec task=traefik type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=bae52d8e-b717-fc04-7fe0-39936ee07aa8 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=bd1d14fe-df1d-2e43-94b9-5ae06e61d0ea 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=bd92110a-953d-43b4-32f4-ca8f0d477ef5 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=c1479f8a-0d9a-0ffc-6c8d-740e0d39033f 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=d249b424-daa3-28ea-0577-630b71624fb1 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=d3018707-3b3b-f3e3-bb95-496dce1bdb8a 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=cb540957-2ac9-39cc-01c5-9c9292bcc7e6 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=ba088217-2b50-6342-5074-10181350b99e 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=d0d64d02-a5b9-4586-3fbe-caa90329d7fb 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=ba77c9e3-f147-63b8-8a97-13763c201dcf 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=c61055e7-ca07-1ef6-0837-ed4d19dc93bb 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=bf137908-f72f-7067-6b3b-5d9b5f57d73b 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=ad022ac8-dbef-7875-ee6c-8444281817c7 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.937Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=a5a37687-3d09-229a-29ad-2d7380993475 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=af65a769-def8-2f1c-2e0a-bc3bb791a1d7 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=b065547f-7914-dace-28af-f29dc587c537 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=acd92bbc-cae8-9e08-26ed-96b31ea082fc 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin216006939 network=unix timestamp=2023-05-11T09:33:01.936Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6567 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6557 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a37da363-9048-86d5-93e1-d6facf1490b1 task=minio type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.905Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin2197702264 network=unix @module=docker_logger timestamp=2023-05-11T09:33:01.904Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6543 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6557 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.874Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] agent.joiner: retry join completed: initial_servers=1 agent_mode=server 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1551535158 network=unix timestamp=2023-05-11T09:33:01.872Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Stream connection from=10.21.21.41:43200 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: memberlist: Initiating push/pull sync with: 10.21.21.41:4648 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Region: global (DC: nomadder1) 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Version: 1.5.5 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Bind Addrs: HTTP: [0.0.0.0:4646]; RPC: 0.0.0.0:4647; Serf: 0.0.0.0:4648 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] ==> Nomad agent started! Log data will stream in below: 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Client: false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Server: true 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Advertise Addrs: HTTP: 10.21.21.41:4646; RPC: 10.21.21.41:4647; Serf: 10.21.21.41:4648 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] ==> Nomad agent configuration: 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] Log Level: TRACE 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] agent.joiner: starting retry join: servers=10.21.21.41 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] http: UI is enabled 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad: serf: Failed to re-join any previously known node 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad.keyring.replicator: starting encryption key replication 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Starting to=Started 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: running: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: running: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=UnknownStatus to=Running 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: running: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: adding server: server="master-01.global (Addr: 10.21.21.41:4647) (DC: nomadder1)" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=Running to=WaitingToDequeue 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=Running to=WaitingToDequeue 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 from=Starting to=Started 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 from=UnknownStatus to=Running 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=UnknownStatus to=Running 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<3, 2, 0, 0>" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=Running to=WaitingToDequeue 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=b4f65730-d7a1-d275-daa0-ee938e535b2b from=Starting to=Started 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: serf: EventMemberJoin: master-01.global 10.21.21.41 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: started scheduling worker: id=b4f65730-d7a1-d275-daa0-ee938e535b2b index=1 of=4 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: started scheduling worker: id=3f4aa19f-32ab-9278-9c9a-42b8f845a391 index=2 of=4 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=UnknownStatus to=Running 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=Starting to=Started 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: started scheduling worker: id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd index=3 of=4 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] worker: running: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: started scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "_core"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] agent: not registering Nomad HTTPS Health Check because verify_https_client enabled 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=be4f3909-4ba3-bb81-a30e-439e0dc7d4cd from=Running to=WaitingToDequeue 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿž] nomad: started scheduling worker: id=c6a5280c-1c28-6698-48be-ddb345a3a6f0 index=4 of=4 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: starting scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "_core"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: entering follower state: follower="Node at 10.21.21.41:4647 [Follower]" leader-address= leader-id= 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: initial configuration: index=1 servers="[{Suffrage:Voter ID:a1c8b791-e2b8-7606-19bd-988341a75d1b Address:10.21.21.41:4647}]" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6543 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6528 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=keycloak_postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.835Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin218009496 network=unix timestamp=2023-05-11T09:33:01.833Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6509 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6528 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=a04015b3-dc90-7f18-8bfd-c1cf7bc37eff task=grafana-agent type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=96bd861f-1c6d-6f22-8492-fdbb32013944 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=92f15fd0-3f1b-fcb4-ac6f-0ff3599f7d2c 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=8f9cc0d0-b538-bbb1-b51d-642f08f483aa 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.781Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin2326967399 network=unix timestamp=2023-05-11T09:33:01.778Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6496 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6509 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.736Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker address=/tmp/plugin1421919878 network=unix @module=docker_logger timestamp=2023-05-11T09:33:01.734Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6496 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6485 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=connect-proxy-keycloak type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: read unix @->/tmp/plugin101337908: read: connection reset by peer" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=keycloak type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=8f3fb4a6-629a-7afd-a334-5580bf2d3374 task=await-for-keycloak-postgres type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.691Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=89cbff61-bc9d-b0d3-3122-6d8c45446ec2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin101337908 network=unix timestamp=2023-05-11T09:33:01.689Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6473 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6485 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: read unix @->/tmp/plugin4179613657: read: connection reset by peer" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=86dcc3e0-5f12-7d37-85c4-1d9b6c82c075 task=tempo type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=5549db2e-90d8-4b45-aca4-9cf0225f4440 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=698f6ca4-4b82-afd4-0a1d-d02ab5ec5052 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=62e0cb44-9b5b-0768-a446-b38185ed6a12 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=5ab98b09-da12-c159-86ea-9151325c1456 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.653Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker network=unix @module=docker_logger address=/tmp/plugin4179613657 timestamp=2023-05-11T09:33:01.652Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6458 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6473 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.615Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1184554221 network=unix timestamp=2023-05-11T09:33:01.613Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6458 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6444 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=53bfe1a2-7bc0-cc6a-7afa-91eaa6cf89d8 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=53f0afc0-ef0b-cc85-e09d-d7ccb130242e 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=50a55d0e-9d85-19e6-2fdf-8a47a8786aea 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.576Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin3317504932 network=unix timestamp=2023-05-11T09:33:01.573Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6430 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6444 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: read unix @->/tmp/plugin1115473686: read: connection reset by peer" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.540Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1115473686 network=unix timestamp=2023-05-11T09:33:01.538Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:33:01+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced check: check=service:_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6430 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.driver_mgr.docker.docker_logger: plugin process exited: driver=docker path=/usr/local/bin/nomad pid=6417 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin exited: driver=docker 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.stdio: received EOF, stopping recv loop: driver=docker err="rpc error: code = Unavailable desc = error reading from server: EOF" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=grafana type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=300ff738-559c-69c9-a33b-fc42e5938d3e 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=2aab7f60-5bad-ab5d-2065-5a4cb8c6eacc 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: using client connection initialized from environment: driver=docker @module=docker_logger timestamp=2023-05-11T09:33:01.502Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: using plugin: driver=docker version=2 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger.nomad: plugin address: driver=docker @module=docker_logger address=/tmp/plugin1157167383 network=unix timestamp=2023-05-11T09:33:01.499Z 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: waiting for RPC address: driver=docker path=/usr/local/bin/nomad 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: plugin started: driver=docker path=/usr/local/bin/nomad pid=6417 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.driver_mgr.docker: failed to reattach to docker logger process: driver=docker error="failed to reattach to docker logger process: Reattachment process not found" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr.docker.docker_logger: starting plugin: driver=docker path=/usr/local/bin/nomad args=["/usr/local/bin/nomad", "docker_logger"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.alloc_runner.task_runner: Task event: alloc_id=24c845fb-fff0-3707-fe56-95075c5f28dc task=mimir type=Received msg="Task received by client" failed=false 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=062871b4-f7d8-849b-883f-d1e0200be207 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=11661c6f-d11e-554a-c71a-1b010cf6b40a 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=1b8a9295-dea2-2e19-c553-e02632f753a4 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=226ddb6f-a04a-a45b-4ec1-56d97cd62d31 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=1d133514-b67c-3420-fa12-95a7f77d098f 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client: found an alloc without any local state, skipping restore: alloc_id=07860884-b699-1463-1929-14a35b20929c 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=docker health=healthy description=Healthy 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr: detected drivers: drivers="map[healthy:[exec docker] undetected:[raw_exec qemu java]]" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.plugin: finished plugin manager initial fingerprint: plugin-type=driver 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.consul: discovered following servers: servers=[10.21.21.41:4647] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.server_mgr: new server list: new_servers=[10.21.21.41:4647] old_servers=[] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.consul: bootstrap contacting Consul DCs: consul_dcs=["nomadder1"] 2023-05-11T11:33:01+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: Re-joined to previously known node: worker-01: 10.21.21.42:8301 2023-05-11T11:33:01+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: Attempting re-join to previously known node: worker-01: 10.21.21.42:8301 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=exec health=healthy description=Healthy 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=java health=undetected description="" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.plugin: finished plugin manager initial fingerprint: plugin-type=device 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.plugin: waiting on plugin manager initial fingerprint: plugin-type=device 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=qemu health=undetected description="" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.plugin: waiting on plugin manager initial fingerprint: plugin-type=driver 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=raw_exec health=undetected description=disabled 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.plugin: starting plugin manager: plugin-type=device 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.device_mgr: exiting since there are no device plugins 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.plugin: starting plugin manager: plugin-type=driver 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr: detected fingerprints: node_attrs=["arch", "bridge", "cgroup", "consul", "cpu", "host", "network", "nomad", "plugins_cni", "signal", "storage"] 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.env_digitalocean: failed to request metadata: attribute=region error="Get \"http://169.254.169.254/metadata/v1/region\": dial tcp 169.254.169.254:80: connect: connection refused" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.plugin: starting plugin manager: plugin-type=csi 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.env_gce: could not read value for attribute: attribute=machine-type error="Get \"http://169.254.169.254/computeMetadata/v1/instance/machine-type\": dial tcp 169.254.169.254:80: connect: connection refused" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.env_gce: error querying GCE Metadata URL, skipping 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.env_azure: could not read value for attribute: attribute=compute/azEnvironment error="Get \"http://169.254.169.254/metadata/instance/compute/azEnvironment?api-version=2019-06-04&format=text\": dial tcp 169.254.169.254:80: connect: connection refused" 2023-05-11T11:33:01+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=vault initial_period=15s 2023-05-11T11:33:01+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-59629ee0b6023dbefc279efc79db5da8e0b58b04 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.network: unable to parse link speed: path=/sys/class/net/docker0/speed device=docker0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=docker0 mbits=1000 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.fingerprint_mgr.network: unable to parse speed: path=/usr/sbin/ethtool device=docker0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=lo mbits=1000 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.fingerprint_mgr.network: unable to parse speed: path=/usr/sbin/ethtool device=lo 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.network: unable to read link speed: path=/sys/class/net/lo/speed device=lo 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.network: link speed detected: interface=eth0 mbits=1000 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.network: detected interface IP: interface=eth0 IP=192.168.68.141 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-d3a0a820c835edfd96b2b113480c431d255b8840 error="dial tcp 10.21.21.42:80: connect: connection refused" 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-d3a0a820c835edfd96b2b113480c431d255b8840 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.cpu: detected CPU model: name="11th Gen Intel(R) Core(TM) i5-11500H @ 2.90GHz" 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.cpu: client configuration reserves these cores for node: cores=[] 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.cpu: set of reservable cores available for tasks: cores=[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11] 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.cpu: detected CPU frequency: mhz=2918 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr.cpu: detected CPU core count: EXTRA_VALUE_AT_END=12 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.fingerprint_mgr.consul: consul agent is available 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=consul initial_period=15s 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client.fingerprint_mgr.cgroup: cgroups are available 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr: CNI config dir is not set or does not exist, skipping: cni_config_dir=/opt/cni/config 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=cgroup initial_period=15s 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.fingerprint_mgr: built-in fingerprints: fingerprinters=["arch", "bridge", "cgroup", "cni", "consul", "cpu", "host", "landlock", "memory", "network", "nomad", "plugins_cni", "signal", "storage", "vault", "env_aws", "env_gce", "env_azure", "env_digitalocean"] 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client: using alloc directory: alloc_dir=/opt/services/core/nomad/data/alloc 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] client.cpuset.v2: initializing with: cores=0-11 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client: using dynamic ports: min=20000 max=32000 reserved="" 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] client: using state directory: state_dir=/opt/services/core/nomad/data/client 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: detected plugin: name=java type=driver plugin_version=0.1.0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: detected plugin: name=raw_exec type=driver plugin_version=0.1.0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: detected plugin: name=exec type=driver plugin_version=0.1.0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: detected plugin: name=docker type=driver plugin_version=0.1.0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] agent: detected plugin: name=qemu type=driver plugin_version=0.1.0 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent.plugin_loader.docker: using client connection initialized from environment: plugin_dir=/opt/services/core/nomad/data/plugins 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [๐Ÿž] agent.plugin_loader.docker: using client connection initialized from environment: plugin_dir=/opt/services/core/nomad/data/plugins 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent.plugin_loader: skipping external plugins since plugin_dir doesn't exist: plugin_dir=/opt/services/core/nomad/data/plugins 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Starting Nomad agent... 2023-05-11T11:33:00+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Loaded configuration from /etc/nomad.d/client.hcl, /etc/nomad.d/nomad_volume_stack_core_keycloak_postgres_volume.hcl, /etc/nomad.d/nomad_volume_stack_core_minio_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_grafana_agent_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_grafana_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_loki_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_mimir_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_nats_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_tempo_volume.hcl 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff-group-grafana-agent-grafana-agent-ready-server 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-24c845fb-fff0-3707-fe56-95075c5f28dc-group-mimir-mimir-api 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-54969951-d541-ae97-922a-7db38096bae5-group-nats-nats-client 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-d9d9c594-c60d-e002-6448-2a9b8b5fa6ec-traefik-traefik- 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-tempo 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-jaeger-jaeger 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a04015b3-dc90-7f18-8bfd-c1cf7bc37eff-group-grafana-agent-grafana-agent-health-server 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-dcd787cd-aa19-0c02-2a4c-676ab7dd0520-group-loki-loki-http 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-console-console 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-zipkin-zipkin 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-otlp-grpc-otlp_grpc 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-86dcc3e0-5f12-7d37-85c4-1d9b6c82c075-group-tempo-tempo-otlp-http-otlp_http 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-a37da363-9048-86d5-93e1-d6facf1490b1-minio-minio-http 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced service: service=_nomad-task-54969951-d541-ae97-922a-7db38096bae5-group-nats-nats-prometheus-exporter-prometheus-exporter 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Synced node info 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 error="dial tcp 10.21.21.42:4318: connect: connection refused" 2023-05-11T11:33:00+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-7850eac4d44b9973bac08e173ca6196df1c64745 2023-05-11T11:32:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=_nomad-check-3517b2043f72a9d165cfb59c6dd0cf084220e4a5 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: snapshot restore progress: id=6-9047-1683653626275 last-index=9047 last-term=6 size-in-bytes=7208431 read-bytes=7208431 percent-complete="100.00%" 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: restored from snapshot: id=6-9047-1683653626275 last-index=9047 last-term=6 size-in-bytes=7208431 2023-05-11T11:32:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 2023-05-11T11:32:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy:1 error="dial tcp 10.21.21.42:22453: connect: connection refused" 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: Consul supports TLSSkipVerify 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: able to contact Consul 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: starting restore from snapshot: id=6-9047-1683653626275 last-index=9047 last-term=6 size-in-bytes=7208431 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: setting up raft bolt store: no_freelist_sync=false 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] ==> Starting Nomad agent... 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] ==> WARNING: Bootstrap mode enabled! Potentially unsafe operation. 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] WARNING: keyring exists but -encrypt given, using keyring 2023-05-11T11:32:59+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] ==> Loaded configuration from /etc/nomad.d/server.hcl 2023-05-11T11:32:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check is now critical: check=service:_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy:1 2023-05-11T11:32:59+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent: Check socket connection failed: check=service:_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy:1 error="dial tcp 10.21.21.42:24296: connect: connection refused" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=compiled-discovery-chain error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=discovery-chain:keycloak-postgres error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent.client: adding server: server="master-01 (Addr: tcp/10.21.21.41:8300) (DC: nomadder1)" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: (LAN) joined: number_of_nodes=1 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Join cluster completed. Synced with initial agents: cluster=LAN num_agents=1 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent.client.serf.lan: serf: EventMemberJoin: master-01 10.21.21.41 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=compiled-discovery-chain error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=discovery-chain:keycloak-postgres error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy service_id=_nomad-task-8f3fb4a6-629a-7afd-a334-5580bf2d3374-group-keycloak-keycloak-8080-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=intentions error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=intention-match error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy service_id=_nomad-task-434f71a9-4b50-8512-effc-5858456f87be-group-grafana-grafana-3000-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Consul agent running! 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=config-entry error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=mesh error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.anti_entropy: failed to sync remote state: error="No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.proxycfg: Failed to handle update from watch: kind=connect-proxy proxy=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy service_id=_nomad-task-a1aad1ed-27cd-b3ad-a9a0-075a42fac82d-group-keycloak-postgres-keycloak-postgres-5432-sidecar-proxy id=leaf error="error filling agent cache: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: (LAN) joining: lan_addresses=["10.21.21.41"] 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Joining cluster...: cluster=LAN 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Started gRPC listeners: port_name=grpc_tls address=127.0.0.1:8503 network=tcp 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Retry join is supported for the following discovery methods: cluster=LAN discovery_methods="aliyun aws azure digitalocean gce hcp k8s linode mdns os packet scaleway softlayer tencentcloud triton vsphere" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: started state syncer 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Starting server: address=[::]:8501 network=tcp protocol=https 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Started DNS server: address=0.0.0.0:8600 network=tcp 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent: Started DNS server: address=0.0.0.0:8600 network=udp 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent: error handling service update: error="error watching service config: No known Consul servers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=resolved-service-config error="No known Consul servers" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent.auto_config: auto-config started 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.router.manager: No servers available 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="No known Consul servers" index=9 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent.router: Initializing LAN area manager 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent.client.serf.lan: serf: EventMemberJoin: worker-01 10.21.21.42 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] agent.auto_config: automatically upgraded to TLS 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: member joined, marking health alive: member=worker-01 partition=default 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: EventMemberJoin: worker-01 10.21.21.42 2023-05-11T11:32:58+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Waiting for consul.service.consul 8501 to open... 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: deregistering member: member=worker-02 partition=default reason=reaped 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: deregistering member: member=worker-03 partition=default reason=reaped 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-c9bab43dc01aaa4f2fb76ff56f6bc875a603b652 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-b7d4b44b6996d12c68ac89cc6783c9bd6e55ce3e 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="CA signing expiration metric" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="CA root pruning" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: deregistering member: member=worker-01 partition=default reason=reaped 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="streaming peering resources" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="intermediate cert renew watch" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="virtual IP version check" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="metrics for streaming peering resources" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: stopped routine: routine="virtual IP version check" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="federation state pruning" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="CA root expiration metric" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: stopping routine: routine="virtual IP version check" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="config entry controllers" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] connect.ca: initialized primary datacenter CA from existing CARoot with provider: provider=consul 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="peering deferred deletion" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.leader: started routine: routine="federation state anti-entropy" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced node info 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.autopilot: reconciliation now enabled 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โŒ] agent.server.cert-manager: failed to handle cache update event: error="leaf cert watch returned an error: CA is uninitialized and unable to sign certificates yet: provider is nil" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="CA is uninitialized and unable to sign certificates yet: provider is nil" index=0 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent: Check socket connection failed: check=_nomad-check-c9bab43dc01aaa4f2fb76ff56f6bc875a603b652 error="dial tcp 0.0.0.0:4647: connect: connection refused" 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent: Check is now critical: check=_nomad-check-c9bab43dc01aaa4f2fb76ff56f6bc875a603b652 2023-05-11T11:32:58+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: Attempting re-join to previously known node: worker-03: 10.21.21.44:8301 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-leaf error="CA is uninitialized and unable to sign certificates yet: provider is nil" index=0 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โŒ] agent.server.cert-manager: failed to handle cache update event: error="leaf cert watch returned an error: CA is uninitialized and unable to sign certificates yet: provider is nil" 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: cluster leadership acquired 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: New leader elected: payload=master-01 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.raft: entering leader state: leader="Node at 10.21.21.41:8300 [Leader]" 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.raft: election won: term=8 tally=1 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent.server.raft: heartbeat timeout reached, starting election: last-leader-addr= last-leader-id= 2023-05-11T11:32:57+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.raft: entering candidate state: node="Node at 10.21.21.41:8300 [Candidate]" term=8 2023-05-11T11:32:56+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent: Check socket connection failed: check=_nomad-check-b7d4b44b6996d12c68ac89cc6783c9bd6e55ce3e error="dial tcp 0.0.0.0:4648: connect: connection refused" 2023-05-11T11:32:56+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent: Check is now critical: check=_nomad-check-b7d4b44b6996d12c68ac89cc6783c9bd6e55ce3e 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Join cluster completed. Synced with initial agents: cluster=LAN num_agents=1 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: (LAN) joined: number_of_nodes=1 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Joining cluster...: cluster=LAN 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Started gRPC listeners: port_name=grpc_tls address=127.0.0.1:8503 network=tcp 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: (LAN) joining: lan_addresses=["10.21.21.41"] 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Retry join is supported for the following discovery methods: cluster=LAN discovery_methods="aliyun aws azure digitalocean gce hcp k8s linode mdns os packet scaleway softlayer tencentcloud triton vsphere" 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Started DNS server: address=0.0.0.0:8600 network=udp 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Starting server: address=[::]:8501 network=tcp protocol=https 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: started state syncer 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Started DNS server: address=0.0.0.0:8600 network=tcp 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Consul agent running! 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.cert-manager: initialized server certificate management 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: Handled event for server in area: event=member-join server=master-01.nomadder1 area=wan 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: Adding LAN server: server="master-01 (Addr: tcp/10.21.21.41:8300) (DC: nomadder1)" 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: Attempting re-join to previously known node: worker-02: 10.21.21.43:8301 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: EventMemberJoin: master-01 10.21.21.41 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.autopilot: reconciliation now disabled 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.router: Initializing LAN area manager 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.wan: serf: EventMemberJoin: master-01.nomadder1 10.21.21.41 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent.server.serf.wan: serf: Failed to re-join any previously known node 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.raft: initial configuration: index=1 servers="[{Suffrage:Voter ID:fa9b4d05-7575-ab9a-1093-da63dba1180f Address:10.21.21.41:8300}]" 2023-05-11T11:32:55+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.raft: entering follower state: follower="Node at 10.21.21.41:8300 [Follower]" leader-address= leader-id= 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent.auto_config: BootstrapExpect is set to 1; this is the same as Bootstrap mode. 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent.auto_config: bootstrap = true: do not enable unless necessary 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Client Addr: [0.0.0.0] (HTTP: -1, HTTPS: 8501, gRPC: -1, gRPC-TLS: 8503, DNS: 8600) 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Node ID: 'fa9b4d05-7575-ab9a-1093-da63dba1180f' 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Cluster Addr: 10.21.21.41 (LAN: 8301, WAN: 8302) 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Server: true (Bootstrap: true) 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Build Date: '2023-03-30 17:51:19 +0000 UTC' 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Node name: 'master-01' 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] ==> Log data will now stream in as it occurs: 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Auto-Encrypt-TLS: true 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] HTTPS TLS: Verify Incoming: true, Verify Outgoing: true, Min Version: TLSv1_2 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] gRPC TLS: Verify Incoming: true, Min Version: TLSv1_2 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Datacenter: 'nomadder1' (Segment: '') 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Version: '1.15.2' 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Gossip Encryption: true 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] Internal RPC TLS: Verify Incoming: true, Verify Outgoing: true (Verify Hostname: true), Min Version: TLSv1_2 2023-05-11T11:32:54+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] ==> Starting Consul agent... 2023-05-11T11:32:53+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent: bootstrap = true: do not enable unless necessary 2023-05-11T11:32:53+02:00 [consul.service ๐Ÿ’ป master-01] [โš ] agent: BootstrapExpect is set to 1; this is the same as Bootstrap mode. 2023-05-11T11:32:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Waiting for consul.service.consul 8501 to open... 2023-05-11T11:32:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: AutoEncrypt.Sign RPC failed: addr=10.21.21.41:8300 error="rpcinsecure: error establishing connection: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:32:49+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: No servers successfully responded to the auto-encrypt request 2023-05-11T11:32:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Waiting for consul.service.consul 8501 to open... 2023-05-11T11:32:44+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: AutoEncrypt.Sign RPC failed: addr=10.21.21.41:8300 error="rpcinsecure: error establishing connection: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:32:44+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: No servers successfully responded to the auto-encrypt request 2023-05-11T11:32:42+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: No servers successfully responded to the auto-encrypt request 2023-05-11T11:32:42+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: AutoEncrypt.Sign RPC failed: addr=10.21.21.41:8300 error="rpcinsecure: error establishing connection: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: AutoEncrypt.Sign RPC failed: addr=10.21.21.41:8300 error="rpcinsecure: error establishing connection: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: No servers successfully responded to the auto-encrypt request 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: AutoEncrypt.Sign RPC failed: addr=10.21.21.41:8300 error="rpcinsecure: error establishing connection: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.auto_config: No servers successfully responded to the auto-encrypt request 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Node name: 'worker-01' 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] ==> Starting Consul agent... 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Gossip Encryption: true 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Cluster Addr: 10.21.21.42 (LAN: 8301, WAN: 8302) 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Internal RPC TLS: Verify Incoming: false, Verify Outgoing: true (Verify Hostname: true), Min Version: TLSv1_2 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Client Addr: [0.0.0.0] (HTTP: -1, HTTPS: 8501, gRPC: -1, gRPC-TLS: 8503, DNS: 8600) 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] gRPC TLS: Verify Incoming: false, Min Version: TLSv1_2 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Server: false (Bootstrap: false) 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] ==> Log data will now stream in as it occurs: 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] HTTPS TLS: Verify Incoming: false, Verify Outgoing: true, Min Version: TLSv1_2 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Datacenter: 'nomadder1' (Segment: '') 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Auto-Encrypt-TLS: true 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Node ID: '79c8efa4-209d-a1c9-87a8-f18ac6963bdb' 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Version: '1.15.2' 2023-05-11T11:32:41+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] Build Date: '2023-03-30 17:51:19 +0000 UTC' 2023-05-11T11:32:40+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Waiting for consul.service.consul 8501 to open... 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] }. Err: connection error: desc = "transport: Error while dialing dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "BalancerAttributes": null, 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Attributes": null, 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Type": 0, 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "ServerName": "master-01", 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Addr": "nomadder1-10.21.21.41:8300", 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Metadata": null 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:35+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error discovering nomad servers: error="client.consul: unable to query Consul datacenters: Unexpected response code: 500 (rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused)" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Catalog.ListDatacenters server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/catalog/datacenters from=127.0.0.1:43126 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Node.UpdateStatus server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error heartbeating. retrying: error="failed to update status: rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" period=1.328483435s 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Node.UpdateStatus server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 4 after "2s") 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 4 after "2s") 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/observability@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 4 after "2s") 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Catalog.ListDatacenters server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:34+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.http: Request error: method=GET url=/v1/catalog/datacenters from=127.0.0.1:56036 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 4 after "2s") 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/minio@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 3 after "1s") 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 3 after "1s") 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/observability@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Metadata": null 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Type": 0, 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] }. Err: connection error: desc = "transport: Error while dialing dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Addr": "nomadder1-10.21.21.41:8300", 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "BalancerAttributes": null, 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "ServerName": "master-01", 2023-05-11T11:31:33+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Attributes": null, 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 3 after "1s") 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:33+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 3 after "1s") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/minio@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 2 after "500ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 2 after "500ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/observability@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 2 after "500ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.673971ms 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ChecksInState server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/health/state/any?index=14159 from=127.0.0.1:57714 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ) (retry attempt 2 after "500ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: error authenticating built API request: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" url="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" method=GET 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/minio@default.global): Unexpected response code: 500 (Server error authenticating request 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=ACL.WhoAmI server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" code=500 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" code=500 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/observability@default.global): Unexpected response code: 500 (rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=29.808255565s 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Catalog.ListServices server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/catalog/services?index=14159 from=127.0.0.1:57714 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/security@default.global): Unexpected response code: 500 (rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" code=500 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error querying node allocations: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url=/v1/agent/connect/ca/roots?index=9 from=127.0.0.1:57714 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) nomad.var.block(nomad/jobs/minio@default.global): Unexpected response code: 500 (rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" code=500 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Metadata": null 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Addr": "nomadder1-10.21.21.41:8300", 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "ServerName": "master-01", 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "BalancerAttributes": null, 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Type": 0, 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] "Attributes": null, 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โœ…] }. Err: connection error: desc = "transport: Error while dialing dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.rpcclient.health: subscribe call failed: err="rpc error: code = Unavailable desc = error reading from server: EOF" failure_count=1 key=mimir topic=ServiceHealth 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) health.service(mimir|passing): Unexpected response code: 500 (subscription closed by server, server is shutting down) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) health.service(mimir|passing): Unexpected response code: 500 (subscription closed by server, server is shutting down) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url="/v1/health/service/mimir?index=14071&passing=1&stale=&wait=60000ms" from=127.0.0.1:43158 error="rpc error: code = Unknown desc = subscription closed by server, server is shutting down" 2023-05-11T11:31:32+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] agent: (view) health.service(mimir|passing): Unexpected response code: 500 (subscription closed by server, server is shutting down) (retry attempt 1 after "250ms") 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.rpcclient.health: subscribe call failed: err="rpc error: code = Unknown desc = subscription closed by server, server is shutting down" failure_count=1 key=mimir topic=ServiceHealth 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url="/v1/health/service/mimir?index=14071&passing=1&stale=&wait=60000ms" from=127.0.0.1:43142 error="rpc error: code = Unknown desc = subscription closed by server, server is shutting down" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.http: Request error: method=GET url="/v1/health/service/mimir?index=14071&passing=1&stale=&wait=60000ms" from=127.0.0.1:43160 error="rpc error: code = Unknown desc = subscription closed by server, server is shutting down" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Intention.Match server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConnectCA.Roots server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: dial tcp ->10.21.21.41:8300: connect: connection refused" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="rpc error getting client: failed to get conn: read tcp 10.21.21.43:44906->10.21.21.41:8300: read: connection reset by peer" index=9 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Health.ServiceNodes server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.http: Request error: method=GET url=/v1/agent/connect/ca/roots?index=9 from=127.0.0.1:60948 error="rpc error getting client: failed to get conn: read tcp 10.21.21.43:44906->10.21.21.41:8300: read: connection reset by peer" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=DiscoveryChain.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=ConfigEntry.Get server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โš ] agent.cache: handling error in Cache.Notify: cache-type=connect-ca-root error="rpc error getting client: failed to get conn: read tcp 10.21.21.43:44906->10.21.21.41:8300: read: connection reset by peer" index=9 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.http: Request error: method=GET url=/v1/health/state/any?index=14159 from=127.0.0.1:60948 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Health.ChecksInState server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.client: RPC failed to server: method=Catalog.ListServices server=10.21.21.41:8300 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:32+02:00 [consul.service ๐Ÿ’ป worker-02] [โŒ] agent.http: Request error: method=GET url=/v1/catalog/services?index=14159 from=127.0.0.1:60948 error="rpc error getting client: failed to get conn: rpc error: lead thread didn't get connection" 2023-05-11T11:31:31+02:00 [consul.service ๐Ÿ’ป worker-02] [โš ] agent.client: Retrying RPC to server: method=ConfigEntry.ResolveServiceConfig server=10.21.21.41:8300 error="rpc error making call: EOF" 2023-05-11T11:31:22+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="880.9ยตs" 2023-05-11T11:31:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:31:12+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.196466ms 2023-05-11T11:31:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:36524: EOF 2023-05-11T11:31:04+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] http: http: TLS handshake error from 10.21.21.42:36510: EOF 2023-05-11T11:31:02+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m1.412965388s 2023-05-11T11:31:02+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="932.548ยตs" 2023-05-11T11:30:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.141221ms 2023-05-11T11:30:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:30:42+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.408942ms 2023-05-11T11:30:32+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.142135ms 2023-05-11T11:30:22+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.571192ms 2023-05-11T11:30:16+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:30:12+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.153618ms 2023-05-11T11:30:11+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: state changed, updating node and re-registering 2023-05-11T11:30:02+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=2.429988ms 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Scheduling to=WaitingToDequeue 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Scheduling to=WaitingToDequeue 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=48ec8a5b-0533-6d27-d668-3ae727920592 job_id=observability namespace=default worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=Scheduling to=Paused 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=76dc7fc9-003f-8c36-6ae3-8163553b14e9 job_id=minio namespace=default worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Scheduling to=WaitingToDequeue 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=8b83211f-50be-5427-0b57-daeaa4f2f7bf job_id=security namespace=default worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.system_sched.binpack: NewBinPackIterator created: eval_id=3cd5f01d-5756-c795-fa72-587aea82acf8 job_id=ingress namespace=default worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=0 ignored=3 errors=0 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=0 ignored=3 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: updated allocations: index=15336 total=3 pulled=0 filtered=3 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Scheduling to=WaitingToDequeue 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=13e92ba2-0c6d-4871-b782-167003097c02 job_id=observability namespace=default worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: allocation updates applied: added=0 removed=0 updated=0 ignored=3 errors=0 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: allocation updates: added=0 removed=0 updated=0 ignored=3 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: updated allocations: index=15312 total=3 pulled=0 filtered=3 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=Scheduling to=Paused 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=Scheduling to=Paused 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.service_sched.binpack: NewBinPackIterator created: eval_id=e75453a9-7180-ca03-d29f-feed4dcd6ac7 job_id=security namespace=default worker_id=90866fea-4262-56df-02b1-04bf068f154a algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=WaitingToDequeue to=WaitingForRaft 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=WaitingForRaft to=Scheduling 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker.system_sched.binpack: NewBinPackIterator created: eval_id=31bafa26-9d17-35ae-059c-744e11fadf8f job_id=ingress namespace=default worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=PUT path=/v1/node/e2eb7460-2bca-ac62-5c53-999281062667/eligibility duration=2.241591ms 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/node/e2eb7460-2bca-ac62-5c53-999281062667 duration=2.484613ms 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/nodes?prefix=e2eb7460-2bca-ac62-5c53-999281062667 duration=1.003869514s 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client: evaluations triggered by node registration: num_evals=3 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.drain.job_watcher: getting job allocs at index: index=15327 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.drain.job_watcher: retrieved allocs for draining jobs: num_allocs=0 index=15327 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.drain.job_watcher: getting job allocs at index: index=1 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: blocked evals status modified: paused=false 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: eval broker status modified: paused=false 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=Started to=Pausing 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=Started to=Pausing 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=Started to=Pausing 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=5d8e26b8-ab80-c34a-ba11-c7116e9c076b job_id=security namespace=default algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=a8d986ce-4c43-e59b-457e-fcc8bd667af2 job_id=observability namespace=default algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=2277c100-b623-7c17-b88b-70f065d25e17 job_id=minio namespace=default algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=1e5742ff-357e-0195-0cb2-301cdfef2d6f job_id=ingress namespace=default algorithm=spread 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) nomad.var.block(nomad/jobs/security@default.global) is still needed 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/e5fea251-9d66-4cf4-2090-83ae30046fb1/forwardauth/secrets/env.vars" 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) receiving dependency nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) checking template 550a891296c5b27beb51ed9e1b2f00e5 2023-05-11T11:30:01+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" duration=1.891883ms 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="956.564ยตs" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:30:00+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/nodes?prefix=e2eb7460-2bca-ac62-5c53-999281062667 duration=5.110327035s 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path=/v1/nodes?prefix=36d1fc65-c097-97bc-18ac-079c1262ccfd error="rpc error: Not ready to serve consistent reads" code=500 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:59+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:58+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error querying node allocations: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.Register server=10.21.21.41:4647 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.Register server=10.21.21.41:4647 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error registering: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=Backoff to=WaitingToDequeue 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=Backoff to=WaitingToDequeue 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=Backoff to=WaitingToDequeue 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Backoff to=WaitingToDequeue 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=WaitingToDequeue to=Backoff 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=WaitingToDequeue to=Backoff 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=WaitingToDequeue to=Backoff 2023-05-11T11:29:57+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingToDequeue to=Backoff 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:56+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:55+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="726.638ยตs" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:54+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-b7d4b44b6996d12c68ac89cc6783c9bd6e55ce3e 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/nodes?prefix=e2eb7460-2bca-ac62-5c53-999281062667 duration=5.086399607s 2023-05-11T11:29:53+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path=/v1/nodes?prefix=36d1fc65-c097-97bc-18ac-079c1262ccfd error="rpc error: Not ready to serve consistent reads" code=500 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.UpdateAlloc server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.UpdateAlloc server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error updating allocations: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=12.056318ms 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error querying node allocations: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client: error registering: error="rpc error: Not ready to serve consistent reads" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: Not ready to serve consistent reads" rpc=Node.Register server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: Not ready to serve consistent reads" rpc=Node.Register server=10.21.21.41:4647 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=Backoff to=WaitingToDequeue 2023-05-11T11:29:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=WaitingToDequeue to=Backoff 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=Backoff to=WaitingToDequeue 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=WaitingToDequeue to=Backoff 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=Backoff to=WaitingToDequeue 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=WaitingToDequeue to=Backoff 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Backoff to=WaitingToDequeue 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=WaitingToDequeue to=Backoff 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:51+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=7dbf7ea4-f6a4-4a66-226d-9c61ff5dbc75 job_id=security namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=3d9ff0ae-1e91-fb96-cb9b-e48a5186f604 job_id=observability namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=778906ca-520c-70f3-cdcb-401f739f2bb8 job_id=minio namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=0922c9bd-08a7-48bb-7087-6a3cd38be5c0 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=28040bc5-7be3-1e8e-8941-4a81120cefc5 job_id=whoami namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=f3a733ac-3b52-7b28-970d-25a6b1e38e24 job_id=security namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=70bb6d12-6854-c488-dc19-ee3e6a129e5d job_id=observability namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=7e534dfc-1bc3-dee7-dcba-605180c775f5 job_id=minio namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=f094e9d1-538e-839a-7711-f18da786e971 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:50+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:49+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=e116abee-6ebc-8d8d-6635-8bf03406bc86 job_id=security namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=6a4aae57-44a1-9c2a-9297-0ca4899dd84c job_id=whoami namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=80ca00c3-41d7-716a-ce39-76eb33d73c65 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=d356ab59-780d-bd9c-650c-5806dd714a2e job_id=observability namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=17e18550-7924-92f9-bbee-e1c06e9b9a64 job_id=minio namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=f6e74385-cfc4-dcc1-aa84-b3292b2cd451 job_id=whoami namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=b9045623-2cdb-94e9-91cd-47f408ce1409 job_id=security namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=c51fc4b3-4d2d-3d59-478a-bbb335b6a4cc job_id=observability namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=1987ac26-5765-9d82-2fa8-e686b3ad782e job_id=ingress namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=88a50bc5-00c7-ae11-91b5-3b52b018038b job_id=minio namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=1c65ba0d-e798-9bf3-c919-806e2251290b job_id=whoami namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=f40a2ff5-5213-f1b8-d04a-21a6a426a646 job_id=security namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=06248c5a-6b86-c4db-6412-db9dcadaf6c8 job_id=observability namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=42212452-209b-4471-897a-258269b46630 job_id=minio namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=526dcad9-48fa-98fa-1f3c-33dc542833d0 job_id=ingress namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-c9bab43dc01aaa4f2fb76ff56f6bc875a603b652 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=e0f0a6bf-0074-31cd-55c9-a7e37c294d58 job_id=security namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=b3976ed0-3e9b-5472-3133-2f0b47ec5fa5 job_id=observability namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.service_sched.binpack: NewBinPackIterator created: eval_id=b0607fc1-9608-4857-e97c-c77083cb3ec6 job_id=minio namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] nomad.fsm.system_sched.binpack: NewBinPackIterator created: eval_id=6108897b-5495-d9e8-7819-df15a625e4be job_id=ingress namespace=default algorithm=spread 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: cluster leadership acquired 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: election won: term=10 tally=1 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: entering leader state: leader="Node at 10.21.21.41:4647 [Leader]" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: entering candidate state: node="Node at 10.21.21.41:4647 [Candidate]" term=10 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad.raft: heartbeat timeout reached, starting election: last-leader-addr= last-leader-id= 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:48+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration=1.064807ms 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=a1aad1ed-27cd-b3ad-a9a0-075a42fac82d task=connect-proxy-keycloak-postgres path=/opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/a1aad1ed-27cd-b3ad-a9a0-075a42fac82d/connect-proxy-keycloak-postgres/secrets/api.sock: bind: invalid argument" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=54969951-d541-ae97-922a-7db38096bae5 task=nats-prometheus-exporter path=/opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/54969951-d541-ae97-922a-7db38096bae5/nats-prometheus-exporter/secrets/api.sock: bind: invalid argument" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โš ] client.alloc_runner.task_runner.task_hook.api: error creating task api socket: alloc_id=434f71a9-4b50-8512-effc-5858456f87be task=connect-proxy-grafana path=/opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets/api.sock error="listen unix /opt/services/core/nomad/data/alloc/434f71a9-4b50-8512-effc-5858456f87be/connect-proxy-grafana/secrets/api.sock: bind: invalid argument" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?index=1&namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) all templates rendered 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) watching 0 dependencies 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: Failed to authenticated Task API request: method=GET url="/v1/var/nomad/jobs/security?namespace=default&stale=&wait=60000ms" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/75709f55-c2fe-4966-1f85-9084745e16dc/traefik/local/traefik.toml" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) rendering "(dynamic)" => "/opt/services/core/nomad/data/alloc/75709f55-c2fe-4966-1f85-9084745e16dc/traefik/local/certconfig.toml" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) checking template d24b27ea3f27f2740c3e274ec4f939a9 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) was not watching 1 dependencies 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) missing data for 1 dependencies 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) add used dependency nomad.var.block(nomad/jobs/security@default.global) to missing since isLeader but do not have a watcher 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) diffing and updating dependencies 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) missing dependency: nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) watching 1 dependencies 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (watcher) adding nomad.var.block(nomad/jobs/security@default.global) 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) checking template 550a891296c5b27beb51ed9e1b2f00e5 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":5,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":" {{- with nomadVar \"nomad/jobs/security\" -}}\n CLIENT_SECRET = {{.keycloak_ingress_secret}}\n {{- end -}}\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/e5fea251-9d66-4cf4-2090-83ae30046fb1/forwardauth/secrets/env.vars","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"{{","RightDelim":"}}","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/e5fea251-9d66-4cf4-2090-83ae30046fb1/forwardauth"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":5,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":5,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) checking template 2d035bb7463e642bb2acc3bd9f422ce6 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) running initial templates 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) initiating run 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent: (runner) final config: {"Consul":{"Address":"127.0.0.1:8501","Namespace":"","Auth":{"Enabled":false,"Username":""},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"/usr/local/share/ca-certificates/cloudlocal/cluster-ca-bundle.pem","CaPath":"","Cert":"/etc/opt/certs/consul/consul.pem","Enabled":true,"Key":"/etc/opt/certs/consul/consul-key.pem","ServerName":"","Verify":true},"Token":"","TokenFile":"","Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":5,"TLSHandshakeTimeout":10000000000}},"Dedup":{"Enabled":false,"MaxStale":2000000000,"Prefix":"consul-template/dedup/","TTL":15000000000,"BlockQueryWaitTime":60000000000},"DefaultDelims":{"Left":null,"Right":null},"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":0},"KillSignal":2,"LogLevel":"WARN","FileLog":{"LogFilePath":"","LogRotateBytes":0,"LogRotateDuration":86400000000000,"LogRotateMaxFiles":0},"MaxStale":2000000000,"PidFile":"","ReloadSignal":1,"Syslog":{"Enabled":false,"Facility":"LOCAL0","Name":"consul-template"},"Templates":[{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"[http.serversTransports]\n[http.serversTransports.default]\n insecureSkipVerify = false\n rootCAs = [\"/etc/opt/certs/ca/ca.crt\",\"/etc/opt/certs/ca/cluster-ca.crt\"]\n [[http.serversTransports.default.certificates]]\n certFile = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n keyFile = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n\n[tls.stores]\n [tls.stores.default]\n [tls.stores.default.defaultCertificate]\n certFile = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n keyFile = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n\n[[tls.certificates]]\n certFile = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n keyFile = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n stores = [\"default\"]\n\n\n\n[http.services]\n# Service to nomad\n [http.services.nomad.loadBalancer]\n serversTransport = \"default\"\n [[http.services.nomad.loadBalancer.servers]]\n url = \"https://10.21.21.41:4646\"\n# Service to consul\n [http.services.consul.loadBalancer]\n serversTransport = \"default\"\n [[http.services.consul.loadBalancer.servers]]\n url = \"https://10.21.21.41:8501\"\n\n# Service to vault\n [http.services.vault.loadBalancer]\n serversTransport = \"default\"\n [[http.services.vault.loadBalancer.servers]]\n url = \"https://10.21.21.41:8200\"\n\n# Service to nexus ui\n [http.services.nexus-ui.loadBalancer]\n serversTransport = \"default\"\n [[http.services.nexus-ui.loadBalancer.servers]]\n url = \"http://10.21.21.41:5002\"\n\n# Service to nexus push\n# [http.services.nexus-push.loadBalancer]\n# serversTransport = \"default\"\n# [[http.services.nexus-push.loadBalancer.servers]]\n# url = \"http://10.21.21.41:5001\"\n\n # Service to nexus pull\n [http.services.nexus-pull.loadBalancer]\n serversTransport = \"default\"\n [[http.services.nexus-pull.loadBalancer.servers]]\n url = \"http://10.21.21.41:5000\"\n\n[http.routers]\n# Route to consul ui\n [http.routers.consul]\n entryPoints = [\"https\"]\n rule = \"Host(`consul.cloud.private`) \"\n service = \"consul\"\n # will terminate the TLS request\n # [http.routers.consul.tls]\n [[http.routers.consul.tls.domains]]\n # main = \"cloud.private\"\n sans = [\"consul.cloud.private\"]\n\n# Route to nomad ui\n [http.routers.nomad]\n entryPoints = [\"https\"]\n rule = \"Host(`nomad.cloud.private`) \"\n service = \"nomad\"\n [[http.routers.nomad.tls.domains]]\n #main = \"cloud.private\"\n sans = [\"nomad.cloud.private\"]\n\n# Route to vault ui\n [http.routers.vault]\n entryPoints = [\"https\"]\n rule = \"Host(`vault.cloud.private`) \"\n service = \"vault\"\n [[http.routers.vault.tls.domains]]\n sans = [\"vault.cloud.private\"]\n\n# Route to nexus ui\n [http.routers.nexus-ui]\n entryPoints = [\"https\"]\n rule = \"Host(`nexus.cloud.private`) \"\n service = \"nexus-ui\"\n [[http.routers.nexus-ui.tls.domains]]\n sans = [\"nexus.cloud.private\"]\n\n# Route to nexus pull\n [http.routers.nexus-pull]\n entryPoints = [\"https\"]\n # rule = \"Host(`registry.cloud.private`) \u0026\u0026 Method(`GET`,`HEAD`)\"\n rule = \"Host(`registry.cloud.private`)\"\n service = \"nexus-pull\"\n [[http.routers.nexus-pull.tls.domains]]\n sans = [\"registry.cloud.private\"]\n\n# Route to nexus push\n# [http.routers.nexus-push]\n# entryPoints = [\"https\"]\n# rule = \"Host(`registry.cloud.private`) \u0026\u0026 Method(`POST`,`PUT`,`DELETE`,`PATCH`)\"\n# service = \"nexus-push\"\n# [[http.routers.nexus-push.tls.domains]]\n# sans = [\"registry.cloud.private\"]\n\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/75709f55-c2fe-4966-1f85-9084745e16dc/traefik/local/certconfig.toml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/75709f55-c2fe-4966-1f85-9084745e16dc/traefik"},{"Backup":false,"Command":[],"CommandTimeout":30000000000,"Contents":"[entryPoints]\n [entryPoints.http]\n address = \":80\"\n [entryPoints.http.http.redirections]\n [entryPoints.http.http.redirections.entryPoint]\n to = \"https\"\n scheme = \"https\"\n [entryPoints.https]\n address = \":443\"\n\n [entryPoints.traefik]\n # The default port 8080 is used by cdvisor\n address = \":8081\"\n# TCP / UDP over one port\n# [entryPoints.tcpep]\n# address = \":3179\"\n# [entryPoints.udpep]\n# address = \":3179/udp\"\n# [entryPoints.streaming]\n# address = \":1704/udp\"\n[api]\n dashboard = true\n insecure = false\n debug = false\n[providers]\n [providers.file]\n filename = \"/etc/traefik/certconfig.toml\"\n debugLogGeneratedTemplate = true\n watch = true\n\n# Enable Consul Catalog configuration backend.\n[providers.consulCatalog]\n prefix = \"traefik\"\n exposedByDefault = false\n connectAware = true\n connectByDefault = false\n watch = true\n # applied if no traefik.http.routers.{name-of-your-choice}.rule tag found\n defaultRule = \"Host(`{{ .Name }}.cloud.private`)\"\n\n [providers.consulCatalog.endpoint]\n address = \"127.0.0.1:8501\"\n scheme = \"https\"\n\n\n[providers.consulCatalog.endpoint.tls]\n ca = \"/etc/opt/certs/ca/cluster-ca.crt\"\n cert = \"/etc/opt/certs/ingress/nomad-ingress.pem\"\n key = \"/etc/opt/certs/ingress/nomad-ingress-key.pem\"\n\n[tracing]\n# [tracing.zipkin]\n# httpEndpoint = \"http://tempo-zipkin.service.consul:9411/api/v2/spans\"\n# sameSpan = true\n# id128Bit = true\n [tracing.jaeger]\n samplingServerURL = \"http://tempo-jaeger.service.consul:14268/sampling\"\n propagation = \"b3\"\n gen128Bit = true\n [tracing.jaeger.collector]\n endpoint = \"http://tempo-jaeger.service.consul:14268/api/traces?format=jaeger.thrift\"\n\n[metrics]\n [metrics.prometheus]\n buckets = [0.1,0.3,1.2,5.0,7.5,9.5,9.9]\n addEntryPointsLabels = true\n addRoutersLabels = true\n addServicesLabels = true\n[log]\n level = \"DEBUG\"\n# format = \"json\"\n\n[accessLog]\n filePath = \"/logs/access.log\"\n #format = \"json\"\n [accessLog.fields]\n defaultMode = \"keep\"\n\n [accessLog.fields.headers]\n defaultMode = \"keep\"\n\n\n\n","CreateDestDirs":true,"Destination":"/opt/services/core/nomad/data/alloc/75709f55-c2fe-4966-1f85-9084745e16dc/traefik/local/traefik.toml","ErrMissingKey":false,"ErrFatal":true,"Exec":{"Command":[],"Enabled":false,"Env":{"Denylist":[],"Custom":[],"Pristine":false,"Allowlist":[]},"KillSignal":2,"KillTimeout":30000000000,"ReloadSignal":null,"Splay":0,"Timeout":30000000000},"Perms":420,"User":null,"Uid":null,"Group":null,"Gid":null,"Source":"","Wait":{"Enabled":false,"Min":0,"Max":0},"LeftDelim":"++","RightDelim":"++","FunctionDenylist":["plugin","writeToFile"],"SandboxPath":"/opt/services/core/nomad/data/alloc/75709f55-c2fe-4966-1f85-9084745e16dc/traefik"}],"TemplateErrFatal":null,"Vault":{"Address":"","Enabled":false,"Namespace":"","RenewToken":false,"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true},"SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":true,"Key":"","ServerName":"","Verify":true},"Transport":{"CustomDialer":null,"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":5,"TLSHandshakeTimeout":10000000000},"UnwrapToken":false,"DefaultLeaseDuration":300000000000,"LeaseRenewalThreshold":0.9,"K8SAuthRoleName":"","K8SServiceAccountTokenPath":"/run/secrets/kubernetes.io/serviceaccount/token","K8SServiceAccountToken":"","K8SServiceMountPath":"kubernetes"},"Nomad":{"Address":"","Enabled":true,"Namespace":"default","SSL":{"CaCert":"","CaPath":"","Cert":"","Enabled":false,"Key":"","ServerName":"","Verify":true},"AuthUsername":"","AuthPassword":"","Transport":{"CustomDialer":{},"DialKeepAlive":30000000000,"DialTimeout":30000000000,"DisableKeepAlives":false,"IdleConnTimeout":90000000000,"MaxIdleConns":100,"MaxIdleConnsPerHost":5,"TLSHandshakeTimeout":10000000000},"Retry":{"Attempts":12,"Backoff":250000000,"MaxBackoff":60000000000,"Enabled":true}},"Wait":{"Enabled":false,"Min":0,"Max":0},"Once":false,"ParseOnly":false,"BlockQueryWaitTime":60000000000} 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=e5fea251-9d66-4cf4-2090-83ae30046fb1 task=forwardauth 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=1845d837-285b-1d3c-86b9-16c47274106e task=logunifier 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=2 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr.docker: image reference count incremented: driver=docker image_name=registry.cloud.private/google_containers/pause-amd64:3.2 image_id=sha256:80d28bedfe5dec59da9ebf8e6260224ac9008ab5c11dbbe16ee3ba3e4439ac2c references=1 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.alloc_runner.task_runner: lifecycle start condition has been met, proceeding: alloc_id=75709f55-c2fe-4966-1f85-9084745e16dc task=traefik 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] consul.sync: sync complete: registered_services=1 deregistered_services=0 registered_checks=0 deregistered_checks=0 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: UI is enabled 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: UI is enabled 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=docker health=healthy description=Healthy 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr: detected drivers: drivers="map[healthy:[exec docker] undetected:[raw_exec qemu java]]" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.plugin: finished plugin manager initial fingerprint: plugin-type=driver 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Region: global (DC: nomadder1) 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Client: true 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Version: 1.5.5 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Advertise Addrs: HTTP: 10.21.21.42:4646 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Bind Addrs: HTTP: [0.0.0.0:4646] 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Nomad agent started! Log data will stream in below: 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Nomad agent configuration: 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Log Level: TRACE 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Server: false 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.server_mgr: new server list: new_servers=[10.21.21.41:4647] old_servers=[] 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.consul: bootstrap contacting Consul DCs: consul_dcs=["nomadder1"] 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=java health=undetected description="" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.plugin: waiting on plugin manager initial fingerprint: plugin-type=device 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.plugin: finished plugin manager initial fingerprint: plugin-type=device 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.plugin: waiting on plugin manager initial fingerprint: plugin-type=driver 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=raw_exec health=undetected description=disabled 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=qemu health=undetected description="" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr: initial driver fingerprint: driver=exec health=healthy description=Healthy 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.device_mgr: exiting since there are no device plugins 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr: detected fingerprints: node_attrs=["arch", "bridge", "cgroup", "consul", "cpu", "host", "network", "nomad", "plugins_cni", "signal", "storage"] 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.env_digitalocean: failed to request metadata: attribute=region error="Get \"http://169.254.169.254/metadata/v1/region\": dial tcp 169.254.169.254:80: connect: connection refused" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.env_azure: could not read value for attribute: attribute=compute/azEnvironment error="Get \"http://169.254.169.254/metadata/instance/compute/azEnvironment?api-version=2019-06-04&format=text\": dial tcp 169.254.169.254:80: connect: connection refused" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.env_gce: error querying GCE Metadata URL, skipping 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.env_gce: could not read value for attribute: attribute=machine-type error="Get \"http://169.254.169.254/computeMetadata/v1/instance/machine-type\": dial tcp 169.254.169.254:80: connect: connection refused" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=vault initial_period=15s 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.network: unable to parse link speed: path=/sys/class/net/docker0/speed device=docker0 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=docker0 mbits=1000 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.network: link speed could not be detected, falling back to default speed: interface=lo mbits=1000 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.network: unable to read link speed: path=/sys/class/net/lo/speed device=lo 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.network: detected interface IP: interface=eth0 IP=192.168.68.148 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.network: link speed detected: interface=eth0 mbits=1000 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.cpu: set of reservable cores available for tasks: cores=[0, 1, 2, 3] 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.cpu: detected CPU model: name="11th Gen Intel(R) Core(TM) i5-11500H @ 2.90GHz" 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=consul initial_period=15s 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.cpu: client configuration reserves these cores for node: cores=[] 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.cpu: detected CPU core count: EXTRA_VALUE_AT_END=4 2023-05-11T11:29:47+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr.cpu: detected CPU frequency: mhz=2918 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr: CNI config dir is not set or does not exist, skipping: cni_config_dir=/opt/cni/config 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr: fingerprinting periodically: fingerprinter=cgroup initial_period=15s 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.fingerprint_mgr: built-in fingerprints: fingerprinters=["arch", "bridge", "cgroup", "cni", "consul", "cpu", "host", "landlock", "memory", "network", "nomad", "plugins_cni", "signal", "storage", "vault", "env_gce", "env_azure", "env_digitalocean", "env_aws"] 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.cpuset.v2: initializing with: cores=0-3 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent.plugin_loader.docker: using client connection initialized from environment: plugin_dir=/opt/services/core/nomad/data/plugins 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] agent.plugin_loader.docker: using client connection initialized from environment: plugin_dir=/opt/services/core/nomad/data/plugins 2023-05-11T11:29:46+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-b7d4b44b6996d12c68ac89cc6783c9bd6e55ce3e 2023-05-11T11:29:46+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced check: check=_nomad-check-c9bab43dc01aaa4f2fb76ff56f6bc875a603b652 2023-05-11T11:29:46+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced service: service=_nomad-server-7kmxk6dg7fdyvni3umhkdmkyc6fdgdpy 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] agent.joiner: retry join completed: initial_servers=1 agent_mode=server 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-server-7kmxk6dg7fdyvni3umhkdmkyc6fdgdpy exists=false reason=operations 2023-05-11T11:29:46+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced service: service=_nomad-server-hxceepjymnvvp3n5q2reh2z7haxck53w 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-server-hxceepjymnvvp3n5q2reh2z7haxck53w exists=false reason=operations 2023-05-11T11:29:46+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Synced service: service=_nomad-server-ldxixdelv4s3r3bxigi63pjg6evjh5n5 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] agent.joiner: starting retry join: servers=10.21.21.41 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: must register service: id=_nomad-server-ldxixdelv4s3r3bxigi63pjg6evjh5n5 exists=false reason=operations 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=Running to=WaitingToDequeue 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=Starting to=Started 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=UnknownStatus to=Running 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Running to=WaitingToDequeue 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=UnknownStatus to=Running 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=UnknownStatus to=Running 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: commit sync operations: ops="<3, 2, 0, 0>" 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: starting scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "_core"] 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=Running to=WaitingToDequeue 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=18f9af9c-6c70-575a-8e70-d7bf8c5fbb9a from=Starting to=Started 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=operations 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=Running to=WaitingToDequeue 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: started scheduling worker(s): num_workers=4 schedulers=["service", "batch", "system", "sysbatch", "_core"] 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: adding server: server="master-01.global (Addr: 10.21.21.41:4647) (DC: nomadder1)" 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed workload status: worker_id=af5c7bc0-1d03-3a56-8237-c19ea015b837 from=UnknownStatus to=Running 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad: serf: Failed to re-join any previously known node 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] agent: not registering Nomad HTTPS Health Check because verify_https_client enabled 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: serf: EventMemberJoin: master-01.global 10.21.21.41 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=1beee303-f51c-6fd7-1fb4-4b1a085ed8fd from=Starting to=Started 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] worker: changed worker status: worker_id=90866fea-4262-56df-02b1-04bf068f154a from=Starting to=Started 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: initial configuration: index=1 servers="[{Suffrage:Voter ID:a1c8b791-e2b8-7606-19bd-988341a75d1b Address:10.21.21.41:4647}]" 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: entering follower state: follower="Node at 10.21.21.41:4647 [Follower]" leader-address= leader-id= 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: snapshot restore progress: id=6-9047-1683653626275 last-index=9047 last-term=6 size-in-bytes=7208431 read-bytes=7208431 percent-complete="100.00%" 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: restored from snapshot: id=6-9047-1683653626275 last-index=9047 last-term=6 size-in-bytes=7208431 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Starting Nomad agent... 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Loaded configuration from /etc/nomad.d/client.hcl, /etc/nomad.d/nomad_volume_stack_core_keycloak_postgres_volume.hcl, /etc/nomad.d/nomad_volume_stack_core_minio_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_grafana_agent_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_grafana_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_loki_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_mimir_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_nats_volume.hcl, /etc/nomad.d/nomad_volume_stack_observability_tempo_volume.hcl 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: execute sync: reason=periodic 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: Consul supports TLSSkipVerify 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [๐Ÿ‘€] consul.sync: able to contact Consul 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad.raft: starting restore from snapshot: id=6-9047-1683653626275 last-index=9047 last-term=6 size-in-bytes=7208431 2023-05-11T11:29:46+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: setting up raft bolt store: no_freelist_sync=false 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: shutting down http server 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.server_mgr: shutting down 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr.docker: error collecting stats from container: container_id=6b78f11d560f802a5c7c7c7d52435abb2a92eca1bd9f04373368b4a070fa1e06 driver=docker error="context canceled" 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr.docker: error collecting stats from container: container_id=f1b7effc0a5d4ef3609b27686c9ea26ad4b27cf9f3493b54445241edabdd9a0e driver=docker error="context canceled" 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] client.driver_mgr.docker: error collecting stats from container: container_id=6458fd63083226fabaafd02ba4144e7f37d55de1473fde3cb647004cb10b04a8 driver=docker error="context canceled" 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/nodes?prefix=e2eb7460-2bca-ac62-5c53-999281062667 duration=1.880837ms 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/agent/self duration="461.87ยตs" 2023-05-11T11:29:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.envoy: Error receiving new DeltaDiscoveryRequest; closing request channel: error="rpc error: code = Canceled desc = context canceled" 2023-05-11T11:29:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.envoy: Error receiving new DeltaDiscoveryRequest; closing request channel: error="rpc error: code = Canceled desc = context canceled" 2023-05-11T11:29:45+02:00 [consul.service ๐Ÿ’ป worker-01] [โŒ] agent.envoy: Error receiving new DeltaDiscoveryRequest; closing request channel: error="rpc error: code = Canceled desc = context canceled" 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" error="rpc error: EOF" code=500 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/observability?index=3271&namespace=default&stale=&wait=60000ms" error="rpc error: EOF" code=500 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/minio?index=10463&namespace=default&stale=&wait=60000ms" error="rpc error: EOF" code=500 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" error="rpc error: EOF" code=500 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=28.923256595s 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Caught signal: interrupt 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] ==> Gracefully shutting down agent... 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โœ…] Error toggling drain mode: Unexpected response code: 500 (rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused) 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server which is not safe to automatically retry: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" rpc=Node.List server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] http: request failed: method=GET path=/v1/nodes?prefix=36d1fc65-c097-97bc-18ac-079c1262ccfd error="rpc error: failed to get conn: dial tcp 10.21.21.41:4647: connect: connection refused" code=500 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Node.GetClientAllocs server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป worker-01] [โŒ] client.rpc: error performing RPC to server: error="rpc error: EOF" rpc=Variables.Read server=10.21.21.41:4647 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] agent: shutdown complete 2023-05-11T11:29:45+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Deregistered service: service=_nomad-server-ldxixdelv4s3r3bxigi63pjg6evjh5n5 2023-05-11T11:29:45+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Deregistered service: service=_nomad-server-7kmxk6dg7fdyvni3umhkdmkyc6fdgdpy 2023-05-11T11:29:45+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent: Deregistered service: service=_nomad-server-hxceepjymnvvp3n5q2reh2z7haxck53w 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป master-01] [โš ] nomad: serf: Shutdown without a Leave 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: cluster leadership lost 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] nomad: shutting down server 2023-05-11T11:29:45+02:00 [nomad.service ๐Ÿ’ป master-01] [โœ…] agent: requesting shutdown 2023-05-11T11:29:42+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.382485ms 2023-05-11T11:29:32+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.61148ms 2023-05-11T11:29:22+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.884091ms 2023-05-11T11:29:20+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server: member joined, marking health alive: member=worker-03 partition=default 2023-05-11T11:29:20+02:00 [consul.service ๐Ÿ’ป master-01] [โœ…] agent.server.serf.lan: serf: EventMemberJoin: worker-03 10.21.21.44 2023-05-11T11:29:16+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m3.481706061s 2023-05-11T11:29:12+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="915.4ยตs" 2023-05-11T11:29:02+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.416873ms 2023-05-11T11:28:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="938.252ยตs" 2023-05-11T11:28:42+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.887481ms 2023-05-11T11:28:32+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.344898ms 2023-05-11T11:28:22+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="966.959ยตs" 2023-05-11T11:28:12+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path="/v1/var/nomad/jobs/security?index=10462&namespace=default&stale=&wait=60000ms" duration=1m3.230393111s 2023-05-11T11:28:12+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=3.86916ms 2023-05-11T11:28:02+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="928.443ยตs" 2023-05-11T11:27:52+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration=1.123312ms 2023-05-11T11:27:42+02:00 [nomad.service ๐Ÿ’ป worker-02] [๐Ÿž] http: request complete: method=GET path=/v1/metrics?format=prometheus duration="902.656ยตs"