Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Consul Connect with Consul TLS enabled #6711

Closed
crizstian opened this issue Nov 15, 2019 · 2 comments
Closed

Support Consul Connect with Consul TLS enabled #6711

crizstian opened this issue Nov 15, 2019 · 2 comments
Labels
stage/duplicate theme/consul/connect Consul Connect integration

Comments

@crizstian
Copy link

If filing a bug please include the following:

Nomad version

Output from nomad version
Nomad v0.10.1 (0d4e5d949fe073c47a947ea36bfef31a3c49224f)

Operating system and Environment details

Linux

Issue

    2019-11-15T19:06:09.319Z [ERROR] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: error creating bootstrap configuration for Connect proxy sidecar: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=connect-proxy-booking-api error="exit status 1" stderr="==> Failed looking up sidecar proxy info for _nomad-task-6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee-group-booking-api-booking-api-3002: Unexpected response code: 400 (Client sent an HTTP request to an HTTPS server.
)
"
    2019-11-15T19:06:09.319Z [ERROR] client.alloc_runner.task_runner: prestart failed: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=connect-proxy-booking-api error="prestart hook "envoy_bootstrap" failed: error creating bootstrap configuration for Connect proxy sidecar: exit status 1"
    2019-11-15T19:06:09.319Z [INFO ] client.alloc_runner.task_runner: restarting task: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=connect-proxy-booking-api reason="Restart within policy" delay=18.131309709s

Reproduction steps

Run a Consul Cluster with TLS enabled
Run Nomad Cluster with TLS enabled

Deploy a job with consul connect nomad native integration

Job file (if appropriate)

e.g.

job "cinemas" {

  datacenters = ["dc1-ncv"]
  region      = "dc1-region"
  type        = "service"

  group "payment-api" {
    count = 1

    task "payment-api" {
      driver = "docker"
      config {
        image = "crizstian/payment-service-go:v0.3-tls"
      }

      env {
        DB_SERVERS   = "mongodb1.service.consul:27017,mongodb2.service.consul:27018,mongodb3.service.consul:27019"
        SERVICE_PORT = "3000"
        CONSUL_IP    = "172.20.20.11"
      }

      resources {
        cpu    = 50
        memory = 50
      }
    }

    network {
      mode = "bridge"
    }

    service {
      name = "payment-api"
      port = "3000"

      connect {
        sidecar_service {}
      }
    }
  }
}

Nomad logs (if appropriate)

If possible please post relevant logs in the issue

    2019-11-15T19:05:31.120Z [ERROR] client.alloc_runner.task_runner: prestart failed: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc task=connect-proxy-booking-api error="prestart hook "envoy_bootstrap" failed: error creating bootstrap configuration for Connect proxy sidecar: exit status 1"
    2019-11-15T19:05:31.120Z [INFO ] client.alloc_runner.task_runner: not restarting task: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc task=connect-proxy-booking-api reason="Exceeded allowed attempts 2 in interval 30m0s and mode is "fail""
    2019-11-15T19:05:31.248Z [ERROR] client.alloc_runner.task_runner.task_hook.stats_hook: failed to start stats collection for task with unrecoverable error: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc task=booking-api error="container stopped"
    2019-11-15T19:05:31.249Z [ERROR] client.driver_mgr.docker.docker_logger.nomad: reading plugin stderr: driver=docker error="read |0: file already closed"
    2019-11-15T19:05:33.031Z [WARN ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=77a389a8-a964-2bd2-71d8-b8ee0bd00626 task=connect-proxy-notification-api @module=logmon timestamp=2019-11-15T19:05:33.031Z
    2019-11-15T19:05:33.031Z [WARN ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=77a389a8-a964-2bd2-71d8-b8ee0bd00626 task=connect-proxy-notification-api @module=logmon timestamp=2019-11-15T19:05:33.031Z
    2019-11-15T19:05:33.032Z [ERROR] client.alloc_runner.task_runner.task_hook.logmon.nomad: reading plugin stderr: alloc_id=77a389a8-a964-2bd2-71d8-b8ee0bd00626 task=connect-proxy-notification-api error="read |0: file already closed"
    2019-11-15T19:05:33.033Z [INFO ] client.gc: marking allocation for GC: alloc_id=77a389a8-a964-2bd2-71d8-b8ee0bd00626
    2019-11-15T19:05:34.048Z [WARN ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=9baf76a9-b5b0-01fa-e9ff-43a03fe8bc6b task=connect-proxy-payment-api @module=logmon timestamp=2019-11-15T19:05:34.048Z
    2019-11-15T19:05:34.048Z [WARN ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=9baf76a9-b5b0-01fa-e9ff-43a03fe8bc6b task=connect-proxy-payment-api @module=logmon timestamp=2019-11-15T19:05:34.048Z
    2019-11-15T19:05:34.050Z [ERROR] client.alloc_runner.task_runner.task_hook.logmon.nomad: reading plugin stderr: alloc_id=9baf76a9-b5b0-01fa-e9ff-43a03fe8bc6b task=connect-proxy-payment-api error="read |0: file already closed"
    2019-11-15T19:05:34.051Z [INFO ] client.gc: marking allocation for GC: alloc_id=9baf76a9-b5b0-01fa-e9ff-43a03fe8bc6b
    2019-11-15T19:05:34.269Z [WARN ] consul.sync: failed to update services in Consul: error="Unexpected response code: 500 (Unknown service "_nomad-task-9baf76a9-b5b0-01fa-e9ff-43a03fe8bc6b-group-payment-api-payment-api-3000-sidecar-proxy")"
    2019-11-15T19:05:35.126Z [WARN ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc task=connect-proxy-booking-api @module=logmon timestamp=2019-11-15T19:05:35.126Z
    2019-11-15T19:05:35.127Z [WARN ] client.alloc_runner.task_runner.task_hook.logmon.nomad: timed out waiting for read-side of process output pipe to close: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc task=connect-proxy-booking-api @module=logmon timestamp=2019-11-15T19:05:35.126Z
    2019-11-15T19:05:35.129Z [ERROR] client.alloc_runner.task_runner.task_hook.logmon.nomad: reading plugin stderr: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc task=connect-proxy-booking-api error="read |0: file already closed"
    2019-11-15T19:05:35.131Z [INFO ] client.gc: marking allocation for GC: alloc_id=549e5fbf-1ec9-2686-83bc-01ca43c4d3cc
    2019-11-15T19:05:35.270Z [INFO ] consul.sync: successfully updated services in Consul
    2019-11-15T19:06:01.136Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=connect-proxy-payment-api path=/var/nomad/data/alloc/d47ee67e-c1e1-703e-7c8f-29eb0f3046a2/alloc/logs/.connect-proxy-payment-api.stdout.fifo @module=logmon timestamp=2019-11-15T19:06:01.136Z
    2019-11-15T19:06:01.136Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=connect-proxy-payment-api @module=logmon path=/var/nomad/data/alloc/d47ee67e-c1e1-703e-7c8f-29eb0f3046a2/alloc/logs/.connect-proxy-payment-api.stderr.fifo timestamp=2019-11-15T19:06:01.136Z
    2019-11-15T19:06:01.137Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=payment-api @module=logmon path=/var/nomad/data/alloc/d47ee67e-c1e1-703e-7c8f-29eb0f3046a2/alloc/logs/.payment-api.stdout.fifo timestamp=2019-11-15T19:06:01.137Z
    2019-11-15T19:06:01.137Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=payment-api @module=logmon path=/var/nomad/data/alloc/d47ee67e-c1e1-703e-7c8f-29eb0f3046a2/alloc/logs/.payment-api.stderr.fifo timestamp=2019-11-15T19:06:01.137Z
    2019-11-15T19:06:01.172Z [INFO ] client.driver_mgr.docker: created container: driver=docker container_id=87ce3c4d5f62d01b0bcffa643ebda8fcd29cb586a2e3715185b8462c41a1b4fc
    2019-11-15T19:06:01.473Z [INFO ] client.driver_mgr.docker: started container: driver=docker container_id=87ce3c4d5f62d01b0bcffa643ebda8fcd29cb586a2e3715185b8462c41a1b4fc
    2019-11-15T19:06:02.109Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=notification-api @module=logmon path=/var/nomad/data/alloc/39915ca5-e226-edb8-7f97-144ab4b9057b/alloc/logs/.notification-api.stdout.fifo timestamp=2019-11-15T19:06:02.109Z
    2019-11-15T19:06:02.109Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=notification-api @module=logmon path=/var/nomad/data/alloc/39915ca5-e226-edb8-7f97-144ab4b9057b/alloc/logs/.notification-api.stderr.fifo timestamp=2019-11-15T19:06:02.109Z
    2019-11-15T19:06:02.113Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=connect-proxy-notification-api @module=logmon path=/var/nomad/data/alloc/39915ca5-e226-edb8-7f97-144ab4b9057b/alloc/logs/.connect-proxy-notification-api.stdout.fifo timestamp=2019-11-15T19:06:02.113Z
    2019-11-15T19:06:02.113Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=connect-proxy-notification-api path=/var/nomad/data/alloc/39915ca5-e226-edb8-7f97-144ab4b9057b/alloc/logs/.connect-proxy-notification-api.stderr.fifo @module=logmon timestamp=2019-11-15T19:06:02.113Z
    2019-11-15T19:06:02.152Z [INFO ] client.driver_mgr.docker: created container: driver=docker container_id=bfcd4cd82c5ba9c500a4dfc75f3ab3193c9eb38caa2f7ef58f724b6f65f01b19
    2019-11-15T19:06:02.433Z [INFO ] client.driver_mgr.docker: started container: driver=docker container_id=bfcd4cd82c5ba9c500a4dfc75f3ab3193c9eb38caa2f7ef58f724b6f65f01b19
    2019-11-15T19:06:03.114Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=booking-api @module=logmon path=/var/nomad/data/alloc/6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee/alloc/logs/.booking-api.stdout.fifo timestamp=2019-11-15T19:06:03.114Z
    2019-11-15T19:06:03.114Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=booking-api @module=logmon path=/var/nomad/data/alloc/6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee/alloc/logs/.booking-api.stderr.fifo timestamp=2019-11-15T19:06:03.114Z
    2019-11-15T19:06:03.126Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=connect-proxy-booking-api @module=logmon path=/var/nomad/data/alloc/6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee/alloc/logs/.connect-proxy-booking-api.stdout.fifo timestamp=2019-11-15T19:06:03.126Z
    2019-11-15T19:06:03.126Z [INFO ] client.alloc_runner.task_runner.task_hook.logmon.nomad: opening fifo: alloc_id=6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee task=connect-proxy-booking-api @module=logmon path=/var/nomad/data/alloc/6bfe4026-66a9-efdc-aa7e-8fbdbe0286ee/alloc/logs/.connect-proxy-booking-api.stderr.fifo timestamp=2019-11-15T19:06:03.126Z
    2019-11-15T19:06:03.153Z [INFO ] client.driver_mgr.docker: created container: driver=docker container_id=9dda4394f776ded9647d0ff780db2157fe7e76ed93c8afcf43561ca0a7dea261
    2019-11-15T19:06:03.431Z [INFO ] client.driver_mgr.docker: started container: driver=docker container_id=9dda4394f776ded9647d0ff780db2157fe7e76ed93c8afcf43561ca0a7dea261
    2019-11-15T19:06:07.316Z [ERROR] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: error creating bootstrap configuration for Connect proxy sidecar: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=connect-proxy-payment-api error="exit status 1" stderr="==> Failed looking up sidecar proxy info for _nomad-task-d47ee67e-c1e1-703e-7c8f-29eb0f3046a2-group-payment-api-payment-api-3000: Unexpected response code: 400 (Client sent an HTTP request to an HTTPS server.
)
"
    2019-11-15T19:06:07.316Z [ERROR] client.alloc_runner.task_runner: prestart failed: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=connect-proxy-payment-api error="prestart hook "envoy_bootstrap" failed: error creating bootstrap configuration for Connect proxy sidecar: exit status 1"
    2019-11-15T19:06:07.316Z [INFO ] client.alloc_runner.task_runner: restarting task: alloc_id=d47ee67e-c1e1-703e-7c8f-29eb0f3046a2 task=connect-proxy-payment-api reason="Restart within policy" delay=16.307852288s
    2019-11-15T19:06:08.291Z [ERROR] client.alloc_runner.task_runner.task_hook.envoy_bootstrap: error creating bootstrap configuration for Connect proxy sidecar: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=connect-proxy-notification-api error="exit status 1" stderr="==> Failed looking up sidecar proxy info for _nomad-task-39915ca5-e226-edb8-7f97-144ab4b9057b-group-notification-api-notification-api-3001: Unexpected response code: 400 (Client sent an HTTP request to an HTTPS server.
)
"
    2019-11-15T19:06:08.292Z [ERROR] client.alloc_runner.task_runner: prestart failed: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=connect-proxy-notification-api error="prestart hook "envoy_bootstrap" failed: error creating bootstrap configuration for Connect proxy sidecar: exit status 1"
    2019-11-15T19:06:08.292Z [INFO ] client.alloc_runner.task_runner: restarting task: alloc_id=39915ca5-e226-edb8-7f97-144ab4b9057b task=connect-proxy-notification-api reason="Restart within policy" delay=16.307852288s
@tgross
Copy link
Member

tgross commented Nov 15, 2019

Hi @crizstian!

We have a report of this in #6594 and an issue to work through the testing for it in #6502. So I'm going to close this one as a duplicate but thanks for reporting it!

@tgross tgross added theme/consul/connect Consul Connect integration stage/duplicate labels Nov 15, 2019
@tgross tgross closed this as completed Nov 15, 2019
shoenig added a commit that referenced this issue Apr 2, 2020
Fixes #6594 #6711 #6714 #7567

e2e testing is still TBD in #6502

Before, we only passed the Nomad agent's configured Consul HTTP
address onto the `consul connect envoy ...` bootstrap command.
This meant any Consul setup with TLS enabled would not work with
Nomad's Connect integration.

This change now sets CLI args and Environment Variables for
configuring TLS options for communicating with Consul when doing
the envoy bootstrap, as described in
https://www.consul.io/docs/commands/connect/envoy.html#usage
@github-actions
Copy link

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 16, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
stage/duplicate theme/consul/connect Consul Connect integration
Projects
None yet
Development

No branches or pull requests

2 participants