Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

created sidecar fails with "gRPC config stream closed" #6714

Closed
pellepelster opened this issue Nov 16, 2019 · 2 comments
Closed

created sidecar fails with "gRPC config stream closed" #6714

pellepelster opened this issue Nov 16, 2019 · 2 comments
Labels
stage/duplicate theme/consul/connect Consul Connect integration

Comments

@pellepelster
Copy link

Nomad version

0.10.1

Operating system and Environment details

Debian GNU Linux 9

Issue

Following the guide from https://www.hashicorp.com/blog/consul-connect-integration-in-hashicorp-nomad/ the created sidecar services in consul show failing healthchecks

dial tcp 127.0.0.1:20956: connect: connection refused

Reproduction steps

Setup Nomad with Consul integration and run the job file below

Job file (if appropriate)

job "countdash" {

  datacenters = [
    "integration-test"]

  group "api" {

    network {
      mode = "bridge"
    }

    service {
      name = "count-api"
      port = "9001"

      connect {
        sidecar_service {
        }
        sidecar_task {
          meta {
            sidecar_log_level = "debug"
          }
        }
      }
    }

    task "web" {
      driver = "docker"
      config {
        image = "hashicorpnomad/counter-api:v1"
      }
    }
  }

  group "dashboard" {
    network {
      mode = "bridge"
      port "http" {
        static = 9002
        to = 9002
      }
    }

    service {
      name = "count-dashboard"
      port = "9002"

      connect {
        sidecar_task {
          meta {
            sidecar_log_level = "debug"
          }
        }
        sidecar_service {
          proxy {
            upstreams {
              destination_name = "count-api"
              local_bind_port = 8080
            }
          }
        }
      }

    }

    task "dashboard" {
      driver = "docker"

      env {
        COUNTING_SERVICE_URL = "http://${NOMAD_UPSTREAM_ADDR_count_api}"
      }
      config {
        image = "hashicorpnomad/counter-dashboard:v1"
      }
    }
  }
}

Log for 'connect-proxy-count-api task'

[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:238] initializing epoch 0 (hot restart version=11.104)
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:240] statically linked extensions:
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:242]   access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:245]   filters.http: envoy.buffer,envoy.cors,envoy.csrf,envoy.ext_authz,envoy.fault,envoy.filters.http.dynamic_forward_proxy,envoy.filters.http.grpc_http1_reverse_bridge,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.original_src,envoy.filters.http.rbac,envoy.filters.http.tap,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:248]   filters.listener: envoy.listener.original_dst,envoy.listener.original_src,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:251]   filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.dubbo_proxy,envoy.filters.network.mysql_proxy,envoy.filters.network.rbac,envoy.filters.network.sni_cluster,envoy.filters.network.thrift_proxy,envoy.filters.network.zookeeper_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:253]   stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:255]   tracers: envoy.dynamic.ot,envoy.lightstep,envoy.tracers.datadog,envoy.tracers.opencensus,envoy.zipkin
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:258]   transport_sockets.downstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:261]   transport_sockets.upstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls
[2019-11-16 11:36:07.624][1][info][main] [source/server/server.cc:267] buffer implementation: old (libevent)
[2019-11-16 11:36:07.629][1][warning][misc] [source/common/protobuf/utility.cc:199] Using deprecated option 'envoy.api.v2.Cluster.hosts' from file cds.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
[2019-11-16 11:36:07.636][1][info][main] [source/server/server.cc:322] admin address: 127.0.0.1:19000
[2019-11-16 11:36:07.640][1][info][main] [source/server/server.cc:432] runtime: layers:
  - name: base
    static_layer:
      {}
  - name: admin
    admin_layer:
      {}
[2019-11-16 11:36:07.640][1][warning][runtime] [source/common/runtime/runtime_impl.cc:497] Skipping unsupported runtime layer: name: "base"
static_layer {
}

[2019-11-16 11:36:07.640][1][info][config] [source/server/configuration_impl.cc:61] loading 0 static secret(s)
[2019-11-16 11:36:07.640][1][info][config] [source/server/configuration_impl.cc:67] loading 1 cluster(s)
[2019-11-16 11:36:07.648][1][info][upstream] [source/common/upstream/cluster_manager_impl.cc:144] cm init: initializing cds
[2019-11-16 11:36:07.650][1][info][config] [source/server/configuration_impl.cc:71] loading 0 listener(s)
[2019-11-16 11:36:07.652][1][info][config] [source/server/configuration_impl.cc:96] loading tracing configuration
[2019-11-16 11:36:07.652][1][info][config] [source/server/configuration_impl.cc:116] loading stats sink configuration
[2019-11-16 11:36:07.652][1][info][main] [source/server/server.cc:516] starting main dispatch loop
[2019-11-16 11:36:07.653][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:36:07.653][1][info][upstream] [source/common/upstream/cluster_manager_impl.cc:148] cm init: all clusters initialized
[2019-11-16 11:36:07.653][1][info][main] [source/server/server.cc:500] all clusters initialized. initializing init manager
[2019-11-16 11:36:07.872][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:36:07.872][1][info][config] [source/server/listener_manager_impl.cc:761] all dependencies initialized. starting workers
[2019-11-16 11:36:08.080][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:36:11.156][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:36:17.742][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:36:32.974][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:36:40.605][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:05.928][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:06.072][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:13.329][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:30.925][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:38.855][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:40.100][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:43.035][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:37:59.296][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:38:27.576][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:38:38.404][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:38:41.526][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:38:56.784][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:39:26.349][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:39:37.985][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:39:48.828][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:39:49.525][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:40:01.332][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:40:11.410][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:40:25.140][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:40:42.516][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:40:56.198][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:41:10.896][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:41:16.521][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:41:30.841][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:41:49.226][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination
[2019-11-16 11:42:05.498][1][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, upstream connect error or disconnect/reset before headers. reset reason: connection termination

Consul Log

  2019/11/16 12:43:54 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0
    2019/11/16 12:43:59 [WARN] agent: Check "service:_nomad-task-f01929ea-67f1-8cdc-d809-0306808f0f6d-group-api-count-api-9001-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:31765: 
    2019/11/16 12:44:04 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0
    2019/11/16 12:44:09 [WARN] agent: Check "service:_nomad-task-f01929ea-67f1-8cdc-d809-0306808f0f6d-group-api-count-api-9001-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:31765: 
    2019/11/16 12:44:14 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0
    2019/11/16 12:44:19 [WARN] agent: Check "service:_nomad-task-f01929ea-67f1-8cdc-d809-0306808f0f6d-group-api-count-api-9001-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:31765: 
    2019/11/16 12:44:24 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0
    2019/11/16 12:44:29 [WARN] agent: Check "service:_nomad-task-f01929ea-67f1-8cdc-d809-0306808f0f6d-group-api-count-api-9001-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:31765: 
    2019/11/16 12:44:34 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0
    2019/11/16 12:44:39 [WARN] agent: Check "service:_nomad-task-f01929ea-67f1-8cdc-d809-0306808f0f6d-group-api-count-api-9001-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:31765: 
    2019/11/16 12:44:44 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0
    2019/11/16 12:44:49 [WARN] agent: Check "service:_nomad-task-f01929ea-67f1-8cdc-d809-0306808f0f6d-group-api-count-api-9001-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:31765: 
    2019/11/16 12:44:54 [WARN] agent: Check "service:_nomad-task-69974f76-1782-3c84-8970-3c1b72ab6f54-group-dashboard-count-dashboard-9002-sidecar-proxy:1" socket connection failed: dial tcp 127.0

Nomad Config

log_level = "DEBUG"
name = "controller-0"
datacenter = "integration-test"

server {
    enabled          = true
    bootstrap_expect = 3
}

client {
    enabled = true
    meta {
        role = "controller"
    }
}

advertise {
  http = "xxx"
  rpc  = "xxx"
  serf = "xxx"
}

plugin "raw_exec" {
  config {
    enabled = true
  }
}

consul {
    address = "127.0.0.1:8500"
    auto_advertise = true
    server_auto_join = true
    client_auto_join = true
    token = "xxx"
}

tls {
    http = true
    rpc  = true

    ca_file   = "/xxx/ca.cert.pem"
    cert_file = "/xxx/controller-0.cert.pem"
    key_file  = "/xxx/controller-0.key.pem"

    verify_server_hostname = true
    verify_https_client    = false
}

autopilot {
    cleanup_dead_servers = true
    last_contact_threshold = "200ms"
    max_trailing_logs = 250
    server_stabilization_time = "10s"
    enable_redundancy_zones = false
    disable_upgrade_migration = false
    enable_custom_upgrades = false
}

Consul config

{
  "datacenter": "integration-test",
  "server": true,
  "ui": true,
  "advertise_addr": "xxx",
  "client_addr": "xxx",
  "leave_on_terminate": true,
  "retry_join": ["xxx","xxx","xxx"],
  "bootstrap_expect": 3,
  "ca_file": "/xxx/ca.cert.pem",
  "cert_file": "/xxx/controller-0.cert.pem",
  "key_file": "/xxx/controller-0.key.pem",
  "verify_incoming": true,
  "verify_incoming_https": false,
  "verify_outgoing": true,
  "addresses": {
    "https": "0.0.0.0",
    "http": "0.0.0.0",
    "dns": "127.0.0.1",
    "grpc": "127.0.0.1"
  },
  "ports": {
    "dns": 8553,
    "http": 8500,
    "https": 8501,
    "grpc": 8502
  },
  "encrypt": "xxx",
  "acl": {
    "enabled": true,
    "default_policy": "deny",
    "down_policy": "deny",
    "tokens": {
      "master": "xxx",
      "agent": "xxx"
    }
  },
  "connect": {
    "enabled": true
  }
}
@tgross tgross added theme/consul/connect Consul Connect integration stage/duplicate labels Nov 18, 2019
@tgross
Copy link
Member

tgross commented Nov 18, 2019

Hi @pellepelster! It looks like you're using Consul with TLS?

We have a report of this in #6594 and an issue to work through the testing for it in #6502. So I'm going to close this one as a duplicate but thanks for reporting it!

@tgross tgross closed this as completed Nov 18, 2019
shoenig added a commit that referenced this issue Apr 2, 2020
Fixes #6594 #6711 #6714 #7567

e2e testing is still TBD in #6502

Before, we only passed the Nomad agent's configured Consul HTTP
address onto the `consul connect envoy ...` bootstrap command.
This meant any Consul setup with TLS enabled would not work with
Nomad's Connect integration.

This change now sets CLI args and Environment Variables for
configuring TLS options for communicating with Consul when doing
the envoy bootstrap, as described in
https://www.consul.io/docs/commands/connect/envoy.html#usage
@github-actions
Copy link

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 16, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
stage/duplicate theme/consul/connect Consul Connect integration
Projects
None yet
Development

No branches or pull requests

2 participants