Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move the deployment YAML to kustomize #2088

Closed
jpeach opened this issue Jan 8, 2020 · 6 comments
Closed

Move the deployment YAML to kustomize #2088

jpeach opened this issue Jan 8, 2020 · 6 comments
Labels
area/deployment Issues or PRs related to deployment tooling or infrastructure. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale.

Comments

@jpeach
Copy link
Contributor

jpeach commented Jan 8, 2020

The current Contour deployment YAML builds from the fragments in examples/contour and doesn't give any help to operators that need to modify the deployment in any way.

If we move this across to a kustomize configuration, we have the opportunity to be able to modify the YAML for different deployment targets (e.g. AWS, GCP, Kind), as well as generate the same consolidated YAML file that we document for the Contour quickstart. Other people that deploy Contour may be able to consume the kustomize configuration if we structure it correctly.

Pros of kustomize:

  • Gives some deployment flexibility we don't have today
  • Built in to kubectl

Cons on kustomize:

  • Documentation leaves a something to be desired
  • Limited configuration flexibility (e.g. hard or impossible to remove components)

Proof of concept: jpeach@1c575c7

Related to #1190, #2050

@jpeach
Copy link
Contributor Author

jpeach commented Jan 10, 2020

One problem the just bit me is that kustomize and kubectl kustomize releases are not aligned. This means that it is possible to use syntax that kustomize will accept but kubectl won't. I'm currently in this situation with fieldRef expressions and I have:

$ kubectl version
Client Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.0", GitCommit:"70132b0f130acc0bed193d9ba59dd186f0e634cf", GitTreeState:"clean", BuildDate:"2019-12-13T11:52:32Z", GoVersion:"go1.13.4", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.0-gke.20", GitCommit:"d324c1db214acfc1ff3d543767f33feab3f4dcaa", GitTreeState:"clean", BuildDate:"2019-11-26T20:51:21Z", GoVersion:"go1.12.11b4", Compiler:"gc", Platform:"linux/amd64"}
$ kustomize version
{Version:3.5.3 GitCommit:5ba90fe5ef1dc4599e359edd41d1d0e6373b247d BuildDate:2019-12-18T03:07:49+00:00 GoOs:darwin GoArch:amd64}

@stevesloka
Copy link
Member

My first thought was to default to using the version of kustomize in kubectl, but there's no way to know what version of kubectl a user has.

I think we just need to pick a version of kustomize and document it.

@jpeach
Copy link
Contributor Author

jpeach commented Jan 10, 2020 via email

@stefanprodan
Copy link
Contributor

This would be very helpful, Contour could expose a remote base for each release branch so that patching could target a specific release, see an example here https://github.com/stefanprodan/eks-contour-ingress/tree/master/contour

@jpeach jpeach added the area/deployment Issues or PRs related to deployment tooling or infrastructure. label Feb 9, 2020
jpeach added a commit to jpeach/contour that referenced this issue Apr 27, 2020
Move the example deployment to Kustomize. This breaks the YAML documents
in the example deployment into 4 components located in `config/components`
- types, contour, envoy and certgen. These are all included in the default
deployments, but operators have the option of creating deployments that
dont't include all the components.

Deployments to various Kubernetes infrastructure are in the `deployment`
directory. The base deployment pulls in all the components and sets the
namespace to `projectcontour`. The `kind` deployment updates the Envoy
Daemonset to use a `NodePort` service, and the `aws` deployment enables
TCP load balancing with PROXY protocol support. No special options are
needed for `gke` as far as I know, but it is included for completeness.

The traditional quickstart YAML is now located at `config/quickstary.yaml`
and is just a rendering of the base deployment. The netlify redirect can't
be updated until after a release because it points to a release branch.

This updates projectcontour#855, projectcontour#1190, projectcontour#2088, projectcontour#2544.

Signed-off-by: James Peach <jpeach@vmware.com>
jpeach added a commit to jpeach/contour that referenced this issue Apr 27, 2020
Move the example deployment to Kustomize. This requires the `kustomize`
tool, since the version of Kustomize vendored in `kubectl apply -k`
is too old to support.

The YAML documents in the example deployment are broken into 4 components
located in `config/components` - types, contour, envoy and certgen. These
are all included in the default deployments, but operators have the
option of creating deployments that dont't include all the components.
The `types-v1` component contains the Contour CRDs suitable for Kubernetes
1.16 or later.

Deployments to various Kubernetes infrastructure are in the `deployment`
directory. The base deployment pulls in all the components and sets the
namespace to `projectcontour`. The `kind` deployment updates the Envoy
Daemonset to use a `NodePort` service, and the `aws` deployment enables
TCP load balancing with PROXY protocol support. No special options are
needed for `gke` as far as I know, but it is included for completeness.

The traditional quickstart YAML is now located at `config/quickstary.yaml`
and is just a rendering of the base deployment. The netlify redirect can't
be updated until after a release because it points to a release branch.

This updates projectcontour#855, projectcontour#1190, projectcontour#2088, projectcontour#2544.

Signed-off-by: James Peach <jpeach@vmware.com>
Copy link

The Contour project currently lacks enough contributors to adequately respond to all Issues.

This bot triages Issues according to the following rules:

  • After 60d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, the Issue is closed

You can:

  • Mark this Issue as fresh by commenting
  • Close this Issue
  • Offer to help out with triage

Please send feedback to the #contour channel in the Kubernetes Slack

@github-actions github-actions bot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Apr 17, 2024
Copy link

The Contour project currently lacks enough contributors to adequately respond to all Issues.

This bot triages Issues according to the following rules:

  • After 60d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, the Issue is closed

You can:

  • Mark this Issue as fresh by commenting
  • Close this Issue
  • Offer to help out with triage

Please send feedback to the #contour channel in the Kubernetes Slack

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/deployment Issues or PRs related to deployment tooling or infrastructure. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale.
Projects
None yet
Development

No branches or pull requests

3 participants