Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Throttling request warnings #1228

Closed
lblackstone opened this issue Jul 31, 2020 · 2 comments
Closed

Throttling request warnings #1228

lblackstone opened this issue Jul 31, 2020 · 2 comments
Labels
kind/bug Some behavior is incorrect or out of spec resolution/fixed This issue was fixed

Comments

@lblackstone
Copy link
Member

Problem description

While investigating #1222, I noticed that installing cert-manager produces some throttling request warnings. I expect this is because the provider submits a bunch of parallel requests.

Errors & Logs

I0731 16:16:42.552853   90306 request.go:621] Throttling request took 1.156452855s, request: POST:https://kubernetes.docker.internal:6443/apis/rbac.authorization.k8s.io/v1/clusterroles
I0731 16:16:52.755191   90306 request.go:621] Throttling request took 4.591436804s, request: GET:https://kubernetes.docker.internal:6443/apis/rbac.authorization.k8s.io/v1beta1/namespaces/kube-system/roles/cert-manager-cainjector:leaderelection

Affected product version(s)

Reproducing the issue

import * as k8s from "@pulumi/kubernetes";

const certManager = new k8s.yaml.ConfigFile("cert-manager", {
    file: "https://github.com/jetstack/cert-manager/releases/download/v0.16.0/cert-manager.yaml",
});

Suggestions for a fix

I see a couple possibilities:

  1. Suppress these warnings and assume that the request will eventually be successful after retrying
  2. Attempt to avoid throttling by limiting concurrency on resource updates.
@ninja-
Copy link

ninja- commented Aug 4, 2020

I would suggest sorting resources and applying CRDs always first...
Or if it's webhooks hanging the deploy - waiting for webhook pod to be up, some logic to sort the order based on that too - not sure.

@lblackstone lblackstone added kind/bug Some behavior is incorrect or out of spec resolution/fixed This issue was fixed labels Dec 21, 2022
@lblackstone
Copy link
Member Author

This can be fixed by setting the Kubernetes client settings appropriately. Sometimes the default is too low for stacks with multiple resources.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Some behavior is incorrect or out of spec resolution/fixed This issue was fixed
Projects
None yet
Development

No branches or pull requests

2 participants