Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

We see non-JSON logs in our karpenter output #3816

Closed
cep21 opened this issue Apr 26, 2023 · 2 comments
Closed

We see non-JSON logs in our karpenter output #3816

cep21 opened this issue Apr 26, 2023 · 2 comments
Labels
question Further information is requested

Comments

@cep21
Copy link

cep21 commented Apr 26, 2023

Version

public.ecr.aws/karpenter/controller:v0.27.0

v1.25.8-eks-ec5523e

Expected Behavior

I expect all karpenter logs to be JSON if I turn on JSON log output.

This is a problem since our monitoring solution expects JSON formatted logs and reports non-JSON logs as errors.

Actual Behavior

Almost all logs are JSON, but a few look like this:

I0425 19:35:40.606552       1 trace.go:219] Trace[638853588]: "DeltaFIFO Pop Process" ID:kube-system/aws-node-hnn54,Depth:50,Reason:slow event handlers blocking the queue (25-Apr-2023 19:35:40.435) (total time: 171ms):
Trace[638853588]: [171.165311ms] [171.165311ms] END

Steps to Reproduce the Problem

Turn on JSON logging. Run karpenter

Resource Specs and Logs

helm get values karpenter
USER-SUPPLIED VALUES:
logEncoding: json
serviceAccount:
  annotations:
    eks.amazonaws.com/role-arn: REDACTED
settings:
  aws:
    clusterEndpoint: REDACTED
    clusterName: REDACTED
    defaultInstanceProfile: REDACTED
    interruptionQueueName: karpenter-interrupts-REDACTED

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment
@cep21 cep21 added the bug Something isn't working label Apr 26, 2023
@jonathan-innis
Copy link
Contributor

I believe kubernetes-sigs/karpenter#297 should fix this by setting the output of all the global loggers to be consistent. This change hasn't been released yet, but once it is, I believe this should resolve your issue.

@jonathan-innis jonathan-innis added question Further information is requested and removed bug Something isn't working labels Apr 26, 2023
@njtran
Copy link
Contributor

njtran commented May 8, 2023

Going to close this as the PR has been merged, if the next release that includes this doesn't solve this for you, please re-open.

@njtran njtran closed this as completed May 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants