Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing test: Serverless Observability API Integration Tests.x-pack/test_serverless/api_integration/test_suites/common/security/user_profiles·ts - serverless common API security/user_profiles route access internal update #165391

Closed
kibanamachine opened this issue Aug 31, 2023 · 23 comments · Fixed by #165516
Labels
failed-test A test failure on a tracked branch, potentially flaky-test Team:Security Team focused on: Auth, Users, Roles, Spaces, Audit Logging, and more!

Comments

@kibanamachine
Copy link
Contributor

kibanamachine commented Aug 31, 2023

A test failed on a tracked branch

Error: socket hang up
    at connResetException (node:internal/errors:720:14)
    at TLSSocket.socketOnEnd (node:_http_client:525:23)
    at TLSSocket.emit (node:events:526:35)
    at endReadableNT (node:internal/streams/readable:1359:12)
    at processTicksAndRejections (node:internal/process/task_queues:82:21) {
  code: 'ECONNRESET',
  response: undefined
}

First failure: CI Build - main

@kibanamachine kibanamachine added the failed-test A test failure on a tracked branch, potentially flaky-test label Aug 31, 2023
@botelastic botelastic bot added the needs-team Issues missing a team label label Aug 31, 2023
@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

@mistic mistic added the Team:Security Team focused on: Auth, Users, Roles, Spaces, Audit Logging, and more! label Sep 1, 2023
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-security (Team:Security)

@botelastic botelastic bot removed the needs-team Issues missing a team label label Sep 1, 2023
mistic added a commit that referenced this issue Sep 1, 2023
@mistic
Copy link
Member

mistic commented Sep 1, 2023

Skipped.

main: 1158ab5

@azasypkin
Copy link
Member

[00:02:07]             └- ✖ fail: serverless common API security/user_profiles route access internal update
[00:02:07]             │      Error: socket hang up
[00:02:07]             │       at connResetException (node:internal/errors:720:14)
[00:02:07]             │       at TLSSocket.socketOnEnd (node:_http_client:525:23)
[00:02:07]             │       at TLSSocket.emit (node:events:526:35)
[00:02:07]             │       at endReadableNT (node:internal/streams/readable:1359:12)
[00:02:07]             │       at processTicksAndRejections (node:internal/process/task_queues:82:21)

@mistic is it the only test that is affected? Looking at the error it seems that our testing infra isn't healthy in general....

@mistic
Copy link
Member

mistic commented Sep 1, 2023

@azasypkin according to the stats this one seems to be consistently flaky over the latest 24 hours. I don't see the same pattern for others at the moment. Probably Error: socket hang up is just hidding the real cause of the flakyness. Everything is fine with buildkite, github and GCP as well as with our worker metrics

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - main

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
failed-test A test failure on a tracked branch, potentially flaky-test Team:Security Team focused on: Auth, Users, Roles, Spaces, Audit Logging, and more!
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants