-
Notifications
You must be signed in to change notification settings - Fork 24.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CI] AutodetectMemoryLimitIT testManyDistinctOverFields failing #105347
Labels
blocker
:ml
Machine learning
Team:ML
Meta label for the ML team
>test-failure
Triaged test failures from CI
Comments
DaveCTurner
added
:ml
Machine learning
>test-failure
Triaged test failures from CI
labels
Feb 9, 2024
Pinging @elastic/ml-core (Team:ML) |
The first failure was on 3rd November last year. The PR that most likely caused this is elastic/ml-cpp#2585, which was merged on 11th October. So there's a fair gap there before the first failure, but given how sporadic the failures were it's not too hard to believe. The failure margin over the expected upper bound is tiny, so I'll just increase it a bit. |
droberts195
added a commit
to droberts195/elasticsearch
that referenced
this issue
Feb 22, 2024
It seems that the changes of elastic/ml-cpp#2585 combined with the randomness of the test could cause it to fail very occasionally, and by a tiny percentage over the expected upper bound. This change reenables the test by very slightly increasing the upper bound. Fixes elastic#105347
droberts195
added a commit
that referenced
this issue
Feb 22, 2024
It seems that the changes of elastic/ml-cpp#2585 combined with the randomness of the test could cause it to fail very occasionally, and by a tiny percentage over the expected upper bound. This change reenables the test by very slightly increasing the upper bound. Fixes #105347
droberts195
added a commit
to droberts195/elasticsearch
that referenced
this issue
Feb 22, 2024
…105727) It seems that the changes of elastic/ml-cpp#2585 combined with the randomness of the test could cause it to fail very occasionally, and by a tiny percentage over the expected upper bound. This change reenables the test by very slightly increasing the upper bound. Fixes elastic#105347
elasticsearchmachine
pushed a commit
that referenced
this issue
Feb 22, 2024
…#105734) It seems that the changes of elastic/ml-cpp#2585 combined with the randomness of the test could cause it to fail very occasionally, and by a tiny percentage over the expected upper bound. This change reenables the test by very slightly increasing the upper bound. Fixes #105347
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
blocker
:ml
Machine learning
Team:ML
Meta label for the ML team
>test-failure
Triaged test failures from CI
Has failed like this a couple of times in the last 90d
Build scan:
https://gradle-enterprise.elastic.co/s/vowhqrir6w5zo/tests/:x-pack:plugin:ml:qa:native-multi-node-tests:javaRestTest/org.elasticsearch.xpack.ml.integration.AutodetectMemoryLimitIT/testManyDistinctOverFields
Reproduction line:
Applicable branches:
main
Reproduces locally?:
Didn't try
Failure history:
Failure dashboard for
org.elasticsearch.xpack.ml.integration.AutodetectMemoryLimitIT#testManyDistinctOverFields
Failure excerpt:
The text was updated successfully, but these errors were encountered: