-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix flaky ml test by increasing timeout #9523
Conversation
The ml tests sometimes fail on Travis or Jenkins but it seems when looking at the output of the files normally all entries are there. As they pass most of the time I assume it is an issue related to the timeout. Increasing the timeout to 60s.
jenkins, test this |
This should fix #8359 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to know why this takes so long, but I guess this is fine by now to reduce flakyness.
@jsoriano me too. My assumption is that Travis just does not have enough memory available for ML so everything is really slow. I'll keep the related issue for now so we can keep tracking it. |
The ml tests sometimes fail on Travis or Jenkins but it seems when looking at the output of the files normally all entries are there. As they pass most of the time I assume it is an issue related to the timeout. Increasing the timeout to 60s. (cherry picked from commit 220cdee)
The ml tests sometimes fail on Travis or Jenkins but it seems when looking at the output of the files normally all entries are there. As they pass most of the time I assume it is an issue related to the timeout. Increasing the timeout to 60s. (cherry picked from commit 220cdee)
The ml tests sometimes fail on Travis or Jenkins but it seems when looking at the output of the files normally all entries are there. As they pass most of the time I assume it is an issue related to the timeout. Increasing the timeout to 60s.