Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The PreCommit Python ML tests with ML deps installed job is flaky #31285

Closed
github-actions bot opened this issue May 14, 2024 · 6 comments · Fixed by #32432
Closed

The PreCommit Python ML tests with ML deps installed job is flaky #31285

github-actions bot opened this issue May 14, 2024 · 6 comments · Fixed by #32432

Comments

@github-actions
Copy link
Contributor

The PreCommit Python ML tests with ML deps installed is failing over 50% of the time.
Please visit https://github.com/apache/beam/actions/workflows/beam_PreCommit_Python_ML.yml?query=is%3Afailure+branch%3Amaster to see all failed workflow runs.
See also Grafana statistics: http://metrics.beam.apache.org/d/CTYdoxP4z/ga-post-commits-status?orgId=1&viewPanel=7&var-Workflow=PreCommit%20Python%20ML%20tests%20with%20ML%20deps%20installed

@liferoad
Copy link
Collaborator

Looks good with 10 runs.

@github-actions github-actions bot added this to the 2.57.0 Release milestone May 18, 2024
@github-actions github-actions bot reopened this May 26, 2024
Copy link
Contributor Author

Reopening since the workflow is still flaky

@liferoad liferoad self-assigned this May 27, 2024
@liferoad
Copy link
Collaborator

#31407 should fix this.

@liferoad
Copy link
Collaborator

Runs are green now:

image

@github-actions github-actions bot reopened this Sep 10, 2024
Copy link
Contributor Author

Reopening since the workflow is still flaky

@liferoad
Copy link
Collaborator

xgboost 1.7.6 is not supported on this platform: occured on py3.8, py3.9, py.3.11, and py 3.12

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant