Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does parallel=True work during training? #2

Open
ribas591 opened this issue Jul 24, 2024 · 3 comments
Open

Does parallel=True work during training? #2

ribas591 opened this issue Jul 24, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@ribas591
Copy link

Hello there!

Thank you for an interesting project!

I wanted to try out the project on my own data set.
Unfortunately, the training time is very long. And despite the flag value, it seems that all the training is done in single-threaded mode.
For example, here is the execution time on a tiny fraction of the dataset with the flag in use (CPU times: user 15.2 s, sys: 29.5 s, total: 44.6 s) and the other one without (CPU times: user 15.8 s, sys: 30.1 s, total: 45.8 s). They are more or less the same.
Is that how it should be? Or am I doing things wrong?

@deadsoul44
Copy link
Contributor

Hello, parallelism has little effect currently. A new version will be released in a couple of days with improved parallelism.

@deadsoul44 deadsoul44 added the enhancement New feature or request label Jul 24, 2024
@ribas591
Copy link
Author

That's great to hear. I'll be happy to test the new version when it's ready.

@deadsoul44
Copy link
Contributor

Version 0.3.8 is released with multi-output support but no parallelism improvement. Parallelism needs more effort than expected due to rayon. See: rayon-rs/rayon#1184

I am trying to find a solution with or without rayon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants