-
-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Filter impossible packages from the pool #9620
Conversation
I plan to let someone look this over and decide where it could go - and if it does go somewhere I will then look at the lint failures. Tests should be working OK except where I used incompatible syntax. |
Some additional timings / memory with above With Pool Optimiser PR (9261). Great benefit, as it flattens many of the packages into small subset. As you can see rules are much much less.
With this PR. Slight improvement, as it completely removes many packages that don't meet the requirements of the remaining fixed packages. Of course, does not flatten those packages into subset that are still potentially valid candidates. It does reduce the rules even further though regardless.
With both merged together. Best of both. Really small package set now of only those that could be modified to meet the new version update requested, and within that set where constraints are identical across versions they're pre-determined the best candidates.
Worth noting that the Pool Optimiser PR will greatly improve even installations. Where this only improves updates as it needs fixed packages to work from. Overall it should make projects like dependabot that run updates continuously much much quicker. |
Does this PR provide any benefit compared to #9261 ? If not, I'd rather focus the effort on the pool optimizer PR. |
@stof It does provide a benefit - in lower memory and also quicker - at least on my local tests. The size of that benefit is not as substantial though, and only really impacts updates. I did wonder if it could be the second optimisation after #9261 adds the optimiser - of course if it is viable and no issues found with it. I would agree that #9261 should be prioritised over this one, as it impacts install AND update, and is a significant improvement as it is. Maybe this one for later? |
…rated rule count, especially around symfony requirements
05330c5
to
40d689e
Compare
40d689e
to
2443afb
Compare
@Seldaek Thank you 🙏 |
@@ -75,7 +75,7 @@ public function optimize(Request $request, Pool $pool) | |||
{ | |||
$this->prepare($request, $pool); | |||
|
|||
$optimizedPool = $this->optimizeByIdenticalDependencies($pool); | |||
$optimizedPool = $this->optimizeByIdenticalDependencies($request, $pool); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the implementation looks good from a logical standpoint. Nitpick: I would just revisit the way it's implemented here. I designed it so that you would list multiple independent optimization steps here like so:
$pool = $this->optimizeByA($pool);
$pool = $this->optimizeByB($pool);
// etc
And the current one is not related to "optimzing by identical dependencies" so it should imho go into its own step.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I see that, but then IMO the way it's creating a new pool in every optimization pass is perhaps not so ideal. Anyway I'll have another look later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK f4e3121 fixes it but I removed the pool cleanup pass to be done outside of optimizeBy*
so that it's only done once. Otherwise keeping track of all removed packages etc gets even messier.
Thanks @driskell ! Just FYI the other one probably will not make it for 2.2 as I'm still a bit unsure about the edge cases, but hopefully we can get it in at some point. |
Composer 2.2 contains a a bug/over-optimization, which causes the install with changed, but compatible requirements to error out on "conflicting" requirements. This commit puts a work-around for this in place, which still tries to use the caching for the Composer install as much as possible. It is recommended to revert this commit if/when the Composer bug has been fixed and a new version has been released. Ref: * Bug report: composer/composer#10394 * Probable cause: composer/composer#9620 * Similar fix in YoastSEO: Yoast/wordpress-seo@b399942
I have noticed updates of some Drupal packages can be fairly slow. This seems to affect at varying levels. One module I found it reproduces with is if you have
drupal/block_permissions
installed. It appears to be due to the fact it has a requires ofdrupal/core ~8
and in the latest versiondrupal/core ^8 || ^9
.When processing the update of the module it then causes the drupal/core package to be downloaded and added to the pool in all versions since 8.0.0 all the way to 8.9 and then 9.0 all the way to 9.2, and all symfony packages that are depended on in-between those versions. This then generates a significant number of conflict rules.
Using the composer.json at https://github.com/drupal/recommended-project/blob/8.9.x/composer.json to reproduce. When editing with nano change the requirement on block_permissions to "^1.0".
As you can see, nearly 3 quarters of a million rules. In fact, on my project I'm having these issues with - there's around 990,000. The majority are conflict rules.
Specifically this means we can't use dependabot as effectively as we'd hoped - as each check of a module for updates takes a significant amount of time. The PR in #9619 reduces the time a large amount but it still is slow.
This PR aims to reduce the rule count - similar to the Pool Optimisation in #9261 (Maybe this should be a second optimisation rule that can be disabled?) - that aims updates only - and hopefully speed up dependabot and just general updates. It does it by taking the list of packages still fixed, and filtering the pool. This removes all the versions of drupal/core in the pool that do not meet the requirement in the composer.json that are still fixed. Thus reducing the package count substantially.
I do not believe problem output should be impacted as I think if a package did come in needing a different version of something, that would be unlocked by
-w
/-W
if needed to allow that recalculation. Though I may have missed something so worth some eyes.For me it reduces the time by a good 5 seconds for the above scenario. It also significantly reduces the memory usage to about a 10th of what it was.
When combined with the PR in #9619 it reduces timings and memory further.
CC @Toflar as this was based on my investigating drupal bits in the Pool Optimiser PR #9261.
(Background: trying to make Dependabot faster so we can use on Drupal effectively - GitHub's own hits 45 minute timeout for us - running ourselves it takes over an hour, about 75 minutes - with the above two PR it makes the full run less than 5 minutes and significantly less memory, bandwidth and CO2 😉 ... if it turns out is it working that is! )