Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add back cmake command-line cache on unix #86764

Closed
wants to merge 6 commits into from

Conversation

sbomer
Copy link
Member

@sbomer sbomer commented May 25, 2023

This was deleted in #64370 because a canceled configure would leave around the cache file, causing subsequent invocations to start building with the incomplete configure.

I'm adding it back along with a fix to write the cache file after configure, as suggested in #64370 (comment). I made the same fix to the windows logic.

This saves 1-2s during incremental builds on my machine. Contributes to #47022.

@ghost ghost assigned sbomer May 25, 2023
@dotnet-issue-labeler dotnet-issue-labeler bot added the needs-area-label An area label is needed to ensure this gets routed to the appropriate area owners label May 25, 2023
@sbomer sbomer requested a review from janvorli May 25, 2023 17:19
@janvorli
Copy link
Member

Besides the interrupted build, there was another case that was making this caching painful for me. That was when I was adding or modifying certain feature detection in configure.cmake stuff. Such changes require the cmake to be rerun, so the caching was preventing these changes from having any effect and I had to manually delete the artifacts/obj directory each time I've made a change into that file. Actually, editing anything in the cmake files has no effect unless you delete the artifacts/obj if the caching is added. That might be unexpected to people when they update their local state from github without cleaning the artifacts folder.
Due to the issues I've mentioned, it doesn't seem worth the 1-2 secs gain in build time for me, but if we really want to make this change, then I'd like to at least have a command line option to disable this caching.

@sbomer
Copy link
Member Author

sbomer commented May 25, 2023

@janvorli thanks for sharing your thoughts. My goal is to have a mode where incremental builds don't re-run configure, to match the typical cmake flow of running configure when you change the build definitions, then only running the build in the inner loop. The few seconds are significant in relation to the ~5s goal mentioned in #47022.

We could do this by requiring an opt-in to get the fast innerloop build via -skipconfigure - but that requires running two commands, first to configure, then to get a fast incremental build (and remembering to pass an extra flag). I suspect many runtime devs would not end up benefiting from it. I would also like the behavior to be consistent with our Windows scripts, where we do use the cache file currently.

Another option is to split configure out into a separate step that must be run explicitly before build - but I don't think runtime devs will have an appetite for this.

So the next best option seems to make the default experience use a cache, and have an opt-out as you suggested. We have a -configureonly flag that should bypass the cache (this is broken on unix, but I can fix it). It will require running two commands (-configureonly, followed by normal build). Does that work for your use, or is it important to have a flag that can skip the cache and then build, in one command?

@am11
Copy link
Member

am11 commented May 25, 2023

This is saving a second for one use-case and adding 15 minutes penalty for others where there is no way but to delete the artifacts directory and rebutild. cmake has its own caching mechanism which everyone uses and should come natural. I don't think this is the right place to save 1-2 seconds.

You can save time by deduplicating and deleting unused / unnecessary introspection checks. Plenty to clean up on the real side of things..

@vargaz
Copy link
Contributor

vargaz commented May 25, 2023

I agree as well that the default build should be safe, and if somebody wants a shortcut build which is faster but can sometime fail, then there should be some kind of option to do that.

@janvorli
Copy link
Member

After thinking about it a bit more, I feel more strongly now that the default should be to not to use the cache. We can add a command line option to enable the caching for the cases where we care about the extra second or two.
The reason is a danger that having the cache can cause people to get incorrect builds after syncing from github / making changes to the cmake build files. That is quite non-obvious, I would expect the whole build to be incremental including the cmake file changes. I really dislike cases when I make a change, it doesn't seem to work, so I have to spend time debugging and investigating what's wrong and then figuring out that it was actually a problem of the build not picking my changes. Or even worse, creating a build for testing at a customer that ends up not having intended fixes.

@teo-tsirpanis teo-tsirpanis added area-Infrastructure and removed needs-area-label An area label is needed to ensure this gets routed to the appropriate area owners labels May 25, 2023
@ghost
Copy link

ghost commented May 25, 2023

Tagging subscribers to this area: @dotnet/runtime-infrastructure
See info in area-owners.md if you want to be subscribed.

Issue Details

This was deleted in #64370 because a canceled configure would leave around the cache file, causing subsequent invocations to start building with the incomplete configure.

I'm adding it back along with a fix to write the cache file after configure, as suggested in #64370 (comment). I made the same fix to the windows logic.

This saves 1-2s during incremental builds on my machine. Contributes to #47022.

Author: sbomer
Assignees: sbomer
Labels:

area-Infrastructure

Milestone: -

@sbomer
Copy link
Member Author

sbomer commented May 31, 2023

I experimented a bit, and it looks like cmake has logic to re-run configure during the build (make or ninja) if it detects changes to the cmake files. For example if I introduce a fatal error into some cmake file, even with the caching, it re-configures and fails the build. Does that address these concerns, or are there scenarios where it's still a problem? If so I'd like to understand the issue better.

@janvorli
Copy link
Member

janvorli commented Jun 1, 2023

@sbomer I have thought that your change was skipping cmake completely when the command line was detected to be unchanged. That would prevent the cmake logic you've mentioned from kicking in and that's exactly the concern I had. Am I missing something then?

@vargaz
Copy link
Contributor

vargaz commented Jun 1, 2023

The generated makefiles/ninja files check the cmake files for updates. It can pick up changes to the files themselves, but not environment changes, etc.

@akoeplinger
Copy link
Member

The generated makefiles/ninja files check the cmake files for updates. It can pick up changes to the files themselves, but not environment changes, etc.

I think that re-running cmake configure also doesn't pick up environment changes so adding the cache file should have the same behavior.
I guess we could try it for a few weeks and revert it if someone runs into a case we didn't consider?

@am11
Copy link
Member

am11 commented Jun 19, 2023

I think that re-running cmake configure also doesn't pick up environment changes so adding the cache file should have the same behavior.

It does. If you install a missing dependency and rerun the build command, it works with what we have in main but not with this hand-rolled cache.

@akoeplinger
Copy link
Member

akoeplinger commented Jun 19, 2023

@am11 that's a different scenario, in that case the configure step fails before we write the cache file (with this PR) so it'd be run again the next time.

@am11
Copy link
Member

am11 commented Jun 19, 2023

that's a different scenario,

This PR is reverting my PR which was fixing the exact scenario. Also see #65533 for other issues with this kind of custom caching.

@stephentoub
Copy link
Member

@sbomer, this hasn't been touched in a year. Are you still working on it? Should it be closed?

@sbomer
Copy link
Member Author

sbomer commented Jul 22, 2024

We can close it.

@sbomer sbomer closed this Jul 22, 2024
@github-actions github-actions bot locked and limited conversation to collaborators Aug 22, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants