Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference: stop re-converging worlds after optimization #38820

Merged
merged 1 commit into from
Dec 14, 2020
Merged

Conversation

vtjnash
Copy link
Sponsor Member

@vtjnash vtjnash commented Dec 10, 2020

The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.

This is preparatory work for further reorganization of how optimization choices are ordered (#38231).

The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.
@vtjnash vtjnash requested a review from Keno December 10, 2020 20:44
@Keno
Copy link
Member

Keno commented Dec 12, 2020

LGTM in general, but I had earlier tried a change like this and saw measureable regressions in sysimg build time? Does that happen here also?

end
if last(valid_worlds) == typemax(UInt)
if doopt && last(valid_worlds) == typemax(UInt)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not move this up into the previous loop at this point?

Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This step might require taking a lock (since we're updating a few different caches, that all need to happen at once), so I like having them appear separately, though it's not essential.

@vtjnash
Copy link
Sponsor Member Author

vtjnash commented Dec 12, 2020

I instrumented this to show that it never made a difference, while building the system image, on the world ages that we cached. Seems to also run at about the same speed.

@Keno Keno merged commit 8c01444 into master Dec 14, 2020
@Keno Keno deleted the jn/infer-worlds branch December 14, 2020 19:23
aviatesk added a commit to aviatesk/JET.jl that referenced this pull request Dec 21, 2020
aviatesk added a commit to aviatesk/JET.jl that referenced this pull request Dec 21, 2020
KristofferC pushed a commit that referenced this pull request Jan 21, 2021
The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.

(cherry picked from commit 8c01444)
vtjnash added a commit that referenced this pull request Jan 21, 2021
The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.

(cherry picked from commit 8c01444)
@KristofferC KristofferC added the backport 1.6 Change should be backported to release-1.6 label Jan 21, 2021
@KristofferC KristofferC mentioned this pull request Jan 21, 2021
60 tasks
@KristofferC KristofferC removed the backport 1.6 Change should be backported to release-1.6 label Feb 1, 2021
KristofferC pushed a commit that referenced this pull request Feb 1, 2021
The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.

(cherry picked from commit 8c01444)
ElOceanografo pushed a commit to ElOceanografo/julia that referenced this pull request May 4, 2021
)

The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.
staticfloat pushed a commit that referenced this pull request Dec 23, 2022
The validity did not change, so we should not need to update it. This
also ensures we copy over all result information earlier, so we can
destroy the InferenceState slightly sooner, and slightly cleaner data flow.

(cherry picked from commit 8c01444)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants