Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache lookup of matching methods between inference and inlining #34339

Merged
merged 1 commit into from
Mar 10, 2020

Conversation

yhls
Copy link
Contributor

@yhls yhls commented Jan 10, 2020

This reduces time spent in METHOD_MATCH by roughly 40%, as measured with #define ENABLE_TIMINGS in options.h. I found that, during inference, attempting to look up methods in the cache gave slightly better performance than only adding to it then.

We save ~2% of the time in compiling the sysimage, for which METHOD_MATCH currently takes ~5% of the time. We save about 7% of the time for julia -e "using Plots; using DataFrames; display(plot(rand(10), rand(10)))", which currently spends ~20% of the time in INFERENCE (which includes inlining) and ~15% in METHOD_MATCH.

@JeffBezanson JeffBezanson added the compiler:latency Compiler latency label Jan 10, 2020
@yhls yhls force-pushed the yhls/cache-matching-methods branch from 1400c1e to e67b3e5 Compare January 15, 2020 19:59
@yhls yhls force-pushed the yhls/cache-matching-methods branch 2 times, most recently from 1610660 to 5b97352 Compare February 26, 2020 22:28
@@ -41,6 +41,10 @@ mutable struct InferenceState
inferred::Bool
dont_work_on_me::Bool

# cached results of calling `_methods_by_ftype`, including `min_valid` and
# `max_valid`, to be used in inlining
matching_methods_cache::IdDict{DataType, Tuple{Any, UInt, UInt}}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
matching_methods_cache::IdDict{DataType, Tuple{Any, UInt, UInt}}
matching_methods_cache::IdDict{Any, Tuple{Any, UInt, UInt}}

@vtjnash what do you think?

The results of calling `_methods_by_ftype` in inference are now kept in
`InferenceState` objects, forwarded to `OptimizationState`s, and used in
inlining.
@yhls yhls force-pushed the yhls/cache-matching-methods branch from 5b97352 to 2253e6f Compare March 6, 2020 18:43
@yhls
Copy link
Contributor Author

yhls commented Mar 6, 2020

By latest measurements, we save about 8% of the time for julia -e "using Plots; using DataFrames; display(plot(rand(10), rand(10)))" overall, and the display part by itself is 2% faster.

@JeffBezanson JeffBezanson added the compiler:inference Type inference label Mar 10, 2020
@JeffBezanson JeffBezanson merged commit 4085873 into JuliaLang:master Mar 10, 2020
ravibitsgoa pushed a commit to ravibitsgoa/julia that referenced this pull request Apr 9, 2020
…aLang#34339)

The results of calling `_methods_by_ftype` in inference are now kept in
`InferenceState` objects, forwarded to `OptimizationState`s, and used in
inlining.
KristofferC pushed a commit that referenced this pull request Apr 11, 2020
The results of calling `_methods_by_ftype` in inference are now kept in
`InferenceState` objects, forwarded to `OptimizationState`s, and used in
inlining.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference compiler:latency Compiler latency
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants