-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I heard you like inferring types but don't like global state... #21677
Conversation
Cool! Does this do anything to avoid using a very large amount of stack space? |
base/inference.jl
Outdated
const nactive = Array{Int,0}() | ||
nactive[] = 0 | ||
const workq = Vector{InferenceState}() # set of InferenceState objects that can make immediate progress | ||
# None! There is none! :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
:-)
end | ||
end | ||
|
||
# limit argument type size growth | ||
if mightlimitlength || mightlimitdepth | ||
# TODO: FIXME: this heuristic depends on non-local state making type-inference unpredictable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this change fix this issue?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately not, but this PR does enable a code rewrite to fix the issue. This is also relevant to your other question:
Does this do anything to avoid using a very large amount of stack space?
In short, nope. @vtjnash believes that the future rewrite of the call recursion limit code here will help with stack space usage.
base/inference.jl
Outdated
return frame | ||
end | ||
|
||
function in_egal(item, collection) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Already exists in this file; called contains_is
.
ad3534e
to
c260e1c
Compare
Looks like Line 150 in 970a742
|
That would be crazy! I removed the now-invalid reference to the work queue (and kept the rest of that tiny module), so I guess we'll see what happens. Let's see if this can make it through a whole BaseBenchmarks run: @nanosoldier |
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
This seems to go so smooth, it's almost scary. |
Could anybody hazard a guess as to what's causing the |
It seems like something is failing to drop a gc-reference (https://ci.appveyor.com/project/JuliaLang/julia/build/1.0.13610/job/iex595jdaqb6w9og#L2826) |
base/inference.jl
Outdated
@@ -296,10 +305,7 @@ end | |||
|
|||
#### current global inference state #### |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seriously though, would be nice to just delete this section :)
base/inference.jl
Outdated
end | ||
end | ||
|
||
function resolve_call_cycle!(linfo::MethodInstance, parent::InferenceState) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function could use a comment. Not obvious to me what resolve
means in this context.
base/inference.jl
Outdated
return nothing | ||
function merge_call_chain!(parent::InferenceState, ancestor::InferenceState, child::InferenceState) | ||
# add backedge of parent <- child | ||
# then add all backeges of parent <- parent.parent |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
backedges
base/inference.jl
Outdated
src = ccall(:jl_uncompress_ast, Any, (Any, Any), li.def, li.def.source) | ||
function InferenceState(linfo::MethodInstance, | ||
optimize::Bool, cached::Bool, params::InferenceParams) | ||
# prepare an InferenceState object for infering lambda |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inferring
base/sysimg.jl
Outdated
@@ -73,6 +72,9 @@ include("refpointer.jl") | |||
include("checked.jl") | |||
importall .Checked | |||
|
|||
# buggy handling of ispure in type-inference means this should be after re-defining the basic operations that they might try to call |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
line wrap comment
We could still use devdocs explaining what a backedge is (general comment and lingering todo from the last rewrite, not necessary for this pr to solve) |
Made the requested clean-ups and added a comment describing
AFAIK, a backedge is a reference from one frame to another (within a call graph) that facilitates the reverse-propagation of type information. I'll leave the actual documentation to Jameson, though I imagine that could be done in a separate PR. |
There's several different types. An |
Will this one still make it into v0.6-final? (it fixes some problems I'm having with intermittent (i.e. unit tests failing |
Can you contribute base tests that were fixed by this? |
removes the global work queue, which allows increasing the precision of cycle detection and resolution, and decreases the need for the threading synchronization lock surrounding inference updates `inInference` flag usage to be merely a hint for `jl_type_infer` to not bother trying to infer a method (helps avoid accidental infinite recursion over inferring type inference), enable inferring inference ref #21677 (cherry picked from commit 5847317)
removes the global work queue, which allows increasing the precision of cycle detection and resolution, and decreases the need for the threading synchronization lock surrounding inference updates `inInference` flag usage to be merely a hint for `jl_type_infer` to not bother trying to infer a method (helps avoid accidental infinite recursion over inferring type inference), enable inferring inference (cherry picked from commit 5847317)
removes the global work queue, which allows increasing the precision of cycle detection and resolution, and decreases the need for the threading synchronization lock surrounding inference updates `inInference` flag usage to be merely a hint for `jl_type_infer` to not bother trying to infer a method (helps avoid accidental infinite recursion over inferring type inference), enable inferring inference (cherry picked from commit 5847317)
removes the global work queue, which allows increasing the precision of cycle detection and resolution, and decreases the need for the threading synchronization lock surrounding inference updates `inInference` flag usage to be merely a hint for `jl_type_infer` to not bother trying to infer a method (helps avoid accidental infinite recursion over inferring type inference), enable inferring inference (cherry picked from commit 5847317)
...so @vtjnash and I removed the work queue by refactoring inference to run recursively.
I'm not sure yet how far this will make it through CI. @vtjnash will probably be able to answer any algorithmic questions better than myself - this is my first foray into the dangerous world of type inference. I mainly sat there like a monkey poking at the keyboard.
Besides reducing dependence on global state, this PR provides the following goodies:
This PR enables the following future work:
@vtjnash is sad that he has to write a new blog post now.