Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MethodError no method matching convert(::Type{AssertionError}, ::String) #24951

Closed
staticfloat opened this issue Dec 6, 2017 · 19 comments · Fixed by #27568
Closed

MethodError no method matching convert(::Type{AssertionError}, ::String) #24951

staticfloat opened this issue Dec 6, 2017 · 19 comments · Fixed by #27568
Labels
bug Indicates an unexpected problem or unintended behavior compiler:inference Type inference

Comments

@staticfloat
Copy link
Member

After including my Giant Julia Codebase (TM), I get a MethodError when I try to define a docstring:

julia> include("libcough/libcough.jl")
ERROR: LoadError: LoadError: LoadError: MethodError: no method matching convert(::Type{AssertionError}, ::String)
Closest candidates are:
  convert(::Type{Any}, ::ANY) at essentials.jl:28
  convert(::Type{T}, ::T) where T at essentials.jl:29
Stacktrace:
 [1] include_from_node1(::String) at ./loading.jl:576
 [2] include(::String) at ./sysimg.jl:14
 [3] include_from_node1(::String) at ./loading.jl:576
 [4] include(::String) at ./sysimg.jl:14
 [5] include_from_node1(::String) at ./loading.jl:576
 [6] include(::String) at ./sysimg.jl:14
while loading /app/libcough/models/FluxCommon.jl, in expression starting on line 51
while loading /app/libcough/models.jl, in expression starting on line 17
while loading /app/libcough/libcough.jl, in expression starting on line 7

If I take the docstring that I am trying to define here and paste it into the current REPL right after this, I get the same, but only once:

julia> """
           train_epoch!(model::FluxModel, x, y, other junk...)

       Trains a `Flux` model for a single epoch, printing out average loss and
       accuracy, with a given batch size, printing out progress, etc...
       """
       function train_epoch!() return 1.0; end
ERROR: MethodError: no method matching convert(::Type{AssertionError}, ::String)
Closest candidates are:
  convert(::Type{Any}, ::ANY) at essentials.jl:28
  convert(::Type{T}, ::T) where T at essentials.jl:29

julia> """
           train_epoch!(model::FluxModel, x, y, other junk...)

       Trains a `Flux` model for a single epoch, printing out average loss and
       accuracy, with a given batch size, printing out progress, etc...
       """
       function train_epoch!() return 1.0; end
train_epoch!

This says to me that something is getting messed up within some internal docstring parsing method, but I don't know how to begin debugging this. This is running on the latest release-0.6:

Julia Version 0.6.2-pre.50
Commit 70b0055be8 (2017-11-30 05:15 UTC)
Platform Info:
  OS: Linux (x86_64-pc-linux-gnu)
  CPU: Intel(R) Core(TM) i5-7600K CPU @ 3.80GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Prescott)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.9.1 (ORCJIT, broadwell)
@staticfloat staticfloat added the bug Indicates an unexpected problem or unintended behavior label Dec 6, 2017
@staticfloat
Copy link
Member Author

It is perhaps helpful to record here that this doesn't happen anymore if I don't import DocOpt (one of my many dependencies).

@ararslan
Copy link
Member

ararslan commented Dec 6, 2017

If I remember correctly, there was a commit in either the 0.6.1 or 0.6.2 backports that was causing this and was caught by PkgEval, but it was reverted before the backports branch was merged into the release branch. Odd that you're still seeing it on the release branch... Do you also see it with the 0.6.1 binary?

@staticfloat
Copy link
Member Author

This happens in 0.6.0, 0.6.1 and 0.6.2-pre.50

@StefanKarpinski
Copy link
Member

Why should this conversion exist?

@staticfloat
Copy link
Member Author

I'm honestly not sure; I'm having a hard time even tracking down where such a conversion is requested. Because I don't get a backtrace from the error, it's not obvious to me where this is coming from. Somewhere within the docsystem perhaps?

@fredrikekre
Copy link
Member

@staticfloat
Copy link
Member Author

staticfloat commented Dec 11, 2017

Running this inside of julia-debug gives the following:

julia-debug: /src/julia-release-0.6/src/gf.c:152: jl_specializations_get_linfo: Assertion `linfo->min_world <= sf->min_world && linfo->max_world >= sf->max_world' failed.

signal (6): Aborted
while loading /app/libcough/models/FluxCommon.jl, in expression starting on line 51
raise at /build/glibc-6V9RKT/glibc-2.19/signal/../nptl/sysdeps/unix/sysv/linux/raise.c:56
abort at /build/glibc-6V9RKT/glibc-2.19/stdlib/abort.c:89
__assert_fail_base at /build/glibc-6V9RKT/glibc-2.19/assert/assert.c:92
__assert_fail at /build/glibc-6V9RKT/glibc-2.19/assert/assert.c:101
jl_specializations_get_linfo at /src/julia-release-0.6/src/gf.c:152
code_for_method at ./inference.jl:2423
unknown function (ip: 0x7f596d6e5955)
jl_call_fptr_internal at /src/julia-release-0.6/src/julia_internal.h:339
jl_call_method_internal at /src/julia-release-0.6/src/julia_internal.h:358
jl_apply_generic at /src/julia-release-0.6/src/gf.c:1926
typeinf_edge at ./inference.jl:2510
unknown function (ip: 0x7f596d6e563a)
jl_call_fptr_internal at /src/julia-release-0.6/src/julia_internal.h:339
jl_call_method_internal at /src/julia-release-0.6/src/julia_internal.h:358
jl_apply_generic at /src/julia-release-0.6/src/gf.c:1926
abstract_call_gf_by_type at ./inference.jl:1420
unknown function (ip: 0x7f596d6e2a56)
jl_call_fptr_internal at /src/julia-release-0.6/src/julia_internal.h:339
jl_call_method_internal at /src/julia-release-0.6/src/julia_internal.h:358
jl_apply_generic at /src/julia-release-0.6/src/gf.c:1926
...

Full stack trace here.

@staticfloat
Copy link
Member Author

staticfloat commented Dec 16, 2017

@vtjnash I've got gdb open, trying to figure out why this assert would be failing inside of julia-debug. Right now, my linfo says that linfo->min_world is 1, but linfo->max_world is 0. This seems like a problem and should never happen, right?

For comparison, sf->min_world is 22377, and sf->max_world is 22378.

@staticfloat
Copy link
Member Author

Looks like the method specialization being looked at is within PyCall.jl:

(gdb) p (char *)jl_symbol_name(m->name)
$32 = 0x7fffecc26910 "<="
(gdb) p (char *)jl_symbol_name(m->file)
$33 = 0x7fffece1ff30 "/root/.julia/v0.6/PyCall/src/pyoperators.jl"

@staticfloat
Copy link
Member Author

My best guess as to why this is happening is because we're hitting this line for a method, and then using it afterward. TBH I don't really understand what's going on here.

@iamed2
Copy link
Contributor

iamed2 commented Jun 4, 2018

This is happening for us and is a massive problem for us. Here is an example of an experience of one of our researchers:

               _
   _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: https://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.6.3 (2018-05-28 20:20 UTC)
 _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org/ release
|__/                   |  x86_64-apple-darwin14.5.0

julia> using GPForecasting
[warn | HelloBatch]: Could not load resource file
ERROR: LoadError: LoadError: MethodError: no method matching convert(::Type{AssertionError}, ::String)
Closest candidates are:
  convert(::Type{Any}, ::ANY) at essentials.jl:28
  convert(::Type{T}, ::T) where T at essentials.jl:29
Stacktrace:
 [1] include_from_node1(::String) at ./loading.jl:576
 [2] include(::String) at ./sysimg.jl:14
 [3] include_from_node1(::String) at ./loading.jl:576
 [4] eval(::Module, ::Any) at ./boot.jl:235
 [5] _require(::Symbol) at ./loading.jl:490
 [6] require(::Symbol) at ./loading.jl:405
while loading /Users/ericperim/GIT_Repos/GPForecasting.jl/src/datahandling.jl, in expression starting on line 5
while loading /Users/ericperim/GIT_Repos/GPForecasting.jl/src/GPForecasting.jl, in expression starting on line 86

julia> exit()
MethodError(Core.Inference.convert, (AssertionError, "invalid age range update"), 0x0000000000000ac5)

This shows up with LoadError stacktraces pointing all over the place. It seems to be unrelated to any particular line of our code. Unfortunately this package is private and we can't share it.

Here are all the packages REQUIREd:

Distances 0.6.0
LineSearches 3.2.5
Optim 0.14.1
ModelAnalysis 0.0.0  # private
PyCall 1.15.0
Memento 0.5.0
Nabla 0.1.0
FDM 0.1.0
IterTools 0.2.1
ArgParse 0.5.0
DataFrames 0.11.5

@ararslan
Copy link
Member

ararslan commented Jun 4, 2018

If someone can figure out how to fix this before 0.7 is released, I'll do a 0.6.4 to include the fix.

@Keno
Copy link
Member

Keno commented Jun 7, 2018

Have either of you tried whether #26514 fixes this bug?

@iamed2
Copy link
Contributor

iamed2 commented Jun 7, 2018

I have not

@iamed2
Copy link
Contributor

iamed2 commented Jun 7, 2018

I have now tried it; it does not fix the bug.

However, this error does appear differently on my system:

               _
   _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: https://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.6.4-pre.1 (2018-06-07 15:56 UTC)
 _/ |\__'_|_|_|\__'_|  |  ed/world-age-cherrypick/58769d3116 (fork: 1 commits, 7 days)
|__/                   |  x86_64-apple-darwin17.6.0

julia> using GPForecasting
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/Nabla.ji for module Nabla.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/LineSearches.ji for module LineSearches.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/Optim.ji for module Optim.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/PyCall.ji for module PyCall.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/Memento.ji for module Memento.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/DataFrames.ji for module DataFrames.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/Distances.ji for module Distances.
INFO: Recompiling stale cache file /Users/ericdavies/.playground/share/invenia/packages/lib/v0.6/ModelAnalysis.ji for module ModelAnalysis.
WARNING: config(args...; kwargs...) is deprecated, use config!(args...; kwargs...) instead.
Stacktrace:
 [1] depwarn(::String, ::Symbol) at ./deprecated.jl:70
 [2] #config#82(::Array{Any,1}, ::Function, ::String, ::Vararg{String,N} where N) at ./deprecated.jl:57
 [3] (::Memento.#kw##config)(::Array{Any,1}, ::Memento.#config, ::String, ::Vararg{String,N} where N) at ./<missing>:0
 [4] include_from_node1(::String) at ./loading.jl:576
 [5] include(::String) at ./sysimg.jl:14
 [6] anonymous at ./<missing>:2
 [7] eval(::Module, ::Any) at ./boot.jl:235
 [8] process_options(::Base.JLOptions) at ./client.jl:286
 [9] _start() at ./client.jl:371
while loading /Users/ericdavies/.playground/share/invenia/packages/v0.6/HelloBatch/src/HelloBatch.jl, in expression starting on line 24
[warn | HelloBatch]: Could not load resource file
[warn | HelloBatch]: Could not load resource file
ERROR: LoadError: LoadError: MethodError: no method matching convert(::Type{AssertionError}, ::String)
Closest candidates are:
  convert(::Type{Any}, ::ANY) at essentials.jl:28
  convert(::Type{T}, ::T) where T at essentials.jl:29
Stacktrace:
 [1] include_from_node1(::String) at ./loading.jl:576
 [2] include(::String) at ./sysimg.jl:14
 [3] include_from_node1(::String) at ./loading.jl:576
 [4] eval(::Module, ::Any) at ./boot.jl:235
 [5] _require(::Symbol) at ./loading.jl:490
 [6] require(::Symbol) at ./loading.jl:405
while loading /Users/ericdavies/.playground/share/invenia/packages/v0.6/GPForecasting/src/datahandling.jl, in expression starting on line 5
while loading /Users/ericdavies/.playground/share/invenia/packages/v0.6/GPForecasting/src/GPForecasting.jl, in expression starting on line 86

julia> quit()
ERROR: MethodError: no method matching convert(::Type{AssertionError}, ::String)
Closest candidates are:
  convert(::Type{Any}, ::ANY) at essentials.jl:28
  convert(::Type{T}, ::T) where T at essentials.jl:29
Stacktrace:
 [1] AssertionError(::String) at ./coreimg.jl:14
 [2] update_valid_age!(::UInt64, ::UInt64, ::Core.Inference.InferenceState) at ./inference.jl:2353
 [3] add_backedge!(::Core.MethodInstance, ::Core.Inference.InferenceState) at ./inference.jl:2366
 [4] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1421
 [5] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [6] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [7] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [8] (::Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState})(::Expr) at ./<missing>:0
 [9] next(::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Int64) at ./generator.jl:45
 [10] copy!(::Array{Any,1}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./abstractarray.jl:573
 [11] _collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Core.Inference.HasShape) at ./array.jl:391
 [12] collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./array.jl:388
 [13] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1901
 [14] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [15] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [16] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [17] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [18] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [19] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [20] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [21] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [22] (::Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState})(::Expr) at ./<missing>:0
 [23] next(::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Int64) at ./generator.jl:45
 [24] copy!(::Array{Any,1}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./abstractarray.jl:573
 [25] _collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Core.Inference.HasShape) at ./array.jl:391
 [26] collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./array.jl:388
 [27] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1901
 [28] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [29] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [30] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [31] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [32] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [33] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [34] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [35] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [36] (::Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState})(::Expr) at ./<missing>:0
 [37] next(::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Int64) at ./generator.jl:45
 [38] copy!(::Array{Any,1}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./abstractarray.jl:573
 [39] _collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Core.Inference.HasShape) at ./array.jl:391
 [40] collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./array.jl:388
 [41] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1901
 [42] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [43] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [44] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [45] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [46] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [47] abstract_call(::Any, ::Tuple{}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [48] abstract_apply(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1561
 [49] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1689
 [50] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [51] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [52] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [53] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [54] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [55] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [56] abstract_call(::Any, ::Tuple{}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [57] abstract_apply(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1561
 [58] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1689
 [59] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [60] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [61] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [62] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [63] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [64] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [65] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [66] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [67] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [68] (::Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState})(::Expr) at ./<missing>:0
 [69] next(::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Int64) at ./generator.jl:45
 [70] copy!(::Array{Any,1}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./abstractarray.jl:573
 [71] _collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Core.Inference.HasShape) at ./array.jl:391
 [72] collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./array.jl:388
 [73] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1901
 [74] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [75] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [76] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [77] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [78] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [79] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [80] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [81] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [82] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [83] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [84] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [85] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [86] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [87] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [88] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [89] abstract_interpret(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:2076
 [90] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2669
 [91] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [92] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [93] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [94] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [95] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [96] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [97] abstract_interpret(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:2076
 [98] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2669
 [99] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [100] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [101] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [102] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [103] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [104] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [105] (::Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState})(::Expr) at ./<missing>:0
 [106] next(::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Int64) at ./generator.jl:45
 [107] copy!(::Array{Any,1}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./abstractarray.jl:573
 [108] _collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}, ::Core.Inference.HasShape) at ./array.jl:391
 [109] collect(::Type{Any}, ::Core.Inference.Generator{Array{Any,1},Core.Inference.##191#192{Array{Any,1},Core.Inference.InferenceState}}) at ./array.jl:388
 [110] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1901
 [111] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [112] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [113] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [114] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [115] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [116] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [117] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [118] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [119] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [120] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [121] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [122] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [123] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [124] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [125] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [126] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [127] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [128] typeinf_edge(::Method, ::Any, ::SimpleVector, ::Core.Inference.InferenceState) at ./inference.jl:2535
 [129] abstract_call_gf_by_type(::Any, ::Any, ::Core.Inference.InferenceState) at ./inference.jl:1420
 [130] abstract_call(::Any, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1897
 [131] abstract_eval_call(::Expr, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1927
 [132] abstract_eval(::Any, ::Array{Any,1}, ::Core.Inference.InferenceState) at ./inference.jl:1950
 [133] typeinf_work(::Core.Inference.InferenceState) at ./inference.jl:2722
 [134] typeinf(::Core.Inference.InferenceState) at ./inference.jl:2787
 [135] typeinf_frame(::Core.MethodInstance, ::Bool, ::Bool, ::Core.Inference.InferenceParams) at ./inference.jl:2504
 [136] typeinf_code(::Core.MethodInstance, ::Bool, ::Bool, ::Core.Inference.InferenceParams) at ./inference.jl:2583
 [137] typeinf_ext(::Core.MethodInstance, ::UInt64) at ./inference.jl:2622
MethodError(Core.Inference.convert, (AssertionError, "invalid age range update"), 0x0000000000000ac5)

It also does not appear with --compilecache=no.

@JeffBezanson JeffBezanson added the compiler:inference Type inference label Jun 7, 2018
@JeffBezanson JeffBezanson added this to the 1.0.x milestone Jun 7, 2018
Keno added a commit that referenced this issue Jun 14, 2018
This issue possibly fixes #24951 (or at least the test case by iamed2).
We believe the original code here meant to say either:

    ((jl_typemap_entry_t*)v)->min_world = ((jl_typemap_entry_t*)v)->max_world + 1;

or

    ((jl_typemap_entry_t*)v)->max_world = ((jl_typemap_entry_t*)v)->min_world - 1;

i.e. set the range of applicable worlds to be empty. What happened instead
was that the given typemap entry that was supposed to be deleted became valid
for one particular world and that world only. Thus any code running in that
particular world may try to access the deleted typemap entry (or add a backedge
to it), causing either incorrect behavior or the assertion failure noted
in the issue. One additional complication is that these world ages are being
deserialized, i.e. they may be larger than the currently possible max world age.
This makes this slightly more likely to happen, since the current process
may work its way up to that world age and exectue some code.

In any case, there's not much value to keeping around the deserialized max or min
world, so just mark them as 1:0, as we do for other deleted entries.

Co-authored-by: Jameson Nash <jameson@juliacomputing.com>
vtjnash pushed a commit that referenced this issue Jun 14, 2018
This issue possibly fixes #24951 (or at least the test case by iamed2).
We believe the original code here meant to say either:

    ((jl_typemap_entry_t*)v)->min_world = ((jl_typemap_entry_t*)v)->max_world + 1;

or

    ((jl_typemap_entry_t*)v)->max_world = ((jl_typemap_entry_t*)v)->min_world - 1;

i.e. set the range of applicable worlds to be empty. What happened instead
was that the given typemap entry that was supposed to be deleted became valid
for one particular world and that world only. Thus any code running in that
particular world may try to access the deleted typemap entry (or add a backedge
to it), causing either incorrect behavior or the assertion failure noted
in the issue. One additional complication is that these world ages are being
deserialized, i.e. they may be larger than the currently possible max world age.
This makes this slightly more likely to happen, since the current process
may work its way up to that world age and exectue some code.

In any case, there's not much value to keeping around the deserialized max or min
world, so just mark them as [1:0], as we do for other deleted entries.

Co-authored-by: Jameson Nash <jameson@juliacomputing.com>
vtjnash added a commit that referenced this issue Jun 14, 2018
This issue possibly fixes #24951 (or at least the test case by iamed2).
We believe the original code here meant to say either:

    ((jl_typemap_entry_t*)v)->min_world = ((jl_typemap_entry_t*)v)->max_world + 1;

or

    ((jl_typemap_entry_t*)v)->max_world = ((jl_typemap_entry_t*)v)->min_world - 1;

i.e. set the range of applicable worlds to be empty. What happened instead
was that the given typemap entry that was supposed to be deleted became valid
for one particular world and that world only. Thus any code running in that
particular world may try to access the deleted typemap entry (or add a backedge
to it), causing either incorrect behavior or the assertion failure noted
in the issue. One additional complication is that these world ages are being
deserialized, i.e. they may be larger than the currently possible max world age.
This makes this slightly more likely to happen, since the current process
may work its way up to that world age and exectue some code.

In any case, there's not much value to keeping around the deserialized max or min
world, so just mark them as [1:0], as we do for other deleted entries.

Co-authored-by: Jameson Nash <vtjnash@gmail.com>
Keno added a commit that referenced this issue Jun 16, 2018
This issue possibly fixes #24951 (or at least the test case by iamed2).
We believe the original code here meant to say either:

    ((jl_typemap_entry_t*)v)->min_world = ((jl_typemap_entry_t*)v)->max_world + 1;

or

    ((jl_typemap_entry_t*)v)->max_world = ((jl_typemap_entry_t*)v)->min_world - 1;

i.e. set the range of applicable worlds to be empty. What happened instead
was that the given typemap entry that was supposed to be deleted became valid
for one particular world and that world only. Thus any code running in that
particular world may try to access the deleted typemap entry (or add a backedge
to it), causing either incorrect behavior or the assertion failure noted
in the issue. One additional complication is that these world ages are being
deserialized, i.e. they may be larger than the currently possible max world age.
This makes this slightly more likely to happen, since the current process
may work its way up to that world age and exectue some code.

In any case, there's not much value to keeping around the deserialized max or min
world, so just mark them as [1:0], as we do for other deleted entries.

Co-authored-by: Jameson Nash <vtjnash@gmail.com>
@staticfloat
Copy link
Member Author

Unfortunately I can no longer trigger this; my codebase has evolved too much. :(

@cossio
Copy link
Contributor

cossio commented Jun 16, 2018

@Keno This is happening in Julia v0.6.3. Reopen?

@KristofferC
Copy link
Member

No, this needs to be backported but it is fixed.

ararslan pushed a commit that referenced this issue Jun 17, 2018
This issue possibly fixes #24951 (or at least the test case by iamed2).
We believe the original code here meant to say either:

    ((jl_typemap_entry_t*)v)->min_world = ((jl_typemap_entry_t*)v)->max_world + 1;

or

    ((jl_typemap_entry_t*)v)->max_world = ((jl_typemap_entry_t*)v)->min_world - 1;

i.e. set the range of applicable worlds to be empty. What happened instead
was that the given typemap entry that was supposed to be deleted became valid
for one particular world and that world only. Thus any code running in that
particular world may try to access the deleted typemap entry (or add a backedge
to it), causing either incorrect behavior or the assertion failure noted
in the issue. One additional complication is that these world ages are being
deserialized, i.e. they may be larger than the currently possible max world age.
This makes this slightly more likely to happen, since the current process
may work its way up to that world age and exectue some code.

In any case, there's not much value to keeping around the deserialized max or min
world, so just mark them as [1:0], as we do for other deleted entries.

Co-authored-by: Jameson Nash <vtjnash@gmail.com>

---

NOTE: This backported commit EXCLUDES additional assertions made by
vtjnash.

(Cherry-picked from commit d9b10f0)
@cossio
Copy link
Contributor

cossio commented Jun 19, 2018

In case someone else is running into this bug with Julia v0.6.3. A quick and dirty way to get your code running is to use the option julia --compilecache=no. Thans to @omus .

This should be resolved soon in v0.6.4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Indicates an unexpected problem or unintended behavior compiler:inference Type inference
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants