Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference results lost after multiple codegen invocations #22290

Closed
maleadt opened this issue Jun 8, 2017 · 3 comments
Closed

Inference results lost after multiple codegen invocations #22290

maleadt opened this issue Jun 8, 2017 · 3 comments
Labels
compiler:codegen Generation of LLVM IR and native code compiler:inference Type inference

Comments

@maleadt
Copy link
Member

maleadt commented Jun 8, 2017

#21677 seems to have caused a regression in irgen idempotence (verified by bisect):

julia> exec_dummy() = return nothing
exec_dummy (generic function with 1 method)

julia> @code_llvm exec_dummy()

; Function Attrs: sspstrong
define void @julia_exec_dummy_66504() #0 !dbg !5 {
top:
  ret void
}

julia> @code_llvm exec_dummy()

; Function Attrs: sspstrong
define void @julia_exec_dummy_66528() #0 !dbg !5 {
top:
  ret void
}

julia> # third time's a charm
julia> @code_llvm exec_dummy()

; Function Attrs: sspstrong
define i8** @japi1_exec_dummy_66528(i8**, i8***, i32) #0 !dbg !5 {
top:
  %3 = alloca i8***, align 8
  store volatile i8*** %1, i8**** %3, align 8
  ret i8** inttoptr (i64 140093546754056 to i8**)
}

The japi1 emission is caused because upon third emission specsig is false due to src->inferred being false too. This breaks CUDAnative, where we can only handle specialized julia_* signatures. (Note we use _dump_function with CodegenParams(cached=false) over there.)

Sorry for not doing much debugging here myself, but I'm short on time.
cc @vtjnash @jrevels

@maleadt maleadt added compiler:codegen Generation of LLVM IR and native code compiler:inference Type inference labels Jun 8, 2017
@vtjnash
Copy link
Member

vtjnash commented Jun 8, 2017

diff --git a/src/gf.c b/src/gf.c
index 3be37fc..c732014 100644
--- a/src/gf.c
+++ b/src/gf.c
@@ -262,7 +262,7 @@ jl_code_info_t *jl_type_infer(jl_method_instance_t **pli, size_t world, int forc
     li->inInference = 1;
     jl_svec_t *linfo_src_rettype = (jl_svec_t*)jl_apply_with_saved_exception_state(fargs, 3, 0);
     ptls->world_age = last_age;
-    assert((jl_is_method(li->def.method) || li->inInference == 0) && "inference failed on a toplevel expr");
+    li->inInference = 0;
 
     jl_code_info_t *src = NULL;
     if (jl_is_svec(linfo_src_rettype) && jl_svec_len(linfo_src_rettype) == 3 &&

@maleadt
Copy link
Member Author

maleadt commented Jun 9, 2017

Great, thanks! Is this good-to-go like that? 9ebc8f5

@vtjnash
Copy link
Member

vtjnash commented Jun 9, 2017

Yes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:codegen Generation of LLVM IR and native code compiler:inference Type inference
Projects
None yet
Development

No branches or pull requests

2 participants