Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce codegen lock scope #46836

Merged
merged 2 commits into from
Dec 13, 2022
Merged

Reduce codegen lock scope #46836

merged 2 commits into from
Dec 13, 2022

Conversation

pchintalapudi
Copy link
Member

Depends on #46825 for type inference lock removal.

@pchintalapudi pchintalapudi force-pushed the pc/typeinf-remove branch 2 times, most recently from bc88b43 to 5e1936d Compare November 22, 2022 15:12
@giordano giordano added the compiler:codegen Generation of LLVM IR and native code label Nov 22, 2022
Base automatically changed from pc/typeinf-remove to master November 23, 2022 22:11
@pchintalapudi pchintalapudi marked this pull request as ready for review December 8, 2022 17:51
@pchintalapudi pchintalapudi requested a review from vtjnash December 8, 2022 17:52
auto decls = jl_emit_code(m, mi, src, jlrettype, output);
JL_UNLOCK(&jl_codegen_lock); // Might GC

Function *F = NULL;
if (m) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This check seems wrong now that m is unconditionally created. How do we indicate failure from jl_emit_code now?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

jl_emit_code resets the module if it errors, so the check is still valid.

@@ -267,7 +267,6 @@ void *jl_create_native_impl(jl_array_t *methods, LLVMOrcThreadSafeModuleRef llvm
jl_method_instance_t *mi = NULL;
jl_code_info_t *src = NULL;
JL_GC_PUSH1(&src);
JL_LOCK(&jl_codegen_lock);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have inverted the ordering of jl_codegen_lock and the ThreadSafeModule lock. Can you update the devlocks to reflect that? Is that even valid?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the lock is taken in small spurts to set up the TSM, but we know that doesn't take the codegen lock, and we release the context lock before attempting to take the codegen lock, and then reacquire the context lock during the params constructor. Since the context lock isn't actively being held during the acquisition of the codegen lock, there should be no priority inversion occurring.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, okay, I assumed clone.getContext was getting the lock, not jl_codegen_params_t

@@ -267,7 +267,6 @@ void *jl_create_native_impl(jl_array_t *methods, LLVMOrcThreadSafeModuleRef llvm
jl_method_instance_t *mi = NULL;
jl_code_info_t *src = NULL;
JL_GC_PUSH1(&src);
JL_LOCK(&jl_codegen_lock);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, okay, I assumed clone.getContext was getting the lock, not jl_codegen_params_t

@pchintalapudi pchintalapudi merged commit 09a6ff8 into master Dec 13, 2022
@pchintalapudi pchintalapudi deleted the pc/codegen-lock-1 branch December 13, 2022 00:07
KristofferC pushed a commit that referenced this pull request Dec 27, 2022
(cherry picked from commit 09a6ff8)
@KristofferC KristofferC added backport 1.9 Change should be backported to release-1.9 and removed backport 1.9 Change should be backported to release-1.9 labels Dec 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:codegen Generation of LLVM IR and native code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants