-
Notifications
You must be signed in to change notification settings - Fork 12.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hygiene: Ensure uniqueness of SyntaxContextData
s
#130324
base: master
Are you sure you want to change the base?
Conversation
I've added many asserts, I'll change them to debug asserts if they affect performance. |
This comment has been minimized.
This comment has been minimized.
hygiene: Ensure uniqueness of `SyntaxContextData`s `SyntaxContextData`s are basically interned with `SyntaxContext`s working as keys, so they are supposed to be unique. However, currently duplicate `SyntaxContextData`s can be created during decoding from metadata or incremental cache. This PR fixes that. cc rust-lang#129827 (comment)
☀️ Try build successful - checks-actions |
This comment has been minimized.
This comment has been minimized.
Finished benchmarking commit (b517457): comparison URL. Overall result: ❌✅ regressions and improvements - ACTION NEEDEDBenchmarking this pull request likely means that it is perf-sensitive, so we're automatically marking it as not fit for rolling up. While you can manually mark this PR as fit for rollup, we strongly recommend not doing so since this PR may lead to changes in compiler perf. Next Steps: If you can justify the regressions found in this try perf run, please indicate this with @bors rollup=never Instruction countThis is a highly reliable metric that was used to determine the overall result at the top of this comment.
Max RSS (memory usage)Results (secondary 1.0%)This is a less reliable metric that may be of interest but was not used to determine the overall result at the top of this comment.
CyclesResults (secondary -12.6%)This is a less reliable metric that may be of interest but was not used to determine the overall result at the top of this comment.
Binary sizeResults (primary -0.5%, secondary -0.8%)This is a less reliable metric that may be of interest but was not used to determine the overall result at the top of this comment.
Bootstrap: 756.444s -> 757.208s (0.10%) |
I'm not super fond of the "hopefully" rhetoric... |
Right now they are not a tree because the |
FIXME: The holes left by decoder break the logic assigning |
Ah, there's one more thing - not all contexts are coming from the decoder (during incremental compilation at least). Many contexts come from the freshly redone compilation (which is typically done before incremental decoding starts) and then they need to "unify" with equivalent contexts coming from decoding - that's where the duplicates were coming from before this PR. So even if all decoding is done in proper order, you can still decode and get a context that is equivalent to one of the freshly built ones, but you don't know it until you decode it and compare. Maybe if #129827 eliminates recursion we'll be able to avoid reserving |
SyntaxContextData
s are basically interned withSyntaxContext
s working as indices, so they are supposed to be unique.However, currently duplicate
SyntaxContextData
s can be created during decoding from metadata or incremental cache.This PR fixes that.
cc #129827 (comment)