Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory when trying to analyze big auto-generated files #13807

Open
jedel1043 opened this issue Dec 21, 2022 · 8 comments
Open

Out of memory when trying to analyze big auto-generated files #13807

jedel1043 opened this issue Dec 21, 2022 · 8 comments
Labels
C-support Category: support questions

Comments

@jedel1043
Copy link

jedel1043 commented Dec 21, 2022

Follow-up of #13788.

Rust analyzer no longer panics when trying to parse big files. However, now my system OOMs (16 GB of RAM) while trying to analyze the generated files.

EDIT:
This only happens after triggering a cargo check on save. I tested cargo check alone and it only consumes 3+GB of RAM, so the problem seems to be R-A.

To reproduce

git clone https://github.com/jedel1043/ra-bug.git && cd ra-bug

Depending on the base memory consumption, the used memory should increase by 8+ GB after opening the project and triggering a cargo check. For some high-end systems this should not be a problem, but the standalone repo bumps the used memory to approx. 15.8 GB on my system, and trying to use the crate in a mid-sized workspace project directly OOMs.

rust-analyzer version: 0.4.1326-standalone

rustc version: rustc 1.66.0 (69f9c33d7 2022-12-12)

cc @Manishearth

@Manishearth
Copy link
Member

As mentioned in that thread, I recognize that databake is a bit of an edge case: it would be nice if there was a cfg attribute recognized and applied by rust-analyzer so such code can be structured to not load a ton of other files when rust-analyzer is looking at the code.

@Veykril
Copy link
Member

Veykril commented Dec 21, 2022

That's odd, I am not running OOM on windows (and I don't see why r-a should OOM on that input), the text and parse queries oly take up ~400mb of memory for me (checked by using the clear database command)

   253mb FileTextQuery
   138mb ParseQuery

@jedel1043
Copy link
Author

@Veykril Ah, forgot to mention that it only happens after you trigger a cargo check.

@jedel1043 jedel1043 changed the title Out of memory when trying to analyzer big auto-generated files Out of memory when trying to analyze big auto-generated files Dec 21, 2022
@Veykril
Copy link
Member

Veykril commented Dec 21, 2022

Oh, if the generated code generates a ton of warnings I can see this happening, r-a currently creates a lot of allocations quadratic to the amount of check diagnostics iirc

@jedel1043
Copy link
Author

That's probably the reason why it crashes on the big project. I noticed that the generated code triggered a lot of error: hidden lifetime parameters in types are deprecated with our set of enabled lints.

cc @Manishearth

@Manishearth
Copy link
Member

quadratic to the amount of check diagnostics iirc

..... how? 😄

@Veykril
Copy link
Member

Veykril commented Dec 21, 2022

iirc this Arc::make_mut currently always causes a clone of the map and its called for each diagnostic

let check_fixes = Arc::make_mut(&mut self.check_fixes);

Though I don't think we keep those copies in memory all at once so maybe thats not the problem here

@lnicola
Copy link
Member

lnicola commented Dec 21, 2022

What about the for loop above it? I think we clean it every so often, but we might end up with a lot of them before passing the whole bunch to the client.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C-support Category: support questions
Projects
None yet
Development

No branches or pull requests

4 participants