Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak when running all unit tests #12311

Open
core-ai-bot opened this issue Aug 31, 2021 · 8 comments
Open

Memory leak when running all unit tests #12311

core-ai-bot opened this issue Aug 31, 2021 · 8 comments

Comments

@core-ai-bot
Copy link
Member

Issue by gruehle
Monday Jun 10, 2013 at 22:49 GMT
Originally opened as adobe/brackets#4185


Steps to repro

  1. Open the unit test window.
  2. Select "All" test suites
  3. Select "All" tests
  4. Wait a long time

Results
Brackets gets really sluggish and uses up tons of memory.

Expected
Should be able to run all tests without running out of memory.

@core-ai-bot
Copy link
Member Author

Comment by redmunds
Tuesday Jun 11, 2013 at 00:01 GMT


I think this thread in the requirejs forum describes the problem we're having and offers a solution: https://groups.google.com/forum/?fromgroups#!topic/requirejs/DtMnHdKlcVE

@core-ai-bot
Copy link
Member Author

Comment by njx
Thursday Jun 13, 2013 at 22:09 GMT


Reviewed. Medium priority, nominating for sprint 27.@redmunds - would you like to look into this further (since you already looked into it a bit :))? If not, feel free to reassign.

@core-ai-bot
Copy link
Member Author

Comment by redmunds
Tuesday Jun 18, 2013 at 01:53 GMT


I think what we need to do is to unload (or "undefine") unit test modules after they are run.

The undef() method is part of requirejs (http://requirejs.org/docs/api.html#undef). A major caveat: "it will not remove the module from other modules that are already defined and got a handle on that module as a dependency when they executed", but that shouldn't affect unit test modules since they are independent.

Currently, unit test module require statements are hard-coded, so I think we need to dynamically load/unload each unit test module every time the tests are run. This means we'll have to refactor the code since Jasmine Spec Runner dialog is populated from module info.

@core-ai-bot
Copy link
Member Author

Comment by jasonsanjose
Wednesday Jun 26, 2013 at 00:01 GMT


I'm starting to think we're leaking memory due to closures. Randy's fix in #4313 addresses some of those issues around setup and teardown of a spec. There's a lot more work to do though.

@core-ai-bot
Copy link
Member Author

Comment by jasonsanjose
Wednesday Jun 26, 2013 at 01:23 GMT


I'm pretty certain now that we're leaking memory for every new brackets window we open.

Ignoring the integration tests (which are integrations tests because they open a new window), if I simply use the devtools timeline to measure memory while I open then close a series of windows view Debug > New Window, I see the Document and DOM Node counts climb up and up. GC helps some, but long after the windows are closed the Documents and DOM Nodes are still around.

See devtools graph:

memory leak

@core-ai-bot
Copy link
Member Author

Comment by jasonsanjose
Wednesday Jun 26, 2013 at 01:29 GMT


I merged@redmunds pull request. But the bulk of this will need to be done in another sprint. Nominating sprint 28.

@core-ai-bot
Copy link
Member Author

Comment by njx
Monday Jul 22, 2013 at 18:34 GMT


Moving out to sprint 29 - seems like we have higher priority unit test failures to worry about.

@core-ai-bot
Copy link
Member Author

Comment by njx
Friday Aug 02, 2013 at 17:16 GMT


Removing from sprint since the main issue it was causing (#4547) is now much alleviated by the collapsing of multiple unit test windows into one (thanks to@TomMalbran as well as@jasonsanjose).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant