Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.ntvs_analysis.dat size and mem use #88

Closed
objelke opened this issue Apr 30, 2015 · 34 comments
Closed

.ntvs_analysis.dat size and mem use #88

objelke opened this issue Apr 30, 2015 · 34 comments

Comments

@objelke
Copy link

objelke commented Apr 30, 2015

Node: v 0.12.2
VS 2012 and 2013, both Ultimate
NTVS 1.0

On larger projects, memory use becomes a significant issue. I suspect it's related to the analyze function; the .ntvs_analysis.dat files for individual projects range from a few MB to >400MB.

The advise for using AnalysisIgnoredDirectories to exclude modules has not resulted in any measurable improvements. Also the .dat file sizes stayed const - even after playing with reference pathes etc.
(note: every project reload after a change takes ~ 10 minutes until the new .dat is completed so hard to test many scenarios).

Combining multiple projects in a single solution had to be given up at an early stage already (sadly) - the VS process was quickly approaching 4GB mem use and then often crashing. If it survived, VS became unusable (many seconds between possible keystrokes, often pausing 20sec+ before VS could be used again).

But even when just loading individual projects, load time is a drag - on slower/older laptops, 10 min.+ for a single project. Even on fast machines (SSD/16GB RAM etc.) many minutes.

Disabling intellisense may be an option - but that would invalidate the use of VS in the first place for this.

Once it's all loaded, editing performance is actually quite all right.
Launching the debugger takes a bit of time but survivable.
While debugging though, every keystroke is followed by painfully slow execution.

The projects are admittedly complex, a typical dependency list:

"dependencies": {
"bcrypt-nodejs": "0.0.3",
"bluebird": "^2.9.14",
"chai": "^2.1.2",
"chai-as-promised": "^4.3.0",
"expiring-lru-cache": "^2.1.0",
"jsonschema": "^1.0.1",
"knex": "^0.7.6",
"lodash": "^3.6.0",
"moment": "^2.9.0",
"monitor": "^0.6.10",
"mysql": "^2.5.5",
"node-cache": "^1.1.0",
"node-uuid": "^1.4.3",
"nodemailer": "^1.3.2",
"nodemailer-smtp-transport": "^1.0.2",
"passport": "^0.2.1",
"passport-http": "^0.2.2",
"remove": "^0.1.5",
"restify": "^3.0.1",
"simple-lru-cache": "0.0.1",
"when": "^3.7.2"
},

(needles to say every dependency has a significant node_modules subfolder itself and so forth...)

Looking for advise

  • any useful action which could be taken without giving up on intellisense?

On VS2012 vs VS2013 - reverted to using 2012; 2013's behavior was similar but overall worse - longer delays, more frequent crashes.

More specifics:

This maps, via file system links, 4 complete shared projects (with their node_modules) into the node_modules tree of the main project - that certainly can contribute to trouble.

Disabling analysis for those isn't an option - any "smarter" way to include shared libs into the visual studio project, reducing memory use?

Final words: not complaining here about a free piece - when it works (on smaller projects) it's a truly excellent integration. Just hoping we can keep on using it for a growing technology stack.

@mousetraps
Copy link
Contributor

Sorry to hear about the perf issue.... thanks for reporting. #44 includes some steps to help us diagnose the issue (as well as known workarounds). Any chance you could provide that info?

Thanks in advance!

@objelke
Copy link
Author

objelke commented May 11, 2015

Disabling intellisense reduced the loading time dramatically and the VS memory footprint significantly (therefore, the frequent VS GC which causes delays in the editor).

Example: mem use before consistently ~2.5GB, after disabling intellisense ~1.5GB.

Unfortunately - not an option for us to use it this way - intellisense is probably the #1 reason to use VS with the node plugin.

For this:

If IntelliSense is to blame, try disabling IntelliSense for a specific package if you think that may be causing the issue. See Ignoring Directories for Analysis for full details.....
<<

I could not make this work, gave up after several hours of trying. Nothing I did seemed to have any effect.

No Resharper/McAfee etc. installed.

If it does not occur with a blank new project, please provide us with your entire project if possible.
<<
Unfortunately, not possible - too many dependencies on other private projects etc.

Complete dependency list of the main project and its imported private projects:

"dependencies": {
"bcrypt-nodejs": "0.0.3",
"expiring-lru-cache": "^2.1.0",
"jsonschema": "^1.0.1",
"knex": "^0.7.6",
"moment": "^2.9.0",
"monitor": "^0.6.10",
"mysql": "^2.5.5",
"node-cache": "^1.1.0",
"node-uuid": "^1.4.3",
"nodemailer": "^1.3.2",
"nodemailer-smtp-transport": "^1.0.2",
"passport": "^0.2.1",
"passport-http": "^0.2.2",
"remove": "^0.1.5",
"request-promise": "^0.4.2",
"restify": "^3.0.1",
"simple-lru-cache": "0.0.1",
"when": "^3.7.2",
"base64url": "^1.0.4",
"bluebird": "^2.9.21",
"bunyan": "^1.3.4",
"chai": "^2.3.0",
"chai-as-promised": "^5.0.0",
"cuint": "^0.2.0",
"int64": "0.0.5",
"lodash": "^3.8.0",
"node-forge": "^0.6.21",
"node-localstorage": "^0.5.0",
"node-rsa": "^0.2.23",
"stream-buffers": "^2.1.0",
"mongoose": "^4.0.1",
"getmac": "^1.0.7",
"node-int64": "^0.4.0",
"socket.io": "^1.3.5",
"socket.io-client": "^1.3.5",
"socket.io-stream": "^0.6.1",
"node-rest-client": "^1.4.4"
}

Created an empty project, just requiring all these. That did not suffice (150MB analysis.dat file, 1GB RAM footprint of VS).

But: many of these dependencies are in reality sub-dependencies of other private tool projects, leading to a fair amount of duplication.

So, next step:

  • made 2 new folders in node_modules (heavylib1, heavylib2) and copied the entire existing node_modules tree into them (plus the package.json as listed above), tripling the size of of dependencies.

Which lead to tripling of the analysis.dat file and dramatic increase in mem use (~2.9GB) and solid 30 minutes of analysis time before the analysis.dat was updated - so that is a fairly exact replication of the issues with have with our projects, maybe a bit exaggerated.

So load (mem and analysis.dat file size) is purely a function of the js file sizes being part of the project, as expected. Which naturally will run into issues with larger projects.

There is no file hashing/skipping of redundant files involved in the analysis I believe? Seems to be the only way to stretch the limits a bit.

@kant2002
Copy link
Contributor

Add just the ''web-component-tester" dev dependency.
This immediately create the .ntvs_analysis.dat file with 200Mb and also increase VS working set to 1460 Mb.

I do not have Resharper
Don't have antivirus
Intellisense enabled.

@kant2002
Copy link
Contributor

I take one more step and test with latest bits of source code. Same error, high mem usage and a lot of CPU.

One thing to notice - The web-component-tester when installed on Windows installed with errors due to long path. This lead to following observations:

  1. Code generates a huge count of PathTooLongException exceptions. This also increase CPU.
  2. The specific component has a lot of nested modules and they also analyzed for Intellisense. @objelke right in his observation that memory usage and size of .ntvs_analysis.dat related with count of analyzed JS files.

Since I mostly not interested in the deep nested projects which are dependencies of my dependencies, I try to modify source code, so too deep NPM modules would be excluded from the analysis.
When I go 5 levels deep (1 level is main project) I manage to reduce memory footprint from 1,46Gb to 966Mb but size of .ntvs_analysis almost the same as before change - 210Mb.
When I stop analyzing on 3 levels deep memory usage reduced to 600Mb, and size of .dat file is 74Mb.

But when I close and reopen solution, file size grows to the 200Mb, so maybe nothing is changed, or I don't implement something correctly.

But CPU usage still high during loading.
From the observations and working under debugger looks like high CPU coming from the unhandled PathTooLongException

That's my observations so far.

@objelke
Copy link
Author

objelke commented May 24, 2015

excellent idea, trying to patch&check.

Another approach (not working for all cases but just to test)
in Nodejs\Product\Nodejs\NodejsProjectNode.cs, around line 488, change the IncludeNodejsFile method to (experimental code of course, please do not comment on that part):

   internal int substrcount(String mstr, String substr)  {
        return (mstr.Length - mstr.Replace(substr, "").Length) / substr.Length;
    }

    internal bool IncludeNodejsFile(NodejsFileNode fileNode) {
        var url = fileNode.Url;
        if (CommonUtils.IsSubpathOf(_intermediateOutputPath, fileNode.Url)) {
            return false;
        }

        if (substrcount(url, "node_modules") > 1)
            return false;

        foreach (var path in _analysisIgnoredDirs) {
            if (url.IndexOf(path, 0, StringComparison.OrdinalIgnoreCase) != -1) {
                return false;
            }
        }
        if (new FileInfo(fileNode.Url).Length > _maxFileSize) {
            // skip obviously generated files...
            return false;
        }
        return true;
    }

This excludes any path with more than one 'node_modules' folder.
Too special to be a generic solution but fits my scenario perfectly.
And, I truly have a responsive, fast, environment back. Even after multiple restarts, the analysis.dat has not grown any more.

So something like expanding analysisIgnoredDirs with regex matching could solve all issues.

Thanks for the idea @kant2002

@mousetraps
Copy link
Contributor

That's a very interesting idea... @DinoV what do you think about using this as a way to scale down IntelliSense?

@DinoV
Copy link
Contributor

DinoV commented May 28, 2015

I think this makes sense, maybe as both a per-project and a global option? At least per-project. But it seems like a reasonable way to dial the analysis. There may be some losses on the edges but the greater the depth allowed the less significant those losses are likely to be.

We could also run AnalysisDriver with a similar change to get an idea of how it effects top-level modules. If there was a value where it seemed to have little impact on a lot of packages other then analysis time we could set that as the default.

@kant2002
Copy link
Contributor

I could made pull request with some of my changes which lead to this results.

@kant2002
Copy link
Contributor

Just in case if this could be helpful:
Run original code with AnalysisDriver
AnalysisDriver.exe /package:web-component-tester /no_cleanup /install_missing
Output is
180469 ms 6585 files 194 MB parser WS, 189 MB parser GC Mem, 461 MB WS, 420MB GC Mem, 16 ( 303) completions

When analyze 3 modules depth deep (3 node_modules in relative path) output is:
24499 ms 1918 files 53 MB parser WS, 45 MB parser GC Mem, 110 MB WS, 91MB GC Mem, 16 ( 303) completions

When analyze 2 modules depth deep (2 node_modules in relative path) output is:
9441 ms 765 files 25 MB parser WS, 21 MB parser GC Mem, 42 MB WS, 36MB GC Mem, 16 ( 303) completions

@objelke
Copy link
Author

objelke commented May 28, 2015

The folder depth within my own code is deeper than the folder depth within node_modules but less explosive.
So if "depth" were the sole determinator, I'd be excluding my own code instead of unneeded sub-dependencies in node_modules.
And I believe this is not an exception.

Unless the config value would be depth within node_modules only.

@objelke
Copy link
Author

objelke commented May 28, 2015

ah, - nm, I see the pull request considers this well.

@kant2002
Copy link
Contributor

@objelke please take a look at my analysis.

Depth Time (ms) Files count Avg. analysis time (ms/file)
No limit 180 469 6585 27,40
4 73 282 4157 17,63
3 24 499 1918 12,77
2 9 441 765 12,34

From what I could see as you told initially, analysis is dependent on count of files. After some threshold average file analysis time increased. Could you try to project based on your subset of data?

@objelke
Copy link
Author

objelke commented May 28, 2015

more scary numbers:

Depth Time (ms) File count Avg. time (ms/file)
No limit 754 502 12930 58
3 361 639 8624 41
2 125 403 5287 24
1 36 033 2561 14

we can see, there is no golden "depth" value. For me, 2 already is out of control.
(*edited: removed parser WS data as there is no direct correlation to accumulated file size)

To ensure we mean the same by "Depth", in your pull request:
if (nestedModulesCount > "Depth")
is what I listed as "Depth".

@objelke
Copy link
Author

objelke commented May 28, 2015

Full output, in case the other values are of interest:

no limit:
754502 ms 12930 files 683 MB parser WS, 617 MB parser GC Mem, 1271 MB WS, 1141 MB GC Mem, 0 ( 0) completions
limit 3:
361639 ms 8624 files 587 MB parser WS, 508 MB parser GC Mem, 1084 MB WS, 923 MB GC Mem, 0 ( 0) completions
limit 2:
125403 ms 5287 files 445 MB parser WS, 370 MB parser GC Mem, 768 MB WS, 648 MB GC Mem, 0 ( 0) completions
limit 1:
36033 ms 2561 files 348 MB parser WS, 273 MB parser GC Mem, 541 MB WS, 447 MB GC Mem, 0 ( 0) completions

@kant2002
Copy link
Contributor

You are right, the number more scary, but they fit the same picture. Let's combine our values.

Depth Time
(ms)
Files count Avg. time
(ms/file)
Avg. mem
(KB/file)
No limit 180 469 6 585 27,40 30,16
4 73 282 4 157 17,63 31,28
3 24 499 1 918 12,77 28,30
2 9 441 765 12,34 33,46
No limit 754 502 12 930 58,35 54,09
3 361 639 8 624 41,93 69,69
2 125 403 5 287 23,71 86,19
1 36 033 2 561 14,07 139,15

Which if we sort this by files count give us following picture

Depth Time
(ms)
Files count Avg. time
(ms/file)
Avg. mem
(KB/file)
No limit 754 502 12 930 58,35 54,09
3 361 639 8 624 41,93 69,69
No limit 180 469 6 585 27,40 30,16
2 125 403 5 287 23,71 86,19
4 73 282 4 157 17,63 31,28
1 36 033 2 561 14,07 139,15
3 24 499 1 918 12,77 28,30
2 9 441 765 12,34 33,46

Based on the I could do following guesses:

  1. Time is dependent on the count of files analyzed. Some polynomial dependency - Quadratic or similar.
  2. Parser memory usage most likely dependent on the amount of code in the file.

This lead to conclusion that this fix not enough, and parser should be further improved to be more efficient.
For example not analyze whole tree, but only files which are used. Not clear how to do that yet.

@objelke
Copy link
Author

objelke commented May 28, 2015

This lead to conclusion that this fix not enough, and parser should be further improved to be more efficient.<<

yes, makes perfect sense.

However, let me point out that the developers in our company are truly happy users for almost a week now with this simple fix (just limit to 1) - env. went from 100% unusable to 100% usable.

So if I could recommend: including a low effort solution as described here can help tremendously, despite perhaps not being perfect.

All that said in case the true solution would cause high effort engineering.

@mousetraps
Copy link
Contributor

I really like the direction of this change, but like any analysis changes, no fix is going to be perfect given the wide variety of patterns available. So a couple ideas in that vein:

  • let's test package_list.txt (these are the top npm packages) - this is more comparable metric than than any specific project, and is the list we should use to choose a reasonable default.
  • beyond that. that, we should surface up a configurable option in project properties and also the IntelliSense settings. Perhaps this can tie into Medium-power IntelliSense mode #138 as well, when it comes to general settings. We can also include it as an option in project properties that will override the default when someone is working with multiple projects and would like to configure on a more fine-grained scale.

mousetraps referenced this issue in kant2002/nodejstools May 29, 2015
Limit depth analysys to relative paths which contains only 2 node_modules. This should be sufficient enough to capture
a) Capture API of the root package
b) Capture API of the dependencies. This is obiviously the must to analyze (1 node_modules)
c) Capture API of subdependencies (2 node_modules). This is also the must, since a lot of packages are wrapper/glue for another library. For example using gulp-typescript require that gulp-typescript should be parsed (level b) and typescript should be parsed (level c) since typescritp provide some config options for gulp-typescript.
@kant2002
Copy link
Contributor

Now working on create configuration page for my fix.Next directions from my oversvations is eliminate huge amount of exceptions generated. There two common types of exceptions:

  • System.IO.PathTooLongException (extremely high amount)
  • Microsoft.NodejsTools.Parsing.RecoveryTokenException

Without PathToLongException debugger run much faster and most likely CPU usage goes low a bit.

mousetraps added a commit that referenced this issue Jun 26, 2015
Fix #138 Add medium level of IntelliSense / limit to the analysis depth (Related to #88).

Limit depth analysys to relative paths which contains only 2 node_modules. This should be sufficient enough to capture
a) Capture API of the root package
b) Capture API of the dependencies. This is obiviously the must to analyze (1 node_modules)
c) Capture API of subdependencies (2 node_modules). This is also the must, since a lot of packages are wrapper/glue for another library. For example using gulp-typescript require that gulp-typescript should be parsed (level b) and typescript should be parsed (level c) since typescritp provide some config options for gulp-typescript.

Related to #88 

#138 Added Medium level of Intellisense. This level now same as Full level, but limit analysis depth to the 2 nested modules depth. Full mode is now has limit to 4 modules depth which I almost sure practical enough, but could be increased if needed. Low level is analyse 1 level depth.
@mousetraps
Copy link
Contributor

Some improvements in this area are available in the latest dev build

@objelke
Copy link
Author

objelke commented Jul 31, 2015

Just some feedback on 1.1 RC on this matter:

Certainly much better than 1.0 - but unfortunately, still a true pain (compared to my brute force "level 1" hack). Loading time of larger solutions with completed analysis: > 2..3 min, without (deleting the .dat): much longer. 15 min++.

Intellisense settings are at "quick".

Tried to force the depth ("NestedModulesLimit") via registry to 1 - but no improvement from that (not sure I got it right - couldn't see a difference).

From looking at the source code, it seems the following key should be respected for VS2013:
HKEY_CURRENT_USER\Software\Microsoft\NodejsTools\12.0\Analysis\Project[DWORD]NestedModulesLimit=1

  • but it didn't seem to matter.

Will compile an own version again and dig deeper - just posting this here in case somebody has a "but you need to...." moment which could help on this.

@objelke
Copy link
Author

objelke commented Aug 4, 2015

a few days now - I can confirm, behavior on 1.1 RC is still unusable for our environment.
2.5GB++ VS memory use, stalling in editor etc.

What is a bit confusing about this: .ntvs_analysis.dat is about 230 MB - that is not so horrible (compared to earlier) - scratching head a bit - perhaps something else was changed?

Don't have time to dive into this now (need to re-apply my patch to the current source code and test; all very time consuming) - but still wanted to report that something is still a bit off here.

Not sure what the 1.1 release plan is - will check deeper next week, maybe there is still time for changes?

@mousetraps mousetraps modified the milestone: Future Sep 24, 2015
@jimschubert
Copy link

I'm also having issues with this extension's memory usage.

In VS2015 Enterprise, I have four projects in a solution:

  • web api 2
  • C# assembly
  • C# NUnit Test project
  • Node.js project

My VS memory ranges from 1.8GB to 2.5GB. I have ReSharper Ultimate, and I originally thought that was the issue. I've disabled, suspended, and uninstalled ReSharper. ReSharper is definitely not the issue. I can unload the Node.js project from my solution and memory drops to around 450MB.

My .ntvs_analysis.dat is also around 230MB.

The Node.js project is just the yeoman template from generator-angular with a handful of small directives added.

@jimschubert
Copy link

I dug a little deeper into this and it looks like I've resolved my memory issues.

The problem appears to be ndoejstools is hitting MAX_PATH for modules installed with npm 2.x. Although I'd installed npm 3 with npm install -g npm, that installs to %AppData%\npm\node_modules\npm and not to C:\Program Files\nodejs\node_modules\npm which is what gets resolved by GetPathToNodeExecutableFromEnvironment.

To resolve my memory issues, I deleted .ntvs_analysis.dat and the node_modules directory (had to manually rename folders to bypass MAX_PATH issues with the recycling bin). I then copied npm 3.x from %AppData%\npm\node_modules\npm into C:\Program Files\nodejs\node_modules\npm.

Memory usage of Visual Studio is now steadily 628MB, which is slightly higher than the 450MB without the Node.js project loaded, but way less than the 2.5GB that was causing VS to crash.

There's a NPM_PREFIX_NPM_CLI_JS environment variable available in npm.cmd from the node 4.0.0 installer. Rather than copy npm 3.x over npm 2.x, I could have probably defined the environment variable, but I'd rather not have any question about the version of npm being executed (like if a tool hardcoded the path to npm-cli.js under program files). Also, I don't recall seeing NPM_PREFIX_NPM_CLI_JS in npm.cmd for node 0.12.x.

@mousetraps
Copy link
Contributor

@jimschubert fyi you can check the version of npm being executed by running .npm -v in the interactive window.
https://github.com/Microsoft/nodejstools/wiki/Npm-Integration#npm-in-the-nodejs-interactive-window

Regarding MAX_PATH memory concerns, probably a red herring since we would ignore those files altogether so there's no reason it would be more memory intensive. It's likely more to do with the fact that things are deduplicated by default with npm v3, so we don't have to analyze as much stuff (see _#_1 in #233 (comment))

@kant2002
Copy link
Contributor

Regarding memory usage. I have ~1.5Gb used by VS + NTVS. I take a dump and run !dump -stat

Here objects which use memory the most:

.....
1928ad6c   126633     37023412 Microsoft.NodejsTools.Analysis.AnalysisSetDetails.AnalysisHashSet+Bucket[]
19282a60  1425602     39916856 Microsoft.NodejsTools.Analysis.Analyzer.ReferenceableDependencyInfo
713625fc  1040452    115138140 System.String
005a5c98   714625    437264154      Free
Total 23027987 objects

Please take a not that String is take most of the memory.

I lookup on which strings are help in memory and here the examples:

!do 53d62a98 
Name:        System.String
String:      # core-util-is

The `util.is*` functions introduced in Node v0.12.
.....
String:      readme
....
String:      readmeFilename
....
String:      Microsoft.JSON.Core.Schema.IJSONSchemaSelector
....
String:      #########\node_modules\rimraf
...
String:      tap test/*.js
......
String:      stream not writable

Here couple interesting thoughts

  • The documentation could take some memory. What's the point store it in memory?, Small and could be read from disk.
  • The parts of package.json could be duplicated, because of strings not interned (?).

@zagros
Copy link

zagros commented Oct 30, 2015

I am having the same issue.

With NTVS (both the Oct 22 Dev release, and the stable on the main branch), I can't get VS to load my project. It freezes VS completely when I open any node.js project.

I have npm 2.14.7, and VS 2013 with Web Essentials.

On the machines with VS 2015 it works for some reason (Same project). This happens only on one of the machines.

Also, the .ntvs_analysis.dat file grows rapidly to ~93MB and it stops. (Before installing the Oct 22 Dev release mentioned in your posts above, I had the stable RC installed and this file was only growing to ~65MB). Both versions failed to open the project.

Given that the project hasn't changed, looks like the traversal of the tree is now getting further and yet stops again. Question for you! Is the large file size because you traverse the tree multiple times and some sort of multiple recursion is happening causing a stack overflow?

Side note
If I uninstall node.js and open visual studio, the project opens (this is whilst NTVS is installed). This was just an observation.

@gallak87
Copy link

gallak87 commented Jan 3, 2016

Using vs 2013 on Win 10 x64.
NPM 4.1.1 and NodeJS Tools 1.1 RC2.1 for VS2013

I've also had this issue for a while - every time I open a nodejs project it freezes, although my memory usage doesn't go up past 500mb (maybe since it's the only project in the solution).

I found a really strange way to keep it from freezing, though. Before opening the solution, I open Tools > NodeJS Tools > Diagnostic Info. Then close it, then I open the NodeJS Interactive window and let it finish loading. Then I open the solution and it doesn't freeze. Just did this 3 separate times and that exact sequence keeps it from freezing. Hope this helps.

@stod0021
Copy link

Still a major issue... 2.6gigs of ram usage for Visual Studio alone is unacceptable.

@noopole
Copy link

noopole commented Apr 14, 2016

Thank you @jimschubert: i deleted node_modules and upgraded npm form v2 to v3 and now it works just fine (RAM passed from 2 Gb to 500 Mo).
😃

@mjbvz
Copy link
Contributor

mjbvz commented Apr 14, 2016

I've created a troubleshooting page to document some possible work arounds for high memory usage: https://github.com/Microsoft/nodejstools/wiki/Troubleshooting#high-memory-usage-workarounds

If you on the NTVS 1.2 Alpha, try switching to use ES6 Preview Intellisense as a first step (Tools -> Options -> Text Editor -> Node.js -> IntelliSense). Our homegrown static analysis engine is usually to blame for high memory usage, and ES6 Preview Intellisense uses a more modern and efficient language service developed by the Typescript team. We will still try to fix leaks and crashes in our static analysis engine of course, but will probably not invest in fixing some of the fundamental memory consumption problems it has.

@kant2002
Copy link
Contributor

Have same issue with NTVS 1.2 Alpha + ES6 Preview

image

Memory usage in Task Manager
1.4Gb Private working set
1.7Gb Working set

@mjbvz What could be the issue?

@mjbvz
Copy link
Contributor

mjbvz commented Apr 18, 2016

@kant2002 That seems high, but the raw numbers are not that useful since they do not tell us how much of that is NTVS. Depending on VS configuration, what other extensions are installed, and what you are doing or have been doing in Visual Studio, this may be expected.

If none of the other solutions have worked and there is no easy repo for the problem so that we can investigate, try building the source and use the VS memory profiling tools to look at what is consuming all that memory. #815 will improve this somewhat, but unless you have like thirty projects in a solution, each with super complex dependency trees, I don't imagine that alone will drop memory usage by more than 20% or so.

@kant2002
Copy link
Contributor

@mjbvz My personal issues is in strings first (that's on released Alpha 1.2)

180a5b60  2880663     34567956 Newtonsoft.Json.Linq.JProperty+JPropertyList
1809a160   858601     35325660 Newtonsoft.Json.Linq.JToken[]
180a30b0   777607     52877276 Newtonsoft.Json.Linq.JObject
711d0764  1045063     54842936 System.Int32[]
18095a58   664961     77272764 System.Collections.Generic.Dictionary`2+Entry[[System.String, mscorlib],[Newtonsoft.Json.Linq.JToken, Newtonsoft.Json]][]
02685e68   881705     94813262      Free
180a38fc  2631037    115765628 Newtonsoft.Json.Linq.JValue
180a3d38  2880663    184362432 Newtonsoft.Json.Linq.JProperty
711ce918  7756903    706724622 System.String

I suspect that this was not resolved by #815, but maybe I'm wrong. I will try find time to build NETVS from source and let you know, if issue still in place. I see that there a lot of good changes in NTVS after last release, so I give it a shot

@mjbvz
Copy link
Contributor

mjbvz commented Jun 23, 2016

Please try downloading and using NTVS 1.2 Beta. We've made some significant performance improvements to the product, and the analyzer specifically.

We now use the Typescript analysis engine by default, instead of our old static analysis engine. This means that we no longer need .ntvs_analysis.dat files, and the analyzer should be much more stable and performant. It should also pick up new JavaScript features and fixes much quicker. We've also made some general memory reduction improvements and fixed a number of leaks.

Please give 1.2 Beta a try, report any issues you run into, and feel free to submit any general feedback. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants