-
-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory crash in 0.0.6 #4
Comments
In version If it runs out of memory with a reasonable set of Markdown files, there are some things I can do to try to fix that, but if the only issue here is that it is trying to process huge swaths of other files, that's not a use case I feel is important. |
Thanks for your reply @DavidAnson, I missed the new rule around Speaking of this trap I fell into, WDYT of emitting a warning of some sort if the number of files to process becomes ridiculously large? Looking at how Prettier handles formatting and linting, there's also scope for refactoring the app so that it emits to std on every processed file and does not hold too much in the memory heap. This way, the tool will be more fault-tolerant and will emit the output as it goes, which will be useful for users. If I started seeing messages like "error in file temp/abc, error in temp/xyz" in my stdout, I'd press ctrl+c without waiting till the tool crashes. Another random thought: how about checking for |
Streaming output would not work with the current implementation of output filters. But maybe it could write some status to stdout by default. Probably just a summary of inputs, then the file count, then a count of errors. With that in place, it should be pretty clear when too many files are being looked at. I'd rather not introduce a whole warning system just for that one scenario that should only ever happen once to people who are migrating. |
I've added progress to It can be turned off via the Example output:
|
Thanks for your reply @DavidAnson! I tried 0.0.8 and generally managed to get a fast result without crashing. Removing Two observations:
Since the above observations are not related to the out-of-memory crash, feel free to close this issue 🙂 |
Great, thank you, I will close this! By the way, I made a couple of other refactorings intended to avoid unnecessary memory pressure. They will be in the next release. Regarding your point about progress, note that the kind of progress used here is not the fancy kind that updates the same line in the console or anything like that. It just outputs three lines of plain text at the relevant parts of execution. And they can be turned off entirely if wanted. Regarding double negation in the ignores property, I'm afraid that might get confusing. Remember that ignores are applied after the original globs from the command line, so they can not add back in files that were excluded there. They could add back in files that were excluded by the ignore pattern itself, but I'm afraid trying to explain and understand that might be difficult. I'm going to wait and see if this becomes an issue for users. |
Commit referenced above: 3404146 |
This issue is a follow-up to igorshubovych/markdownlint-cli#108
Trying
markdownlint-cli2 "**/*"
in a project that has a large folder in.markdownlintignore
results this output in about 5 mins:Repro steps are the same as in
markdownlint-cli2 "**/*"
except that the ignored folder is even bigger (354K files distributed across about 6K subfolders)The text was updated successfully, but these errors were encountered: