-
-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to download beyond 900 chapters - OutOfMemoryError #58
Comments
I seem to have fixed it by downloading and using the Java 64-bit. Downloaded from here: https://www.java.com/en/download/manual.jsp Maybe you can let Java 64-bit to be one of the prerequisite of using the program? Anyways, thanks again for your work. It makes my military life so much better. |
Thanks for the info! I'm not sure what the default heap size on 32-bit is but it seems to be pretty low. You can manually increase it with the parameters -xms / -xmx ( |
For reference, Library of Heaven's Path and Gate of God are the two novels
I tried that failed to go over the 930 chapters mark.
…On Sun, 6 Sep 2020, 02:57 Flameish, ***@***.***> wrote:
Thanks for the info! I'm not sure what the default heap size on 32-bit is
but it seems to be pretty low. You can manually increase it with the
parameters -xms / -xmx (java -Xms1g -Xmx2g -jar NovelGrabber...) so I
don't think 64-bit is a requirement just yet.
If a novel take more than 4gb of memory (32-bit limit) I'll seriously need
to reevaluate my memory management.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#58 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQ4JJULCIJYSKMU3MDIYHDDSEKCYTANCNFSM4Q3FTR4Q>
.
|
Tested it with 50 MB. Ran out of memory after 160 chapters / 800 pages w/o images. I'm going to take a look where I can optimize it. |
On my system, it seems to be using about half a GB at first while grabbing
and slowly increasing to just over a GB after it is complete.
…On Sun, 6 Sep 2020, 03:09 Flameish, ***@***.***> wrote:
Tested it with 50 MB. Ran out of memory after 160 chapters / 800 pages w/o
images. I'm going to take a look where I can optimize it.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#58 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQ4JJUKHHN4LGRNW665ZQ4TSEKEFDANCNFSM4Q3FTR4Q>
.
|
What changed from before where i could download 4000+ chapters without issue |
Tried to download some novels on novelfull, when I get to chapter 900-1000ish, it just stops.
Ran it with the console and got the following error code:
Exception in thread "pool-2-thread-1" java.lang.OutOfMemoryError: Java heap space
at org.jsoup.parser.CharacterReader.(CharacterReader.java:36)
at org.jsoup.parser.CharacterReader.(CharacterReader.java:41)
at org.jsoup.parser.TreeBuilder.initialiseParse(TreeBuilder.java:38)
at org.jsoup.parser.HtmlTreeBuilder.initialiseParse(HtmlTreeBuilder.java:65)
at org.jsoup.parser.TreeBuilder.parse(TreeBuilder.java:46)
at org.jsoup.parser.Parser.parseInput(Parser.java:35)
at org.jsoup.helper.DataUtil.parseInputStream(DataUtil.java:175)
at org.jsoup.helper.HttpConnection$Response.parse(HttpConnection.java:835)
at org.jsoup.helper.HttpConnection.get(HttpConnection.java:287)
at grabber.scripts.ChapterContentScripts.defaults(ChapterContentScripts.java:222)
at grabber.scripts.ChapterContentScripts.fetchContent(ChapterContentScripts.java:45)
at grabber.Chapter.saveChapter(Chapter.java:48)
at grabber.Novel.downloadChapters(Novel.java:137)
at gui.GUI.lambda$null$2(GUI.java:332)
at gui.GUI$$Lambda$67/20323159.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
It seems that Java ran out of memory. Is there a way to allocate more RAM to the program? Thanks for your work.
The text was updated successfully, but these errors were encountered: