Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemoryError and duplicate subs #39

Open
shinji257 opened this issue Jul 26, 2017 · 4 comments
Open

MemoryError and duplicate subs #39

shinji257 opened this issue Jul 26, 2017 · 4 comments

Comments

@shinji257
Copy link
Contributor

shinji257 commented Jul 26, 2017

So currently there is an issue where, if the connection stalls for some reason, that the python process can inflate in memory usage. It will do so until either the connection recovers or a MemoryError is thrown by the system. I'm attempting to resolve this by catching the error during the download so that it gets handled.

Currently in testing I have it so that it will catch the exception twice at most in the same run. The first time it will attempt to reset progress and restart. There is a flush in the progress function so theoretically that should flush out excess memory usage. If the exception happens a second time it gives up. The exception is still caught however it doesn't restart progress and opts to break execution instead.

There is another issue (found while retrieving Web Ghost Pipopa Episode 39) where a show may have a duplicate subtitle entry. What appears to be the case is when a typo is corrected it adds a new entry and the old one is never removed. I did get to see both versions and there was a single character difference that looks like a typo.

Currently when downloading/decryption it ends up with the most recent revision however it ends up with a duplicate entry in the mkv and there is a file not found error thrown at the end during cleanup due to this.

I've added exception catches for WindowsError and OSError during cleanup so that it can move on (it does still give a cleaner version of the error for review) and during mkvmerge I have it checking if the file is already in the currently building command line. If it is then it just continues on to the next iteration and avoids adding a duplicate into the mkv.

This part has been tested fully on the test episode in question. It is assumed that it should work for multi-lingual setups although I don't know any.

@shinji257
Copy link
Contributor Author

Also got an ssl.SSLError for a timeout. Since timeouts are handled already I want to try to handle this one directly. I don't know what the error code is though so I'm hoping to get this if it happens again.

@jaw20
Copy link
Owner

jaw20 commented Nov 2, 2017

Have you been able to fix the memory leak?

@shinji257
Copy link
Contributor Author

No. I tried to clear it up but it still showed up every now and again. When I tried to set up a "catch and restart" loop it didn't actually clear the memory and got stuck in an infinite loop. Eventually, I gave up on the issue.

@alzamer2
Copy link
Collaborator

alzamer2 commented Nov 3, 2017

@shinji257 i dont know if it helpfull but i was rewriting the code
can you see if the memory leak happen in the new code
its not completed but it can begin download
https://github.com/alzamer2/Crunchyroll-XML-Decoder/tree/code_overhaul

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants