Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi range incrementing crashes website #24

Open
placeholder10 opened this issue Dec 21, 2024 · 1 comment
Open

Multi range incrementing crashes website #24

placeholder10 opened this issue Dec 21, 2024 · 1 comment
Assignees
Labels

Comments

@placeholder10
Copy link

Description

I have tried multi incrementing in ranges.
I followed the instructions on your help guide, but it seems like I am missing something.

I would like to find the other photos this listing has. I noticed only the first 5 and last 4 numbers change.
This is why I want to do multi range incrementation. To threat those parts of numbers as one.

This is the URL:
https://img.jofogas.hu/hdimages/E_mozgo_elektromos_kerekpar_elado__851232547195571.jpg
This is what I tried:
https://img.jofogas.hu/hdimages/E_mozgo_elektromos_kerekpar_elado__[85123-99999]254719[0000-9999].jpg
I selected the first one in brackets then multi increment button, then did the same with the second one. I click accept and the site crashes.

URL

https://img.jofogas.hu/hdimages/E_mozgo_elektromos_kerekpar_elado__851232547195571.jpg

Version

6.1

Browser

Chrome 131.0.6778.205

OS

Windows 10

Device

PC

@sixcious
Copy link
Owner

Hi placeholder10 thank you for opening this issue! 💜

Yes, so unfortunately that's just way too high a multi range for the browser and system to handle and so it will hang for a bit and then throw an Out of Memory error. What you're basically asking for is to multi-increment 99,999 * 9,999 = ~1,000,000,000 URLs, which is a huge amount of text to store in memory.

The multi increment was really designed to handle more reasonable URL ranges that are broken up into typical smaller directory levels like https://example.com/files/1/1/001.jpg.

You bring up a good point and I should maybe put some validation/warning text to give the user a warning. I just need to come up with a good "warning limit," since I believe it's dependent on your available memory, which varies.

Can you try doing smaller ranges? Something in the tens of thousands of URLs total should be OK based on my testing on an 8GB RAM device. Your second range is already at 10,000 so that first range probably needs to be a lot smaller. Though I can't imagine you're going to want to be incrementing even that many times.

Please let me know if I can provide any more assistance. I think you might need to find ways to break the problem down even further so it's something more manageable to solve.

@sixcious sixcious self-assigned this Dec 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants