You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"Is it possible to have the scraper not completely quit and exit when the limit is reached, but offer an option to wait a bit and then hit something to continue (make it the non-default button). Would be nice to be able to queue everything up and then just periodically hit continue instead of starting over.
The issue I run into is if I queue up say 200, and it does 45 and quits, but 5 of those didn't auto-scrape when I restart it is going to re-attempt those 5 using up some of my searches. I'd like to be able to continue the queued up job after waiting. As of now, I have to do no more then 40-50 at a time which is brutal"
The text was updated successfully, but these errors were encountered:
From here:
"Is it possible to have the scraper not completely quit and exit when the limit is reached, but offer an option to wait a bit and then hit something to continue (make it the non-default button). Would be nice to be able to queue everything up and then just periodically hit continue instead of starting over.
The issue I run into is if I queue up say 200, and it does 45 and quits, but 5 of those didn't auto-scrape when I restart it is going to re-attempt those 5 using up some of my searches. I'd like to be able to continue the queued up job after waiting. As of now, I have to do no more then 40-50 at a time which is brutal"
The text was updated successfully, but these errors were encountered: