You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be really nifty to be able to run this from the CLI as a command and have it write the JSON to stdout so it could be used as part of a larger tool. For example:
scraply --execute /my-macro
The text was updated successfully, but these errors were encountered:
I would imagine this being useful when looking to crawl for things other than HTML documents or using in shell scripts. For example, you could use Scraply to build up a JSON object describing your data on an image hosting website, and then pass that to another command that would download all your images. In that case you'd want to run the script as part of a larger process (or manually as a script) and may not want to worry about binding to a port, making sure the server terminates, and securing the endpoint.
I know that it is ~two years since you posted your suggestion, but I wanted to let you know that scraply now is just a simple CSS selectors scraper with a very simple cli interface.
It would be really nifty to be able to run this from the CLI as a command and have it write the JSON to stdout so it could be used as part of a larger tool. For example:
The text was updated successfully, but these errors were encountered: