-
TLDR: I am looking for a way to display full stack traces when an error occurs during the scraping process. Is there a configuration option or a method in the API that allows for this? If not, I believe this would be a valuable feature to consider for future updates. I have read the How to analyze and fix errors when scraping a website and the Best practices when writing scrapers (#Error handling), besides looking into the Configuration, but I can't figure out how one can display stack traces. Instead, I found this:
Besides being surprised that a tool made for developers (and with great DX, by the way ❤️) calls stack traces something In fact, hiding stack traces and swallowing errors is generally considered an anti-pattern in software development. It makes it significantly harder to identify the root cause of issues and can lead to a lot of wasted time and frustration. I strongly believe that developers using Crawlee should have the option to see full stack traces when an error occurs. If there's a way to enable this that I've missed, I'd appreciate your guidance. If not, I urge you to reconsider this design decision and provide a way to display full stack traces in future updates. In the meantime, any suggestions or workarounds for accessing stack traces would be greatly appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Stack traces are hidden only when the request will be retried, you should get the full stack trace for the final error message (once all retries fail). Or are we talking about something else? You can force the stack traces on each retry as well via |
Beta Was this translation helpful? Give feedback.
Stack traces are hidden only when the request will be retried, you should get the full stack trace for the final error message (once all retries fail). Or are we talking about something else?
You can force the stack traces on each retry as well via
CRAWLEE_VERBOSE_LOG
env var.