2024-06-16 13:11:01 [scrapy.utils.log] INFO: Scrapy 2.11.0 started (bot: gazette) 2024-06-16 13:11:01 [scrapy.utils.log] INFO: Versions: lxml 4.9.3.0, libxml2 2.10.3, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 22.10.0, Python 3.10.7 (main, May 29 2023, 13:51:48) [GCC 12.2.0], pyOpenSSL 23.2.0 (OpenSSL 3.1.3 19 Sep 2023), cryptography 41.0.4, Platform Linux-5.19.0-46-generic-x86_64-with-glibc2.36 2024-06-16 13:11:01 [pr_araucaria] INFO: Collecting data from 2024-05-24 to 2024-06-16. 2024-06-16 13:11:01 [scrapy.addons] INFO: Enabled addons: [] 2024-06-16 13:11:01 [py.warnings] WARNING: /home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/utils/request.py:254: ScrapyDeprecationWarning: '2.6' is a deprecated value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. It is also the default value. In other words, it is normal to get this warning if you have not defined a value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. This is so for backward compatibility reasons, but it will change in a future version of Scrapy. See the documentation of the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting for information on how to handle this deprecation. return cls(crawler) 2024-06-16 13:11:01 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.epollreactor.EPollReactor 2024-06-16 13:11:01 [scrapy.extensions.telnet] INFO: Telnet Password: f46e324efa8d29f3 2024-06-16 13:11:01 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'spidermon.contrib.scrapy.extensions.Spidermon', 'gazette.extensions.StatsPersist'] 2024-06-16 13:11:01 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'gazette', 'COMMANDS_MODULE': 'gazette.commands', 'DOWNLOAD_TIMEOUT': 360, 'FILES_STORE_S3_ACL': 'public-read', 'LOG_FILE': 'log_pr_araucaria.txt', 'NEWSPIDER_MODULE': 'gazette.spiders', 'SPIDER_MODULES': ['gazette.spiders'], 'TEMPLATES_DIR': 'templates', 'USER_AGENT': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:108.0) ' 'Gecko/20100101 Firefox/108.0'} 2024-06-16 13:11:01 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy_zyte_smartproxy.ZyteSmartProxyMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2024-06-16 13:11:01 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2024-06-16 13:11:01 [scrapy.middleware] INFO: Enabled item pipelines: ['gazette.pipelines.GazetteDateFilteringPipeline', 'gazette.pipelines.DefaultValuesPipeline', 'gazette.pipelines.QueridoDiarioFilesPipeline', 'spidermon.contrib.scrapy.pipelines.ItemValidationPipeline', 'gazette.pipelines.SQLDatabasePipeline'] 2024-06-16 13:11:01 [scrapy.core.engine] INFO: Spider opened 2024-06-16 13:11:01 [gazette.database.models] INFO: Populating 'querido_diario_spider' table - Please wait! 2024-06-16 13:11:01 [gazette.database.models] INFO: Populating 'querido_diario_spider' table - Done! 2024-06-16 13:11:01 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2024-06-16 13:11:01 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2024-06-16 13:11:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2024-06-16 13:11:02 [tzlocal] DEBUG: /etc/timezone found, contents: America/Sao_Paulo 2024-06-16 13:11:02 [tzlocal] DEBUG: /etc/localtime found 2024-06-16 13:11:02 [tzlocal] DEBUG: 2 found: {'/etc/timezone': 'America/Sao_Paulo', '/etc/localtime is a symlink to': 'America/Sao_Paulo'} 2024-06-16 13:11:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/utils/defer.py", line 279, in iter_errback yield next(it) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in return (r for r in result or () if self._filter(r, spider)) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in return (self._set_referer(r, response) for r in result or ()) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in return (r for r in result or () if self._filter(r, spider)) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in return (r for r in result or () if self._filter(r, response, spider)) File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/home/marcos/Documentos/querido-diario/data_collection/gazette/spiders/base/atende_v2.py", line 54, in parse download_url = item.css("button::attr(data-link)")[-1].get() File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/parsel/selector.py", line 143, in __getitem__ o = super().__getitem__(pos) IndexError: list index out of range 2024-06-16 13:11:02 [scrapy.core.engine] INFO: Closing spider (finished) 2024-06-16 13:11:02 [scrapy.extensions.feedexport] INFO: Stored csv feed (0 items) in: pr_araucaria.csv 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] ------------------------------ MONITORS ------------------------------ 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] Comparison Between Executions/Days without gazettes... FAIL 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] Requests/Items Ratio/Ratio of requests over items scraped count... OK 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] Error Count Monitor/test_stat_monitor... FAIL 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] Finish Reason Monitor/Should have the expected finished reason(s)... OK 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] Item Validation Monitor/test_stat_monitor... SKIPPED (Unable to find 'spidermon/validation/fields/errors' in job stats.) 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-06-16 13:11:02 [pr_araucaria] ERROR: [Spidermon] ====================================================================== FAIL: Comparison Between Executions/Days without gazettes ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/marcos/Documentos/querido-diario/data_collection/gazette/monitors.py", line 69, in test_days_without_gazettes self.assertNotEqual( AssertionError: 0 == 0 : No gazettes scraped in the last 7 days. 2024-06-16 13:11:02 [pr_araucaria] ERROR: [Spidermon] ====================================================================== FAIL: Error Count Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/marcos/Documentos/querido-diario/venv/lib/python3.10/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 225, in test_stat_monitor assertion_method( AssertionError: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '1' 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] 5 monitors in 0.003s 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] FAILED (failures=2, skipped=1) 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] -------------------------- FINISHED ACTIONS -------------------------- 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] 0 actions in 0.000s 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] OK 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] --------------------------- PASSED ACTIONS --------------------------- 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] 0 actions in 0.000s 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] OK 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] --------------------------- FAILED ACTIONS --------------------------- 2024-06-16 13:11:02 [spidermon.contrib.actions.discord] INFO: *pr_araucaria* finished - Finish time: *2024-06-16 16:11:02.864758+00:00* - Gazettes scraped: *0* - 🔥 2 failures 🔥 ===== FAILURES ===== Comparison Between Executions/Days without gazettes: 0 == 0 : No gazettes scraped in the last 7 days. Error Count Monitor/test_stat_monitor: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '1' 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] CustomSendDiscordMessage... OK 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] 1 action in 0.000s 2024-06-16 13:11:02 [pr_araucaria] INFO: [Spidermon] OK 2024-06-16 13:11:02 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 451, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 2341, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'elapsed_time_seconds': 1.24313, 'feedexport/success_count/FileFeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2024, 6, 16, 16, 11, 2, 864758, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 15325, 'httpcompression/response_count': 1, 'log_count/DEBUG': 5, 'log_count/ERROR': 3, 'log_count/INFO': 36, 'log_count/WARNING': 1, 'memusage/max': 123478016, 'memusage/startup': 123478016, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/IndexError': 1, 'spidermon/validation/validators': 1, 'spidermon/validation/validators/item/jsonschema': True, 'start_time': datetime.datetime(2024, 6, 16, 16, 11, 1, 621628, tzinfo=datetime.timezone.utc)} 2024-06-16 13:11:02 [scrapy.core.engine] INFO: Spider closed (finished)