• LogParser v0.8.2, last updated at 2026-03-26 15:12:12, http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-26T14_00_00.json

PROJECT (sourcing_v2), SPIDER (bca.uk)

  • Log analysis
  • Log categorization
  • Progress visualization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spiderbca.uk
    jobtask_51_2026-03-26T14_00_00
    first_log_time2026-03-26 14:00:10
    latest_log_time2026-03-26 15:12:06
    runtime1:11:56
    crawled_pages 1610
    scraped_items 3400
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count166
    log_warning_count423
    log_redirect_count16154
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      last 10 of 166

      2026-03-26 14:45:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'NL23%20XMG'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'NL23%20XMG'
      2026-03-26 14:45:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'NL23%20XMG'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'NL23%20XMG'
      2026-03-26 14:45:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'NL23%20XMG'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'NL23%20XMG'
      2026-03-26 14:45:14 [scrapy.core.scraper] ERROR: Spider error processing <GET https://bcamediaprod.blob.core.windows.net/private/pdfs/InspectionBase/GB/NL23XMG/42774667?sv=2020-08-04&st=2026-03-26T14%3A45%3A11Z&se=2026-03-26T15%3A10%3A11Z&sr=b&sp=r&sig=WY1XJE9DYFLjuamSv%2FwEdEiviehcA3Lode3Fzg66W84%3D> (referer: https://www.bca.co.uk/)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'NL23%20XMG'
      2026-03-26 14:45:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'MK21%20YWO'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'MK21%20YWO'
      2026-03-26 14:45:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'MK21%20YWO'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'MK21%20YWO'
      2026-03-26 14:45:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'MK21%20YWO'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'MK21%20YWO'
      2026-03-26 14:45:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'MK21%20YWO'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'MK21%20YWO'
      2026-03-26 14:45:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'MK21%20YWO'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'MK21%20YWO'
      2026-03-26 14:45:15 [scrapy.core.scraper] ERROR: Spider error processing <GET https://bcamediaprod.blob.core.windows.net/private/pdfs/InspectionBase/GB/MK21YWO/42636009?sv=2020-08-04&st=2026-03-26T14%3A45%3A12Z&se=2026-03-26T15%3A10%3A12Z&sr=b&sp=r&sig=M%2FLjtwqZ%2Bxc17b60%2FKWwOzly3nNve%2BJXnqp5mPjOTsI%3D> (referer: https://www.bca.co.uk/)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 36, in process_spider_output
          item._scrape_type = spider.identifier_scrape_type[identifier]
                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
      KeyError: 'MK21%20YWO'
    • warning_logs
      last 10 of 423

      2026-03-26 15:10:49 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=137&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:19 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=138&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:19 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=138&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:19 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=138&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:30 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=139&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:30 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=139&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:30 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=139&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:52 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=140&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:53 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=140&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 15:11:53 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=140&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.

      INFO

    • redirect_logs
      last 10 of 16154

      2026-03-26 14:47:03 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/LA22OSL/695903177/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=695903177>
      2026-03-26 14:47:03 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/SX67ECJ/694561537/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694561537>
      2026-03-26 14:47:03 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/PO67AZA/694304924/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694304924>
      2026-03-26 14:47:03 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/NX11EFC/694332834/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694332834>
      2026-03-26 14:47:03 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/BK22ENX/697150536/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697150536>
      2026-03-26 14:47:04 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/GJ66TWK/694304566/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694304566>
      2026-03-26 14:47:06 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/PO67AZA/694304910/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694304910>
      2026-03-26 14:47:06 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/NX11EFC/694332830/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694332830>
      2026-03-26 14:47:06 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/SX67ECJ/694561535/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694561535>
      2026-03-26 14:47:07 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/GJ66TWK/694304557/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694304557>

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6024
    • telnet_password

      803a0709ad6c6dfe
    • latest_crawl

      2026-03-26 15:12:02 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=140&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search) ['zyte-api']
    • latest_scrape

      2026-03-26 15:12:01 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.bca.co.uk/api/search?q=&pageSize=100&page=138&sort=MostRecentlyAdded>
    • latest_stat

      2026-03-26 15:11:11 [scrapy.extensions.logstats] INFO: Crawled 1606 pages (at 3 pages/min), scraped 3319 items (at 100 items/min)
    • Head

      2026-03-26 14:00:10 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2026-03-26 14:00:10 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2026-03-26 14:00:10 [bca.uk] INFO: Starting spider bca.uk
      2026-03-26 14:00:10 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_CLIENT_ID, AZURE_TENANT_ID
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2118
      2026-03-26 14:00:10 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27cookies%27 HTTP/1.1" 200 None
      2026-03-26 14:00:10 [bca.uk] INFO: No cached cookies found, will perform fresh login
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27auth_token%27 HTTP/1.1" 200 None
      2026-03-26 14:00:10 [bca.uk] INFO: Loaded cached Auth0 token from Azure Tables (length=1492)
      2026-03-26 14:00:10 [scrapy.addons] INFO: Enabled addons:
      []
      2026-03-26 14:00:10 [asyncio] DEBUG: Using selector: EpollSelector
      2026-03-26 14:00:10 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2026-03-26 14:00:10 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2026-03-26 14:00:10 [scrapy.extensions.telnet] INFO: Telnet Password: 803a0709ad6c6dfe
      2026-03-26 14:00:10 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2026-03-26 14:00:10 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'DOWNLOAD_MAXSIZE': 52428800,
       'DOWNLOAD_WARNSIZE': 10485760,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/bca.uk/task_51_2026-03-26T14_00_00.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'MEMUSAGE_LIMIT_MB': 2048,
       'MEMUSAGE_WARNING_MB': 1536,
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2026-03-26 14:00:10 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-26 14:00:10 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-26 14:00:10 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2026-03-26 14:00:10 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2026-03-26 14:00:10 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2026-03-26 14:00:11 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 14:00:11 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2026-03-26 14:00:11 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2026-03-26 14:00:11 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2026-03-26 14:00:11 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2026-03-26 14:00:11 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2026-03-26 14:00:11 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2026-03-26 14:00:11 [scrapy.core.engine] INFO: Spider opened
      2026-03-26 14:00:11 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2026-03-26 14:00:11 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6024
    • Tail

      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='KY65%2520RPV') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='L40%2520YAW') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LB18%2520UUG') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LB59%2520NWT') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LC17%2520FTZ') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LD16%2520XPT') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LF66%2520BNB') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LK15%2520ZFH') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LK63%2520ZJO') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LL66%2520AJV') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LL66%2520JUH') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LM06%2520BJZ') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LM22%2520HKY') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LP62%2520UWM') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LP65%2520EKV') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LP66%2520VJU') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LR11%2520EZU') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LR17%2520FPE') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LR64%2520PKY') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LS12%2520MSV') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LV11%2520YGW') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LX14%2520JKY') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LY04%2520RGV') HTTP/1.1" 204 0
      2026-03-26 15:12:04 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LY11%2520BFL') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='M222%2520DMH') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MA17%2520AKP') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MA21%2520XKD') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MA62%2520HYN') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MA66%2520FVE') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MF64%2520EZH') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MJ12%2520HYC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MJ14%2520XPE') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MJ68%2520LZP') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MT65%2520XKL') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MW14%2520LWR') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NA62%2520FUB') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NA63%2520TPU') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NG14%2520SGX') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NJ12%2520PNK') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NL11%2520YVS') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NL12%2520UWJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NL17%2520YYF') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NX17%2520VXZ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='OE12%2520RYB') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PA04%2520TOR') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PJ11%2520UDS') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PK07%2520LXH') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PK64%2520VZN') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PK66%2520FZC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PO12%2520XPG') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PX67%2520WUK') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RJ57%2520AFN') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RJ58%2520XNY') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RO18%2520ATK') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RO57%2520HFD') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RX65%2520XOW') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RX66%2520XBV') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RX67%2520XVM') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA15%2520FBK') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SC13%2520FVJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SD61%2520XYL') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SD64%2520HHM') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SH14%2520NKC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SJ64%2520ZVE') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SK64%2520NZX') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SK69%2520UGX') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SL13%2520LAA') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SL68%2520HHK') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SO08%2520VPK') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SP10%2520YKN') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SR66%2520NVN') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SW13%2520MXS') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SW16%2520TYY') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SW18%2520SOE') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SW65%2520USM') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VE14%2520EHR') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VE62%2520YWC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VE67%2520AEM') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK51%2520KAJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK60%2520FDG') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK64%2520HSJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK65%2520HGU') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN16%2520YSJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN17%2520LTO') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO10%2520DSZ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO59%2520MTE') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO60%2520EGX') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VX16%2520FJN') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF64%2520ONV') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF70%2520GBU') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG13%2520TOJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG15%2520WHP') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WH65%2520TYT') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ12%2520HSX') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ64%2520WVW') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ65%2520YHA') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJZ%25206356') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM15%2520NME') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM66%2520XER') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM68%2520ELH') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN18%2520AZU') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WNZ%25204272') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WO16%2520WZC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WO67%2520DZT') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP67%2520NDJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WR15%2520YJC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WR64%2520VBG') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WU65%2520UTC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV18%2520NVC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV55%2520VLW') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YA15%2520VFB') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YA63%2520OHG') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB10%2520VVC') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB65%2520YKO') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YC12%2520WGJ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YE15%2520LYY') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF57%2520KSV') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG13%2520YFE') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG14%2520XSA') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG64%2520XPB') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ12%2520MHV') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ15%2520BMT') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK57%2520WXZ') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK66%2520NEA') HTTP/1.1" 204 0
      2026-03-26 15:12:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK66%2520ZFE') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YM18%2520ZZP') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YN66%2520YVA') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO15%2520LRX') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS61%2520EDX') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT64%2520XWF') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX14%2520KVB') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY15%2520JVG') HTTP/1.1" 204 0
      2026-03-26 15:12:06 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (3400 items) in: file:///var/lib/scrapyd/items/sourcing_v2/bca.uk/task_51_2026-03-26T14_00_00.jl
      2026-03-26 15:12:06 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/exception_count': 327,
       'downloader/exception_type_count/scrapy.core.downloader.handlers.http11.TunnelError': 327,
       'downloader/request_bytes': 27050793,
       'downloader/request_count': 32494,
       'downloader/request_method_count/GET': 32494,
       'downloader/response_bytes': 1357615211,
       'downloader/response_count': 32167,
       'downloader/response_status_count/200': 16013,
       'downloader/response_status_count/302': 16154,
       'elapsed_time_seconds': 4314.869048,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2026, 3, 26, 15, 12, 6, 53529, tzinfo=datetime.timezone.utc),
       'item_dropped_count': 566,
       'item_dropped_reasons_count/DropItem': 566,
       'item_scraped_count': 3400,
       'log_count/DEBUG': 206304,
       'log_count/ERROR': 166,
       'log_count/INFO': 168852,
       'log_count/WARNING': 423,
       'memusage/max': 228548608,
       'memusage/startup': 150372352,
       'photo_download_count': 14403,
       'pipeline/dropped_expired': 566,
       'playwright/context_count': 2,
       'playwright/context_count/max_concurrent': 1,
       'playwright/context_count/persistent/False': 2,
       'playwright/context_count/remote/False': 2,
       'playwright/page_count': 0,
       'request_depth_max': 140,
       'response_received_count': 1610,
       'scheduler/dequeued': 32494,
       'scheduler/dequeued/memory': 32494,
       'scheduler/enqueued': 32494,
       'scheduler/enqueued/memory': 32494,
       'scrape_type/new': 1487,
       'scrape_type/price_update': 2484,
       'scrape_type/skipped': 9565,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 145,
       'scrapy-zyte-api/error_ratio': 0.027586206896551724,
       'scrapy-zyte-api/errors': 4,
       "scrapy-zyte-api/exception_types/<class 'aiohttp.client_exceptions.ClientConnectorError'>": 4,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 13.233658774419041,
       'scrapy-zyte-api/mean_response_seconds': 13.791029262468744,
       'scrapy-zyte-api/processed': 141,
       'scrapy-zyte-api/request_args/actions': 1,
       'scrapy-zyte-api/request_args/browserHtml': 1,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 140,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 140,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 141,
       'scrapy-zyte-api/request_args/httpResponseBody': 140,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 140,
       'scrapy-zyte-api/request_args/sessionContext': 1,
       'scrapy-zyte-api/request_args/url': 141,
       'scrapy-zyte-api/status_codes/0': 4,
       'scrapy-zyte-api/status_codes/200': 141,
       'scrapy-zyte-api/success': 141,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'source/items_encountered': 13889,
       'spider_exceptions/KeyError': 21,
       'spider_exceptions/ValueError': 5,
       'start_time': datetime.datetime(2026, 3, 26, 14, 0, 11, 184481, tzinfo=datetime.timezone.utc)}
      2026-03-26 15:12:06 [scrapy.core.engine] INFO: Spider closed (finished)
    • Log

      /1/log/utf8/sourcing_v2/bca.uk/task_51_2026-03-26T14_00_00/?job_finished=True

    • Source

      http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-26T14_00_00.log

  • sourcelog
    last_update_time2026-03-26 15:12:06
    last_update_timestamp1774537926
    downloader/exception_count327
    downloader/exception_type_count/scrapy.core.downloader.handlers.http11.TunnelError327
    downloader/request_bytes27050793
    downloader/request_count32494
    downloader/request_method_count/GET32494
    downloader/response_bytes1357615211
    downloader/response_count32167
    downloader/response_status_count/20016013
    downloader/response_status_count/30216154
    elapsed_time_seconds4314.869048
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2026, 3, 26, 15, 12, 6, 53529, tzinfo=datetime.timezone.utc)
    item_dropped_count566
    item_dropped_reasons_count/DropItem566
    item_scraped_count3400
    log_count/DEBUG206304
    log_count/ERROR166
    log_count/INFO168852
    log_count/WARNING423
    memusage/max228548608
    memusage/startup150372352
    photo_download_count14403
    pipeline/dropped_expired566
    playwright/context_count2
    playwright/context_count/max_concurrent1
    playwright/context_count/persistent/False2
    playwright/context_count/remote/False2
    playwright/page_count0
    request_depth_max140
    response_received_count1610
    scheduler/dequeued32494
    scheduler/dequeued/memory32494
    scheduler/enqueued32494
    scheduler/enqueued/memory32494
    scrape_type/new1487
    scrape_type/price_update2484
    scrape_type/skipped9565
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts145
    scrapy-zyte-api/error_ratio0.027586206896551724
    scrapy-zyte-api/errors4
    scrapy-zyte-api/exception_types/<class _aiohttp.client_exceptions.ClientConnectorError_>4
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds13.233658774419041
    scrapy-zyte-api/mean_response_seconds13.791029262468744
    scrapy-zyte-api/processed141
    scrapy-zyte-api/request_args/actions1
    scrapy-zyte-api/request_args/browserHtml1
    scrapy-zyte-api/request_args/customHttpRequestHeaders140
    scrapy-zyte-api/request_args/experimental.requestCookies140
    scrapy-zyte-api/request_args/experimental.responseCookies141
    scrapy-zyte-api/request_args/httpResponseBody140
    scrapy-zyte-api/request_args/httpResponseHeaders140
    scrapy-zyte-api/request_args/sessionContext1
    scrapy-zyte-api/request_args/url141
    scrapy-zyte-api/status_codes/04
    scrapy-zyte-api/status_codes/200141
    scrapy-zyte-api/success141
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    source/items_encountered13889
    spider_exceptions/KeyError21
    spider_exceptions/ValueError5
    start_timedatetime.datetime(2026, 3, 26, 14, 0, 11, 184481, tzinfo=datetime.timezone.utc)