• LogParser v0.8.2, last updated at 2026-03-26 06:21:46, http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-26T06_00_00.json

PROJECT (sourcing_v2), SPIDER (bca.uk)

  • Log analysis
  • Log categorization
  • Progress visualization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spiderbca.uk
    jobtask_51_2026-03-26T06_00_00
    first_log_time2026-03-26 06:00:15
    latest_log_time2026-03-26 06:21:44
    runtime0:21:29
    crawled_pages 369
    scraped_items 818
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count8
    log_warning_count396
    log_redirect_count2593
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      8 in total

      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-26 06:03:02 [scrapy.core.scraper] ERROR: Spider error processing <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=10&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
    • warning_logs
      last 10 of 396

      2026-03-26 06:20:53 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=129&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:20:55 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=130&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:20:57 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=130&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:20:57 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=130&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:00 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=131&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:01 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=131&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:01 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=131&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:04 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:06 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:06 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.

      INFO

    • redirect_logs
      last 10 of 2593

      2026-03-26 06:13:26 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/EO12HMK/697211421/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697211421>
      2026-03-26 06:13:26 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098081/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098081>
      2026-03-26 06:13:27 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/FV61VWL/697210827/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697210827>
      2026-03-26 06:13:27 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098091/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098091>
      2026-03-26 06:13:27 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/FV61VWL/697210831/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697210831>
      2026-03-26 06:13:27 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098106/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098106>
      2026-03-26 06:13:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098122/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098122>
      2026-03-26 06:13:30 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098139/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098139>
      2026-03-26 06:13:32 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098157/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098157>
      2026-03-26 06:13:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/ML62WWV/697098170/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697098170>

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6025
    • telnet_password

      3a401693128ea775
    • latest_crawl

      2026-03-26 06:21:43 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search) ['zyte-api']
    • latest_scrape

      2026-03-26 06:20:19 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.bca.co.uk/api/search?q=&pageSize=100&page=122&sort=MostRecentlyAdded>
    • latest_stat

      2026-03-26 06:21:16 [scrapy.extensions.logstats] INFO: Crawled 368 pages (at 10 pages/min), scraped 818 items (at 3 items/min)
    • Head

      2026-03-26 06:00:15 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2026-03-26 06:00:15 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2026-03-26 06:00:15 [bca.uk] INFO: Starting spider bca.uk
      2026-03-26 06:00:15 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_TENANT_ID, AZURE_CLIENT_ID
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2118
      2026-03-26 06:00:15 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27cookies%27 HTTP/1.1" 200 None
      2026-03-26 06:00:15 [bca.uk] INFO: No cached cookies found, will perform fresh login
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:00:15 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27auth_token%27 HTTP/1.1" 200 None
      2026-03-26 06:00:15 [bca.uk] INFO: Loaded cached Auth0 token from Azure Tables (length=1492)
      2026-03-26 06:00:15 [scrapy.addons] INFO: Enabled addons:
      []
      2026-03-26 06:00:15 [asyncio] DEBUG: Using selector: EpollSelector
      2026-03-26 06:00:15 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2026-03-26 06:00:15 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2026-03-26 06:00:15 [scrapy.extensions.telnet] INFO: Telnet Password: 3a401693128ea775
      2026-03-26 06:00:16 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2026-03-26 06:00:16 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'DOWNLOAD_MAXSIZE': 52428800,
       'DOWNLOAD_WARNSIZE': 10485760,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/bca.uk/task_51_2026-03-26T06_00_00.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'MEMUSAGE_LIMIT_MB': 2048,
       'MEMUSAGE_WARNING_MB': 1536,
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2026-03-26 06:00:16 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-26 06:00:16 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-26 06:00:16 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2026-03-26 06:00:16 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2026-03-26 06:00:16 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2026-03-26 06:00:16 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:00:16 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2026-03-26 06:00:16 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2026-03-26 06:00:16 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2026-03-26 06:00:16 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2026-03-26 06:00:16 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2026-03-26 06:00:16 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2026-03-26 06:00:16 [scrapy.core.engine] INFO: Spider opened
      2026-03-26 06:00:16 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2026-03-26 06:00:16 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6025
    • Tail

      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for WT72%20DGU: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: LC72%20ZYL
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27LC72%2520ZYL%27 HTTP/1.1" 200 None
      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for LC72%20ZYL: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: RGZ%204784
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27RGZ%25204784%27 HTTP/1.1" 200 None
      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for RGZ%204784: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: LB73%20UEY
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27LB73%2520UEY%27 HTTP/1.1" 200 None
      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for LB73%20UEY: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: DE72%20RXK
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27DE72%2520RXK%27 HTTP/1.1" 200 None
      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for DE72%20RXK: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: DF72%20AEE
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27DF72%2520AEE%27 HTTP/1.1" 200 None
      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for DF72%20AEE: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: LB73%20YXK
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:05 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27LB73%2520YXK%27 HTTP/1.1" 200 None
      2026-03-26 06:21:05 [bca.uk] INFO: Scrape type for LB73%20YXK: 0
      2026-03-26 06:21:05 [bca.uk] INFO: Found listing with ID: OE72%20HHA
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27OE72%2520HHA%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for OE72%20HHA: 0
      2026-03-26 06:21:06 [bca.uk] INFO: Found listing with ID: OV23%20HZC
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27OV23%2520HZC%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for OV23%20HZC: 0
      2026-03-26 06:21:06 [bca.uk] INFO: Found listing with ID: WV24%20ZRE
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27WV24%2520ZRE%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for WV24%20ZRE: 0
      2026-03-26 06:21:06 [bca.uk] INFO: Found listing with ID: YY23%20VFN
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27YY23%2520VFN%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for YY23%20VFN: 0
      2026-03-26 06:21:06 [bca.uk] INFO: Found listing with ID: DA72%20ZZH
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27DA72%2520ZZH%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for DA72%20ZZH: 0
      2026-03-26 06:21:06 [bca.uk] INFO: Found listing with ID: BA19%20YBM
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27BA19%2520YBM%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for BA19%20YBM: 0
      2026-03-26 06:21:06 [bca.uk] INFO: Found listing with ID: FE12%20LRZ
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:06 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27FE12%2520LRZ%27 HTTP/1.1" 200 None
      2026-03-26 06:21:06 [bca.uk] INFO: Scrape type for FE12%20LRZ: 0
      2026-03-26 06:21:06 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-26 06:21:06 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      
      2026-03-26 06:21:06 [zyte_api._retry] DEBUG: Starting call to 'zyte_api._async.AsyncZyteAPI.get.<locals>.request', this is the 1st time calling it.
      2026-03-26 06:21:16 [scrapy.extensions.logstats] INFO: Crawled 368 pages (at 10 pages/min), scraped 818 items (at 3 items/min)
      
      2026-03-26 06:21:16 [scrapy.extensions.memusage] INFO: Peak memory usage is 198MiB
      2026-03-26 06:21:43 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=132&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search) ['zyte-api']
      2026-03-26 06:21:43 [bca.uk] INFO: Total items found: 0
      2026-03-26 06:21:43 [bca.uk] INFO: No more items to scrape after page {self.next_page}
      2026-03-26 06:21:43 [scrapy.core.engine] INFO: Closing spider (finished)
      2026-03-26 06:21:43 [bca.uk] INFO: bca.uk Crawl ended with reason finished, scrape types: {<ScrapeType.NEW: 1>: 0, <ScrapeType.NEW_DUPLICATE_ID: 4>: 0, <ScrapeType.PRICE_UPDATE: 2>: 2, <ScrapeType.AUCTION_UPDATE: 3>: 0, <ScrapeType.SKIPPED: 0>: 982, <ScrapeType.BATCH_SKIPPED: 5>: 0}
      2026-03-26 06:21:43 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:43 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20RowKey%20eq%20%27G706%2520XFH%27 HTTP/1.1" 200 None
      2026-03-26 06:21:43 [bca.uk] INFO: Saving data for G706%20XFH: {'created_time': 1774506103.877689}
      2026-03-26 06:21:43 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:43 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "PATCH /ScrapedListings(PartitionKey='bca.uk',RowKey='G706%2520XFH') HTTP/1.1" 204 0
      2026-03-26 06:21:43 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27bca.uk%27%20and%20last_price_update_time%20lt%201774074103 HTTP/1.1" 200 None
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='AU11%2520VZJ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='AY63%2520DJE') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='BK23%2520AVD') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='BN13%2520VPO') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='BV23%2520EFJ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='CE61%2520TGF') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='CJ22%2520OGT') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DE61%2520WWK') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DF71%2520NJK') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DK14%2520NVG') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DL73%2520KOH') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DP15%2520RHA') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DT73%2520BGY') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DU59%2520MFF') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DY24%2520PPU') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='DY70%2520FDU') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='EA64%2520NPX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='EJ11%2520ZPE') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='EO11%2520LCG') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='EO63%2520UDW') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='ET10%2520AYX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='EX11%2520YJV') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='FA15%2520HNZ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='FN22%2520RZU') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='GF71%2520DJV') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='GN15%2520WAO') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='GO05%2520AMM') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='GU13%2520PGY') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='GV63%2520BLX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='GX15%2520SYU') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='HG12%2520EHC') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='HJ60%2520VMP') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='HJ61%2520YEC') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='HN10%2520YNX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='HN15%2520ZZX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='HV58%2520ZDZ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='JA02%2520AHA') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='KL19%2520ZGO') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='KS22%2520LPL') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='LV64%2520UJK') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MJ24%2520VYE') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MM21%2520HHO') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MW59%2520XHS') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='MX16%2520WNA') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='NL18%2520NYP') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='OE16%2520CXS') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PN12%2520NLX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='PO22%2520WWN') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RA17%2520ONJ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='RA66%2520HUP') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='S13%2520MLU') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA14%2520ABX') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SC65%2520DJU') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SF63%2520FCP') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SM72%2520EYJ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SO14%2520KLD') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='ST21%2520KZV') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='ST64%2520UCY') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SV62%2520OHA') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='T700%2520JDN') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VE15%2520LRO') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO17%2520UPB') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG73%2520OFZ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN19%2520OPS') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WU60%2520OFO') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YC69%2520OPP') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ66%2520ANS') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO22%2520UFW') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR12%2520YOJ') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY58%2520FXB') HTTP/1.1" 204 0
      2026-03-26 06:21:44 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (818 items) in: file:///var/lib/scrapyd/items/sourcing_v2/bca.uk/task_51_2026-03-26T06_00_00.jl
      2026-03-26 06:21:44 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/request_bytes': 4659157,
       'downloader/request_count': 5330,
       'downloader/request_method_count/GET': 5330,
       'downloader/response_bytes': 327327302,
       'downloader/response_count': 5330,
       'downloader/response_status_count/200': 2737,
       'downloader/response_status_count/302': 2593,
       'elapsed_time_seconds': 1288.391487,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2026, 3, 26, 6, 21, 44, 826798, tzinfo=datetime.timezone.utc),
       'item_scraped_count': 818,
       'log_count/DEBUG': 57263,
       'log_count/ERROR': 8,
       'log_count/INFO': 51177,
       'log_count/WARNING': 396,
       'memusage/max': 207994880,
       'memusage/startup': 150388736,
       'photo_download_count': 2368,
       'playwright/context_count': 2,
       'playwright/context_count/max_concurrent': 1,
       'playwright/context_count/persistent/False': 2,
       'playwright/context_count/remote/False': 2,
       'playwright/page_count': 0,
       'request_depth_max': 132,
       'response_received_count': 369,
       'scheduler/dequeued': 5330,
       'scheduler/dequeued/memory': 5330,
       'scheduler/enqueued': 5330,
       'scheduler/enqueued/memory': 5330,
       'scrape_type/new': 248,
       'scrape_type/price_update': 571,
       'scrape_type/skipped': 12220,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 133,
       'scrapy-zyte-api/error_ratio': 0.0,
       'scrapy-zyte-api/errors': 0,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 5.927900648136672,
       'scrapy-zyte-api/mean_response_seconds': 6.388770721516665,
       'scrapy-zyte-api/processed': 133,
       'scrapy-zyte-api/request_args/actions': 1,
       'scrapy-zyte-api/request_args/browserHtml': 1,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 132,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 132,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 133,
       'scrapy-zyte-api/request_args/httpResponseBody': 132,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 132,
       'scrapy-zyte-api/request_args/sessionContext': 1,
       'scrapy-zyte-api/request_args/url': 133,
       'scrapy-zyte-api/status_codes/200': 133,
       'scrapy-zyte-api/success': 133,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'source/items_encountered': 13099,
       'spider_exceptions/ValueError': 1,
       'start_time': datetime.datetime(2026, 3, 26, 6, 0, 16, 435311, tzinfo=datetime.timezone.utc)}
      2026-03-26 06:21:44 [scrapy.core.engine] INFO: Spider closed (finished)
      
    • Log

      /1/log/utf8/sourcing_v2/bca.uk/task_51_2026-03-26T06_00_00/?job_finished=True

    • Source

      http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-26T06_00_00.log

  • sourcelog
    last_update_time2026-03-26 06:21:44
    last_update_timestamp1774506104
    downloader/request_bytes4659157
    downloader/request_count5330
    downloader/request_method_count/GET5330
    downloader/response_bytes327327302
    downloader/response_count5330
    downloader/response_status_count/2002737
    downloader/response_status_count/3022593
    elapsed_time_seconds1288.391487
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2026, 3, 26, 6, 21, 44, 826798, tzinfo=datetime.timezone.utc)
    item_scraped_count818
    log_count/DEBUG57263
    log_count/ERROR8
    log_count/INFO51177
    log_count/WARNING396
    memusage/max207994880
    memusage/startup150388736
    photo_download_count2368
    playwright/context_count2
    playwright/context_count/max_concurrent1
    playwright/context_count/persistent/False2
    playwright/context_count/remote/False2
    playwright/page_count0
    request_depth_max132
    response_received_count369
    scheduler/dequeued5330
    scheduler/dequeued/memory5330
    scheduler/enqueued5330
    scheduler/enqueued/memory5330
    scrape_type/new248
    scrape_type/price_update571
    scrape_type/skipped12220
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts133
    scrapy-zyte-api/error_ratio0.0
    scrapy-zyte-api/errors0
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds5.927900648136672
    scrapy-zyte-api/mean_response_seconds6.388770721516665
    scrapy-zyte-api/processed133
    scrapy-zyte-api/request_args/actions1
    scrapy-zyte-api/request_args/browserHtml1
    scrapy-zyte-api/request_args/customHttpRequestHeaders132
    scrapy-zyte-api/request_args/experimental.requestCookies132
    scrapy-zyte-api/request_args/experimental.responseCookies133
    scrapy-zyte-api/request_args/httpResponseBody132
    scrapy-zyte-api/request_args/httpResponseHeaders132
    scrapy-zyte-api/request_args/sessionContext1
    scrapy-zyte-api/request_args/url133
    scrapy-zyte-api/status_codes/200133
    scrapy-zyte-api/success133
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    source/items_encountered13099
    spider_exceptions/ValueError1
    start_timedatetime.datetime(2026, 3, 26, 6, 0, 16, 435311, tzinfo=datetime.timezone.utc)