• LogParser v0.8.2, last updated at 2026-03-28 06:35:17, http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-28T06_00_00.json

PROJECT (sourcing_v2), SPIDER (bca.uk)

  • Log analysis
  • Log categorization
  • Progress visualization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spiderbca.uk
    jobtask_51_2026-03-28T06_00_00
    first_log_time2026-03-28 06:00:14
    latest_log_time2026-03-28 06:35:08
    runtime0:34:54
    crawled_pages 347
    scraped_items 2988
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count10
    log_warning_count438
    log_redirect_count2258
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      10 in total

      2026-03-28 06:01:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:15 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:01:15 [scrapy.core.scraper] ERROR: Spider error processing <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=1&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-28 06:14:39 [crawlers.pipelines.post_to_api] ERROR: bca.uk, item_id bd770c48-f883-5517-b128-d164cb57cd7f: Failed to post item to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing: 422 Client Error: Unprocessable Entity for url: https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
       | Status Code: 422
       | Response Content: {"ValidationErrors":[{"PropertyName":"Color","ErrorMessage":"color value can only be expressed in letters","ErrorCode":0}],"WasSuccessful":false}
       | Response Headers: {'Content-Type': 'application/json', 'Date': 'Sat, 28 Mar 2026 06:14:38 GMT', 'Request-Context': 'appId=cid-v1:1a14ebe8-38cd-4629-ab2d-40684250fa5b', 'Server': 'Kestrel', 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains; preload', 'Transfer-Encoding': 'chunked'}
      
      2026-03-28 06:14:51 [crawlers.pipelines.post_to_api] ERROR: bca.uk, item_id fcf660a7-993b-56ef-b878-05e41cdaa0b9: Failed to post item to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing: 422 Client Error: Unprocessable Entity for url: https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
       | Status Code: 422
       | Response Content: {"ValidationErrors":[{"PropertyName":"Color","ErrorMessage":"color value can only be expressed in letters","ErrorCode":0}],"WasSuccessful":false}
       | Response Headers: {'Content-Type': 'application/json', 'Date': 'Sat, 28 Mar 2026 06:14:50 GMT', 'Request-Context': 'appId=cid-v1:1a14ebe8-38cd-4629-ab2d-40684250fa5b', 'Server': 'Kestrel', 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains; preload', 'Transfer-Encoding': 'chunked'}
      
    • warning_logs
      last 10 of 438

      2026-03-28 06:33:34 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=143&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:33:56 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:33:57 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:33:57 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:34:47 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=145&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:34:47 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=145&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:34:47 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=145&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:34:55 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=146&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:34:55 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=146&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-28 06:34:55 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=146&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.

      INFO

    • redirect_logs
      last 10 of 2258

      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/PIJ2496/694074080/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=694074080>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/NX12JBU/698553603/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698553603>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/YS13XFM/698598640/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698598640>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/T500MUL/698445012/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698445012>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/HG22NFU/697135793/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697135793>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/HG22NFU/697135782/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=697135782>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/SD62JTV/698477775/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698477775>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/T500MUL/698445008/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698445008>
      2026-03-28 06:15:33 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/NX12JBU/698553582/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698553582>
      2026-03-28 06:15:34 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/YS13XFM/698598620/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=698598620>

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6025
    • telnet_password

      ec29a0d83f3a0ee1
    • latest_crawl

      2026-03-28 06:34:59 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=146&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search) ['zyte-api']
    • latest_scrape

      2026-03-28 06:34:58 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded>
    • latest_stat

      2026-03-28 06:34:15 [scrapy.extensions.logstats] INFO: Crawled 344 pages (at 2 pages/min), scraped 2908 items (at 131 items/min)
    • Head

      2026-03-28 06:00:14 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2026-03-28 06:00:14 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2026-03-28 06:00:14 [bca.uk] INFO: Starting spider bca.uk
      2026-03-28 06:00:14 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_CLIENT_ID, AZURE_TENANT_ID
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2123
      2026-03-28 06:00:14 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27cookies%27 HTTP/1.1" 200 None
      2026-03-28 06:00:14 [bca.uk] INFO: No cached cookies found, will perform fresh login
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-28 06:00:14 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27auth_token%27 HTTP/1.1" 200 None
      2026-03-28 06:00:14 [bca.uk] INFO: Loaded cached Auth0 token from Azure Tables (length=1492)
      2026-03-28 06:00:14 [scrapy.addons] INFO: Enabled addons:
      []
      2026-03-28 06:00:14 [asyncio] DEBUG: Using selector: EpollSelector
      2026-03-28 06:00:14 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2026-03-28 06:00:14 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2026-03-28 06:00:14 [scrapy.extensions.telnet] INFO: Telnet Password: ec29a0d83f3a0ee1
      2026-03-28 06:00:15 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2026-03-28 06:00:15 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'DOWNLOAD_MAXSIZE': 52428800,
       'DOWNLOAD_WARNSIZE': 10485760,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/bca.uk/task_51_2026-03-28T06_00_00.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'MEMUSAGE_LIMIT_MB': 2048,
       'MEMUSAGE_WARNING_MB': 1536,
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2026-03-28 06:00:15 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-28 06:00:15 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-28 06:00:15 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2026-03-28 06:00:15 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2026-03-28 06:00:15 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2026-03-28 06:00:15 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-28 06:00:15 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2026-03-28 06:00:15 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2026-03-28 06:00:15 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2026-03-28 06:00:15 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2026-03-28 06:00:15 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2026-03-28 06:00:15 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2026-03-28 06:00:15 [scrapy.core.engine] INFO: Spider opened
      2026-03-28 06:00:15 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2026-03-28 06:00:15 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6025
    • Tail

      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VF19%2520CXP') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VGZ%25203493') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VGZ%25208613') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VIG%25201703') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK09%2520KFF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK19%2520GMU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN14%2520OYR') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN16%2520XEC') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN18%2520BHO') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN21%2520OSD') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN67%2520LWG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN70%2520YME') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN72%2520MZY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN72%2520RJV') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO23%2520UKB') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO59%2520PLZ') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO59%2520VDV') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO71%2520UMD') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VU68%2520ETR') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VX18%2520OFD') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WA21%2520VUG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WA22%2520VTD') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WA22%2520XYG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WA68%2520JNF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WD19%2520SKV') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WD65%2520CWA') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WD68%2520XTT') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF19%2520YHY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF24%2520OGM') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG17%2520BLK') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WH16%2520OGA') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ09%2520FWG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ24%2520KJN') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ67%2520YJU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WK13%2520JDU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WK60%2520UCF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM09%2520UDY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM14%2520CDZ') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM71%2520KUG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM72%2520CMF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN70%2520EUX') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WO65%2520CGY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WO72%2520CKF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP17%2520UTK') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP19%2520BWF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WR59%2520WDU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WR63%2520XCU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WR69%2520GZC') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WT15%2520CYH') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WT18%2520VFD') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WU14%2520GPF') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WU23%2520EWW') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV14%2520NJU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV22%2520XYK') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV72%2520XXC') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WX68%2520LKA') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='XHZ%25205288') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YA15%2520UYY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YA72%2520FWX') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB14%2520EXL') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB15%2520UVG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB19%2520LDV') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB21%2520NMM') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB21%2520XRK') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB67%2520EZC') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB71%2520HGO') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB71%2520MKP') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YC21%2520OJV') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YD68%2520TOK') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YD72%2520HCY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YD73%2520HBK') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YE17%2520LXS') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YE22%2520NNT') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YE74%2520EHP') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF13%2520TBY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF20%2520HVB') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF59%2520LTN') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF61%2520SVJ') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG18%2520KXP') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG19%2520DNE') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG72%2520UHZ') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG72%2520ZSR') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH15%2520NZT') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH18%2520WMA') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH21%2520HCL') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH24%2520OPY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH66%2520HFO') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH67%2520HPY') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH71%2520OSP') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ21%2520EME') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ63%2520MRU') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK11%2520XAM') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK58%2520XDG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK64%2520ONG') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK69%2520KWR') HTTP/1.1" 204 0
      2026-03-28 06:35:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK69%2520PXP') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YL14%2520GKA') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YL22%2520OMW') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YL74%2520DSY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YM19%2520GWZ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YM19%2520YKT') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YM72%2520FNR') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YM73%2520MXZ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YN19%2520HML') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO15%2520RCU') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO21%2520KLL') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO24%2520FNZ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO72%2520EPL') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP13%2520GCV') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP20%2520CHY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP22%2520HCU') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP22%2520XAE') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP24%2520UGY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP69%2520UPV') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP70%2520WZY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR08%2520FYY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR20%2520TUW') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR22%2520BVJ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR70%2520DTZ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS11%2520XFY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS20%2520KNJ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS71%2520MWJ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS72%2520VXC') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS72%2520XDY') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT13%2520TNZ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT22%2520ZXJ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT23%2520ZYG') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT66%2520OTL') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT68%2520GKP') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT70%2520FLK') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX13%2520UCS') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX15%2520VTN') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX16%2520WMT') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX59%2520UKB') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX62%2520OGN') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX73%2520HJE') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY06%2520MXE') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY16%2520OKX') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY70%2520PXJ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY72%2520RZZ') HTTP/1.1" 204 0
      2026-03-28 06:35:08 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (2988 items) in: file:///var/lib/scrapyd/items/sourcing_v2/bca.uk/task_51_2026-03-28T06_00_00.jl
      2026-03-28 06:35:08 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/request_bytes': 4196436,
       'downloader/request_count': 4677,
       'downloader/request_method_count/GET': 4677,
       'downloader/response_bytes': 309609207,
       'downloader/response_count': 4677,
       'downloader/response_status_count/200': 2419,
       'downloader/response_status_count/302': 2258,
       'elapsed_time_seconds': 2092.982467,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2026, 3, 28, 6, 35, 8, 440489, tzinfo=datetime.timezone.utc),
       'item_scraped_count': 2988,
       'log_count/DEBUG': 72873,
       'log_count/ERROR': 10,
       'log_count/INFO': 69140,
       'log_count/WARNING': 438,
       'memusage/max': 223485952,
       'memusage/startup': 148643840,
       'photo_download_count': 2072,
       'playwright/context_count': 2,
       'playwright/context_count/max_concurrent': 1,
       'playwright/context_count/persistent/False': 2,
       'playwright/context_count/remote/False': 2,
       'playwright/page_count': 0,
       'request_depth_max': 146,
       'response_received_count': 347,
       'scheduler/dequeued': 4677,
       'scheduler/dequeued/memory': 4677,
       'scheduler/enqueued': 4677,
       'scheduler/enqueued/memory': 4677,
       'scrape_type/new': 222,
       'scrape_type/price_update': 2767,
       'scrape_type/skipped': 11413,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 147,
       'scrapy-zyte-api/error_ratio': 0.0,
       'scrapy-zyte-api/errors': 0,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 8.83776559138379,
       'scrapy-zyte-api/mean_response_seconds': 9.480584007111338,
       'scrapy-zyte-api/processed': 147,
       'scrapy-zyte-api/request_args/actions': 1,
       'scrapy-zyte-api/request_args/browserHtml': 1,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 146,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 146,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 147,
       'scrapy-zyte-api/request_args/httpResponseBody': 146,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 146,
       'scrapy-zyte-api/request_args/sessionContext': 1,
       'scrapy-zyte-api/request_args/url': 147,
       'scrapy-zyte-api/status_codes/200': 147,
       'scrapy-zyte-api/success': 147,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'source/items_encountered': 14415,
       'spider_exceptions/ValueError': 1,
       'start_time': datetime.datetime(2026, 3, 28, 6, 0, 15, 458022, tzinfo=datetime.timezone.utc)}
      2026-03-28 06:35:08 [scrapy.core.engine] INFO: Spider closed (finished)
      
    • Log

      /1/log/utf8/sourcing_v2/bca.uk/task_51_2026-03-28T06_00_00/?job_finished=True

    • Source

      http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-28T06_00_00.log

  • sourcelog
    last_update_time2026-03-28 06:35:08
    last_update_timestamp1774679708
    downloader/request_bytes4196436
    downloader/request_count4677
    downloader/request_method_count/GET4677
    downloader/response_bytes309609207
    downloader/response_count4677
    downloader/response_status_count/2002419
    downloader/response_status_count/3022258
    elapsed_time_seconds2092.982467
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2026, 3, 28, 6, 35, 8, 440489, tzinfo=datetime.timezone.utc)
    item_scraped_count2988
    log_count/DEBUG72873
    log_count/ERROR10
    log_count/INFO69140
    log_count/WARNING438
    memusage/max223485952
    memusage/startup148643840
    photo_download_count2072
    playwright/context_count2
    playwright/context_count/max_concurrent1
    playwright/context_count/persistent/False2
    playwright/context_count/remote/False2
    playwright/page_count0
    request_depth_max146
    response_received_count347
    scheduler/dequeued4677
    scheduler/dequeued/memory4677
    scheduler/enqueued4677
    scheduler/enqueued/memory4677
    scrape_type/new222
    scrape_type/price_update2767
    scrape_type/skipped11413
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts147
    scrapy-zyte-api/error_ratio0.0
    scrapy-zyte-api/errors0
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds8.83776559138379
    scrapy-zyte-api/mean_response_seconds9.480584007111338
    scrapy-zyte-api/processed147
    scrapy-zyte-api/request_args/actions1
    scrapy-zyte-api/request_args/browserHtml1
    scrapy-zyte-api/request_args/customHttpRequestHeaders146
    scrapy-zyte-api/request_args/experimental.requestCookies146
    scrapy-zyte-api/request_args/experimental.responseCookies147
    scrapy-zyte-api/request_args/httpResponseBody146
    scrapy-zyte-api/request_args/httpResponseHeaders146
    scrapy-zyte-api/request_args/sessionContext1
    scrapy-zyte-api/request_args/url147
    scrapy-zyte-api/status_codes/200147
    scrapy-zyte-api/success147
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    source/items_encountered14415
    spider_exceptions/ValueError1
    start_timedatetime.datetime(2026, 3, 28, 6, 0, 15, 458022, tzinfo=datetime.timezone.utc)