• LogParser v0.8.2, last updated at 2026-03-27 15:01:57, http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-27T14_00_00.json

PROJECT (sourcing_v2), SPIDER (bca.uk)

  • Log analysis
  • Log categorization
  • Progress visualization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spiderbca.uk
    jobtask_51_2026-03-27T14_00_00
    first_log_time2026-03-27 14:00:09
    latest_log_time2026-03-27 15:01:49
    runtime1:01:40
    crawled_pages 1632
    scraped_items 3721
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count208
    log_warning_count432
    log_redirect_count16114
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      last 10 of 208

      2026-03-27 14:34:40 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field year is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-27 14:34:40 [scrapy.core.scraper] ERROR: Spider error processing <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=33&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 804, in _extract_listing_item
          item["year"] = year
          ~~~~^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 116, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field year is not nullable
      2026-03-27 14:40:52 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:52 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:53 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:53 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:53 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:53 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:53 [crawlers.middlewares.monitoring_spider_middleware] ERROR: expected string or bytes-like object, got 'NoneType'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
      2026-03-27 14:40:53 [scrapy.core.scraper] ERROR: Spider error processing <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=53&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 735, in parse_listings
          yield from self._process_listing_item(item_to_scrape)
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 478, in _process_listing_item
          item = self._extract_listing_item(item_to_scrape)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 774, in _extract_listing_item
          year = self._extract_year(detail_data)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/bcauk.py", line 872, in _extract_year
          match = re.match(r"(\d{4})", plate)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/re/__init__.py", line 166, in match
          return _compile(pattern, flags).match(string)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      TypeError: expected string or bytes-like object, got 'NoneType'
    • warning_logs
      last 10 of 432

      2026-03-27 15:00:34 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=141&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:00:46 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=142&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:00:47 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=142&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:00:47 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=142&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:00:59 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=143&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:00:59 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=143&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:00:59 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=143&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:01:18 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:01:18 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.
      2026-03-27 15:01:18 [scrapy_zyte_api._params] WARNING: Request <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> defines the Zyte API customHttpRequestHeaders parameter, overriding Request.headers. Use Request.headers instead.

      INFO

    • redirect_logs
      last 10 of 16114

      2026-03-27 14:44:28 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/WV71TBZ/696710457/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696710457>
      2026-03-27 14:44:28 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/RO72TMV/696990177/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696990177>
      2026-03-27 14:44:28 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/LD23OCN/696106757/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696106757>
      2026-03-27 14:44:28 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/YF23DFE/697495138/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&Reg=YF23DFE-GB&grp=public&obl=hht3,Manheim&minwidth=600&width=600&default=5>
      2026-03-27 14:44:28 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/YM22MJX/695826183/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=695826183>
      2026-03-27 14:44:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/WV71TBZ/696710451/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696710451>
      2026-03-27 14:44:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/LS73MJV/696214301/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696214301>
      2026-03-27 14:44:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/LD23OCN/696106752/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696106752>
      2026-03-27 14:44:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/LD23OCN/696106747/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696106747>
      2026-03-27 14:44:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://bcamediaprod.blob.core.windows.net/public/images/vehicle/GB/RO72TMV/696990163/600> from <GET https://www1.bcaimage.com/Document?DocType=VehicleImage&width=600&docId=696990163>

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6024
    • telnet_password

      c5147042a508a892
    • latest_crawl

      2026-03-27 15:01:38 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.bca.co.uk/api/search?q=&pageSize=100&page=144&sort=MostRecentlyAdded> (referer: https://www.bca.co.uk/search) ['zyte-api']
    • latest_scrape

      2026-03-27 15:01:43 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.bca.co.uk/api/search?q=&pageSize=100&page=143&sort=MostRecentlyAdded>
    • latest_stat

      2026-03-27 15:01:12 [scrapy.extensions.logstats] INFO: Crawled 1630 pages (at 3 pages/min), scraped 3666 items (at 74 items/min)
    • Head

      2026-03-27 14:00:09 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2026-03-27 14:00:09 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2026-03-27 14:00:09 [bca.uk] INFO: Starting spider bca.uk
      2026-03-27 14:00:09 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_TENANT_ID, AZURE_CLIENT_ID
      2026-03-27 14:00:09 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2113
      2026-03-27 14:00:10 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27cookies%27 HTTP/1.1" 200 None
      2026-03-27 14:00:10 [bca.uk] INFO: No cached cookies found, will perform fresh login
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /SpiderData()?$filter=PartitionKey%20eq%20%27BCAUk%27%20and%20RowKey%20eq%20%27auth_token%27 HTTP/1.1" 200 None
      2026-03-27 14:00:10 [bca.uk] INFO: Loaded cached Auth0 token from Azure Tables (length=1492)
      2026-03-27 14:00:10 [scrapy.addons] INFO: Enabled addons:
      []
      2026-03-27 14:00:10 [asyncio] DEBUG: Using selector: EpollSelector
      2026-03-27 14:00:10 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2026-03-27 14:00:10 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2026-03-27 14:00:10 [scrapy.extensions.telnet] INFO: Telnet Password: c5147042a508a892
      2026-03-27 14:00:10 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2026-03-27 14:00:10 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'DOWNLOAD_MAXSIZE': 52428800,
       'DOWNLOAD_WARNSIZE': 10485760,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/bca.uk/task_51_2026-03-27T14_00_00.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'MEMUSAGE_LIMIT_MB': 2048,
       'MEMUSAGE_WARNING_MB': 1536,
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2026-03-27 14:00:10 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-27 14:00:10 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-03-27 14:00:10 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2026-03-27 14:00:10 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2026-03-27 14:00:10 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-03-27 14:00:10 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2026-03-27 14:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2026-03-27 14:00:10 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2026-03-27 14:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2026-03-27 14:00:10 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2026-03-27 14:00:10 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2026-03-27 14:00:10 [scrapy.core.engine] INFO: Spider opened
      2026-03-27 14:00:10 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2026-03-27 14:00:10 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6024
    • Tail

      2026-03-27 15:01:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA20%2520DND') HTTP/1.1" 204 0
      2026-03-27 15:01:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA22%2520FXB') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA23%2520DYO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA59%2520HPN') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SA59%2520POH') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SB08%2520XMY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SB14%2520VRY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SB59%2520LRY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SB61%2520NUC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SB68%2520BFY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SB69%2520GSY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SC13%2520BZK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SC16%2520GVD') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SD17%2520XZO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SD20%2520HXZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SF66%2520YTH') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SG67%2520PFY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SH10%2520NXG') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SH12%2520BRF') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SH61%2520ZGC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SK22%2520AWH') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SK54%2520RLU') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SK59%2520LDY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SL14%2520WPX') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SL67%2520FUM') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SL68%2520FXF') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SM10%2520KKO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SN14%2520OWM') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SN72%2520HNC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SP15%2520OKC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SP17%2520ZGJ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SP63%2520VTG') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SR22%2520HBA') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SR70%2520GXD') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='ST20%2520CDY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='ST69%2520ZBP') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='SW57%2520CXB') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='TGZ%25206090') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VA18%2520JPU') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VA72%2520TVO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VE72%2520NXK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VEZ%25204065') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK11%2520JUY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK11%2520RWE') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK18%2520JWJ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK62%2520XRM') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VK65%2520LUZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VN61%2520UUS') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VO61%2520JUY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VX14%2520MVH') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='VX63%2520BFA') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WD19%2520VSC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WD69%2520MXF') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF16%2520LZO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF21%2520YZM') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF58%2520YRK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WF59%2520AOZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG10%2520WLK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG20%2520UJX') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG65%2520XNK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WG68%2520FZJ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WH69%2520HRG') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ02%2520VGC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WJ67%2520DBY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WK60%2520SRZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WL12%2520PSO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WM17%2520LPV') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN06%2520UPC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN17%2520UTH') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN60%2520XSR') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WN66%2520VGY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WO73%2520UDZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP09%2520OLG') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP11%2520LLZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP16%2520DZF') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WP52%2520DDJ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WR59%2520FZE') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WU61%2520UTS') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV11%2520HZC') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV18%2520NZW') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV20%2520DVA') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV58%2520DVA') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV60%2520WTL') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV61%2520NXT') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV64%2520VMW') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV67%2520PYD') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WV69%2520UGL') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WX63%2520OYB') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='WX63%2520ZKK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YA15%2520HPZ') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB19%2520CZY') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB59%2520OVU') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YB67%2520KXH') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YC61%2520XNF') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YC68%2520KAM') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YD14%2520WKO') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YD21%2520ZGU') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YD60%2520ANV') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YE70%2520VYL') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF14%2520SYS') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YF59%2520JBE') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG05%2520PHA') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG06%2520XMK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YG22%2520XUV') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH23%2520UOK') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YH61%2520EXU') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ21%2520NBN') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJ23%2520VFR') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YJZ%25206764') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK06%2520ZBD') HTTP/1.1" 204 0
      2026-03-27 15:01:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK09%2520HRX') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YK09%2520XZU') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YL66%2520KXH') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YM17%2520HLV') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YO66%2520LNY') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YP62%2520XWK') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR02%2520WVH') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR19%2520RJG') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YR74%2520XKC') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS06%2520FHA') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS15%2520EEJ') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS20%2520DZJ') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YS67%2520OMZ') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT13%2520NGX') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT18%2520GRK') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YT19%2520FEW') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX05%2520YTJ') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX21%2520JZL') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX22%2520NLP') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YX64%2520YRU') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY21%2520VKU') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY70%2520LTO') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='bca.uk',RowKey='YY74%2520HGL') HTTP/1.1" 204 0
      2026-03-27 15:01:49 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (3721 items) in: file:///var/lib/scrapyd/items/sourcing_v2/bca.uk/task_51_2026-03-27T14_00_00.jl
      2026-03-27 15:01:49 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/exception_count': 475,
       'downloader/exception_type_count/scrapy.core.downloader.handlers.http11.TunnelError': 475,
       'downloader/request_bytes': 27746040,
       'downloader/request_count': 32833,
       'downloader/request_method_count/GET': 32833,
       'downloader/response_bytes': 1381086660,
       'downloader/response_count': 32358,
       'downloader/response_status_count/200': 16244,
       'downloader/response_status_count/302': 16114,
       'elapsed_time_seconds': 3698.336023,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2026, 3, 27, 15, 1, 49, 174236, tzinfo=datetime.timezone.utc),
       'item_dropped_count': 649,
       'item_dropped_reasons_count/DropItem': 649,
       'item_scraped_count': 3721,
       'log_count/DEBUG': 209467,
       'log_count/ERROR': 208,
       'log_count/INFO': 172114,
       'log_count/WARNING': 432,
       'memusage/max': 230334464,
       'memusage/startup': 150654976,
       'photo_download_count': 14612,
       'pipeline/dropped_expired': 649,
       'playwright/context_count': 2,
       'playwright/context_count/max_concurrent': 1,
       'playwright/context_count/persistent/False': 2,
       'playwright/context_count/remote/False': 2,
       'playwright/page_count': 0,
       'request_depth_max': 144,
       'response_received_count': 1632,
       'scheduler/dequeued': 32833,
       'scheduler/dequeued/memory': 32833,
       'scheduler/enqueued': 32833,
       'scheduler/enqueued/memory': 32833,
       'scrape_type/new': 1527,
       'scrape_type/price_update': 2851,
       'scrape_type/skipped': 9318,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 145,
       'scrapy-zyte-api/error_ratio': 0.0,
       'scrapy-zyte-api/errors': 0,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 12.077912474988867,
       'scrapy-zyte-api/mean_response_seconds': 12.66070118425735,
       'scrapy-zyte-api/processed': 145,
       'scrapy-zyte-api/request_args/actions': 1,
       'scrapy-zyte-api/request_args/browserHtml': 1,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 144,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 144,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 145,
       'scrapy-zyte-api/request_args/httpResponseBody': 144,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 144,
       'scrapy-zyte-api/request_args/sessionContext': 1,
       'scrapy-zyte-api/request_args/url': 145,
       'scrapy-zyte-api/status_codes/200': 145,
       'scrapy-zyte-api/success': 145,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'source/items_encountered': 14284,
       'spider_exceptions/KeyError': 24,
       'spider_exceptions/TypeError': 1,
       'spider_exceptions/ValueError': 7,
       'start_time': datetime.datetime(2026, 3, 27, 14, 0, 10, 838213, tzinfo=datetime.timezone.utc)}
      2026-03-27 15:01:49 [scrapy.core.engine] INFO: Spider closed (finished)
    • Log

      /1/log/utf8/sourcing_v2/bca.uk/task_51_2026-03-27T14_00_00/?job_finished=True

    • Source

      http://scrapyd-0:6800/logs/sourcing_v2/bca.uk/task_51_2026-03-27T14_00_00.log

  • sourcelog
    last_update_time2026-03-27 15:01:49
    last_update_timestamp1774623709
    downloader/exception_count475
    downloader/exception_type_count/scrapy.core.downloader.handlers.http11.TunnelError475
    downloader/request_bytes27746040
    downloader/request_count32833
    downloader/request_method_count/GET32833
    downloader/response_bytes1381086660
    downloader/response_count32358
    downloader/response_status_count/20016244
    downloader/response_status_count/30216114
    elapsed_time_seconds3698.336023
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2026, 3, 27, 15, 1, 49, 174236, tzinfo=datetime.timezone.utc)
    item_dropped_count649
    item_dropped_reasons_count/DropItem649
    item_scraped_count3721
    log_count/DEBUG209467
    log_count/ERROR208
    log_count/INFO172114
    log_count/WARNING432
    memusage/max230334464
    memusage/startup150654976
    photo_download_count14612
    pipeline/dropped_expired649
    playwright/context_count2
    playwright/context_count/max_concurrent1
    playwright/context_count/persistent/False2
    playwright/context_count/remote/False2
    playwright/page_count0
    request_depth_max144
    response_received_count1632
    scheduler/dequeued32833
    scheduler/dequeued/memory32833
    scheduler/enqueued32833
    scheduler/enqueued/memory32833
    scrape_type/new1527
    scrape_type/price_update2851
    scrape_type/skipped9318
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts145
    scrapy-zyte-api/error_ratio0.0
    scrapy-zyte-api/errors0
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds12.077912474988867
    scrapy-zyte-api/mean_response_seconds12.66070118425735
    scrapy-zyte-api/processed145
    scrapy-zyte-api/request_args/actions1
    scrapy-zyte-api/request_args/browserHtml1
    scrapy-zyte-api/request_args/customHttpRequestHeaders144
    scrapy-zyte-api/request_args/experimental.requestCookies144
    scrapy-zyte-api/request_args/experimental.responseCookies145
    scrapy-zyte-api/request_args/httpResponseBody144
    scrapy-zyte-api/request_args/httpResponseHeaders144
    scrapy-zyte-api/request_args/sessionContext1
    scrapy-zyte-api/request_args/url145
    scrapy-zyte-api/status_codes/200145
    scrapy-zyte-api/success145
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    source/items_encountered14284
    spider_exceptions/KeyError24
    spider_exceptions/TypeError1
    spider_exceptions/ValueError7
    start_timedatetime.datetime(2026, 3, 27, 14, 0, 10, 838213, tzinfo=datetime.timezone.utc)