• LogParser v0.8.2, last updated at 2025-12-01 06:00:58, http://scrapyd-2:6800/logs/sourcing_v2/manheim.gb/task_37_2025-12-01T06_00_02.json

PROJECT (sourcing_v2), SPIDER (manheim.gb)

  • Log analysis
  • Log categorization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spidermanheim.gb
    jobtask_37_2025-12-01T06_00_02
    first_log_time2025-12-01 06:00:09
    latest_log_time2025-12-01 06:00:48
    runtime0:00:39
    crawled_pages 2
    scraped_items 0
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count8
    log_warning_count0
    log_redirect_count0
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      8 in total

      2025-12-01 06:00:41 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Field odometer is not nullable
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable
      2025-12-01 06:00:42 [scrapy.core.scraper] ERROR: Spider error processing <POST https://www.manheim.co.uk/search/refine> (referer: https://www.manheim.co.uk/search)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/manheim.py", line 201, in parse_listings
          scraped_item = self._extract_listing_item(item, expiration_date)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/manheim.py", line 281, in _extract_listing_item
          item["odometer"] = odometer
          ~~~~^^^^^^^^^^^^
        File "/usr/src/app/crawlers/items.py", line 115, in __setitem__
          raise ValueError(f"Field {key} is not nullable")
      ValueError: Field odometer is not nullable

      INFO

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6023
    • telnet_password

      2bef58345b08d6ef
    • latest_crawl

      2025-12-01 06:00:41 [scrapy.core.engine] DEBUG: Crawled (200) <POST https://www.manheim.co.uk/search/refine> (referer: https://www.manheim.co.uk/search) ['zyte-api']
    • latest_stat

      2025-12-01 06:00:09 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
    • Head

      2025-12-01 06:00:09 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2025-12-01 06:00:09 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2025-12-01 06:00:09 [manheim.gb] INFO: Starting spider manheim.gb
      2025-12-01 06:00:09 [scrapy.addons] INFO: Enabled addons:
      []
      2025-12-01 06:00:09 [asyncio] DEBUG: Using selector: EpollSelector
      2025-12-01 06:00:09 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2025-12-01 06:00:09 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2025-12-01 06:00:09 [scrapy.extensions.telnet] INFO: Telnet Password: 2bef58345b08d6ef
      2025-12-01 06:00:09 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2025-12-01 06:00:09 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/manheim.gb/task_37_2025-12-01T06_00_02.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2025-12-01 06:00:09 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2025-12-01 06:00:09 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2025-12-01 06:00:09 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2025-12-01 06:00:09 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2025-12-01 06:00:09 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2025-12-01 06:00:09 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_TENANT_ID, AZURE_CLIENT_ID
      2025-12-01 06:00:09 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2025-12-01 06:00:09 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 1971
      2025-12-01 06:00:09 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2025-12-01 06:00:09 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2025-12-01 06:00:09 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-01 06:00:09 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2025-12-01 06:00:09 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2025-12-01 06:00:09 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: low_mileage_for_country.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: cars_too_new_for_country.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_price_for_currency.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_country.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: registration_date_old.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_make.json
      2025-12-01 06:00:09 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2025-12-01 06:00:09 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2025-12-01 06:00:09 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2025-12-01 06:00:09 [scrapy.core.engine] INFO: Spider opened
      2025-12-01 06:00:09 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2025-12-01 06:00:09 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
      2025-12-01 06:00:09 [scrapy-playwright] INFO: Starting download handler
      2025-12-01 06:00:09 [scrapy-playwright] INFO: Starting download handler
      2025-12-01 06:00:14 [scrapy-playwright] INFO: Launching browser firefox
    • Tail

      2025-12-01 06:00:46 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3942709') HTTP/1.1" 204 0
      2025-12-01 06:00:46 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3942970') HTTP/1.1" 204 0
      2025-12-01 06:00:46 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3942984') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943021') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943054') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943141') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943349') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943410') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943413') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943419') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943424') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943434') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943544') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943548') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943567') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943582') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943680') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3943701') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944163') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944292') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944358') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944370') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944583') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944672') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944691') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944692') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944699') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944792') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944933') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3944996') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945038') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945081') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945083') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945113') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945219') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945309') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945579') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945606') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3945999') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946011') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946020') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946022') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946026') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946220') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946432') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946438') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946756') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946809') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946897') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3946924') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947081') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947111') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947328') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947429') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947750') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947778') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947903') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947905') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947906') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947912') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3947992') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948019') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948139') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948186') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948245') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948256') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948289') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948341') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948344') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948348') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948363') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948366') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948367') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948372') HTTP/1.1" 204 0
      2025-12-01 06:00:47 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948426') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948628') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948639') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948810') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948879') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948881') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948935') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3948997') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949029') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949154') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949256') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949344') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949372') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949531') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949584') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949753') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949893') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949908') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3949913') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950136') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950150') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950193') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950501') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950750') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950751') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950752') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950926') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950932') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3950951') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951046') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951056') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951087') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951151') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951154') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951155') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951159') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951182') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951190') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951191') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951193') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951194') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951195') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951196') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951197') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951198') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951199') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951200') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951201') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951202') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951205') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951213') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951214') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951225') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='manheim.gb',RowKey='3951262') HTTP/1.1" 204 0
      2025-12-01 06:00:48 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (0 items) in: file:///var/lib/scrapyd/items/sourcing_v2/manheim.gb/task_37_2025-12-01T06_00_02.jl
      2025-12-01 06:00:48 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/request_bytes': 1566,
       'downloader/request_count': 2,
       'downloader/request_method_count/GET': 1,
       'downloader/request_method_count/POST': 1,
       'downloader/response_bytes': 638949,
       'downloader/response_count': 2,
       'downloader/response_status_count/200': 2,
       'elapsed_time_seconds': 39.066208,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2025, 12, 1, 6, 0, 48, 774459, tzinfo=datetime.timezone.utc),
       'log_count/DEBUG': 563,
       'log_count/ERROR': 8,
       'log_count/INFO': 67,
       'memusage/max': 127590400,
       'memusage/startup': 127590400,
       'playwright/context_count': 1,
       'playwright/context_count/max_concurrent': 1,
       'playwright/context_count/persistent/False': 1,
       'playwright/context_count/remote/False': 1,
       'playwright/page_count': 1,
       'playwright/page_count/max_concurrent': 1,
       'playwright/request_count': 85,
       'playwright/request_count/aborted': 47,
       'playwright/request_count/method/GET': 82,
       'playwright/request_count/method/POST': 3,
       'playwright/request_count/navigation': 4,
       'playwright/request_count/resource_type/document': 4,
       'playwright/request_count/resource_type/font': 8,
       'playwright/request_count/resource_type/image': 41,
       'playwright/request_count/resource_type/script': 22,
       'playwright/request_count/resource_type/stylesheet': 8,
       'playwright/request_count/resource_type/xhr': 2,
       'playwright/response_count': 38,
       'playwright/response_count/method/GET': 35,
       'playwright/response_count/method/POST': 3,
       'playwright/response_count/resource_type/document': 4,
       'playwright/response_count/resource_type/font': 8,
       'playwright/response_count/resource_type/script': 16,
       'playwright/response_count/resource_type/stylesheet': 8,
       'playwright/response_count/resource_type/xhr': 2,
       'request_depth_max': 1,
       'response_received_count': 2,
       'scheduler/dequeued': 2,
       'scheduler/dequeued/memory': 2,
       'scheduler/enqueued': 2,
       'scheduler/enqueued/memory': 2,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 1,
       'scrapy-zyte-api/error_ratio': 0.0,
       'scrapy-zyte-api/errors': 0,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 1.9594592639477924,
       'scrapy-zyte-api/mean_response_seconds': 2.1958086569793522,
       'scrapy-zyte-api/processed': 1,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 1,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 1,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 1,
       'scrapy-zyte-api/request_args/httpRequestBody': 1,
       'scrapy-zyte-api/request_args/httpRequestMethod': 1,
       'scrapy-zyte-api/request_args/httpResponseBody': 1,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 1,
       'scrapy-zyte-api/request_args/url': 1,
       'scrapy-zyte-api/status_codes/200': 1,
       'scrapy-zyte-api/success': 1,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'spider_exceptions/ValueError': 1,
       'start_time': datetime.datetime(2025, 12, 1, 6, 0, 9, 708251, tzinfo=datetime.timezone.utc)}
      2025-12-01 06:00:48 [scrapy.core.engine] INFO: Spider closed (finished)
      
    • Log

      /2/log/utf8/sourcing_v2/manheim.gb/task_37_2025-12-01T06_00_02/?job_finished=True

    • Source

      http://scrapyd-2:6800/logs/sourcing_v2/manheim.gb/task_37_2025-12-01T06_00_02.log

  • sourcelog
    last_update_time2025-12-01 06:00:48
    last_update_timestamp1764568848
    downloader/request_bytes1566
    downloader/request_count2
    downloader/request_method_count/GET1
    downloader/request_method_count/POST1
    downloader/response_bytes638949
    downloader/response_count2
    downloader/response_status_count/2002
    elapsed_time_seconds39.066208
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2025, 12, 1, 6, 0, 48, 774459, tzinfo=datetime.timezone.utc)
    log_count/DEBUG563
    log_count/ERROR8
    log_count/INFO67
    memusage/max127590400
    memusage/startup127590400
    playwright/context_count1
    playwright/context_count/max_concurrent1
    playwright/context_count/persistent/False1
    playwright/context_count/remote/False1
    playwright/page_count1
    playwright/page_count/max_concurrent1
    playwright/request_count85
    playwright/request_count/aborted47
    playwright/request_count/method/GET82
    playwright/request_count/method/POST3
    playwright/request_count/navigation4
    playwright/request_count/resource_type/document4
    playwright/request_count/resource_type/font8
    playwright/request_count/resource_type/image41
    playwright/request_count/resource_type/script22
    playwright/request_count/resource_type/stylesheet8
    playwright/request_count/resource_type/xhr2
    playwright/response_count38
    playwright/response_count/method/GET35
    playwright/response_count/method/POST3
    playwright/response_count/resource_type/document4
    playwright/response_count/resource_type/font8
    playwright/response_count/resource_type/script16
    playwright/response_count/resource_type/stylesheet8
    playwright/response_count/resource_type/xhr2
    request_depth_max1
    response_received_count2
    scheduler/dequeued2
    scheduler/dequeued/memory2
    scheduler/enqueued2
    scheduler/enqueued/memory2
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts1
    scrapy-zyte-api/error_ratio0.0
    scrapy-zyte-api/errors0
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds1.9594592639477924
    scrapy-zyte-api/mean_response_seconds2.1958086569793522
    scrapy-zyte-api/processed1
    scrapy-zyte-api/request_args/customHttpRequestHeaders1
    scrapy-zyte-api/request_args/experimental.requestCookies1
    scrapy-zyte-api/request_args/experimental.responseCookies1
    scrapy-zyte-api/request_args/httpRequestBody1
    scrapy-zyte-api/request_args/httpRequestMethod1
    scrapy-zyte-api/request_args/httpResponseBody1
    scrapy-zyte-api/request_args/httpResponseHeaders1
    scrapy-zyte-api/request_args/url1
    scrapy-zyte-api/status_codes/2001
    scrapy-zyte-api/success1
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    spider_exceptions/ValueError1
    start_timedatetime.datetime(2025, 12, 1, 6, 0, 9, 708251, tzinfo=datetime.timezone.utc)