• LogParser v0.8.2, last updated at 2026-01-27 17:25:52, http://scrapyd-3:6800/logs/sourcing_v2/auto1.fr/task_31_2026-01-27T17_00_00.json

PROJECT (sourcing_v2), SPIDER (auto1.fr)

  • Log analysis
  • Log categorization
  • Progress visualization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spiderauto1.fr
    jobtask_31_2026-01-27T17_00_00
    first_log_time2026-01-27 17:00:06
    latest_log_time2026-01-27 17:25:50
    runtime0:25:44
    crawled_pages 1391
    scraped_items 292
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count8
    log_warning_count0
    log_redirect_count0
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      8 in total

      2026-01-27 17:08:19 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:19 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:19 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:19 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:20 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:20 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:20 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:08:20 [scrapy.core.scraper] ERROR: Spider error processing <POST https://www.auto1.com/v1/car-search/cars/search/cdb8adcc-312b-4040-b112-4a289f3f9b07> (referer: https://www.auto1.com/v1/car-search/cars/search/cdb8adcc-312b-4040-b112-4a289f3f9b07)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2026-01-27 17:25:50 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f1b1061f810>
      2026-01-27 17:25:50 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f1b1043cc10>

      INFO

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6023
    • telnet_password

      86293d4ea249fcf2
    • latest_duplicate

      2026-01-27 17:00:18 [scrapy.dupefilters] DEBUG: Filtered duplicate request: <GET https://www.auto1.com/v1/car-details-view/JE80286/cdb8adcc-312b-4040-b112-4a289f3f9b07> - no more duplicates will be shown (see DUPEFILTER_DEBUG to show all duplicates)
    • latest_crawl

      2026-01-27 17:25:43 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://img-pa.auto1.com/img0f/6e/0f6efda318fa334805bcf7bd29b6da3e/pa/YT46166_d8a52a92aecf4869dc993b368a8ecdb1.png> (referer: https://www.auto1.com/)
    • latest_scrape

      2026-01-27 17:25:48 [scrapy.core.scraper] DEBUG: Scraped from <200 https://img-pa.auto1.com/img0f/6e/0f6efda318fa334805bcf7bd29b6da3e/pa/YT46166_d8a52a92aecf4869dc993b368a8ecdb1.png>
    • latest_stat

      2026-01-27 17:25:07 [scrapy.extensions.logstats] INFO: Crawled 1388 pages (at 10 pages/min), scraped 290 items (at 3 items/min)
    • Head

      2026-01-27 17:00:06 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2026-01-27 17:00:06 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2026-01-27 17:00:06 [auto1.fr] INFO: Starting spider auto1.fr
      2026-01-27 17:00:06 [scrapy.addons] INFO: Enabled addons:
      []
      2026-01-27 17:00:06 [asyncio] DEBUG: Using selector: EpollSelector
      2026-01-27 17:00:06 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2026-01-27 17:00:06 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2026-01-27 17:00:06 [scrapy.extensions.telnet] INFO: Telnet Password: 86293d4ea249fcf2
      2026-01-27 17:00:06 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2026-01-27 17:00:06 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/auto1.fr/task_31_2026-01-27T17_00_00.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2026-01-27 17:00:06 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-01-27 17:00:06 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-01-27 17:00:06 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2026-01-27 17:00:06 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2026-01-27 17:00:06 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2026-01-27 17:00:07 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_CLIENT_ID, AZURE_TENANT_ID
      2026-01-27 17:00:07 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2026-01-27 17:00:07 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2001
      2026-01-27 17:00:07 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2026-01-27 17:00:07 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2026-01-27 17:00:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-01-27 17:00:07 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2026-01-27 17:00:07 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2026-01-27 17:00:07 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: low_mileage_for_country.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: cars_too_new_for_country.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_price_for_currency.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_country.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: registration_date_old.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_make.json
      2026-01-27 17:00:07 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2026-01-27 17:00:07 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2026-01-27 17:00:07 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2026-01-27 17:00:07 [scrapy.core.engine] INFO: Spider opened
      2026-01-27 17:00:07 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2026-01-27 17:00:07 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
      2026-01-27 17:00:07 [scrapy-playwright] INFO: Starting download handler
      
      2026-01-27 17:00:07 [scrapy-playwright] INFO: Starting download handler
      2026-01-27 17:00:12 [zyte_api._retry] DEBUG: Starting call to 'zyte_api._async.AsyncZyteAPI.get.<locals>.request', this is the 1st time calling it.
    • Tail

      2026-01-27 17:25:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='CX24944') HTTP/1.1" 204 0
      2026-01-27 17:25:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='CZ71369') HTTP/1.1" 204 0
      2026-01-27 17:25:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DA76883') HTTP/1.1" 204 0
      2026-01-27 17:25:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DC26621') HTTP/1.1" 204 0
      2026-01-27 17:25:48 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DE74551') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DF28688') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DG22885') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DG76599') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DG87105') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DJ95926') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DK33828') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DP34492') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DR11636') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='DY89660') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='EA26262') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='EF47158') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='EH64312') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='EN77753') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ER48331') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FB58010') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FC29889') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FD68554') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FJ70386') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FK08094') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FT96295') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FV32795') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='FV45595') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GB51965') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GH59417') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GK86587') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GN12835') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GR63211') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GT97216') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GU24641') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='GU84031') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='HB74151') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='HC02477') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='HL08264') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JA31400') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JD57935') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JF52733') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JH69404') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JJ33572') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JN53316') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JR28426') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JX30299') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JY15235') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='JY43883') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KA91806') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KE61647') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KK58978') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KM62242') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KN08520') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KN09669') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KP18383') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KU87382') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KV69153') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='KZ18752') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='LE51320') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='LG11399') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='LK78611') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='LM98221') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='MJ57193') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='MJ88763') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='MP03922') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='MP71004') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='MT22652') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='MW81631') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='NF98293') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='NG22672') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='NN46751') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='NT02050') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='PA86969') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='PN12875') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='PP56854') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='PT36972') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='PX63392') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RC06341') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RE87298') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RH90312') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RJ14139') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RS07385') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RU38583') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='RY56744') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='SG47247') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='SL48973') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='SN17914') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='SU00836') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='SY97954') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TG39799') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TG52256') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TJ65302') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TL67788') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TR79275') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TV64888') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='TX15971') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='UF22188') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='UG65414') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='UL45229') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='UM70541') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='UP36954') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='UZ63623') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='VC05108') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='VD09967') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='VH02409') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='VL02976') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='VN36315') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WD91609') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WF89975') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WH46926') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WS32539') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WX60162') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WX85238') HTTP/1.1" 204 0
      2026-01-27 17:25:49 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='WZ04297') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XC40325') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XE43588') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XE48625') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XK43933') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XK76571') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XN96347') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XR74380') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XT97718') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XU89386') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XV03883') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XV80427') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XY30281') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='XZ73836') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YA67284') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YD73523') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YF99481') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YJ73584') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YJ92364') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YS50796') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YU47350') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='YX41033') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ZE40507') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ZM77762') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ZN64975') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ZW80131') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ZX35116') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.fr',RowKey='ZX46142') HTTP/1.1" 204 0
      2026-01-27 17:25:50 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (292 items) in: file:///var/lib/scrapyd/items/sourcing_v2/auto1.fr/task_31_2026-01-27T17_00_00.jl
      2026-01-27 17:25:50 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/request_bytes': 6051487,
       'downloader/request_count': 3479,
       'downloader/request_method_count/GET': 3430,
       'downloader/request_method_count/POST': 49,
       'downloader/response_bytes': 495902462,
       'downloader/response_count': 3479,
       'downloader/response_status_count/200': 3479,
       'dupefilter/filtered': 5,
       'elapsed_time_seconds': 1542.820793,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2026, 1, 27, 17, 25, 50, 268811, tzinfo=datetime.timezone.utc),
       'item_dropped_count': 378,
       'item_dropped_reasons_count/DropItem': 378,
       'item_scraped_count': 292,
       'log_count/DEBUG': 38755,
       'log_count/ERROR': 8,
       'log_count/INFO': 32339,
       'memusage/max': 201392128,
       'memusage/startup': 127016960,
       'photo_download_count': 2088,
       'request_depth_max': 51,
       'response_received_count': 1391,
       'scheduler/dequeued': 3479,
       'scheduler/dequeued/memory': 3479,
       'scheduler/enqueued': 3479,
       'scheduler/enqueued/memory': 3479,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 1408,
       'scrapy-zyte-api/error_ratio': 0.012073863636363636,
       'scrapy-zyte-api/errors': 17,
       "scrapy-zyte-api/exception_types/<class 'aiohttp.client_exceptions.ClientConnectorError'>": 17,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 4.7834250520189014,
       'scrapy-zyte-api/mean_response_seconds': 4.803912077516202,
       'scrapy-zyte-api/processed': 1391,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 1391,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 1390,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 1391,
       'scrapy-zyte-api/request_args/httpRequestBody': 49,
       'scrapy-zyte-api/request_args/httpRequestMethod': 49,
       'scrapy-zyte-api/request_args/httpResponseBody': 1391,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 1391,
       'scrapy-zyte-api/request_args/sessionContext': 1391,
       'scrapy-zyte-api/request_args/url': 1391,
       'scrapy-zyte-api/status_codes/0': 17,
       'scrapy-zyte-api/status_codes/200': 1391,
       'scrapy-zyte-api/success': 1391,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'spider_exceptions/ValueError': 1,
       'start_time': datetime.datetime(2026, 1, 27, 17, 0, 7, 448018, tzinfo=datetime.timezone.utc)}
      2026-01-27 17:25:50 [scrapy.core.engine] INFO: Spider closed (finished)
      2026-01-27 17:25:50 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f1b1061f810>
      2026-01-27 17:25:50 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f1b1043cc10>
    • Log

      /3/log/utf8/sourcing_v2/auto1.fr/task_31_2026-01-27T17_00_00/?job_finished=True

    • Source

      http://scrapyd-3:6800/logs/sourcing_v2/auto1.fr/task_31_2026-01-27T17_00_00.log

  • sourcelog
    last_update_time2026-01-27 17:25:50
    last_update_timestamp1769534750
    downloader/request_bytes6051487
    downloader/request_count3479
    downloader/request_method_count/GET3430
    downloader/request_method_count/POST49
    downloader/response_bytes495902462
    downloader/response_count3479
    downloader/response_status_count/2003479
    dupefilter/filtered5
    elapsed_time_seconds1542.820793
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2026, 1, 27, 17, 25, 50, 268811, tzinfo=datetime.timezone.utc)
    item_dropped_count378
    item_dropped_reasons_count/DropItem378
    item_scraped_count292
    log_count/DEBUG38755
    log_count/ERROR8
    log_count/INFO32339
    memusage/max201392128
    memusage/startup127016960
    photo_download_count2088
    request_depth_max51
    response_received_count1391
    scheduler/dequeued3479
    scheduler/dequeued/memory3479
    scheduler/enqueued3479
    scheduler/enqueued/memory3479
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts1408
    scrapy-zyte-api/error_ratio0.012073863636363636
    scrapy-zyte-api/errors17
    scrapy-zyte-api/exception_types/<class _aiohttp.client_exceptions.ClientConnectorError_>17
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds4.7834250520189014
    scrapy-zyte-api/mean_response_seconds4.803912077516202
    scrapy-zyte-api/processed1391
    scrapy-zyte-api/request_args/customHttpRequestHeaders1391
    scrapy-zyte-api/request_args/experimental.requestCookies1390
    scrapy-zyte-api/request_args/experimental.responseCookies1391
    scrapy-zyte-api/request_args/httpRequestBody49
    scrapy-zyte-api/request_args/httpRequestMethod49
    scrapy-zyte-api/request_args/httpResponseBody1391
    scrapy-zyte-api/request_args/httpResponseHeaders1391
    scrapy-zyte-api/request_args/sessionContext1391
    scrapy-zyte-api/request_args/url1391
    scrapy-zyte-api/status_codes/017
    scrapy-zyte-api/status_codes/2001391
    scrapy-zyte-api/success1391
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    spider_exceptions/ValueError1
    start_timedatetime.datetime(2026, 1, 27, 17, 0, 7, 448018, tzinfo=datetime.timezone.utc)