• LogParser v0.8.2, last updated at 2025-12-05 17:11:47, http://scrapyd-1:6800/logs/sourcing_v2/auto1.es/task_25_2025-12-05T17_00_00.json

PROJECT (sourcing_v2), SPIDER (auto1.es)

  • Log analysis
  • Log categorization
  • Progress visualization
  • View log
  • Crawler.stats
  • projectsourcing_v2
    spiderauto1.es
    jobtask_25_2025-12-05T17_00_00
    first_log_time2025-12-05 17:00:07
    latest_log_time2025-12-05 17:11:46
    runtime0:11:39
    crawled_pages 786
    scraped_items 234
    shutdown_reasonN/A
    finish_reasonfinished
    log_critical_count0
    log_error_count10
    log_warning_count0
    log_redirect_count0
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      10 in total

      2025-12-05 17:01:47 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:01:48 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:01:48 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:01:48 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:01:48 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:01:48 [crawlers.middlewares.monitoring_spider_middleware] ERROR: invalid literal for int() with base 10: 'other'
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:01:48 [scrapy.core.scraper] ERROR: Spider error processing <POST https://www.auto1.com/v1/car-search/cars/search/cdb8adcc-312b-4040-b112-4a289f3f9b07> (referer: https://www.auto1.com/v1/car-search/cars/search/cdb8adcc-312b-4040-b112-4a289f3f9b07)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
          yield await it.__anext__()
                ^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
          return await self.data.__anext__()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
          async for o in as_async_generator(it):
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
          async for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
          async for item_or_request in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
          async for r in result or ():
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
          async for r in iterable:
        File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
          async for item in result:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
          for r in it:
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
          exception_result = self._process_spider_exception(
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
          for r in iterable:
        File "/usr/src/app/crawlers/spiders/auto1.py", line 226, in parse_api_search
          item = self._listing_from_search_result(car, channel, car_stock_number)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/spiders/auto1.py", line 336, in _listing_from_search_result
          item["doors_number"] = int(car.get("doors"))
                                 ^^^^^^^^^^^^^^^^^^^^^
      ValueError: invalid literal for int() with base 10: 'other'
      2025-12-05 17:03:16 [crawlers.pipelines.post_to_api] ERROR: auto1.es, item_id e46e4a1e-f163-596d-80cc-620c747a598a: Failed to post item to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing: 422 Client Error: Unprocessable Entity for url: https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
       | Status Code: 422
       | Response Content: {"ValidationErrors":[{"PropertyName":"","ErrorMessage":"Images required for new listing","ErrorCode":4}],"WasSuccessful":false}
       | Response Headers: {'Content-Type': 'application/json', 'Date': 'Fri, 05 Dec 2025 17:03:15 GMT', 'Request-Context': 'appId=cid-v1:1a14ebe8-38cd-4629-ab2d-40684250fa5b', 'Server': 'Kestrel', 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains; preload', 'Transfer-Encoding': 'chunked'}
      
      2025-12-05 17:11:46 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f6690816590>
      2025-12-05 17:11:46 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f66905804d0>

      INFO

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6023
    • telnet_password

      2128e64668004002
    • latest_duplicate

      2025-12-05 17:00:19 [scrapy.dupefilters] DEBUG: Filtered duplicate request: <GET https://www.auto1.com/v1/car-details-view/YT08561/cdb8adcc-312b-4040-b112-4a289f3f9b07> - no more duplicates will be shown (see DUPEFILTER_DEBUG to show all duplicates)
    • latest_crawl

      2025-12-05 17:11:44 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.auto1.com/en/app/merchant/car/EB37326> (referer: https://www.auto1.com/v1/car-details-view/EB37326/cdb8adcc-312b-4040-b112-4a289f3f9b07) ['zyte-api']
    • latest_scrape

      2025-12-05 17:11:45 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.auto1.com/en/app/merchant/car/EB37326>
    • latest_stat

      2025-12-05 17:11:08 [scrapy.extensions.logstats] INFO: Crawled 784 pages (at 0 pages/min), scraped 233 items (at 0 items/min)
    • Head

      2025-12-05 17:00:07 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2025-12-05 17:00:07 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2025-12-05 17:00:07 [auto1.es] INFO: Starting spider auto1.es
      2025-12-05 17:00:07 [scrapy.addons] INFO: Enabled addons:
      []
      2025-12-05 17:00:07 [asyncio] DEBUG: Using selector: EpollSelector
      2025-12-05 17:00:07 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2025-12-05 17:00:07 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2025-12-05 17:00:07 [scrapy.extensions.telnet] INFO: Telnet Password: 2128e64668004002
      2025-12-05 17:00:07 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2025-12-05 17:00:07 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/auto1.es/task_25_2025-12-05T17_00_00.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2025-12-05 17:00:07 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2025-12-05 17:00:07 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2025-12-05 17:00:07 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2025-12-05 17:00:07 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2025-12-05 17:00:07 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2025-12-05 17:00:07 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_CLIENT_ID, AZURE_TENANT_ID
      2025-12-05 17:00:07 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2025-12-05 17:00:07 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 1965
      2025-12-05 17:00:07 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2025-12-05 17:00:07 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2025-12-05 17:00:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:00:08 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2025-12-05 17:00:08 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2025-12-05 17:00:08 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: low_mileage_for_country.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: cars_too_new_for_country.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_price_for_currency.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_country.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: registration_date_old.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_make.json
      2025-12-05 17:00:08 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2025-12-05 17:00:08 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2025-12-05 17:00:08 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2025-12-05 17:00:08 [scrapy.core.engine] INFO: Spider opened
      2025-12-05 17:00:08 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2025-12-05 17:00:08 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
      2025-12-05 17:00:08 [scrapy-playwright] INFO: Starting download handler
      2025-12-05 17:00:08 [scrapy-playwright] INFO: Starting download handler
      
    • Tail

       'price_includes_vat': True,
       'registration_date': '2020-09-29',
       'seats_number': 2,
       'seller_name': None,
       'title': 'Peugeot Partner 1.5 Blue-HDi Pro L1',
       'transmission': 'manual',
       'trim': 'Pro L1',
       'year': 2020}
      2025-12-05 17:11:37 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:11:37 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27auto1.es%27%20and%20RowKey%20eq%20%27WT96929%27 HTTP/1.1" 200 None
      2025-12-05 17:11:37 [auto1.es] INFO: Saving data for WT96929: {'auction_closing_time': 1765039970.0, 'created_time': 1764003789.143743, 'last_price_update_time': 1764954697.423731}
      2025-12-05 17:11:37 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      
      2025-12-05 17:11:37 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "PATCH /ScrapedListings(PartitionKey='auto1.es',RowKey='WT96929') HTTP/1.1" 204 0
      2025-12-05 17:11:43 [zyte_api._retry] DEBUG: Starting call to 'zyte_api._async.AsyncZyteAPI.get.<locals>.request', this is the 2nd time calling it.
      2025-12-05 17:11:44 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.auto1.com/en/app/merchant/car/EB37326> (referer: https://www.auto1.com/v1/car-details-view/EB37326/cdb8adcc-312b-4040-b112-4a289f3f9b07) ['zyte-api']
      2025-12-05 17:11:44 [crawlers.middlewares.id_gen_middleware] INFO: Generated ID for item: 1348c345-3ff8-5adc-b6d5-280ce7707f98 with identifier: EB37326
      2025-12-05 17:11:44 [crawlers.middlewares.report_download_middleware] INFO: Skipping car_expert_report download for item 1348c345-3ff8-5adc-b6d5-280ce7707f98 with scrape type 2
      2025-12-05 17:11:44 [crawlers.middlewares.photo_download_middleware] INFO: Skipping photo download for item 1348c345-3ff8-5adc-b6d5-280ce7707f98 with scrape type 2
      2025-12-05 17:11:44 [crawlers.middlewares.monitoring_spider_middleware] INFO: Spider: auto1.es, Processed item id: 1348c345-3ff8-5adc-b6d5-280ce7707f98, identifier: EB37326
      2025-12-05 17:11:44 [crawlers.pipelines.translation_pipeline] INFO: Spider: auto1.es, Translating item: 1348c345-3ff8-5adc-b6d5-280ce7707f98 with identifier: EB37326
      2025-12-05 17:11:44 [crawlers.pipelines.item_rules_pipeline] INFO: Spider: auto1.es, Applying rules to item: 1348c345-3ff8-5adc-b6d5-280ce7707f98 with identifier: EB37326
      2025-12-05 17:11:44 [crawlers.pipelines.post_to_api] INFO: Spider: auto1.es, Posting item: 1348c345-3ff8-5adc-b6d5-280ce7707f98 with identifier: EB37326 to the API
      2025-12-05 17:11:44 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): api.app.infinit.cc:443
      2025-12-05 17:11:44 [urllib3.connectionpool] DEBUG: https://api.app.infinit.cc:443 "POST /api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing HTTP/1.1" 200 None
      2025-12-05 17:11:45 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.auto1.com/en/app/merchant/car/EB37326>
      {'auction_title': '',
       'c_o_2_emission_value': 129,
       'category': None,
       'color': 'white',
       'currency': 'EUR',
       'doors_number': 5,
       'emission_standard': 'EURO 6',
       'engine': '2.0L',
       'engine_horse_power': 190,
       'expiration_date': '2025-12-06T16:09:45Z',
       'fuel_type': 'diesel',
       'id': '1348c345-3ff8-5adc-b6d5-280ce7707f98',
       'is_damaged': False,
       'is_operable': True,
       'is_vat_deductible': False,
       'is_vat_included': True,
       'link_to_web_offer': 'https://www.auto1.com/en/app/merchant/car/EB37326',
       'listing': 'auction',
       'location': 'ES, Calaf - Barcelona',
       'make': 'BMW',
       'model': 'X1',
       'odometer': 135795,
       'odometer_reading_unit': 'km',
       'origin_country_code': 'ES',
       'original_photo_count': 0,
       'price': 0,
       'price_includes_vat': True,
       'registration_date': '2016-04-27',
       'seats_number': 5,
       'seller_name': None,
       'title': 'BMW X1 xDrive 20d xLine',
       'transmission': 'automatic',
       'trim': 'xLine',
       'year': 2016}
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27auto1.es%27%20and%20RowKey%20eq%20%27EB37326%27 HTTP/1.1" 200 None
      2025-12-05 17:11:45 [auto1.es] INFO: Saving data for EB37326: {'auction_closing_time': 1765037385.0, 'created_time': 1762362157.506255, 'last_price_update_time': 1764954705.078421}
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "PATCH /ScrapedListings(PartitionKey='auto1.es',RowKey='EB37326') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [scrapy.core.engine] INFO: Closing spider (finished)
      2025-12-05 17:11:45 [auto1.es] INFO: auto1.es Crawl ended with reason finished, scrape types: {<ScrapeType.NEW: 1>: 0, <ScrapeType.NEW_DUPLICATE_ID: 4>: 0, <ScrapeType.PRICE_UPDATE: 2>: 0, <ScrapeType.AUCTION_UPDATE: 3>: 0, <ScrapeType.SKIPPED: 0>: 0}
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27auto1.es%27%20and%20RowKey%20eq%20%27NB20063%27 HTTP/1.1" 200 None
      2025-12-05 17:11:45 [auto1.es] INFO: Saving data for NB20063: {'created_time': 1764954705.212426}
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "PATCH /ScrapedListings(PartitionKey='auto1.es',RowKey='NB20063') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /ScrapedListings()?$filter=PartitionKey%20eq%20%27auto1.es%27%20and%20last_price_update_time%20lt%201764522705 HTTP/1.1" 200 None
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='AC30439') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='AG10532') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='AL55727') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='AP44308') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='AT16070') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='BT09002') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='BX24765') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='CJ58925') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='CL14168') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='CP04909') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='DB75337') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='DR10729') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='DT42441') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='EK23268') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ES19217') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ET95519') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='EX13755') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='FF37688') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='FK42893') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='FL81877') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='FP40332') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='GA20530') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='GC31041') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='GC73616') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='GE46645') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='GH14514') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='GL75649') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='HB93442') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='HF23928') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='HJ35368') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='HM09513') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='HN40792') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='HU22248') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='JL09805') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='JS90593') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='KP88799') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='KS24892') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='LA40008') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='MF42971') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ML36471') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='NF93246') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='NU64303') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='PE23741') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='PJ54874') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='PM96586') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='PV45725') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='PX17411') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='RB88759') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='RP24593') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='SL68555') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='SM27151') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='SM85341') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='TB18792') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='TH28461') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='TJ53128') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='UB10043') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='UH34805') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='VC75336') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='VJ38426') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='VU70419') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='WT26775') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='XX68980') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='YH63177') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='YT78283') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ZG28300') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ZJ79064') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ZL56858') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "DELETE /ScrapedListings(PartitionKey='auto1.es',RowKey='ZN95076') HTTP/1.1" 204 0
      2025-12-05 17:11:45 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (234 items) in: file:///var/lib/scrapyd/items/sourcing_v2/auto1.es/task_25_2025-12-05T17_00_00.jl
      2025-12-05 17:11:45 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
      {'downloader/request_bytes': 3246239,
       'downloader/request_count': 1875,
       'downloader/request_method_count/GET': 1857,
       'downloader/request_method_count/POST': 18,
       'downloader/response_bytes': 91331918,
       'downloader/response_count': 1875,
       'downloader/response_status_count/200': 1875,
       'dupefilter/filtered': 24,
       'elapsed_time_seconds': 697.725643,
       'feedexport/success_count/FileFeedStorage': 1,
       'finish_reason': 'finished',
       'finish_time': datetime.datetime(2025, 12, 5, 17, 11, 45, 890045, tzinfo=datetime.timezone.utc),
       'item_dropped_count': 149,
       'item_dropped_reasons_count/DropItem': 149,
       'item_scraped_count': 234,
       'log_count/DEBUG': 16635,
       'log_count/ERROR': 10,
       'log_count/INFO': 14963,
       'memusage/max': 164024320,
       'memusage/startup': 127336448,
       'photo_download_count': 1089,
       'request_depth_max': 20,
       'response_received_count': 786,
       'scheduler/dequeued': 1875,
       'scheduler/dequeued/memory': 1875,
       'scheduler/enqueued': 1875,
       'scheduler/enqueued/memory': 1875,
       'scrapy-zyte-api/429': 0,
       'scrapy-zyte-api/attempts': 788,
       'scrapy-zyte-api/error_ratio': 0.0025380710659898475,
       'scrapy-zyte-api/errors': 2,
       "scrapy-zyte-api/exception_types/<class 'aiohttp.client_exceptions.ClientConnectorError'>": 2,
       'scrapy-zyte-api/fatal_errors': 0,
       'scrapy-zyte-api/mean_connection_seconds': 4.6836694266786925,
       'scrapy-zyte-api/mean_response_seconds': 4.69176296722209,
       'scrapy-zyte-api/processed': 786,
       'scrapy-zyte-api/request_args/customHttpRequestHeaders': 786,
       'scrapy-zyte-api/request_args/experimental.requestCookies': 785,
       'scrapy-zyte-api/request_args/experimental.responseCookies': 786,
       'scrapy-zyte-api/request_args/httpRequestBody': 18,
       'scrapy-zyte-api/request_args/httpRequestMethod': 18,
       'scrapy-zyte-api/request_args/httpResponseBody': 786,
       'scrapy-zyte-api/request_args/httpResponseHeaders': 786,
       'scrapy-zyte-api/request_args/sessionContext': 786,
       'scrapy-zyte-api/request_args/url': 786,
       'scrapy-zyte-api/status_codes/0': 2,
       'scrapy-zyte-api/status_codes/200': 786,
       'scrapy-zyte-api/success': 786,
       'scrapy-zyte-api/success_ratio': 1.0,
       'scrapy-zyte-api/throttle_ratio': 0.0,
       'spider_exceptions/ValueError': 1,
       'start_time': datetime.datetime(2025, 12, 5, 17, 0, 8, 164402, tzinfo=datetime.timezone.utc)}
      2025-12-05 17:11:45 [scrapy.core.engine] INFO: Spider closed (finished)
      2025-12-05 17:11:46 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f6690816590>
      2025-12-05 17:11:46 [asyncio] ERROR: Unclosed client session
      client_session: <aiohttp.client.ClientSession object at 0x7f66905804d0>
      
    • Log

      /1/log/utf8/sourcing_v2/auto1.es/task_25_2025-12-05T17_00_00/?job_finished=True

    • Source

      http://scrapyd-1:6800/logs/sourcing_v2/auto1.es/task_25_2025-12-05T17_00_00.log

  • sourcelog
    last_update_time2025-12-05 17:11:45
    last_update_timestamp1764954705
    downloader/request_bytes3246239
    downloader/request_count1875
    downloader/request_method_count/GET1857
    downloader/request_method_count/POST18
    downloader/response_bytes91331918
    downloader/response_count1875
    downloader/response_status_count/2001875
    dupefilter/filtered24
    elapsed_time_seconds697.725643
    feedexport/success_count/FileFeedStorage1
    finish_reasonfinished
    finish_timedatetime.datetime(2025, 12, 5, 17, 11, 45, 890045, tzinfo=datetime.timezone.utc)
    item_dropped_count149
    item_dropped_reasons_count/DropItem149
    item_scraped_count234
    log_count/DEBUG16635
    log_count/ERROR10
    log_count/INFO14963
    memusage/max164024320
    memusage/startup127336448
    photo_download_count1089
    request_depth_max20
    response_received_count786
    scheduler/dequeued1875
    scheduler/dequeued/memory1875
    scheduler/enqueued1875
    scheduler/enqueued/memory1875
    scrapy-zyte-api/4290
    scrapy-zyte-api/attempts788
    scrapy-zyte-api/error_ratio0.0025380710659898475
    scrapy-zyte-api/errors2
    scrapy-zyte-api/exception_types/<class _aiohttp.client_exceptions.ClientConnectorError_>2
    scrapy-zyte-api/fatal_errors0
    scrapy-zyte-api/mean_connection_seconds4.6836694266786925
    scrapy-zyte-api/mean_response_seconds4.69176296722209
    scrapy-zyte-api/processed786
    scrapy-zyte-api/request_args/customHttpRequestHeaders786
    scrapy-zyte-api/request_args/experimental.requestCookies785
    scrapy-zyte-api/request_args/experimental.responseCookies786
    scrapy-zyte-api/request_args/httpRequestBody18
    scrapy-zyte-api/request_args/httpRequestMethod18
    scrapy-zyte-api/request_args/httpResponseBody786
    scrapy-zyte-api/request_args/httpResponseHeaders786
    scrapy-zyte-api/request_args/sessionContext786
    scrapy-zyte-api/request_args/url786
    scrapy-zyte-api/status_codes/02
    scrapy-zyte-api/status_codes/200786
    scrapy-zyte-api/success786
    scrapy-zyte-api/success_ratio1.0
    scrapy-zyte-api/throttle_ratio0.0
    spider_exceptions/ValueError1
    start_timedatetime.datetime(2025, 12, 5, 17, 0, 8, 164402, tzinfo=datetime.timezone.utc)