Desktop version Jobs Parsed
  • Analysis
  • Categories
  • Charts
  • Logs
  • projectsourcing_v2
    spidercarwow.gb
    jobtask_45_2026-01-28T06_00_02
    first_log_time2026-01-28 06:00:09
    latest_log_time2026-01-29 12:38:10
    runtime1 day, 6:38:01
    crawled_pages 937
    scraped_items 0
    shutdown_reasonN/A
    finish_reasonN/A
    log_critical_count0
    log_error_count468
    log_warning_count7
    log_redirect_count0
    log_retry_count0
    log_ignore_count0
    latest_crawl
    latest_scrape
    latest_log
    current_time
    latest_itemN/A
    • WARNING+

    • error_logs
      last 10 of 468

      2026-01-28 06:44:28 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Ignoring non-200 response
      NoneType: None
      2026-01-28 06:44:28 [scrapy.core.scraper] ERROR: Spider error processing <GET https://dealers.carwow.co.uk/dealers/listings/11552350> (referer: https://dealers.carwow.co.uk/dealers/listings/filtered/stock)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/twisted/internet/defer.py", line 1078, in _runCallbacks
          current.result = callback(  # type: ignore[misc]
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 301, in process_spider_exception
          return self._process_spider_exception(response, spider, _failure)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 83, in _process_spider_input
          result = method(response=response, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/httperror.py", line 46, in process_spider_input
          raise HttpError(response, "Ignoring non-200 response")
      scrapy.spidermiddlewares.httperror.HttpError: Ignoring non-200 response
      2026-01-28 06:44:36 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Ignoring non-200 response
      NoneType: None
      2026-01-28 06:44:36 [scrapy.core.scraper] ERROR: Spider error processing <GET https://dealers.carwow.co.uk/dealers/listings/11624824> (referer: https://dealers.carwow.co.uk/dealers/listings/filtered/stock)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/twisted/internet/defer.py", line 1078, in _runCallbacks
          current.result = callback(  # type: ignore[misc]
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 301, in process_spider_exception
          return self._process_spider_exception(response, spider, _failure)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 83, in _process_spider_input
          result = method(response=response, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/httperror.py", line 46, in process_spider_input
          raise HttpError(response, "Ignoring non-200 response")
      scrapy.spidermiddlewares.httperror.HttpError: Ignoring non-200 response
      2026-01-28 06:45:03 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Ignoring non-200 response
      NoneType: None
      2026-01-28 06:45:03 [scrapy.core.scraper] ERROR: Spider error processing <GET https://dealers.carwow.co.uk/dealers/listings/11617279> (referer: https://dealers.carwow.co.uk/dealers/listings/filtered/stock)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/twisted/internet/defer.py", line 1078, in _runCallbacks
          current.result = callback(  # type: ignore[misc]
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 301, in process_spider_exception
          return self._process_spider_exception(response, spider, _failure)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 83, in _process_spider_input
          result = method(response=response, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/httperror.py", line 46, in process_spider_input
          raise HttpError(response, "Ignoring non-200 response")
      scrapy.spidermiddlewares.httperror.HttpError: Ignoring non-200 response
      2026-01-28 06:45:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Ignoring non-200 response
      NoneType: None
      2026-01-28 06:45:14 [scrapy.core.scraper] ERROR: Spider error processing <GET https://dealers.carwow.co.uk/dealers/listings/11580662> (referer: https://dealers.carwow.co.uk/dealers/listings/filtered/stock)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/twisted/internet/defer.py", line 1078, in _runCallbacks
          current.result = callback(  # type: ignore[misc]
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 301, in process_spider_exception
          return self._process_spider_exception(response, spider, _failure)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 83, in _process_spider_input
          result = method(response=response, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/httperror.py", line 46, in process_spider_input
          raise HttpError(response, "Ignoring non-200 response")
      scrapy.spidermiddlewares.httperror.HttpError: Ignoring non-200 response
      2026-01-28 06:45:19 [crawlers.middlewares.monitoring_spider_middleware] ERROR: Ignoring non-200 response
      NoneType: None
      2026-01-28 06:45:19 [scrapy.core.scraper] ERROR: Spider error processing <GET https://dealers.carwow.co.uk/dealers/listings/11595746> (referer: https://dealers.carwow.co.uk/dealers/listings/filtered/stock)
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/dist-packages/twisted/internet/defer.py", line 1078, in _runCallbacks
          current.result = callback(  # type: ignore[misc]
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 301, in process_spider_exception
          return self._process_spider_exception(response, spider, _failure)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
          result = method(response=response, exception=exception, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
          raise exception
        File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 83, in _process_spider_input
          result = method(response=response, spider=spider)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/httperror.py", line 46, in process_spider_input
          raise HttpError(response, "Ignoring non-200 response")
      scrapy.spidermiddlewares.httperror.HttpError: Ignoring non-200 response
    • warning_logs
      7 in total

      2026-01-28 06:39:05 [carwow.gb] WARNING: Failed to process listing listing_card_11554126: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11554126") to be visible
        -   locator resolved to visible <turbo-frame complete="" loading="lazy" id="listing_card_1155…>…</turbo-frame>
      
      2026-01-28 06:45:02 [carwow.gb] WARNING: Failed to process listing listing_card_11548633: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11548633") to be visible
        -   locator resolved to visible <turbo-frame complete="" loading="lazy" id="listing_card_1154…>…</turbo-frame>
      
      2026-01-28 06:45:10 [carwow.gb] WARNING: Failed to process listing listing_card_11607301: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11607301") to be visible
        -   locator resolved to visible <turbo-frame complete="" loading="lazy" id="listing_card_1160…>…</turbo-frame>
      
      2026-01-28 06:45:12 [carwow.gb] WARNING: Failed to process listing listing_card_11635256: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11635256") to be visible
        -   locator resolved to visible <turbo-frame complete="" loading="lazy" id="listing_card_1163…>…</turbo-frame>
      
      2026-01-28 06:45:15 [carwow.gb] WARNING: Failed to process listing listing_card_11620953: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11620953") to be visible
        -   locator resolved to visible <turbo-frame complete="" loading="lazy" id="listing_card_1162…>…</turbo-frame>
      
      2026-01-28 06:45:17 [carwow.gb] WARNING: Failed to process listing listing_card_11609981: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11609981") to be visible
        -   locator resolved to visible <turbo-frame complete="" loading="lazy" id="listing_card_1160…>…</turbo-frame>
      
      2026-01-28 06:45:19 [carwow.gb] WARNING: Failed to process listing listing_card_11627109: Page.wait_for_selector: Timeout 2000ms exceeded.
      Call log:
      waiting for locator("#listing_card_11627109") to be visible
      

      INFO

      DEBUG

    • scrapy_version

      2.11.2
    • telnet_console

      127.0.0.1:6023
    • telnet_password

      8ec0cde1b2d34a0d
    • latest_crawl

      2026-01-28 06:46:40 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://dealers.carwow.co.uk/dealers/listings/11592271> (referer: https://dealers.carwow.co.uk/dealers/listings/filtered/stock) ['zyte-api']
    • latest_stat

      2026-01-29 12:38:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
    • Head

      2026-01-28 06:00:09 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
      2026-01-28 06:00:09 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
      2026-01-28 06:00:09 [carwow.gb] INFO: Starting spider carwow.gb
      2026-01-28 06:00:09 [scrapy.addons] INFO: Enabled addons:
      []
      2026-01-28 06:00:09 [asyncio] DEBUG: Using selector: EpollSelector
      2026-01-28 06:00:09 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
      2026-01-28 06:00:09 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
      2026-01-28 06:00:09 [scrapy.extensions.telnet] INFO: Telnet Password: 8ec0cde1b2d34a0d
      2026-01-28 06:00:09 [scrapy.middleware] INFO: Enabled extensions:
      ['scrapy.extensions.corestats.CoreStats',
       'scrapy.extensions.telnet.TelnetConsole',
       'scrapy.extensions.memusage.MemoryUsage',
       'scrapy.extensions.feedexport.FeedExporter',
       'scrapy.extensions.logstats.LogStats',
       'scrapy.extensions.closespider.CloseSpider']
      2026-01-28 06:00:09 [scrapy.crawler] INFO: Overridden settings:
      {'BOT_NAME': 'SourcingV2',
       'CLOSESPIDER_TIMEOUT': 7200,
       'FEED_EXPORT_ENCODING': 'utf-8',
       'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/carwow.gb/task_45_2026-01-28T06_00_02.log',
       'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
       'NEWSPIDER_MODULE': 'spiders',
       'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
       'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
       'SPIDER_MODULES': ['spiders', 'auth_check'],
       'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
       'USER_AGENT': ''}
      2026-01-28 06:00:09 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-01-28 06:00:09 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
      2026-01-28 06:00:09 [scrapy.middleware] INFO: Enabled downloader middlewares:
      ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
       'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
       'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
       'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
       'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
       'scrapy.downloadermiddlewares.retry.RetryMiddleware',
       'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
       'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
       'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
       'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
       'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
       'scrapy.downloadermiddlewares.stats.DownloaderStats']
      2026-01-28 06:00:09 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
      2026-01-28 06:00:09 [scrapy.middleware] INFO: Enabled spider middlewares:
      ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
       'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
       'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
       'scrapy.spidermiddlewares.referer.RefererMiddleware',
       'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
       'scrapy.spidermiddlewares.depth.DepthMiddleware',
       'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
       'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
       'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
      2026-01-28 06:00:09 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_CLIENT_ID, AZURE_TENANT_ID
      2026-01-28 06:00:09 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
      2026-01-28 06:00:10 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2009
      2026-01-28 06:00:10 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
      2026-01-28 06:00:10 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): alxsourcingstorageprod.table.core.windows.net:443
      2026-01-28 06:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "POST /Tables HTTP/1.1" 409 None
      2026-01-28 06:00:10 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: auto
      2026-01-28 06:00:10 [urllib3.connectionpool] DEBUG: https://alxsourcingstorageprod.table.core.windows.net:443 "GET /Translations()?$filter=PartitionKey%20eq%20%27auto%27%20and%20RowKey%20eq%20%27auto%27 HTTP/1.1" 200 None
      2026-01-28 06:00:10 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: low_mileage_for_country.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: cars_too_new_for_country.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_price_for_currency.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_country.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: registration_date_old.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: high_mileage_for_make.json
      2026-01-28 06:00:10 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
      2026-01-28 06:00:10 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.app.infinit.cc/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
      2026-01-28 06:00:10 [scrapy.middleware] INFO: Enabled item pipelines:
      ['crawlers.pipelines.translation_pipeline.TranslationPipeline',
       'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
       'crawlers.pipelines.post_to_api.PostToApiPipeline']
      2026-01-28 06:00:10 [scrapy.core.engine] INFO: Spider opened
      2026-01-28 06:00:10 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      2026-01-28 06:00:10 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
      2026-01-28 06:00:10 [scrapy-playwright] INFO: Starting download handler
      2026-01-28 06:00:10 [scrapy-playwright] INFO: Starting download handler
      2026-01-28 06:00:15 [scrapy-playwright] INFO: Launching browser chromium
    • Tail

      2026-01-29 09:19:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:20:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:21:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:22:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:23:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:24:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:25:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:26:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:27:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:28:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:29:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:30:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:31:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:32:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:33:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:34:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:35:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:36:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:37:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:38:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:39:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:40:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:41:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:42:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:43:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:44:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:45:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:46:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:47:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:48:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:49:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:50:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:51:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:52:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:53:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:54:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:55:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:56:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:57:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:58:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 09:59:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:00:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:01:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:02:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:03:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:04:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:05:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:06:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:07:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:08:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:09:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:10:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:11:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:12:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:13:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:14:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:15:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:16:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:17:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:18:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:19:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:20:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:21:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:22:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:23:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:24:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:25:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:26:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:27:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:28:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:29:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:30:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:31:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:32:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:33:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:34:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:35:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:36:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:37:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:38:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:39:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:40:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:41:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:42:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:43:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:44:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:45:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:46:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:47:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:48:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:49:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:50:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:51:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:52:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:53:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:54:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:55:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:56:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:57:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:58:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 10:59:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:00:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:01:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:02:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:03:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:04:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:05:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:06:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:07:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:08:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:09:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:10:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:11:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:12:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:13:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:14:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:15:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:16:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:17:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:18:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:19:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:20:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:21:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:22:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:23:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:24:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:25:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:26:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:27:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:28:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:29:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:30:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:31:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:32:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:33:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:34:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:35:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:36:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:37:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:38:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:39:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:40:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:41:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:42:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:43:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:44:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:45:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:46:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:47:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:48:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:49:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:50:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:51:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:52:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:53:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:54:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:55:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:56:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:57:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:58:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 11:59:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:00:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:01:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:02:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:03:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:04:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:05:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:06:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:07:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:08:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:09:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:10:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:11:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:12:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:13:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:14:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:15:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:16:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:17:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:18:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:19:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:20:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:21:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:22:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:23:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:24:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:25:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:26:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:27:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:28:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:29:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:30:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:31:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:32:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:33:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:34:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:35:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:36:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:37:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
      2026-01-29 12:38:10 [scrapy.extensions.logstats] INFO: Crawled 937 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
      
    • Log

      /2/log/utf8/sourcing_v2/carwow.gb/task_45_2026-01-28T06_00_02/?ui=mobile

    • Source

      http://scrapyd-2:6800/logs/sourcing_v2/carwow.gb/task_45_2026-01-28T06_00_02.log