首页 > 解决方案 > Scrapy - NameError:未定义全局名称'logger'

问题描述

我正在尝试通过修改中间件来稍微修改 Scrapy 重试。我使用这个中间件:

class Retry500Middleware(RetryMiddleware):

    def _retry(self, request, reason, spider):
        retries = request.meta.get('retry_times', 0) + 1

        if retries <= self.max_retry_times:
            logger.debug("Retrying %(request)s (failed %(retries)d times): %(reason)s",
                         {'request': request, 'retries': retries, 'reason': reason},
                         extra={'spider': spider})
            retryreq = request.copy()
            retryreq.meta['retry_times'] = retries
            retryreq.meta['download_timeout'] = 600
            retryreq.dont_filter = True
            retryreq.priority = request.priority + self.priority_adjust
            return retryreq
        else:
            logger.error("Gave up retrying %(request)s (failed %(retries)d times): %(reason)s",
                         {'request': request, 'retries': retries, 'reason': reason},
                         extra={'spider': spider})

然后我得到这个错误。

Traceback (most recent call last):
  File "/usr/lib64/python2.7/site-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks
    result = g.send(result)
  File "/usr/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 53, in process_response
    spider=spider)
  File "/usr/lib/python2.7/site-packages/scrapy/downloadermiddlewares/retry.py", line 54, in process_response
    return self._retry(request, reason, spider) or response
  File "/home/<user_name>/<project_folder>/<project_name>/<project_name>/middlewares.py", line 48, in _retry
    logger.debug("Retrying %(request)s (failed %(retries)d times): %(reason)s",
NameError: global name 'logger' is not defined
2018-08-15 14:01:44 [scrapy.core.engine] INFO: Closing spider (finished)

我在我的机器上使用了它,中间件工作得很好。我应该怎么做才能避免这个错误?

标签: pythonweb-scrapingscrapyscreen-scrapingscrapy-middleware

解决方案


最后,我改用这段代码

import logging
logging.log(logging.ERROR, "Gave up retrying %(request)s (failed %(retries)d times): %(reason)s",
                         {'request': request, 'retries': retries, 'reason': reason},
                         extra={'spider': spider})

推荐阅读