首页 > 解决方案 > 证书过期,不能与 cerify=True 一起使用;requests.exceptions.SSLError 证书验证失败

问题描述

我是 Python 的真正初学者,基本上从互联网上学到了所有东西——所以如果我没有正确掌握所有概念,请原谅。

我的问题是我尝试使用requestsand编写网络抓取BeautifulSoup。两天后我收到证书已过期的错误,如果我进入这个网站也是如此- 我什至无法在我的资源管理器中将其添加为异常。

这是我的代码:

def project_spider(max_pages):
    global page
    page = 1
    #for i in range(1, max_pages+1):
    while page <= max_pages:
       # for i in range(1, page + 1)
            page += 1
            url = 'https://hubbub.org/projects/?page=' + str(page)
            # Collect list of urls
            try:
                source_code = requests.get(url, allow_redirects=False, timeout=15, verify=False)
            except Exception or AttributeError or ConnectionError or IOError:
                print 'Failed to open url.'
                pass
            # Turn urls to text
            plain_text = source_code.text.encode('utf-8')
            # define object with all text on website
            soup = BeautifulSoup(plain_text, 'html.parser')
            # define variable that finds in the text data everything that is in the html code considered "diverse" and has the attributes 'col...' class
            data = soup.findAll('div', attrs={'class': 'col-xs-12 col-sm-6 col-md-4 col-lg-3'})
            # for every found diverse in the data variable
            for div in data:
               #search all diverse for links (a)
               links = div.findAll('a', href=True)
               global names
               names = div.find('h4').contents[0]
               print(names)
               for a in links:
                   global links2
                   links2 = a['href']
                   print(links2)
                   get_single_item_data(links2)

可能专家会以不同的方式进行编程。但是,我尝试使用 verify=False 和 session() 修复它,但它不起作用。我也试图跳过它在 (5) 中的页面,但我无法跳过它。此刻我真的很绝望,因为我得到的只是这个错误:

https://rabbitraisers.org/p/fantasticfloats/
Traceback (most recent call last):
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
    chunked=chunked)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 343, in _make_request
    self._validate_conn(conn)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 849, in _validate_conn
    conn.connect()
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connection.py", line 356, in connect
    ssl_context=context)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\util\ssl_.py", line 359, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\ssl.py", line 412, in wrap_socket
    session=session
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\ssl.py", line 850, in _create
    self.do_handshake()
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\ssl.py", line 1108, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1045)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\adapters.py", line 445, in send
    timeout=timeout
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\util\retry.py", line 398, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='rabbitraisers.org', port=443): Max retries exceeded with url: /p/fantasticfloats/ (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1045)')))

标签: pythonweb-scrapingbeautifulsouppython-requests

解决方案


在源代码的顶部导入它

from requests.packages.urllib3.exceptions import InsecureRequestWarning

然后把它作为你的 project_spider 函数的第一行之一

requests.packages.urllib3.disable_warnings(InsecureRequestWarning)

推荐阅读