首页 > 解决方案 > HTTPS proxies does not work when sending requests

问题描述

I want to crawl some websites with Python and BeautifulSoup lib, and I want to set some proxies. I came a cross a problem with setting up the proxies. When ever I set https proxy, it does not work, but when i set the http proxy it works. Do you know whats the problem?

import requests

https_proxies = [{'https': 'https://60.191.11.241:3128'},
                  {'https': 'https://129.226.162.177:80'},
                  {'https': 'https://103.231.80.202:55443'},
                  {'https': 'https://167.172.171.115:34538'},
                  {'https': 'https://103.104.193.34:6000'},
                  {'https': 'https://200.60.4.238:999'}]

for proxy in https_proxies:
  try:
    r1 = requests.get("https://edition.cnn.com/", proxies=proxy, timeout = 3)
    print(r1)
  except:
    print("BAD PROXY")

http_proxies = [{'http': 'http://194.169.164.14:34334'},
                {'http': 'http://46.167.238.109:5678'},
                {'http': 'http://201.59.102.35:5678'},
                {'http': 'http://31.25.243.40:9147'},
                {'http': 'http://178.162.202.44:1025'}]

for proxy in http_proxies:
  try:
    r1 = requests.get("https://edition.cnn.com/", proxies=proxy, timeout = 3)
    print(r1)
  except:
    print("BAD PROXY")      

I have tried with more than 50 HTTPS proxies on 10 different websites, and its always the same. This is the website I have used for getting the proxies: https://geonode.com/free-proxy-list And this: https://free-proxy-list.net/

标签: pythonproxypython-requests

解决方案


推荐阅读