python - 如何从加载缓慢的网站中抓取表格数据
问题描述
我正在尝试从以下网站抓取表格数据:https ://fantasyfootball.telegraph.co.uk/premier-league/statscentre/
目标是获取所有玩家数据并将其存储在字典中。
我正在使用 BeautifulSoup,我可以从 html 内容中找到表格,但是返回的表格正文是空的。
从阅读其他帖子我看到这可能与网站加载网站后加载表格数据的方式很慢有关,但我找不到解决问题的方法。
我的代码如下:
from bs4 import BeautifulSoup
import requests
# Make a GET request to feth the raw HTML content
html_content = requests.get(url).text
# Parse the html content
soup = BeautifulSoup(html_content, "lxml")
# Find the Title Data within the website
player_table = soup.find("table", attrs={"class": "player-profile-content"})
print(player_table)
我得到的结果是这样的:
<table class="playerrow playlist" id="table-players">
<thead>
<tr class="table-head"></tr>
</thead>
<tbody></tbody>
</table>
网站上的实际 HTML 代码很长,因为它们将大量数据打包到每个<tr>
以及后续代码中<td>
,所以除非有人问,否则我不会在这里发布。可以说<td>
标题行中有几行,<tr>
正文中有几行。
解决方案
此脚本将打印所有播放器统计信息(数据通过 Json 从外部 URL 加载):
import ssl
import json
import requests
from urllib3 import poolmanager
# workaround to avoid SSL errors:
class TLSAdapter(requests.adapters.HTTPAdapter):
def init_poolmanager(self, connections, maxsize, block=False):
"""Create and initialize the urllib3 PoolManager."""
ctx = ssl.create_default_context()
ctx.set_ciphers('DEFAULT@SECLEVEL=1')
self.poolmanager = poolmanager.PoolManager(
num_pools=connections,
maxsize=maxsize,
block=block,
ssl_version=ssl.PROTOCOL_TLS,
ssl_context=ctx)
url = 'https://fantasyfootball.telegraph.co.uk/premier-league/json/getstatsjson'
session = requests.session()
session.mount('https://', TLSAdapter())
data = session.get(url).json()
# uncomment this to print all data:
# print(json.dumps(data, indent=4))
for s in data['playerstats']:
for k, v in s.items():
print('{:<15} {}'.format(k, v))
print('-'*80)
印刷:
SUSPENSION None
WEEKPOINTS 0
TEAMCODE MCY
SXI 34
PLAYERNAME de Bruyne, K
FULLCLEAN -
SUBS 3
TEAMNAME Man City
MISSEDPEN 0
YELLOWCARD 3
CONCEED -
INJURY None
PLAYERFULLNAME Kevin de Bruyne
RATIO 40.7
PICKED 36
VALUE 5.6
POINTS 228
PARTCLEAN -
OWNGOAL 0
ASSISTS 30
GOALS 14
REDCARD 0
PENSAVE -
PLAYERID 3001
POS MID
--------------------------------------------------------------------------------
...and so on.