首页 > 解决方案 > Python - Beautifulsoup - 从列表中传递单个 url 以被抓取

问题描述

我正在尝试接收下一页上的 url 列表

https://sport-tv-guide.live/live/tennis

收集这些 URL 后,我需要将每个 URL 传递给一个抓取函数,以抓取并输出相关的匹配数据。

如果特定页面上只有一个匹配项,则数据将正确输出,例如 - https://sport-tv-guide.live/live/darts(请参阅下面的输出)

正确的输出

当我使用包含多个链接的页面时出现问题,例如 - https://sport-tv-guide.live/live/tennis,似乎 URL 被正确抓取(使用打印确认,打印URLS),但它们似乎没有正确传递给要抓取的内容,因为脚本只是默默地失败(见下面的输出)

网址输出

代码如下:

import requests
from bs4 import BeautifulSoup

def makesoup(url):
    cookies = {'mycountries' : '101,28,3,102,42,10,18,4,2'}
    r = requests.post(url,  cookies=cookies)
    return BeautifulSoup(r.text,"lxml")
   
def linkscrape(links):
    baseurl = "https://sport-tv-guide.live"
    urllist = []
    
    for link in links:
        finalurl = (baseurl+ link['href'])
        urllist.append(finalurl)
        # print(finalurl)
        
    for singleurl in urllist:
        soup2=makesoup(url=singleurl)
        print(singleurl)
        g_data=soup2.find_all('div', {'class': 'main col-md-4 eventData'})
        for match in g_data:
            hometeam =  match.find('div', class_='cell40 text-center teamName1').text.strip()
            awayteam =  match.find('div', class_='cell40 text-center teamName2').text.strip()
            dateandtime = match.find('div', class_='timeInfo').text.strip()
            print("Match ; " + hometeam + "vs" +  awayteam) 
            print("Date and Time; ", dateandtime) 


            
def matches():
    soup=makesoup(url = "https://sport-tv-guide.live/live/tennis")
    linkscrape(links= soup.find_all('a', {'class': 'article flag',  'href' : True}))
    

我假设问题是当有多个 URL 时,它们作为一个大字符串而不是单独的 URL 传递,但我不确定如何让它一次只从 URL 列表中传递每个 URL被刮?

感谢任何可以建议或帮助解决此问题的人。

标签: pythonweb-scrapingbeautifulsoup

解决方案


分析链接后,2个链接指向不同布局的不同页面。

https://sport-tv-guide.live/live/tennis - 当您获得所有链接时使用此链接,它们指向不同的页面布局。

https://sport-tv-guide.live/live/darts - 此页面中的链接指向此布局。

如果您需要从https://sport-tv-guide.live/live/tennis的所有链接中抓取数据,可以使用以下脚本。

import requests
from bs4 import BeautifulSoup

def makesoup(url):
    cookies = {'mycountries' : '101,28,3,102,42,10,18,4,2'}
    print(url)
    r = requests.post(url,  cookies=cookies)
    return BeautifulSoup(r.text,"lxml")
   
def linkscrape(links):
    baseurl = "https://sport-tv-guide.live"
    urllist = []
    
    for link in links:
        finalurl = baseurl + link['href']
        urllist.append(finalurl)
        
    for singleurl in urllist:
        soup2=makesoup(url=singleurl)
        g_data=soup2.find('div', {'class': 'eventData'})
        try:
            teams = g_data.find_all("div", class_=["row","mb-5"])
            
            print("HomeTeam - {}".format(teams[0].find("div", class_="main col-md-8 col-wrap").text.strip()))
            print("AwayTeam - {}".format(teams[1].find("div", class_="main col-md-8 col-wrap").text.strip()))
            channelInfo = g_data.find("div", {"id":"channelInfo"})
            print("Time - {}".format(channelInfo.find("div", class_="time full").text.strip()))
            print("Date - {}".format(channelInfo.find("div", class_="date full").text.strip()))
        except :
            print("Data not found")

def matches():
    soup=makesoup(url = "https://sport-tv-guide.live/live/tennis")
    linkscrape(links=soup.find_all('a', {'class': 'article flag',  'href' : True}))

matches()

注意:我放try/except是因为从页面获得的链接没有相同的布局。

输出:

https://sport-tv-guide.live/live/tennis
https://sport-tv-guide.live/event/live-tennis-national-tennis-centre-roehampton?uid=191007191100
Data not found
https://sport-tv-guide.live/event/bett1-aces-berlin/?uid=71916304
HomeTeam - Tommy Haas - Roberto Bautista-Agut
AwayTeam - Dominic Thiem - Jannik Sinner
Time - 11:15
Date - Sunday, 07-19-2020
https://sport-tv-guide.live/event/bett1-aces-berlin/?uid=71916307
HomeTeam - Tommy Haas - Roberto Bautista-Agut
AwayTeam - Dominic Thiem - Jannik Sinner
Time - 14:00
Date - Sunday, 07-19-2020
https://sport-tv-guide.live/event/bett1-aces-berlin/?uid=17207191605
HomeTeam - Tommy Haas - Roberto Bautista-Agut
AwayTeam - Dominic Thiem - Jannik Sinner
Time - 14:05
Date - Sunday, 07-19-2020
https://sport-tv-guide.live/event/world-teamtennis/?uid=161707191630102
Data not found

推荐阅读