首页 > 解决方案 > 抓取亚马逊评论,不能排除付费评论

问题描述

我试图收集每个评论者给产品的星数。我注意到一些评论者是“Vine Voices”或付费评论者。他们很少给4星,大多是5星。因此,我想排除它们。

我这样做的方法是,如果评论标有“a-color-success a-text-bold”标签,则将它们标记为“付费”或“未付费”。

我似乎无法将任何“付费”标签附加到 vine 变量中。怎么来的?

只有那些由 Vine Voice 撰写的评论有标签,那些没有的评论没有“付费”标签。

import requests
from bs4 import BeautifulSoup
import pandas as pd
import time

headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.71 Safari/537.36'}

rating_list = [] 
date_list = []
vine = []

for num in range(1,12):
    url = "https://www.amazon.com/Jabra-Wireless-Noise-Canceling-Headphones-Built/product-reviews/B07RS8B5HV/ref=cm_cr_arp_d_paging_btm_next_2?ie=UTF8&reviewerType=all_reviews&pageNumber={}&sortBy=recent".format(num)

    r = requests.get(url, headers = headers)

    soup = BeautifulSoup(r.content, 'lxml')

    for ratings in soup.find_all("div", attrs={"data-hook": "review"}):     
        submission_date = ratings.find("span", {'data-hook':'review-date'}).text
        rating = ratings.find('i', attrs={"data-hook": "review-star-rating"}).text
        paid = ratings.find("span", attrs={"class": "a-color-success a-text-bold"})

        if paid in ratings:
             vine.append("Paid")
        else:
            vine.append("Not-paid")

            date_list.append(submission_date)
            rating_list.append(rating)

            data = {'Rating':rating_list, 'Date':date_list, "Paid":vine}
        time.sleep(2)

df = pd.DataFrame(data)
df["Date"] = pd.to_datetime(df["Date"])
df = df.sort_values(by="Date", ascending=False)
print(df)

这是我到目前为止所得到的。评论 2 和 3 是 Vine Voice,但它们被标记为未付费,但应该付费。

0    5.0 out of 5 stars 2019-09-18  Not-paid
1    4.0 out of 5 stars 2019-09-13  Not-paid
2    5.0 out of 5 stars 2019-09-12  Not-paid
3    5.0 out of 5 stars 2019-09-11  Not-paid
4    5.0 out of 5 stars 2019-09-10  Not-paid
...

标签: pythonhtmlweb-scrapingbeautifulsoup

解决方案


您将元素与元素进行比较,这就是为什么它总是达到其他条件的原因。我已经进行了更改并将文本与文本进行了比较,它工作正常。检查下面的代码。

import requests
from bs4 import BeautifulSoup
import pandas as pd
import time

headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.71 Safari/537.36'}

rating_list = []
date_list = []
vine = []

for num in range(1,12):
    url = "https://www.amazon.com/Jabra-Wireless-Noise-Canceling-Headphones-Built/product-reviews/B07RS8B5HV/ref=cm_cr_arp_d_paging_btm_next_2?ie=UTF8&reviewerType=all_reviews&pageNumber={}&sortBy=recent".format(num)

    r = requests.get(url, headers = headers)

    soup = BeautifulSoup(r.content, 'lxml')

    for ratings in soup.find_all("div", attrs={"data-hook": "review"}):
        submission_date = ratings.find("span", {'data-hook':'review-date'}).text
        rating = ratings.find('i', attrs={"data-hook": "review-star-rating"}).text
        paid = ratings.find("span", attrs={"class": "a-color-success a-text-bold"})
        if paid:

         if paid.text in ratings.text:
             vine.append("Paid")
             date_list.append(submission_date)
             rating_list.append(rating)

             data = {'Rating': rating_list, 'Date': date_list, "Paid": vine}
        else:
            vine.append("Not-paid")

            date_list.append(submission_date)
            rating_list.append(rating)

            data = {'Rating':rating_list, 'Date':date_list, "Paid":vine}
        time.sleep(2)

df = pd.DataFrame(data)
df["Date"] = pd.to_datetime(df["Date"])
df = df.sort_values(by="Date", ascending=False)
print(df)

输出:

          Date      Paid              Rating
0   2019-09-18  Not-paid  5.0 out of 5 stars
1   2019-09-13  Not-paid  4.0 out of 5 stars
2   2019-09-12      Paid  5.0 out of 5 stars
3   2019-09-11      Paid  5.0 out of 5 stars
4   2019-09-10  Not-paid  5.0 out of 5 stars
5   2019-09-10  Not-paid  2.0 out of 5 stars
6   2019-09-10      Paid  5.0 out of 5 stars
7   2019-09-09      Paid  5.0 out of 5 stars
8   2019-09-09  Not-paid  2.0 out of 5 stars
9   2019-09-08      Paid  5.0 out of 5 stars
10  2019-09-05      Paid  5.0 out of 5 stars
11  2019-09-01  Not-paid  2.0 out of 5 stars
12  2019-08-31      Paid  5.0 out of 5 stars
13  2019-08-25      Paid  5.0 out of 5 stars
14  2019-08-24  Not-paid  4.0 out of 5 stars
15  2019-08-22  Not-paid  5.0 out of 5 stars
16  2019-08-21      Paid  5.0 out of 5 stars
17  2019-08-20  Not-paid  5.0 out of 5 stars
18  2019-08-20      Paid  5.0 out of 5 stars
19  2019-08-18      Paid  5.0 out of 5 stars
20  2019-08-17  Not-paid  5.0 out of 5 stars
21  2019-08-17  Not-paid  5.0 out of 5 stars
22  2019-08-14  Not-paid  4.0 out of 5 stars
23  2019-08-12      Paid  5.0 out of 5 stars
24  2019-08-05      Paid  5.0 out of 5 stars
25  2019-08-05      Paid  4.0 out of 5 stars
26  2019-08-04      Paid  5.0 out of 5 stars
27  2019-08-04      Paid  4.0 out of 5 stars
29  2019-08-03      Paid  5.0 out of 5 stars
28  2019-08-03      Paid  4.0 out of 5 stars
..         ...       ...                 ...
80  2019-07-08      Paid  5.0 out of 5 stars
81  2019-07-08      Paid  5.0 out of 5 stars
82  2019-07-08      Paid  5.0 out of 5 stars
85  2019-07-07      Paid  5.0 out of 5 stars
83  2019-07-07      Paid  5.0 out of 5 stars
84  2019-07-07      Paid  5.0 out of 5 stars
87  2019-07-06      Paid  5.0 out of 5 stars
86  2019-07-06      Paid  4.0 out of 5 stars
88  2019-07-05  Not-paid  4.0 out of 5 stars
89  2019-07-05      Paid  5.0 out of 5 stars
90  2019-07-05      Paid  5.0 out of 5 stars
91  2019-07-05      Paid  5.0 out of 5 stars
92  2019-07-04      Paid  5.0 out of 5 stars
93  2019-07-04      Paid  4.0 out of 5 stars
94  2019-07-04      Paid  5.0 out of 5 stars
95  2019-07-04      Paid  5.0 out of 5 stars
96  2019-07-04      Paid  5.0 out of 5 stars
98  2019-07-03  Not-paid  3.0 out of 5 stars
97  2019-07-03      Paid  5.0 out of 5 stars
99  2019-07-01      Paid  5.0 out of 5 stars
100 2019-07-01      Paid  3.0 out of 5 stars
101 2019-07-01      Paid  5.0 out of 5 stars
102 2019-06-30      Paid  5.0 out of 5 stars
103 2019-06-29      Paid  5.0 out of 5 stars
104 2019-06-29      Paid  5.0 out of 5 stars
105 2019-06-28  Not-paid  1.0 out of 5 stars
106 2019-06-27      Paid  4.0 out of 5 stars
107 2019-06-27      Paid  5.0 out of 5 stars
108 2019-06-26      Paid  5.0 out of 5 stars
109 2019-06-26      Paid  5.0 out of 5 stars

[110 rows x 3 columns]

推荐阅读