首页 > 解决方案 > 如何将抓取数据保存到 CSV 文件中?

问题描述

我对这个 Python、Selenium 和 BeautifulSoup 很陌生。我已经在网上看到了很多教程,但我很困惑。请帮我。所以基本上这是我的python代码:

    from selenium import webdriver
    from selenium.webdriver.common.by import By
    from selenium.webdriver.support.ui import WebDriverWait
    from selenium.webdriver.support import expected_conditions as EC
    from bs4 import BeautifulSoup as bs
    
    #import requests
    import time 
    #import csv
    
    passwordStr = '***'
    usernameStr='***'
    
    chrome_path = r'C:\Users\wana isa\geckodriver-v0.26.0-win64\geckodriver.exe'
    browser = webdriver.Firefox(executable_path=r'C:\Users\wana isa\geckodriver-v0.26.0-win64\geckodriver.exe')
    browser.get(('http://*********/'))
    
    wait = WebDriverWait(browser,10)
    
    
    # wait for transition then continue to fill items
    #time.sleep(2)
    password = wait.until(EC.presence_of_element_located((By.ID, 'txt_Password')))
    password.send_keys(passwordStr)
    username = wait.until(EC.presence_of_element_located((By.ID, 'txt_Username')))
    username.send_keys(usernameStr)
    
    signInButton = browser.find_element_by_id('button')
    signInButton.click()
    browser.get(('http://******'))
    
    
    MainTab=browser.find_element_by_name('mainli_waninfo').click()
    SubTab=browser.find_element_by_name('subli_bssinfo').click()
    browser.switch_to.frame(browser.find_element_by_id('frameContent'))
    
    html=browser.page_source
    soup=bs(html,'lxml')
    #print(soup.prettify())
    
#for Service Proversioning Status , This is the data that i scrape and need to be saved into csv
    spsList=['ONT  Registration Status','OLT Service Configuration Status','EMS Configuration Status','ACS Registration Status']
    sps_id=['td1_2','td2_2','td3_2','td4_2']
    for i in range(len(sps_id)):
        elemntValu = browser.find_element_by_id(sps_id[i]).text
        output= print(spsList[i] + " : "+ elemntValu)
        
    browser.close()

这是输出:

在此处输入图像描述

如果你能帮助我,我将不胜感激。

标签: pythonpandasseleniumselenium-webdriverbeautifulsoup

解决方案


将此导入添加到您的代码中:

import csv

将以下内容添加到您的代码中:

  with open('FileName.csv', 'w', newline='') as file:
      writer = csv.writer(file)
      for i in range(len(sps_id)):
            elemntValu = browser.find_element_by_id(sps_id[i]).text
            output= print(spsList[i] + " : "+ elemntValu)
            writer.writerow([spsList[i], elemntValu])
  f.close()
  browser.close()

推荐阅读