首页 > 解决方案 > 从文件进行多线程时修复“AttributeError:'str'对象没有属性'close'”?

问题描述

获取 AttributeError:“str”对象没有属性“close”

尝试关闭文件(仍在下面的代码中),重新编码两次。

import urllib2
import csv
import lxml
from bs4 import BeautifulSoup
from multiprocessing.dummy import Pool  # This is a thread-based Pool
from multiprocessing import cpu_count
placeHolder = []

def crawlToCSV(URLrecord):
    try:
        OpenSomeSiteURL = urllib2.urlopen(URLrecord).read()
        Soup_SomeSite = BeautifulSoup(OpenSomeSiteURL, "lxml")
        OpenSomeSiteURL.close()
        tbodyTags = Soup_SomeSite.find("title")
    except urllib2.URLError:
        print (URLrecord.rstrip() + " -- Error")
        Soup_SomeSite = ""
        OpenSomeSiteURL = ""


if __name__ == "__main__":
    fileName = raw_input()
    pool = Pool(cpu_count() * 2)  # Creates a Pool with cpu_count * 2 threads.
    with open(fileName, "rb") as f:
        results = pool.map(crawlToCSV, f) # results is a list of all the placeHolder lists returned from each call to crawlToCSV
        f.close()

也很抱歉我的糟糕编码,第一次使用 Python 多线程,犯了很多错误。

现在试图让它运行,在接受 URL 时不会出错。

  File "test.py", line 26, in <module>
    results = pool.map(crawlToCSV, f) # results is a list of all the placeHolder lists returned from each call to crawlToCSV
  File "/usr/lib/python2.7/multiprocessing/pool.py", line 253, in map
    return self.map_async(func, iterable, chunksize).get()
  File "/usr/lib/python2.7/multiprocessing/pool.py", line 572, in get
    raise self._value
AttributeError: 'str' object has no attribute 'close'

标签: pythonmultithreadingbeautifulsoupurllib2python-multiprocessing

解决方案


推荐阅读