首页 > 解决方案 > AttributeError: '_io.BufferedReader' 对象没有属性 'load'

问题描述

我正在尝试使用带有 TF 的 Python 制作植物分类程序,但遇到了标题中提到的错误。

当我构建和运行程序时,我得到以下输出:

5539
2021-06-02 12:33:37.608529: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library cudart64_110.dll
Traceback (most recent call last):
  File "C:\Users\User\Desktop\AlphaTango\myclassifier.py", line 9, in <module>
    (feature, labels) = load_data()
  File "C:\Users\User\Desktop\AlphaTango\utils.py", line 46, in load_data
    data = pick.load(pick)
AttributeError: '_io.BufferedReader' object has no attribute 'load'

我还在第 9 行收到了一个危险信号(带有“(特征,标签)= load_data()”的行,说明“文件”C:\Users\User\Desktop\AlphaTango\myclassifier.py”,第 9 行,在)

代码本身很长,但我将把它粘贴在下面;希望它能帮助您确定问题并帮助我:

    from utils import load_data
    
    import tensorflow as tf
    import numpy as np
    import matplotlib.pyplot as plt
    
    from sklearn.model_selection import train_test_split
    
    (feature, labels) = load_data()
    
    
    x_train, x_test, y_train, y_test = train_test_split(feature, labels, test_size = 0.1)
    
    categories = ['black-grass', 'charlock', 'cleavers', 'common chickweed', 'common wheat', 'fat hen', 'loose silky-bent', 'maize', 'scentless mayweed', 'shepherds purse', 'small-flowered cranesbill', 'sugar beet']
    
    
    input_layer = tf.keras.layers.Input([224, 224, 3])
    
    conv1=tf.kears.layers.Conv2D(filters= 32, kernel_size=(5, 5), padding='Same', 
        activation ='relu')(input_layer)
    
    pool1 = tf.keras.layers.MaxPooling2D(pool_size=(2, 2))(conv1)
    
    conv2 = tf.keras.layers.Conv2D(filters= 64, kernel_size=(3,3), padding='Same',
        activation='relu')(pool1)
    
    pool2 = tf.keras.layers.MaxPooling2D(pool_size(2,2), strides=(2,2))(conv2)
    
    conv3 = tf.keras.layers.Conv2D(filters = 96, kernel_size=(3,3),padding='Same',
        activation='relu')(pool2)
    pool3 = tf.keras.layers.MaxPooling2D(pool_size=(2,2), strides=(2,2))(conv3)
    
    
    conv4 = tf.keras.layers.Conv2D(filters = 96, kernel_size= (3,3),padding='Same',
        activation= 'relu')(pool3)
    
    pool4 = tf.keras.layers.MaxPooling2D(pool_size(2,2), strides=(2,2))(conv4)
    
    flt1 = tf.keras.layers.Flatten()(pool4)
    
    dn1 = tf.keras.layers.Dense(512, activation='relu')(flt1)
    out = tf.kera.layers.Dense(5, activation='softmax')(dn1)
    
    model = tf.keras.Model(input_layer, out)
    
    model.compile(optimizer='adam', loss='sparse_categorical_crossentropy',
        metrics=['accuracy'])
    
    model.fit(x_train, y_train, batch_size = 100, epochs =10)
    
    model.save('mymodel.h5')

编辑:哦,还有一件事;我在构建解决方案时遵循本教程:https ://www.youtube.com/watch?v=POO1gdUJ7yE&t=995s

但是,他的代码有效,而我的无效

EDIT2:还添加了另一个文件的代码:

import os
import numpy as np
import matplotlib.pyplot as plt
import cv2
import pickle



data_dir = './data/plants'

categories = ['black-grass', 'charlock', 'cleavers', 'common chickweed', 'common wheat', 'fat hen', 'loose silky-bent', 'maize', 'scentless mayweed', 'shepherds purse', 'small-flowered cranesbill', 'sugar beet']

data = []

def make_data():
    for category in categories:
        path = os.path.join(data_dir, category)  # ./data/plants/black-grass"
        label = categories.index(category)

        for img_name in os.listdir(path):
            image_path = os.path.join(path, img_name)
            image = cv2.imread(image_path)

            try:
                image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
                image = cv2.resize(image, (224, 224))

                image = np.array(image, dtype=np.float32)

                data.append([image, label])

            except Exception as e:
                pass

    print(len(data))

    pik = open('data.pickle', 'wb')
    pickle.dump(data, pik)
    pik.close()


make_data()

def load_data():
    pick = open('data.pickle', 'rb')
    data = pick.load(pick)
    pick.close()

    np.random.shuffle(data)

    feature = []
    labels = []

    for img, label in data:
        feature.append(img)
        labels.append(label)

    feature = np.array(feature, dtype=np.float32)
    labels = np.array(labels)

    feature = feature/255.0

    return [feature, labels]

标签: pythontensorflowmachine-learningartificial-intelligenceimage-classification

解决方案


你在这里打错了:

pick = open('data.pickle', 'rb')
data = pick.load(pick) # this line

而不是使用pickle加载文件,您使用了pick. 正确的版本是这样的:

pick = open('data.pickle', 'rb')
data = pickle.load(pick) 

推荐阅读