首页 > 解决方案 > 标准化具有许多零的像素输入数据

问题描述

我想为神经网络标准化我的输入数据。

数据如下所示:

data= np.array([[0,0,0,0,233,2,0,0,0],[0,0,0,23,50,2,0,0,0],[0,0,0,0,3,20,3,0,0]])

这是我使用的功能。由于零,它不起作用。

def standardize(data): #dataframe
    _,c = data.shape
    data_standardized = data.copy(deep=True)
    for j in range(c):
        x = data_standardized.iloc[:, j]
        avg = x.mean()
        std = x.std()
        x_standardized = (x - avg)/ std
        data_standardized.iloc[:, j] = x_standardized

    return data_standardized

标签: pythonpython-3.xneural-networkstandardized

解决方案


使用布尔索引来避免除以零:

In [90]: data= np.array([[0,0,0,0,233,2,0,0,0],[0,0,0,23,50,2,0,0,0],[0,0,0,0,3,20,3,0,0]])

In [91]: new = np.zeros(data.shape)

In [92]: m = data.mean(0)

In [93]: std = data.std(0)

In [94]: r = data-m

In [95]: new[:,std.nonzero()] = r[:,std.nonzero()]/std[std.nonzero()]

In [96]: new
Out[96]: 
array([[ 0.        ,  0.        ,  0.        , -0.70710678,  1.3875163 ,
        -0.70710678, -0.70710678,  0.        ,  0.        ],
       [ 0.        ,  0.        ,  0.        ,  1.41421356, -0.45690609,
        -0.70710678, -0.70710678,  0.        ,  0.        ],
       [ 0.        ,  0.        ,  0.        , -0.70710678, -0.9306102 ,
         1.41421356,  1.41421356,  0.        ,  0.        ]])

或使用sklearn.preprocessing.StandardScaler


您的功能重构:

def standardize(data): #dataframe
    data = data.values
    new = np.zeros(data.shape)
    m = data.mean(0)
    std = data.std(0)
    new[:,std.nonzero()] = r[:,std.nonzero()]/std[std.nonzero()]
    return pd.DataFrame(new)

推荐阅读