首页 > 解决方案 > 使用 sklearn 包进行 Gibbs 采样

问题描述

我目前正在尝试将 sklearns 包用于受限玻尔兹曼机 [RBM] 的伯努利版本,但我不明白它是如何工作的。我想使用它的数据集是 MNIST 数据集。我目前使用的极少量代码是:

def rbm():
    #weights = np.zeros((20, 100, 784))
    #for j in range(0, epochs):
    rbm = BernoulliRBM(n_iter= 1, learning_rate = 0.01, n_components = 100, random_state=0, verbose=True)
    rbm.fit(bindigit_trn)
    gibbs(rbm.components_)
    weights = rbm.components_
    return weights

我得到的错误代码是:“此检查检测到应该解析但没有解析的名称。由于动态调度和鸭子类型,这在有限但有用的情况下是可能的。顶级和类级项目比实例项目得到更好的支持。”

有人可以帮助初学者,我该怎么做?

标签: pythonmachine-learningscikit-learndeep-learningrbm

解决方案


您可以使用数据集尝试以下操作scikit-learn digits

from sklearn.neural_network import BernoulliRBM
from sklearn import datasets
import numpy as np
import matplotlib.pylab as plt

def show_digits(im, title):
    plt.figure(figsize=(5,5))
    plt.gray()
    for i in range(im.shape[0]):
        plt.subplot(10,10,i+1)
        plt.imshow(np.reshape(im[i,:], (8,8)))
        plt.axis('off')
    plt.suptitle(title)
    plt.show()

def rbm():
    digits = datasets.load_digits()
    bindigit_trn = np.asarray(digits.data, 'float32')
    for i in range(len(bindigit_trn)):
        bindigit_trn[i,:] = bindigit_trn[i,:] / np.max(bindigit_trn[i,:])
    print(bindigit_trn.shape)
    # (1797, 64) => 1797 8x8 digits images
    digits = bindigit_trn[:100,:]
    print(digits.shape)
    # (100, 64) => 100 images
    show_digits(digits, 'original digits')
    rbm = BernoulliRBM(n_iter= 10, learning_rate = 0.1, n_components = 10, random_state=0, verbose=True)
    rbm.fit(bindigit_trn)
    print(rbm.components_.shape)
    # (10, 64)
    digits_new = digits.copy() # rbm.components_.copy() 
    # gibbs sampling here
    for j in range(10000):
        for i in range(100):
            digits_new[i,:] = rbm.gibbs(digits_new[i,:])
    print(digits_new.shape)
    # (100, 64)
    show_digits(digits_new, 'sampled digits')
    weights = rbm.components_
    return weights

weights = rbm()
show_digits(weights, 'weights')

在此处输入图像描述

#[BernoulliRBM] Iteration 1, pseudo-likelihood = -25.85, time = 0.02s
#[BernoulliRBM] Iteration 2, pseudo-likelihood = -25.67, time = 0.02s
#[BernoulliRBM] Iteration 3, pseudo-likelihood = -25.45, time = 0.03s
#[BernoulliRBM] Iteration 4, pseudo-likelihood = -24.34, time = 0.02s
#[BernoulliRBM] Iteration 5, pseudo-likelihood = -23.41, time = 0.02s
#[BernoulliRBM] Iteration 6, pseudo-likelihood = -22.33, time = 0.02s
#[BernoulliRBM] Iteration 7, pseudo-likelihood = -21.88, time = 0.02s
#[BernoulliRBM] Iteration 8, pseudo-likelihood = -21.66, time = 0.02s
#[BernoulliRBM] Iteration 9, pseudo-likelihood = -21.74, time = 0.02s
#[BernoulliRBM] Iteration 10, pseudo-likelihood = -21.04, time = 0.02s

在此处输入图像描述

在此处输入图像描述


推荐阅读