首页 > 解决方案 > 如何使用 keras 提取 CNN 激活?

问题描述

我想使用 keras 从第一个全连接层中提取CNN 激活。Caffe中有这样的功能,但我无法使用该框架,因为我面临安装问题。我正在阅读一篇使用这些 CNN 激活的研究论文,但作者使用的是 Caffe。

有没有办法提取那些 CNN 激活,所以我可以通过使用数据挖掘关联规则apriori algorithm将它们用作事务中的项目。

当然,首先我必须提取 CNN 激活的k 个最大幅度。所以每张图片都是一笔交易,每一次激活都是一件物品。

到目前为止,我有以下代码:

from __future__ import print_function
import keras
from keras.datasets import mnist
from keras.layers import Dense, Flatten
from keras.layers import Conv2D, MaxPooling2D
from keras.models import Sequential
import matplotlib.pylab as plt

model = Sequential()
model.add(Conv2D(32, kernel_size=(5, 5), strides=(1, 1),
                 activation='relu',
                 input_shape=input_shape))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Conv2D(64, (5, 5), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())
model.add(Dense(1000, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

model.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=keras.optimizers.Adam(),
              metrics=['accuracy'])

标签: pythonkerasconv-neural-networkcaffefeature-extraction

解决方案


提到下面使用的解决方案Tensorflow Keras

为了能够访问Activations,首先我们应该传递一个或多个图像,然后激活对应于这些图像。

传递 an 的代码Input Image如下preprocessing所示:

from tensorflow.keras.preprocessing import image

Test_Dir = '/Deep_Learning_With_Python_Book/Dogs_Vs_Cats_Small/test/cats'
Image_File = os.path.join(Test_Dir, 'cat.1545.jpg')

Image = image.load_img(Image_File, target_size = (150,150))

Image_Tensor = image.img_to_array(Image)

print(Image_Tensor.shape)

Image_Tensor = tf.expand_dims(Image_Tensor, axis = 0)

Image_Tensor = Image_Tensor/255.0

一旦模型被定义,我们可以Activations使用下面显示的代码访问任何层(关于猫和狗数据集):

# Extract the Model Outputs for all the Layers
Model_Outputs = [layer.output for layer in model.layers]
# Create a Model with Model Input as Input and the Model Outputs as Output
Activation_Model = Model(model.input, Model_Outputs)
Activations = Activation_Model.predict(Image_Tensor)

First Fully Connected Layer(关于猫和狗数据)的输出是:

print('Shape of Activation of First Fully Connected Layer is', Activations[-2].shape)
print('------------------------------------------------------------------------------------------')
print('Activation of First Fully Connected Layer is', Activations[-2])

它的输出如下图所示:

Shape of Activation of First Fully Connected Layer is (1, 512)
------------------------------------------------------------------------------------------
Activation of First Fully Connected Layer is [[0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.02759874 0.         0.         0.         0.
  0.         0.         0.00079661 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.04887392 0.         0.
  0.04422646 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.01124999
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.00286965 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.00027195 0.
  0.         0.02132209 0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.00511147 0.         0.         0.02347952 0.
  0.         0.         0.         0.         0.         0.
  0.02570331 0.         0.         0.         0.         0.03443285
  0.         0.         0.         0.         0.         0.
  0.         0.0068848  0.         0.         0.         0.
  0.         0.         0.         0.         0.00936454 0.
  0.00389365 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.00152553 0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.09215052 0.         0.         0.0284613  0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.00198757 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.02395868 0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.01150922 0.0119792
  0.         0.         0.         0.         0.         0.
  0.00775307 0.         0.         0.         0.         0.
  0.         0.         0.         0.01026413 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.01522083 0.         0.00377031 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.02235368 0.         0.         0.         0.
  0.         0.         0.         0.         0.00317057 0.
  0.         0.         0.         0.         0.         0.
  0.03029975 0.         0.         0.         0.         0.
  0.         0.         0.03843511 0.         0.         0.
  0.         0.         0.         0.         0.         0.02327696
  0.00557329 0.         0.02251234 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.01655817 0.         0.
  0.         0.         0.         0.         0.00221658 0.
  0.         0.         0.         0.02087847 0.         0.
  0.         0.         0.02594821 0.         0.         0.
  0.         0.         0.01515464 0.         0.         0.
  0.         0.         0.         0.         0.00019883 0.
  0.         0.         0.         0.         0.         0.00213376
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.00237587
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.02521542 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.00490679 0.         0.04504126 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.        ]]

有关更多信息,请参阅Keras 之父 Francois Chollet的第 5.4.1 节可视化本书的中间激活。Deep Learning Using Python

希望这可以帮助。快乐学习!


推荐阅读