首页 > 解决方案 > 如何在 Keras 中将嵌入与可变长度输入连接起来?

问题描述

这是我正在处理的网络图,数据是表格和结构化的,

在此处输入图像描述

在左边,我们有一些连续特征的能力,在右边,我们可以有“N”个修饰符。每个修饰符都有分类的modifier_type和一些连续特征的统计信息。

如果这里只有一个修饰符,那么代码就可以正常工作!

import keras.backend as K
from keras.models import Model
from keras.layers import Input, Embedding, concatenate
from keras.layers import Dense, GlobalMaxPooling1D, Reshape
from keras.optimizers import Adam

K.clear_session()

# Using embeddings for categorical features
modifier_type_embedding_in=[]
modifier_type_embedding_out=[]

# sample categorical features
categorical_features = ['modifier_type']

modifier_input_ = Input(shape=(1,), name='modifier_type_in')
# Let's assume 10 unique type of modifiers and let's have embedding dimension as 6
modifier_output_ = Embedding(input_dim=10, output_dim=6, name='modifier_type')(modifier_input_)
modifier_output_ = Reshape(target_shape=(6,))(modifier_output_)  

modifier_type_embedding_in.append(modifier_input_)
modifier_type_embedding_out.append(modifier_output_)

# sample continuous features
statistics = ['duration']
statistics_inputs =[Input(shape=(len(statistics),), name='statistics')] # Input(shape=(1,))

# sample continuous features
abilities = ['buyback_cost', 'cooldown', 'number_of_deaths', 'ability', 'teleport', 'team', 'level', 'max_mana', 'intelligence']
abilities_inputs=[Input(shape=(len(abilities),), name='abilities')] # Input(shape=(9,))

concat = concatenate(modifier_type_embedding_out + statistics_inputs)
FC_relu = Dense(128, activation='relu', name='fc_relu_1')(concat)
FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)
model = concatenate(abilities_inputs + [FC_relu])
model = Dense(64, activation='relu', name='fc_relu_3')(model)
model_out = Dense(1, activation='sigmoid', name='fc_sigmoid')(model)

model_in = abilities_inputs + modifier_type_embedding_in + statistics_inputs
model = Model(inputs=model_in, outputs=model_out)
model.compile(loss='binary_crossentropy', optimizer=Adam(lr=2e-05, decay=1e-3), metrics=['accuracy'])

在此处输入图像描述

但是,在编译“N”个修饰符时,我得到以下错误,下面是我在代码中所做的更改,

modifier_input_ = Input(shape=(None, 1,), name='modifier_type_in')


statistics_inputs =[Input(shape=(None, len(statistics),), name='statistics')] # Input(shape=(None, 1,))


FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)
max_pool = GlobalMaxPooling1D()(FC_relu)

model = concatenate(abilities_inputs + [max_pool])

这是我得到的,

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-3-7703088b1d24> in <module>
     22 abilities_inputs=[Input(shape=(len(abilities),), name='abilities')] # Input(shape=(9,))
     23 
---> 24 concat = concatenate(modifier_type_embedding_out + statistics_inputs)
     25 FC_relu = Dense(128, activation='relu', name='fc_relu_1')(concat)
     26 FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)

e:\Miniconda3\lib\site-packages\keras\layers\merge.py in concatenate(inputs, axis, **kwargs)
    647         A tensor, the concatenation of the inputs alongside axis `axis`.
    648     """
--> 649     return Concatenate(axis=axis, **kwargs)(inputs)
    650 
    651 

e:\Miniconda3\lib\site-packages\keras\engine\base_layer.py in __call__(self, inputs, **kwargs)
    423                                          'You can build it manually via: '
    424                                          '`layer.build(batch_input_shape)`')
--> 425                 self.build(unpack_singleton(input_shapes))
    426                 self.built = True
    427 

e:\Miniconda3\lib\site-packages\keras\layers\merge.py in build(self, input_shape)
    360                              'inputs with matching shapes '
    361                              'except for the concat axis. '
--> 362                              'Got inputs shapes: %s' % (input_shape))
    363 
    364     def _merge_function(self, inputs):

ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 6), (None, None, 1)]

如何在旨在接受可变输入长度特征的神经网络中使用嵌入层?

标签: machine-learningkerasneural-networkdeep-learningembedding

解决方案


答案是,

import keras.backend as K
from keras.models import Model
from keras.layers import Input, Embedding, concatenate
from keras.layers import Dense, GlobalMaxPooling1D, Reshape
from keras.optimizers import Adam

K.clear_session()

# Using embeddings for categorical features
modifier_type_embedding_in=[]
modifier_type_embedding_out=[]

# sample categorical features
categorical_features = ['modifier_type']

modifier_input_ = Input(shape=(None,), name='modifier_type_in')
# Let's assume 10 unique type of modifiers and let's have embedding dimension as 6
modifier_output_ = Embedding(input_dim=10, output_dim=6, name='modifier_type')(modifier_input_)

modifier_type_embedding_in.append(modifier_input_)
modifier_type_embedding_out.append(modifier_output_)

# sample continuous features
statistics = ['duration']
statistics_inputs =[Input(shape=(None, len(statistics),), name='statistics')] # Input(shape=(1,))

# sample continuous features
abilities = ['buyback_cost', 'cooldown', 'number_of_deaths', 'ability', 'teleport', 'team', 'level', 'max_mana', 'intelligence']
abilities_inputs=[Input(shape=(len(abilities),), name='abilities')] # Input(shape=(9,))

concat = concatenate(modifier_type_embedding_out + statistics_inputs)
FC_relu = Dense(128, activation='relu', name='fc_relu_1')(concat)
FC_relu = Dense(128, activation='relu', name='fc_relu_2')(FC_relu)
max_pool = GlobalMaxPooling1D()(FC_relu)

model = concatenate(abilities_inputs + [max_pool])
model = Dense(64, activation='relu', name='fc_relu_3')(model)
model_out = Dense(1, activation='sigmoid', name='fc_sigmoid')(model)

model_in = abilities_inputs + modifier_type_embedding_in + statistics_inputs
model = Model(inputs=model_in, outputs=model_out)
model.compile(loss='binary_crossentropy', optimizer=Adam(lr=2e-05, decay=1e-3), metrics=['accuracy'])

在此处输入图像描述


推荐阅读