首页 > 解决方案 > 使用 Keras 嵌入层进行词嵌入的情感分析

问题描述

我需要对我的模型结果进行一些说明。

这是我的用例:

以下是一些重要信息,以了解模型和我的方法:

# Constants
NB_WORDS = 44000  # Parameter indicating the number of words we'll put in the dictionary
VAL_SIZE = 1000  # Size of the validation set
NB_START_EPOCHS = 10  # Number of epochs we usually start to train with
EPOCH_ITER = list(range(0,11)) # For stepwise evaluating the accuracy metrics for 10 epochs
BATCH_SIZE = 512  # Size of the batches used in the mini-batch gradient descent
MAX_LEN = 267  # Maximum number of words in a sequence (review)
REV_DIM = 300 # Number of dimensions of the indeed review word embeddings --> most common Mikolow et al., 2013


# Modeling
emb_model = models.Sequential()

emb_model.add(layers.Embedding(NB_WORDS, REV_DIM, input_length=MAX_LEN)) 
# Embedding layer is first hidden layer

"""
Embedding Layer (
input_length = no. of words in vocabularly;
output_dim = dimensionality; 
max_length = length of largest review
)
"""
                                                                            
emb_model.add(layers.Flatten()) 
# Flatten Layers are reshaping tensor to 1-D array

emb_model.add(layers.Dense(2, activation='softmax'))
# Is the regular deeply connected neural network layer. It is most common and 
# frequently used layer. Dense layer does the below operation on the input and return the output.
# Operation := output = activation(dot(input, kernel) + bias)
# further see: https://www.tutorialspoint.com/keras/keras_dense_layer.htm#:~:text=Advertisements,input%20and%20return%20the%20output.
# Defines the output size in our case 2, hence positive or negative (0 or 1)

emb_model.summary()

我已经做了一些解释。但由于我是初学者,我真的需要更多信息/解释/提示,特别是关于如何以及为什么改进我的模型。

这是我的结果:

在此处输入图像描述 在此处输入图像描述

标签: tensorflowkerassentiment-analysisembeddingword-embedding

解决方案


推荐阅读