首页 > 解决方案 > 将 TensorFlow LSTM 转换为 Keras

问题描述

我正在尝试根据 TensorFlow 实现在 Keras 中实现 LSTM 层(参见下面的代码),它是文本识别 CRNN 网络的一部分。

import tensorflow as tf
from tensorflow.contrib import rnn

def __sequence_label(self, inputdata):
"""
Implement the sequence label part of the network
:param inputdata:
:return:
"""
with tf.variable_scope('LSTMLayers'):
    # construct stack LSTM rcnn layer
    # forward LSTM cell
    fw_cell_list = [rnn.BasicLSTMCell(nh, forget_bias=1.0) for nh in [256, 256]]
    # Backward direction cells
    bw_cell_list = [rnn.BasicLSTMCell(nh, forget_bias=1.0) for nh in [256, 256]]

    stack_lstm_layer, _, _ = rnn.stack_bidirectional_dynamic_rnn(fw_cell_list, bw_cell_list, inputdata, dtype=tf.float32)

    if self.phase.lower() == 'train':
        stack_lstm_layer = self.dropout(inputdata=stack_lstm_layer, keep_prob=0.5)

    [batch_s, _, hidden_nums] = inputdata.get_shape().as_list()  # [batch, width, 2*n_hidden]
    rnn_reshaped = tf.reshape(stack_lstm_layer, [-1, hidden_nums])  # [batch x width, 2*n_hidden]

    w = tf.Variable(tf.truncated_normal([hidden_nums, 37], stddev=0.1), name="w")
    # Doing the affine projection

    logits = tf.matmul(rnn_reshaped, w)

    logits = tf.reshape(logits, [batch_s, -1, 37])

    raw_pred = tf.argmax(tf.nn.softmax(logits), axis=2, name='raw_prediction')

    # Swap batch and batch axis
    rnn_out = tf.transpose(logits, (1, 0, 2), name='transpose_time_major')  # [width, batch, n_classes]

return rnn_out, raw_pred

谁能告诉我相应的层在 Keras 中应该是什么样子的?
提前致谢 :)

标签: pythontensorflowkeras

解决方案


推荐阅读