首页 > 解决方案 > tensorflow 1.8 版中 tf.contrib.seq2seq.prepare_attention() 的替代方案

问题描述

AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare_attention”

我知道 prepare_attention() 已被弃用。有什么替代方案?并且还请指定语法。

我正在使用的函数是:defdecode_layer_train(encoder_state,dec_cell,dec_embed_input,sequence_length,decode_scope,output_fn,keep_prob,batch_size):'''解码训练数据'''

attention_states = tf.zeros([batch_size, 1, dec_cell.output_size])

att_keys, att_vals, att_score_fn, att_construct_fn = tf.contrib.seq2seq.prepare_attention(attention_states,
                                             attention_option="bahdanau",
                                             num_units=dec_cell.output_size)

train_decoder_fn = tf.contrib.seq2seq.attention_decoder_fn_train(encoder_state[0],
                                                                 att_keys,
                                                                 att_vals,
                                                                 att_score_fn,
                                                                 att_construct_fn,
                                                                 name = "attn_dec_train")
train_pred, _, _ = tf.contrib.seq2seq.dynamic_rnn_decoder(dec_cell,
                                                          train_decoder_fn,
                                                          dec_embed_input,
                                                          sequence_length,
                                                          scope=decoding_scope)
train_pred_drop = tf.nn.dropout(train_pred, keep_prob)
return output_fn(train_pred_drop)

标签: pythontensorflowdeep-learningchatbotseq2seq

解决方案


推荐阅读