首页 > 解决方案 > AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare_attention”

问题描述

我正在尝试运行我的代码并且代码正在抛出错误。错误如下所述:

AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare_attention”

我将我的 tensorflow 版本更新为1.0.0.但升级并没有解决我的问题。我也在谷歌中搜索过这个错误,但我没有得到正确的解决方案。

这是代码部分,请看一下。

获取训练和测试预测

training_predictions, test_predictions = seq2seq_model(tf.reverse(inputs, [-1]),
                                                       targets,
                                                       keep_prob,
                                                       batch_size,
                                                       sequence_length,
                                                       len(answerswords2int),
                                                       len(questionswords2int),
                                                       encoding_embedding_size,
                                                       decoding_embedding_size,
                                                       rnn_size,
                                                       num_layers,
                                                       questionswords2int)






C:\Users\Maniech\Anaconda3\lib\site-packages\tensorflow_core\python\client\session.py:1750: UserWarning: An interactive session is already active. This can cause out-of-memory errors in some cases. You must explicitly call `InteractiveSession.close()` to release resources held by the other session(s).
  warnings.warn('An interactive session is already active. This can '
Traceback (most recent call last):

  File "<ipython-input-8-aecd893a8ef5>", line 37, in <module>
    questionswords2int)

  File "C:/Users/Maniech/Desktop/Deep NLP AZ/chatbot.py", line 292, in seq2seq_model
    batch_size)

  File "C:/Users/Maniech/Desktop/Deep NLP AZ/chatbot.py", line 258, in decoder_rnn
    batch_size)

  File "C:/Users/Maniech/Desktop/Deep NLP AZ/chatbot.py", line 201, in decode_training_set
    attention_keys, attention_values, attention_score_function, attention_construct_function = tf.contrib.seq2seq.prepare_attention(attention_states, attention_option = "bahdanau", num_units = decoder_cell.output_size)

AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'prepare_attention'

任何帮助表示赞赏。

标签: tensorflowmachine-learningchatbot

解决方案


推荐阅读