tensorflow - AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare_attention”
问题描述
我正在尝试运行我的代码并且代码正在抛出错误。错误如下所述:
AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare_attention”
我将我的 tensorflow 版本更新为1.0.0.
但升级并没有解决我的问题。我也在谷歌中搜索过这个错误,但我没有得到正确的解决方案。
这是代码部分,请看一下。
获取训练和测试预测
training_predictions, test_predictions = seq2seq_model(tf.reverse(inputs, [-1]),
targets,
keep_prob,
batch_size,
sequence_length,
len(answerswords2int),
len(questionswords2int),
encoding_embedding_size,
decoding_embedding_size,
rnn_size,
num_layers,
questionswords2int)
C:\Users\Maniech\Anaconda3\lib\site-packages\tensorflow_core\python\client\session.py:1750: UserWarning: An interactive session is already active. This can cause out-of-memory errors in some cases. You must explicitly call `InteractiveSession.close()` to release resources held by the other session(s).
warnings.warn('An interactive session is already active. This can '
Traceback (most recent call last):
File "<ipython-input-8-aecd893a8ef5>", line 37, in <module>
questionswords2int)
File "C:/Users/Maniech/Desktop/Deep NLP AZ/chatbot.py", line 292, in seq2seq_model
batch_size)
File "C:/Users/Maniech/Desktop/Deep NLP AZ/chatbot.py", line 258, in decoder_rnn
batch_size)
File "C:/Users/Maniech/Desktop/Deep NLP AZ/chatbot.py", line 201, in decode_training_set
attention_keys, attention_values, attention_score_function, attention_construct_function = tf.contrib.seq2seq.prepare_attention(attention_states, attention_option = "bahdanau", num_units = decoder_cell.output_size)
AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'prepare_attention'
任何帮助表示赞赏。
解决方案
推荐阅读
- excel - 如何从工作表顶部的行偏移量开始
- android - 从gms地点迁移到地点时出现重复类问题-apcompat
- python - 查找与允许重叠的模式匹配的子字符串
- android - 当应用程序在后台或屏幕关闭时如何振动?科尔多瓦 9 安卓
- javascript - 我的异步函数没有等待 Firebase 的承诺
- vb.net - 从“JObject”类型到“String”类型的转换无效
- quire-api - 在 Quire 中将任务标记为未读
- python - 尝试通过 https 下载文件
- android-source - 连接多个设备时如何为特定设备运行 vts
- google-apps-script - 如何使用 Google Apps 脚本命名当前版本的 Google 电子表格、文档或演示文稿?