首页 > 解决方案 > 张量归一化期间的意外输出

问题描述

我想标准化矩阵的每一行。

import tensorflow as tf
t = 2
dimension = 300
tf.reset_default_graph()
batch_size = 2 
epsilon = 1e-8
train_labels = tf.placeholder(tf.int32, shape=[batch_size])
out_embeddings = tf.Variable(tf.zeros([t, dimension], tf.float32))
embed_out = tf.nn.embedding_lookup(out_embeddings, train_labels)
out_square = tf.reduce_sum(tf.square(embed_out), 1, keepdims=True) + epsilon
out_norm = tf.sqrt(tf.reduce_sum(tf.square(embed_out), 1, keepdims=True) + epsilon)
normalized_embed_out = embed_out / out_norm

with tf.Session(config=tf.ConfigProto(allow_soft_placement=True)) as session:
    tf.global_variables_initializer().run()
    batch_labels = [0, 1]
    square, norm, normalized_embed = session.run([out_square, out_norm, normalized_embed_out],
                                             feed_dict={train_labels: batch_labels})
    print("square:", square)
    print("norm:", norm)

但令人惊讶的是,我得到了这个结果:

square: [[1.e-08] [1.e-08]]
norm: [[10000.] [10000.]]

你能帮我理解原因吗?

标签: pythontensorflow

解决方案


推荐阅读