首页 > 解决方案 > 当通过“张量”损失时需要“磁带”

问题描述

关于 tf.

import numpy as np
import tensorflow as tf
from tensorflow import keras

x_train = [1,2,3]
y_train = [1,2,3]

W = tf.Variable(tf.random.normal([1]), name = 'weight')
b = tf.Variable(tf.random.normal([1]), name = 'bias')
hypothesis = W*x_train+b

optimizer = tf.optimizers.SGD (learning_rate=0.01)

train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])

当我开始我的代码的最后一行时,出现了以下错误。

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-52-cd6e22f66d09> in <module>()
----> 1 train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])

1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in _compute_gradients(self, loss, var_list, grad_loss, tape)
    530     # TODO(josh11b): Test that we handle weight decay in a reasonable way.
    531     if not callable(loss) and tape is None:
--> 532       raise ValueError("`tape` is required when a `Tensor` loss is passed.")
    533     tape = tape if tape is not None else backprop.GradientTape()
    534 

ValueError: `tape` is required when a `Tensor` loss is passed.

我知道它与 tensorflow 版本 2 相关,但不想修改为版本 1。

想要一个 tensorflow ver2 的解决方案。谢谢。

标签: tensorflow

解决方案


由于您没有提供成本函数,因此我添加了一个。这是代码

import numpy as np
import tensorflow as tf
from tensorflow import keras

 
x_train = [1,2,3]
y_train = [1,2,3]

W = tf.Variable(tf.random.normal([1]), name = 'weight')
b = tf.Variable(tf.random.normal([1]), name = 'bias')
hypothesis = W*x_train+b

@tf.function
def cost():

    y_model = W*x_train+b
    error = tf.reduce_mean(tf.square(y_train- y_model))
    return error


optimizer = tf.optimizers.SGD (learning_rate=0.01)

train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])

tf.print(W)
tf.print(b)

推荐阅读