首页 > 解决方案 > 使用 GradientTape 在 Tensorflow 中进行梯度计算 - 获得意外的无值

问题描述

我在计算 TensorFlow 1.15 中的梯度时遇到问题。我认为这是与上下文管理器或 keras 会话相关的东西,但我不确定。以下是我编写的代码:

def create_adversarial_pattern_CW(input_patch, input_label, target_label):
 input_patch_T = tf.cast(input_patch,tf.float32)
 with tf.GradientTape() as tape:
  tape.watch(input_patch_T)
  patch_pred = model(input_patch_T)
  loss_input_label = soft_dice_loss(input_label, patch_pred[0])
  loss_target_label = soft_dice_loss(target_label, patch_pred[0])
  f = loss_input_label - loss_target_label
 f_grad = tape.gradient(f, input_patch_T)

 #-------------------------#
 print(type(f_grad)) 
 #-------------------------#

 f_grad_sign = tf.sign(f_grad)
 return f_grad_sign


def DAG():
 sess = K.get_session()
 with sess.as_default() as sess:
  adv_x_old = tf.cast(X,dtype=tf.float32)
  for i in range(iters):
   #-------------------------#
   #y_pred = model(adv_x_old) -> If I uncomment this line the value of f_grad returned is None, otherwise it works fine, but I need this line
   #-------------------------#
   perturbations = create_adversarial_pattern_CW(adv_x_old, y, y_target)
   adv_x_new = adv_x_old - alpha*perturbations
   adv_x_old = adv_x_new
  adv_patch_pred = model(adv_x_old)

为了解决这个问题,我尝试将注释行包装为:

with tf.GradientTape() as tape:
  with tape.stop_recording():
    y_pred = model(adv_x_old)

但我仍然将 f_grad 的值设为 None。

标签: tensorflowmachine-learningdeep-learninggradientgradienttape

解决方案


推荐阅读