首页 > 解决方案 > Combining add_loss with keras.losses in multioutput models using intermediate outputs

问题描述

Previously in another post (Keras multioutput custom loss with intermediate layers output) I discussed the problem I was having. Finally, this problem was fixed in this way:

def MyLoss(true1, true2, out1, out2, out3):
    loss1 = tf.keras.losses.someloss1(out1, true1)
    loss2 = tf.keras.losses.someloss2(out2, true2)
    loss3 = tf.keras.losses.someloss3(out2, out3)
    loss = loss1 + loss2 + loss3
    return loss

input1 = Input(shape=input1_shape)
input2 = Input(shape=input2_shape)

# do not take into account the notation, only the idea
output1 = Submodel1()([input1,input2]) 
output2 = Submodel2()(output1)
output3 = Sumbodel3()(output1)

true1 = Input(shape=true1shape)
true2 = Input(shape=true2shape)

model = Model([input1,input2,true1,true2], [output1,output2,output3])
model.add_loss(MyLoss(true1, true2, output1, output2, output3))
model.compile(optimizer='adam', loss=None)

model.fit(x=[input1 ,input2 ,true1,true2], y=None, epochs=n_epochs)

In that problem, all the losses I used were keras losses (i.e. tf.keras.losss.someloss) but now I want to add a couple more losses and I want to combine custom losses with keras losses. That is, now I have this scheme:

enter image description here

For adding these two losses, which are SSIM losses, I have tried this:

def SSIMLoss(y_true, y_pred):
    return 1-tf.reduce_mean(tf.image.ssim(y_true, y_pred, 1.0))

def MyLoss(true1, true2, out1, out2, out3):
    loss1 = tf.keras.losses.someloss1(out1, true1)
    customloss1 = SSIMLoss(out1,true1)
    loss2 = tf.keras.losses.someloss2(out2, true2)
    loss3 = tf.keras.losses.someloss3(out2, out3)
    customloss2 = SSIMLoss(out2,out3)
    loss = loss1 + loss2 + loss3 + customloss1 + customloss2
    return loss

But I get this error:

OperatorNotAllowedInGraphError: using a `tf.Tensor` as a Python `bool` is not allowed in Graph execution. Use Eager execution or decorate this function with @tf.function.

I have tried decorating the function with @tf.function but I get this error:

_SymbolicException: Inputs to eager execution function cannot be Keras symbolic tensors, but found [<tf.Tensor 'input_43:0' shape=(None, 128, 128, 1) dtype=float32>, <tf.Tensor 'conv2d_109/Sigmoid:0' shape=(None, 128, 128, 1) dtype=float32>]

I have found this (https://github.com/tensorflow/tensorflow/issues/32127) about combining keras losses with add_loss, maybe this is the problem, but I don´t know how to fix it.

标签: pythontensorflowmachine-learningkerasdeep-learning

解决方案


I was able to reproduce your above errors in TF 2.3. But in TF 2.4 and nightly TF 2.6, there was no such issue, but when I tried to plot the model I got another error, though no issue with the model. summary() and also training with .fit. However, if the eager mode is disabled, then there wouldn't be an issue with TF 2.3 / 2.4.


Details

In TF 2.3, I can reproduce your issue same shown below. To resolve this, just disable the eager mode showed above.

In TF 2.4 / TF Nightly 2.6, I didn't need to disable the eager mode. The model was compiled fine and train as expected. But the only issue occurs when I tried to plot the model, it gave the following error

tf.keras.utils.plot_model(model)
....
AttributeError: 'tensorflow.python.framework.ops.EagerTensor' object has no 
attribute '_keras_history'

This issue caused for 1-.. expression in the SSIMLoss method; something similar. But again, by disabling the eager mode, it resolves anyway. However, in general, it's better to upgrade to TF 2.4.


Code Examples

Here I will show you a dummy example that probably similar to your training pipelines. In this example, we have one input (28, 28, 3) and three outputs (28, 28, 3).

from tensorflow.keras.layers import *
from tensorflow.keras import Model 
import tensorflow as tf 
import numpy as np

# tf.compat.v1.disable_eager_execution()
print(tf.__version__)
print(tf.executing_eagerly())
2.4.1
True

Custom loss functions.

def SSIMLoss(y_true, y_pred):
    return 1 - tf.reduce_mean(tf.image.ssim(y_true, y_pred, 1.0))

def MyLoss(true1, true2, out1, out2, out3):
    loss1 = tf.keras.losses.cosine_similarity(out1, true1)
    loss2 = tf.keras.losses.cosine_similarity(out2, true2)
    loss3 = tf.keras.losses.cosine_similarity(out2, out3)
    customloss1 = SSIMLoss(true1, out1)
    customloss2 = SSIMLoss(out2, out3)

    loss = loss1 + loss2 + loss3 + customloss1 + customloss2
    return loss

Data

imgA = tf.random.uniform([10, 28, 28, 3], minval=0, maxval=256)
tarA = np.random.randn(10, 28, 28, 3)
tarB = np.random.randn(10, 28, 28, 3)

Model

A model with one input and three outputs.

input  = Input(shape=(28, 28, 3))
middle = Conv2D(16, kernel_size=(3,3), padding='same')(input)

outputA = Dense(3, activation='relu')(middle)
outputB = Dense(3, activation='selu')(middle)
outputC = Dense(3, activation='elu')(middle)

target_inputA = Input(shape=(28, 28, 3))
target_inputB = Input(shape=(28, 28, 3))

model = Model([input, target_inputA, target_inputB], 
              [outputA, outputB, outputC])

model.add_loss(MyLoss(target_inputA, target_inputB, 
                      outputA, outputB, outputC))

# tf.keras.utils.plot_model(model) # disable eager mode 
model.summary()

Compile and Run

model.compile(optimizer='adam', loss=None)
model.fit([imgA, tarA, tarB], steps_per_epoch=5)

5/5 [==============================] - 2s 20ms/step - loss: 1.4338
<tensorflow.python.keras.callbacks.History at 0x7efde188d450>

推荐阅读