首页 > 解决方案 > 在 Tensorflow 2.4 中使用 Densenet121 的 Logits 问题

问题描述

我试图责备迁移学习和微调以适应我的问题My colab with GPU的示例。当我在最后一个密集层中使用 softmax 时,这个错误不会发生。但是使用'from_logits=True',这个错误ocorrer,我的图像是jpg,它们被分成文件夹:

InvalidArgumentError                      Traceback (most recent call last)

<ipython-input-85-7ea61d5df8ec> in <module>()
----> 1 loss0, accuracy0, auc0, precision0, recall0  = model.evaluate(val_ds)

5 frames

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in evaluate(self, x, y, batch_size, verbose, sample_weight, steps, callbacks, max_queue_size, workers, use_multiprocessing, return_dict)
   1387             with trace.Trace('test', step_num=step, _r=1):
   1388               callbacks.on_test_batch_begin(step)
-> 1389               tmp_logs = self.test_function(iterator)
   1390               if data_handler.should_sync:
   1391                 context.async_wait()

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
    826     tracing_count = self.experimental_get_tracing_count()
    827     with trace.Trace(self._name) as tm:
--> 828       result = self._call(*args, **kwds)
    829       compiler = "xla" if self._experimental_compile else "nonXla"
    830       new_tracing_count = self.experimental_get_tracing_count()

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
    893       # If we did not create any variables the trace we have is good enough.
    894       return self._concrete_stateful_fn._call_flat(
--> 895           filtered_flat_args, self._concrete_stateful_fn.captured_inputs)  # pylint: disable=protected-access
    896 
    897     def fn_with_cond(inner_args, inner_kwds, inner_filtered_flat_args):

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/function.py in _call_flat(self, args, captured_inputs, cancellation_manager)
   1917       # No tape is watching; skip to running the function.
   1918       return self._build_call_outputs(self._inference_function.call(
-> 1919           ctx, args, cancellation_manager=cancellation_manager))
   1920     forward_backward = self._select_forward_and_backward_functions(
   1921         args,

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/function.py in call(self, ctx, args, cancellation_manager)
    558               inputs=args,
    559               attrs=attrs,
--> 560               ctx=ctx)
    561         else:
    562           outputs = execute.execute_with_cancellation(

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
     58     ctx.ensure_initialized()
     59     tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 60                                         inputs, attrs, num_outputs)
     61   except core._NotOkStatusException as e:
     62     if name is not None:

InvalidArgumentError: 2 root error(s) found.
  (0) Invalid argument:  assertion failed: [predictions must be >= 0] [Condition x >= y did not hold element-wise:] [x (Dense121/dense_1/BiasAdd:0) = ] [[0.853173912 1.97515857 0.608713508...]...] [y (Cast_4/x:0) = ] [0]
     [[{{node assert_greater_equal/Assert/AssertGuard/else/_1/assert_greater_equal/Assert/AssertGuard/Assert}}]]
  (1) Invalid argument:  assertion failed: [predictions must be >= 0] [Condition x >= y did not hold element-wise:] [x (Dense121/dense_1/BiasAdd:0) = ] [[0.853173912 1.97515857 0.608713508...]...] [y (Cast_4/x:0) = ] [0]
     [[{{node assert_greater_equal/Assert/AssertGuard/else/_1/assert_greater_equal/Assert/AssertGuard/Assert}}]]
     [[assert_greater_equal_2/Assert/AssertGuard/branch_executed/_65/_167]]
0 successful operations.
0 derived errors ignored. [Op:__inference_test_function_61870]

Function call stack:
test_function -> test_function

我尝试了几件事来解决,但没有

标签: tensorflowgoogle-colaboratory

解决方案


推荐阅读