打印损失函数tensorflow 2.0的所有条款
Print all terms of loss function tensorflow 2.0
我正在定义自定义损失函数。例如,让我们取 loss function = L1 loss + L2 loss.
当我做 model.fit_generator()
时,每批次后都会打印整体损失函数。但我想查看 L1 loss
和 L2 loss
的各个值。我怎样才能做到这一点?
我想知道各个术语的值以了解它们的相对比例。
tf.print(l1_loss, output_stream=sys.stdout)
抛出异常说 tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.
.
甚至 tf.print('---')
也只是在开头打印 ---
而不是每批都打印。
tf.keras.backend.print_tensor(l1_loss)
没有打印任何东西
没有看到你的代码,我只能猜测你没有用 @tf.function
装饰器装饰你的自定义损失函数。
import numpy as np
import tensorflow as tf
@tf.function # <-- Be sure to use this decorator.
def custom_loss(y_true, y_pred):
loss = tf.reduce_mean(tf.math.abs(y_pred - y_true))
tf.print(loss) # <-- Use tf.print(), instead of print(). You can print not just 'loss', but any TF tensor in this function using this approach.
return loss
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=[8]))
model.compile(loss=custom_loss, optimizer="sgd")
x_data = tf.data.Dataset.from_tensor_slices([np.ones(8)] * 100)
y_data = tf.data.Dataset.from_tensor_slices([np.ones(1)] * 100)
data = tf.data.Dataset.zip((x_data, y_data)).batch(2)
model.fit_generator(data, steps_per_epoch=10, epochs=2)
输出如下所示,它告诉您逐批损失值。
Epoch 1/2
0.415590227 1/10 [==>...........................] - ETA: 0s - loss: 0.41560.325590253
0.235590339
0.145590425
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 11ms/step - loss: 0.1392 Epoch 2/2
0.0555904508 1/10 [==>...........................] - ETA: 0s - loss: 0.05560.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 498us/step - loss: 0.0450
我正在定义自定义损失函数。例如,让我们取 loss function = L1 loss + L2 loss.
当我做 model.fit_generator()
时,每批次后都会打印整体损失函数。但我想查看 L1 loss
和 L2 loss
的各个值。我怎样才能做到这一点?
我想知道各个术语的值以了解它们的相对比例。
tf.print(l1_loss, output_stream=sys.stdout)
抛出异常说tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.
.甚至
tf.print('---')
也只是在开头打印---
而不是每批都打印。tf.keras.backend.print_tensor(l1_loss)
没有打印任何东西
没有看到你的代码,我只能猜测你没有用 @tf.function
装饰器装饰你的自定义损失函数。
import numpy as np
import tensorflow as tf
@tf.function # <-- Be sure to use this decorator.
def custom_loss(y_true, y_pred):
loss = tf.reduce_mean(tf.math.abs(y_pred - y_true))
tf.print(loss) # <-- Use tf.print(), instead of print(). You can print not just 'loss', but any TF tensor in this function using this approach.
return loss
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=[8]))
model.compile(loss=custom_loss, optimizer="sgd")
x_data = tf.data.Dataset.from_tensor_slices([np.ones(8)] * 100)
y_data = tf.data.Dataset.from_tensor_slices([np.ones(1)] * 100)
data = tf.data.Dataset.zip((x_data, y_data)).batch(2)
model.fit_generator(data, steps_per_epoch=10, epochs=2)
输出如下所示,它告诉您逐批损失值。
Epoch 1/2
0.415590227 1/10 [==>...........................] - ETA: 0s - loss: 0.41560.325590253
0.235590339
0.145590425
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 11ms/step - loss: 0.1392 Epoch 2/2
0.0555904508 1/10 [==>...........................] - ETA: 0s - loss: 0.05560.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 498us/step - loss: 0.0450