Keras:将 Tensorboard 与 train_on_batch() 一起使用
Keras: use Tensorboard with train_on_batch()
对于 keras 函数 fit()
和 fit_generator()
,可以通过将 keras.callbacks.TensorBoard
对象传递给函数来实现张量板可视化。对于 train_on_batch()
函数,显然没有可用的回调。在这种情况下,keras 中还有其他选项可以创建 Tensorboard 吗?
我认为目前唯一的选择是使用TensorFlow代码。在 中,我找到了一种手动创建 TensorBoard 日志的方法。
因此,使用 Keras train_on_batch()
的代码示例可能如下所示:
# before training init writer (for tensorboard log) / model
writer = tf.summary.FileWriter(...)
model = ...
# train model
loss = model.train_on_batch(...)
summary = tf.Summary(value=[tf.Summary.Value(tag="loss",
simple_value=value), ])
writer.add_summary(summary)
注意:对于 TensorBoard 中的此示例,您必须选择 Horizontal Axis“RELATIVE”,因为没有步骤会传递到摘要。
一种创建 TensorBoard 回调并手动驱动它的可能方法:
# This example shows how to use keras TensorBoard callback
# with model.train_on_batch
import tensorflow.keras as keras
# Setup the model
model = keras.models.Sequential()
model.add(...) # Add your layers
model.compile(...) # Compile as usual
batch_size=256
# Create the TensorBoard callback,
# which we will drive manually
tensorboard = keras.callbacks.TensorBoard(
log_dir='/tmp/my_tf_logs',
histogram_freq=0,
batch_size=batch_size,
write_graph=True,
write_grads=True
)
tensorboard.set_model(model)
# Transform train_on_batch return value
# to dict expected by on_batch_end callback
def named_logs(model, logs):
result = {}
for l in zip(model.metrics_names, logs):
result[l[0]] = l[1]
return result
# Run training batches, notify tensorboard at the end of each epoch
for batch_id in range(1000):
x_train,y_train = create_training_data(batch_size)
logs = model.train_on_batch(x_train, y_train)
tensorboard.on_epoch_end(batch_id, named_logs(model, logs))
tensorboard.on_train_end(None)
对于 keras 函数 fit()
和 fit_generator()
,可以通过将 keras.callbacks.TensorBoard
对象传递给函数来实现张量板可视化。对于 train_on_batch()
函数,显然没有可用的回调。在这种情况下,keras 中还有其他选项可以创建 Tensorboard 吗?
我认为目前唯一的选择是使用TensorFlow代码。在
因此,使用 Keras train_on_batch()
的代码示例可能如下所示:
# before training init writer (for tensorboard log) / model
writer = tf.summary.FileWriter(...)
model = ...
# train model
loss = model.train_on_batch(...)
summary = tf.Summary(value=[tf.Summary.Value(tag="loss",
simple_value=value), ])
writer.add_summary(summary)
注意:对于 TensorBoard 中的此示例,您必须选择 Horizontal Axis“RELATIVE”,因为没有步骤会传递到摘要。
一种创建 TensorBoard 回调并手动驱动它的可能方法:
# This example shows how to use keras TensorBoard callback
# with model.train_on_batch
import tensorflow.keras as keras
# Setup the model
model = keras.models.Sequential()
model.add(...) # Add your layers
model.compile(...) # Compile as usual
batch_size=256
# Create the TensorBoard callback,
# which we will drive manually
tensorboard = keras.callbacks.TensorBoard(
log_dir='/tmp/my_tf_logs',
histogram_freq=0,
batch_size=batch_size,
write_graph=True,
write_grads=True
)
tensorboard.set_model(model)
# Transform train_on_batch return value
# to dict expected by on_batch_end callback
def named_logs(model, logs):
result = {}
for l in zip(model.metrics_names, logs):
result[l[0]] = l[1]
return result
# Run training batches, notify tensorboard at the end of each epoch
for batch_id in range(1000):
x_train,y_train = create_training_data(batch_size)
logs = model.train_on_batch(x_train, y_train)
tensorboard.on_epoch_end(batch_id, named_logs(model, logs))
tensorboard.on_train_end(None)