Jump to content

TensorBoard: Difference between revisions

No edit summary
Line 2: Line 2:
==Custom Usage==
==Custom Usage==
If you're using a custom training loop (i.e. gradient tape), then you'll need to set everything up manually.
If you're using a custom training loop (i.e. gradient tape), then you'll need to set everything up manually.
First create a <code>SummaryWriter<code>
<syntaxhighlight lang="python">
train_log_dir = os.path.join(args.checkpoint_dir, "logs", "train")
train_summary_writer = tf.summary.create_file_writer(train_log_dir)
</syntaxhighlight>


===Scalars===
===Scalars===
 
Add scalars using <code>tf.summary.scalar</code>:
<syntaxhighlight lang="python">
with train_summary_writer.as_default():
  tf.summary.scalar("training_loss", m_loss.numpy(), step=int(ckpt.step))
</syntaxhighlight>


==Resources==
==Resources==
* [https://www.tensorflow.org/tensorboard/get_started Getting started with TensorBoard]
* [https://www.tensorflow.org/tensorboard/get_started Getting started with TensorBoard]