TensorBoard: Difference between revisions
Line 3: | Line 3: | ||
If you're using a custom training loop (i.e. gradient tape), then you'll need to set everything up manually. | If you're using a custom training loop (i.e. gradient tape), then you'll need to set everything up manually. | ||
First create a <code>SummaryWriter<code> | First create a <code>SummaryWriter</code> | ||
<syntaxhighlight lang="python"> | <syntaxhighlight lang="python"> | ||
train_log_dir = os.path.join(args.checkpoint_dir, "logs", "train") | train_log_dir = os.path.join(args.checkpoint_dir, "logs", "train") |
Revision as of 13:14, 18 June 2020
Custom Usage
If you're using a custom training loop (i.e. gradient tape), then you'll need to set everything up manually.
First create a SummaryWriter
train_log_dir = os.path.join(args.checkpoint_dir, "logs", "train")
train_summary_writer = tf.summary.create_file_writer(train_log_dir)
Scalars
Add scalars using tf.summary.scalar
:
with train_summary_writer.as_default():
tf.summary.scalar("training_loss", m_loss.numpy(), step=int(ckpt.step))