PyTorch: Difference between revisions

43 bytes added ,  29 July 2020
Line 62: Line 62:
writer = SummaryWriter(log_dir="./runs")
writer = SummaryWriter(log_dir="./runs")


writer.add_scalar("train_loss", loss_np, step)
# Calculate loss. Increment the step.
 
writer.add_scalar("train_loss", loss.item(), step)


# Optionally flush e.g. at checkpoints
# Optionally flush e.g. at checkpoints