TensorFlow: Difference between revisions

258 bytes added ,  29 January 2021
Line 198: Line 198:
}}
}}


<!-- ===Batch Normalization===
===Batch Normalization===
Batchnorm has 3 variables: beta, mean, variance. However, these are not all stored in trainable variables. Thus, you must explicity list them in <code>tf.Saver</code> or use <code>var_list=tf.global_variables()</code>.
See [https://www.tensorflow.org/api_docs/python/tf/compat/v1/layers/batch_normalization <code>tf.compat.v1.layers.batch_normalization</code>]
When training with batchnorm, you need to run <code>tf.GraphKeys.UPDATE_OPS</code> in your session to update the batchnorm variables or they will not be updated.
These variables do not contribute to the loss when training is true so they will not by updated by the optimizer.


See [https://stackoverflow.com/questions/54186376/saving-tensorflow-model-that-uses-batchnorm SO: Saving TF model w/ batchnorm]
<syntaxhighlight lang="python">
-->
update_ops = tf.compat.v1.get_collection(tf.GraphKeys.UPDATE_OPS)
train_op = optimizer.minimize(loss)
train_op = tf.group([train_op, update_ops])
</syntaxhighlight>


==Estimators==
==Estimators==