Skip to content

Commit b093728

Browse files
authored
Update README.md
1 parent 32b9c86 commit b093728

File tree

1 file changed

+0
-7
lines changed

1 file changed

+0
-7
lines changed

README.md

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,4 @@ Run `python mnist_bn.py --phase=test` to test.
66

77
It should achieve an accuracy of ~99.3% or higher on test set.
88

9-
10-
**The keys to use batch normalization in `slim`** are:
11-
12-
1. **Set proper decay rate for BN layer.** Because a BN layer uses EMA (exponential moving average) to approximate the population mean/variance, it takes sometime to warm up, i.e. to get the EMA close to real population mean/variance. The default decay rate is 0.999, which is kind of high for our little cute MNIST dataset and needs ~1000 steps to get a good estimation. In my code, `decay` is set to 0.95, then it learns the population statistics very quickly. However, a large value of `decay` does have it own advantage: it gathers information from more mini-batches thus is more stable.
13-
14-
2. **Use `slim.learning.create_train_op` to create train op instead of `tf.train.GradientDescentOptimizer(0.1).minimize(loss)` or something else!**.
15-
169
I've added accuracy, cross_entropy and batch normalization paramters into summary. Use **tensorboard --logdir=/log** to explore the learning curve and parameter distributions!

0 commit comments

Comments
 (0)