Training examples with __reproducible performance__.
__The word "reproduce" should always mean reproduce performance__.
With the magic of SGD, wrong deep learning code often appears to still work,
especially if you try it on toy datasets.
With the magic of SGD, wrong deep learning code often appears to work, especially if you try it on toy datasets.
Github is full of such deep learning code that "implements" but does not "reproduce" methods.
See [Unawareness of Deep Learning Mistakes](https://medium.com/@ppwwyyxx/unawareness-of-deep-learning-mistakes-d5b5774da0ba).
We refuse toy examples.
Instead of showing you 10 arbitrary networks trained on toy datasets with random final performance,
tensorpack examples try to faithfully replicate experiments and performance in the paper as much as possible,
tensorpack examples try to faithfully replicate experiments and performance in the paper,
so you're confident that they are correct.
...
...
@@ -19,7 +19,7 @@ These are all the toy examples in tensorpack. They are supposed to be just demos
+[An illustrative MNIST example with explanation of the framework](basics/mnist-convnet.py)
+ Tensorpack supports any symbolic libraries. See the same MNIST example written with [tf.layers](basics/mnist-tflayers.py), [tf-slim](basics/mnist-tfslim.py), and [with weights visualizations](basics/mnist-visualizations.py)
+ A tiny [Cifar ConvNet](basics/cifar-convnet.py) and [SVHN ConvNet](basics/svhn-digit-convnet.py)
+ If you've used Keras, check out [Keras examples](keras)
+ If you've used Keras, check out [Keras+Tensorpack examples](keras)
+[A boilerplate file to start with, for your own tasks](boilerplate.py)
To train, first decompress ImageNet data into [this structure](http://tensorpack.readthedocs.io/en/latest/modules/dataflow.dataset.html#tensorpack.dataflow.dataset.ILSVRC12), then: