@@ -25,7 +25,6 @@ These are the only toy examples in tensorpack. They are supposed to be just demo
...
@@ -25,7 +25,6 @@ These are the only toy examples in tensorpack. They are supposed to be just demo
+[An illustrative MNIST example with explanation of the framework](basics/mnist-convnet.py)
+[An illustrative MNIST example with explanation of the framework](basics/mnist-convnet.py)
+ Tensorpack supports any symbolic libraries. See the same MNIST example written with [tf.layers](basics/mnist-tflayers.py), and [with weights visualizations](basics/mnist-visualizations.py)
+ Tensorpack supports any symbolic libraries. See the same MNIST example written with [tf.layers](basics/mnist-tflayers.py), and [with weights visualizations](basics/mnist-visualizations.py)
+ A tiny [Cifar ConvNet](basics/cifar-convnet.py) and [SVHN ConvNet](basics/svhn-digit-convnet.py)
+ A tiny [Cifar ConvNet](basics/cifar-convnet.py) and [SVHN ConvNet](basics/svhn-digit-convnet.py)
+ If you've used Keras, check out [Keras+Tensorpack examples](keras)
+[A boilerplate file to start with, for your own tasks](boilerplate.py)
+[A boilerplate file to start with, for your own tasks](boilerplate.py)
@@ -29,7 +29,7 @@ baseline and they actually cannot beat this standard ResNet recipe.
...
@@ -29,7 +29,7 @@ baseline and they actually cannot beat this standard ResNet recipe.
To reproduce training or evaluation in the above table,
To reproduce training or evaluation in the above table,
first decompress ImageNet data into [this structure](http://tensorpack.readthedocs.io/modules/dataflow.dataset.html#tensorpack.dataflow.dataset.ILSVRC12), then:
first decompress ImageNet data into [this structure](http://tensorpack.readthedocs.io/modules/dataflow.dataset.html#tensorpack.dataflow.dataset.ILSVRC12), then:
reproduce exactly the same setting of [tensorpack ResNet example](../ResNet) on ImageNet.
It has:
+ ResNet-50 model modified from [keras.applications](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/_impl/keras/applications/resnet50.py).
(We put stride on 3x3 conv in each bottleneck, which is different from certain other implementations).
+ Multi-GPU data-parallel __training and validation__ which scales
+ Finished 100 epochs in 19 hours on 8 V100s, with >90% GPU utilization.
+ Still slightly slower than native tensorpack examples.
+ Good accuracy (same as [tensorpack ResNet example](../ResNet))
### Note:
### Note:
Keras does not respect variable scopes or variable
Keras does not respect variable scopes or variable
...
@@ -45,4 +30,4 @@ collections, which contradicts with tensorpack trainers.
...
@@ -45,4 +30,4 @@ collections, which contradicts with tensorpack trainers.
Therefore Keras support is __experimental__ and __unofficial__.
Therefore Keras support is __experimental__ and __unofficial__.
These simple examples can run within tensorpack smoothly, but note that a
These simple examples can run within tensorpack smoothly, but note that a
complicated model or a future version of Keras may break them.
complicated model or a future version of Keras may not work well.