Commit d814317c authored by Yuxin Wu's avatar Yuxin Wu

update readme

parent 095c1cd9
...@@ -3,7 +3,7 @@ Neural Network Toolbox on TensorFlow ...@@ -3,7 +3,7 @@ Neural Network Toolbox on TensorFlow
Still in development, but usable. Still in development, but usable.
See some interesting [examples](https://github.com/ppwwyyxx/tensorpack/tree/master/examples) to learn about the framework: See some interesting [examples](examples) to learn about the framework:
+ [DoReFa-Net: training binary / low bitwidth CNN](examples/DoReFa-Net) + [DoReFa-Net: training binary / low bitwidth CNN](examples/DoReFa-Net)
+ [Double-DQN for playing Atari games](examples/Atari2600) + [Double-DQN for playing Atari games](examples/Atari2600)
...@@ -15,13 +15,19 @@ See some interesting [examples](https://github.com/ppwwyyxx/tensorpack/tree/mast ...@@ -15,13 +15,19 @@ See some interesting [examples](https://github.com/ppwwyyxx/tensorpack/tree/mast
Focus on modularity. You just have to define the following three components to start a training: Focus on modularity. You just have to define the following three components to start a training:
1. The model, or the graph. Define the graph as well as its inputs and outputs. `models/` has some scoped abstraction of common models. 1. The model, or the graph. Define the graph as well as its inputs and outputs. `models/` has some scoped abstraction of common models.
`LinearWrap` and `argscope` makes large models look simpler.
2. The data. All data producer has an unified `DataFlow` interface, and this interface can be chained 2. The data. tensorpack allows and encourages complex data processing.
to perform complex preprocessing. It uses multiprocess to avoid performance bottleneck on data
loading.
3. The callbacks. They include everything you want to do apart from the training iterations: + All data producer has an unified `DataFlow` interface, allowing them to be composed to perform complex preprocessing.
change hyperparameters, save models, print logs, run validation, and more. + Use Python to easily handle your own data format, yet still keep a good training speed thanks to multiprocess prefetch & TF Queue prefetch.
For example, InceptionV3 can run in the same speed as the official code which reads data using TF operators.
3. The callbacks, including everything you want to do apart from the training iterations. For example:
+ Change hyperparameters
+ Save models
+ Print some variables of interest
+ Run inference on a test dataset
With the above components defined, tensorpack trainer will run the training iterations for you. With the above components defined, tensorpack trainer will run the training iterations for you.
Multi-GPU training is ready to use by simply changing the trainer. Multi-GPU training is ready to use by simply changing the trainer.
......
# tensorpack examples
Only allow examples with reproducible and meaningful performancce.
+ [An illustrative mnist example](mnist-convnet.py)
+ [A small Cifar10 ConvNet with 91% accuracy](cifar-convnet.py)
+ [A tiny SVHN ConvNet with 97.5% accuracy](svhn-digit-convnet.py)
+ [Reproduce some reinforcement learning papers](Atari2600)
+ [char-rnn for fun](char-rnn)
+ [DisturbLabel, because I don't believe the paper](DisturbLabel)
+ [DoReFa-Net, binary / low-bitwidth CNN on ImageNet](DoReFa-Net)
+ [GoogleNet-InceptionV1 with 71% accuracy](Inception)
+ [ResNet for Cifar10 with similar accuracy, and for SVHN with state-of-the-art accuracy](ResNet)
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment