Commit cf6567b7 authored by Yuxin Wu's avatar Yuxin Wu

update docs

parent 15f1b1df
......@@ -15,24 +15,24 @@ It's Yet Another TF wrapper, but different in:
+ Speed comes for free with tensorpack -- it uses TensorFlow in the __correct way__ with no extra overhead.
On various CNNs, it runs 1.5~1.7x faster than the equivalent Keras code.
+ Data-parallel multi-GPU training is off-the-shelf to use. It is as fast as Google's [official benchmark](https://www.tensorflow.org/performance/benchmarks).
+ Data-parallel multi-GPU training is off-the-shelf to use. It runs as fast as Google's [official benchmark](https://www.tensorflow.org/performance/benchmarks).
+ See [tensorpack/benchmarks](https://github.com/tensorpack/benchmarks) for some benchmark scripts.
+ See [tensorpack/benchmarks](https://github.com/tensorpack/benchmarks) for the benchmark scripts.
2. Focus on __large datasets__.
+ It's painful to read/preprocess data through TF.
Tensorpack helps you load large datasets (e.g. ImageNet) in __pure Python__ with autoparallelization.
3. It's not a model wrapper.
+ There are already too many symbolic function wrappers.
Tensorpack includes only a few common models,
but you can use any other wrappers within tensorpack, including sonnet/Keras/slim/tflearn/tensorlayer/....
+ There are too many symbolic function wrappers.
Tensorpack includes only a few common models.
You can use any symbolic function library inside tensorpack, including tflayers/Keras/slim/tflearn/tensorlayer/....
See [tutorials](http://tensorpack.readthedocs.io/en/latest/tutorial/index.html) to know more about these features.
## [Examples](examples):
Instead of showing you 10 different ways to train MNIST to random accuracy,
Instead of showing you 10 random networks with random accuracy,
[tensorpack examples](examples) faithfully replicate papers and care about performance.
And everything runs on multiple GPUs. Some highlights:
......
......@@ -11,11 +11,11 @@ See [Unawareness of Deep Learning Mistakes](https://medium.com/@ppwwyyxx/unaware
## Getting Started:
These examples don't have meaningful performance numbers. They are supposed to be just demos.
+ [An illustrative MNIST example with explanation of the framework](Basics/mnist-convnet.py)
+ The same MNIST example written with [tf.layers](Basics/mnist-tflayers.py), [tf-slim](Basics/mnist-tfslim.py), and [with weights visualizations](Basics/mnist-visualizations.py)
+ A tiny [Cifar ConvNet](Basics/cifar-convnet.py) and [SVHN ConvNet](Basics/svhn-digit-convnet.py)
+ [A boilerplate file to start with, for your own tasks](boilerplate.py)
+ [An illustrative MNIST example with explanation of the framework](basics/mnist-convnet.py)
+ A tiny [Cifar ConvNet](basics/cifar-convnet.py) and [SVHN ConvNet](basics/svhn-digit-convnet.py)
+ Tensorpack supports any symbolic libraries. See the same MNIST example written with [tf.layers](basics/mnist-tflayers.py), [tf-slim](basics/mnist-tfslim.py), and [with weights visualizations](basics/mnist-visualizations.py)
+ If you've used Keras, check out [Keras examples](keras)
+ [A boilerplate file to start with, for your own tasks](boilerplate.py)
## Vision:
| Name | Performance |
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment