Commit a759dbb0 authored by Yuxin Wu's avatar Yuxin Wu

update docs

parent a9b89567
......@@ -22,7 +22,7 @@ Feature Requests:
+ You can implement a lot of features by extending tensorpack
(See http://tensorpack.readthedocs.io/en/latest/tutorial/index.html#extend-tensorpack).
It does not have to be added to tensorpack unless you have a good reason.
+ We don't take example requests.
+ We don't take feature requests for examples.
Usage Questions:
......
......@@ -6,7 +6,8 @@
There are two ways to do inference during training.
1. The easiest way is to write a callback, and use
`self.trainer.get_predictor()` to get a callable under inference mode.
[self.trainer.get_predictor()](../modules/modules/train.html#tensorpack.train.TowerTrainer.get_predictor)
to get a callable under inference mode.
See [Write a Callback](extend/callback.html).
2. If your inference follows the paradigm of:
......@@ -29,15 +30,16 @@ Please note that, the metagraph saved during training is the training graph.
But sometimes you need a different one for inference.
For example, you may need a different data layout for CPU inference,
or you may need placeholders in the inference graph, or the training graph contains multi-GPU replication
which you want to remove.
which you want to remove. In fact, directly import a huge training metagraph is usually not a good idea for deployment.
In this case, you can always construct a new graph by simply:
```python
a, b = tf.placeholder(...), tf.placeholder(...)
# call symbolic functions on a, b
```
The only tool tensorpack has for after-training inference is `OfflinePredictor`,
The only tool tensorpack has for after-training inference is [OfflinePredictor](../modules/predict.html#tensorpack.predict.OfflinePredictor),
a simple function to build the graph and return a callable for you.
It is mainly for quick demo purposes.
It only runs inference on Python data, therefore may not be the most efficient way.
Check out some examples for its usage.
It only runs inference on numpy arrays, therefore may not be the most efficient way.
Check out examples and docs for its usage.
......@@ -4,6 +4,7 @@
# Author: Yuxin Wu <ppwwyyxxc@gmail.com>
from tensorpack import *
from tensorpack.tfutils import get_tf_version_number
from tensorpack.tfutils.summary import add_moving_summary
from tensorpack.tfutils.scope_utils import auto_reuse_variable_scope
import tensorflow as tf
......@@ -83,6 +84,7 @@ class Model(DCGAN.Model):
if __name__ == '__main__':
assert get_tf_version_number() >= 1.4
args = DCGAN.get_args(default_batch=64, default_z_dim=128)
M = Model(shape=args.final_size, batch=args.batch, z_dim=args.z_dim)
if args.sample:
......
......@@ -40,7 +40,7 @@ class PredictConfig(object):
tensors can be any computable tensor in the graph.
return_input (bool): same as in :attr:`PredictorBase.return_input`.
create_graph (bool): create a new graph, or use the default graph
when then predictor is first initialized.
when predictor is first initialized.
You need to set either `model`, or `inputs_desc` plus `tower_func`.
"""
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment