Commit 6ac34dfb authored by Yuxin Wu's avatar Yuxin Wu

add a faq page

parent 3cacaee8
# FAQs
## Does it support data format X / augmentation Y / layer Z?
The library tries to support everything, but it couldn't really include everything.
For the XYZ you need, you can either implement them, or use any existing code and wrap it
with tensorpack interface. See [Extend Tensorpack](http://tensorpack.readthedocs.io/en/latest/tutorial/index.html#extend-tensorpack)
for more details.
It you think:
1. The framework has limitation so your XYZ cannot be supported, OR
2. Your XYZ is very common, or very well-defined, so it would be nice to include it.
Then it's a good time to open an issue.
## How to dump/inspect a model
When you enable `ModelSaver` as a callback,
trained models will be stored in TensorFlow checkpoint format, which typically includes a
`.data-xxxxx` file and a `.index` file. Both are necessary.
To inspect a checkpoint, the easiest way is `tf.train.NewCheckpointReader`. Please note that it
expects a path without the extension.
You can dump a cleaner version of the model (with only model/trainable variables), with
`scripts/dump-model-params.py`, as a simple `var-name: value` dict saved in npy format.
It expects a metagraph file which is also saved by `ModelSaver`.
## How to load a model / do transfer learning
All model loading (in either training or testing) is through the `session_init` interface
in `TrainConfig` or `PredictConfig`.
It accepts a `SessionInit` instance, where the common options are `SaverRestore` which restores
TF checkpoint, or `DictRestore` which restores a dict. `get_model_loader` is a small helper to
decide which one to use from file name.
Doing transfer learning is painless. Variable restoring is completely based on name match between
the current graph and the `SessionInit` initializer.
Therefore, if you want to re-train some layer, just rename it.
And unmatched variables on both side will be printed as warning.
To freeze some variables, there are [different ways](https://github.com/ppwwyyxx/tensorpack/issues/87#issuecomment-270545291)
with pros and cons.
...@@ -39,6 +39,7 @@ User Tutorials ...@@ -39,6 +39,7 @@ User Tutorials
model model
trainer trainer
callback callback
faq
Extend Tensorpack Extend Tensorpack
================= =================
......
...@@ -22,6 +22,7 @@ assert args.config or args.meta, "Either config or metagraph must be present!" ...@@ -22,6 +22,7 @@ assert args.config or args.meta, "Either config or metagraph must be present!"
with tf.Graph().as_default() as G: with tf.Graph().as_default() as G:
if args.config: if args.config:
logger.warn("Using a config script is not reliable. Please use metagraph.")
MODEL = imp.load_source('config_script', args.config).Model MODEL = imp.load_source('config_script', args.config).Model
M = MODEL() M = MODEL()
with TowerContext('', is_training=False): with TowerContext('', is_training=False):
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment