Commit b419cc81 authored by Yuxin Wu's avatar Yuxin Wu

update docs

parent 99872a7b
...@@ -16,11 +16,13 @@ Otherwise, you can post here for: ...@@ -16,11 +16,13 @@ Otherwise, you can post here for:
Some typical questions that we DO NOT answer: Some typical questions that we DO NOT answer:
+ "Could you improve/implement an example/paper ?" -- + "Could you implement a paper / other variants of a paper / additional features of a paper ?"
-- The answer is: we have no plans to do so.
We have no plans to do so. We don't consider feature We take feature requests on the library, but not on examples.
requests for examples or implement a paper for you. We don't implement papers or variants/features of a paper for you,
unless it can demonstrate some Tensorpack features not yet demonstrated in the existing examples.
If you don't know how to do something yourself, you may ask a usage question. If you don't know how to do something yourself, you may ask a usage question.
+ "The examples do not perform as expected after I change the models/dataset/parameters/etc." + "The examples do not perform as expected after I change the models/dataset/parameters/etc."
Tensorpack maintainers make sure the examples perform well without modifications. Tensorpack maintainers make sure the examples perform well without modifications.
......
...@@ -8,8 +8,9 @@ about: Suggest an idea for Tensorpack ...@@ -8,8 +8,9 @@ about: Suggest an idea for Tensorpack
(See http://tensorpack.readthedocs.io/tutorial/index.html#extend-tensorpack). (See http://tensorpack.readthedocs.io/tutorial/index.html#extend-tensorpack).
It does not have to be added to Tensorpack unless you have a good reason. It does not have to be added to Tensorpack unless you have a good reason.
+ "Could you implement a paper / other variants of a paper ?" + "Could you implement a paper / other variants of a paper / additional features of a paper ?"
-- The answer is: we have no plans to do so. -- The answer is: we have no plans to do so.
We don't implement papers or variants of a paper for you, We take feature requests on the library, but not on examples.
unless it demonstrates some Tensorpack features not yet demonstrated in the existing examples. We don't implement papers or variants/features of a paper for you,
unless it can demonstrate some Tensorpack features not yet demonstrated in the existing examples.
If you don't know how to do something yourself, you may ask a usage question. If you don't know how to do something yourself, you may ask a usage question.
...@@ -11,6 +11,7 @@ This is how TensorFlow summaries eventually get logged/saved/printed: ...@@ -11,6 +11,7 @@ This is how TensorFlow summaries eventually get logged/saved/printed:
1. __What to Log__: Define what you want to log in the graph. 1. __What to Log__: Define what you want to log in the graph.
When you call `tf.summary.xxx` in your graph code, TensorFlow adds an op to When you call `tf.summary.xxx` in your graph code, TensorFlow adds an op to
`tf.GraphKeys.SUMMARIES` collection (by default). `tf.GraphKeys.SUMMARIES` collection (by default).
Tensorpack further removes summaries not from the first training tower.
2. __When to Log__: [MergeAllSummaries](../modules/callbacks.html#tensorpack.callbacks.MergeAllSummaries) 2. __When to Log__: [MergeAllSummaries](../modules/callbacks.html#tensorpack.callbacks.MergeAllSummaries)
callback is one of the [default callbacks](../modules/train.html#tensorpack.train.DEFAULT_CALLBACKS). callback is one of the [default callbacks](../modules/train.html#tensorpack.train.DEFAULT_CALLBACKS).
It runs ops in the `tf.GraphKeys.SUMMARIES` collection (by default) every epoch (by default), It runs ops in the `tf.GraphKeys.SUMMARIES` collection (by default) every epoch (by default),
...@@ -25,6 +26,8 @@ This is how TensorFlow summaries eventually get logged/saved/printed: ...@@ -25,6 +26,8 @@ This is how TensorFlow summaries eventually get logged/saved/printed:
saves scalars to a JSON file. saves scalars to a JSON file.
All the "what, when, where" can be customized in either the graph or with the callbacks/monitors setting. All the "what, when, where" can be customized in either the graph or with the callbacks/monitors setting.
You can call `tf.summary.xxx(collections=[...])` to add your custom summaries a different collection,
and use the `MergeAllSummaries(key=...)` callback to write them to monitors.
The design goal to disentangle "what, when, where" is to make components reusable. The design goal to disentangle "what, when, where" is to make components reusable.
Suppose you have `M` items to log Suppose you have `M` items to log
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment