Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
e78e2e1e
Commit
e78e2e1e
authored
Aug 09, 2018
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
update docs
parent
63bdc43b
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
25 additions
and
10 deletions
+25
-10
docs/tutorial/faq.md
docs/tutorial/faq.md
+1
-1
docs/tutorial/summary.md
docs/tutorial/summary.md
+24
-9
No files found.
docs/tutorial/faq.md
View file @
e78e2e1e
...
...
@@ -20,7 +20,7 @@ Then it is a good time to open an issue.
1.
Learn
`tf.Print`
.
2.
Know
[
DumpTensors
](
../modules/callbacks.html#tensorpack.callbacks.DumpTensors
[]
)
,
2.
Know
[
DumpTensors
](
../modules/callbacks.html#tensorpack.callbacks.DumpTensors
)
,
[
ProcessTensors
](
../modules/callbacks.html#tensorpack.callbacks.ProcessTensors
)
callbacks.
And it's also easy to write your own version of them.
...
...
docs/tutorial/summary.md
View file @
e78e2e1e
...
...
@@ -8,7 +8,8 @@ The default logging behavior should be good enough for normal use cases, so you
This is how TensorFlow summaries eventually get logged/saved/printed:
1.
__What to Log__: When you call
`tf.summary.xxx`
in your graph code, TensorFlow adds an op to
1.
__What to Log__: Define what you want to log in the graph.
When you call
`tf.summary.xxx`
in your graph code, TensorFlow adds an op to
`tf.GraphKeys.SUMMARIES`
collection (by default).
2.
__When to Log__:
[
MergeAllSummaries
](
../modules/callbacks.html#tensorpack.callbacks.MergeAllSummaries
)
callback is in the
[
default callbacks
](
../modules/train.html#tensorpack.train.DEFAULT_CALLBACKS
)
.
...
...
@@ -25,8 +26,22 @@ This is how TensorFlow summaries eventually get logged/saved/printed:
All the "what, when, where" can be customized in either the graph or with the callbacks/monitors setting.
Since TF summaries are evaluated infrequently (every epoch) by default, if the content is data-dependent, the values
could have high variance. To address this issue, you can:
The design goal to disentangle "what, when, where" is to make components reusable.
Suppose you have
`M`
items to log
(possibly from differently places, not necessarily the graph)
and
`N`
backends to log your data to, you
automatically obtain all the
`MxN`
combinations.
Despite of that, if you only care about logging one specific item (e.g. for
debugging purpose), you can check out the
[
FAQ
](
http://tensorpack.readthedocs.io/tutorial/faq.html#how-to-print-dump-intermediate-results-in-training
)
for easier options.
### Noisy TensorFlow Summaries
Since TF summaries are evaluated infrequently (every epoch) by default,
if the content is data-dependent, the values could have high variance.
To address this issue, you can:
1.
Change "When to Log": log more frequently, but note that certain summaries can be expensive to
log. You may want to use a separate collection for frequent logging.
2.
Change "What to Log": you can call
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment