Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
ab507874
Commit
ab507874
authored
Jul 14, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Reenter the same name_scope when reusing a variable_scope
parent
79f4760a
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
8 additions
and
3 deletions
+8
-3
tensorpack/callbacks/summary.py
tensorpack/callbacks/summary.py
+2
-1
tensorpack/tfutils/scope_utils.py
tensorpack/tfutils/scope_utils.py
+3
-1
tensorpack/tfutils/summary.py
tensorpack/tfutils/summary.py
+3
-1
No files found.
tensorpack/callbacks/summary.py
View file @
ab507874
...
@@ -27,7 +27,8 @@ class MovingAverageSummary(Callback):
...
@@ -27,7 +27,8 @@ class MovingAverageSummary(Callback):
def
_setup_graph
(
self
):
def
_setup_graph
(
self
):
ops
=
tf
.
get_collection
(
self
.
_collection
)
ops
=
tf
.
get_collection
(
self
.
_collection
)
logger
.
info
(
"Maintain moving averages of {} tensors."
.
format
(
len
(
ops
)))
logger
.
info
(
"Maintain moving average summary of {} tensors."
.
format
(
len
(
ops
)))
self
.
ema_op
=
tf
.
group
(
*
ops
,
name
=
'summary_moving_averages'
)
self
.
ema_op
=
tf
.
group
(
*
ops
,
name
=
'summary_moving_averages'
)
self
.
_fetch
=
tf
.
train
.
SessionRunArgs
(
fetches
=
self
.
ema_op
)
self
.
_fetch
=
tf
.
train
.
SessionRunArgs
(
fetches
=
self
.
ema_op
)
...
...
tensorpack/tfutils/scope_utils.py
View file @
ab507874
...
@@ -44,8 +44,10 @@ def auto_reuse_variable_scope(func):
...
@@ -44,8 +44,10 @@ def auto_reuse_variable_scope(func):
h
=
hash
((
tf
.
get_default_graph
(),
scope
.
name
))
h
=
hash
((
tf
.
get_default_graph
(),
scope
.
name
))
# print("Entering " + scope.name + " reuse: " + str(h in used_scope))
# print("Entering " + scope.name + " reuse: " + str(h in used_scope))
if
h
in
used_scope
:
if
h
in
used_scope
:
ns
=
scope
.
original_name_scope
with
tf
.
variable_scope
(
scope
,
reuse
=
True
):
with
tf
.
variable_scope
(
scope
,
reuse
=
True
):
return
func
(
*
args
,
**
kwargs
)
with
tf
.
name_scope
(
ns
):
return
func
(
*
args
,
**
kwargs
)
else
:
else
:
used_scope
.
add
(
h
)
used_scope
.
add
(
h
)
return
func
(
*
args
,
**
kwargs
)
return
func
(
*
args
,
**
kwargs
)
...
...
tensorpack/tfutils/summary.py
View file @
ab507874
...
@@ -166,8 +166,10 @@ def add_moving_summary(v, *args, **kwargs):
...
@@ -166,8 +166,10 @@ def add_moving_summary(v, *args, **kwargs):
# TODO will produce variable tower0/xxx?
# TODO will produce variable tower0/xxx?
# TODO not saved under distributed
# TODO not saved under distributed
# TODO use zero_debias
# TODO use zero_debias
# TODO create EMA for each variable separately, so that the maintain ops
# have a decent name (rather than EMA)
gs
=
get_global_step_var
()
gs
=
get_global_step_var
()
with
tf
.
name_scope
(
None
),
tf
.
device
(
gs
.
device
):
with
tf
.
device
(
gs
.
device
):
averager
=
tf
.
train
.
ExponentialMovingAverage
(
averager
=
tf
.
train
.
ExponentialMovingAverage
(
decay
,
num_updates
=
gs
,
name
=
'EMA'
)
decay
,
num_updates
=
gs
,
name
=
'EMA'
)
avg_maintain_op
=
averager
.
apply
(
v
)
avg_maintain_op
=
averager
.
apply
(
v
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment