Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
c8028236
Commit
c8028236
authored
Sep 26, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix local variable override bug (fix #430)
parent
8648d571
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
10 deletions
+8
-10
tensorpack/train/distributed.py
tensorpack/train/distributed.py
+8
-10
No files found.
tensorpack/train/distributed.py
View file @
c8028236
...
...
@@ -13,7 +13,6 @@ from ..tfutils.sesscreate import NewSessionCreator
from
..tfutils.common
import
get_global_step_var
,
get_op_tensor_name
from
.multigpu
import
MultiGPUTrainerBase
from
.utility
import
override_to_local_variable
__all__
=
[
'DistributedTrainerReplicated'
]
...
...
@@ -205,7 +204,6 @@ class DistributedTrainerReplicated(MultiGPUTrainerBase):
cbs
=
self
.
_input_source
.
setup
(
self
.
model
.
get_inputs_desc
())
self
.
config
.
callbacks
.
extend
(
cbs
)
with
override_to_local_variable
():
# Ngpu * Nvar * 2
grad_list
=
MultiGPUTrainerBase
.
build_on_multi_tower
(
self
.
config
.
tower
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment