Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
6efe0deb
Commit
6efe0deb
authored
Dec 17, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
update docs
parent
c2cec01e
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
8 additions
and
5 deletions
+8
-5
tensorpack/graph_builder/distributed.py
tensorpack/graph_builder/distributed.py
+1
-1
tensorpack/graph_builder/training.py
tensorpack/graph_builder/training.py
+2
-2
tensorpack/train/trainers.py
tensorpack/train/trainers.py
+5
-2
No files found.
tensorpack/graph_builder/distributed.py
View file @
6efe0deb
...
...
@@ -23,7 +23,7 @@ class DistributedReplicatedBuilder(DataParallelBuilder):
and get synchronously applied to the global copy of variables located on PS.
Then each worker copy the latest variables from PS back to local.
It is an equivalent of `
--variable_update=distributed_replicated
` in
It is an equivalent of `
`--variable_update=distributed_replicated`
` in
`tensorflow/benchmarks <https://github.com/tensorflow/benchmarks>`_.
Note:
...
...
tensorpack/graph_builder/training.py
View file @
6efe0deb
...
...
@@ -106,7 +106,7 @@ class SyncMultiGPUParameterServerBuilder(DataParallelBuilder):
shared variable scope. It synchronoizes the gradients computed
from each tower, averages them and applies to the shared variables.
It is an equivalent of `
--variable_update=parameter_server
` in
It is an equivalent of `
`--variable_update=parameter_server`
` in
`tensorflow/benchmarks <https://github.com/tensorflow/benchmarks>`_.
"""
def
__init__
(
self
,
towers
,
ps_device
=
None
):
...
...
@@ -165,7 +165,7 @@ class SyncMultiGPUReplicatedBuilder(DataParallelBuilder):
It will build one tower on each GPU under its own variable scope.
Each gradient update is averaged across or GPUs through NCCL.
It is an equivalent of `
--variable_update=replicated
` in
It is an equivalent of `
`--variable_update=replicated`
` in
`tensorflow/benchmarks <https://github.com/tensorflow/benchmarks>`_.
"""
...
...
tensorpack/train/trainers.py
View file @
6efe0deb
...
...
@@ -71,13 +71,16 @@ class SyncMultiGPUTrainerParameterServer(SingleCostTrainer):
"""
@
map_arg
(
gpus
=
_int_to_range
)
def
__init__
(
self
,
gpus
,
ps_device
=
'gpu'
):
def
__init__
(
self
,
gpus
,
ps_device
=
None
):
"""
Args:
gpus ([int]): list of GPU ids.
ps_device: either 'gpu' or 'cpu', where variables are stored. Setting to 'cpu' might help when #gpu>=4
ps_device: either 'gpu' or 'cpu', where variables are stored.
The default value is subject to change.
"""
self
.
devices
=
gpus
if
ps_device
is
None
:
ps_device
=
'gpu'
if
len
(
gpus
)
<=
2
else
'cpu'
self
.
_builder
=
SyncMultiGPUParameterServerBuilder
(
gpus
,
ps_device
)
super
(
SyncMultiGPUTrainerParameterServer
,
self
)
.
__init__
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment