Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
7b4bc9cf
Commit
7b4bc9cf
authored
Jun 17, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
update docs and deprecation
parent
b6aced91
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
6 additions
and
12 deletions
+6
-12
tensorpack/tfutils/collection.py
tensorpack/tfutils/collection.py
+1
-0
tensorpack/tfutils/optimizer.py
tensorpack/tfutils/optimizer.py
+1
-0
tensorpack/utils/fs.py
tensorpack/utils/fs.py
+2
-12
tensorpack/utils/utils.py
tensorpack/utils/utils.py
+2
-0
No files found.
tensorpack/tfutils/collection.py
View file @
7b4bc9cf
...
@@ -18,6 +18,7 @@ def backup_collection(keys):
...
@@ -18,6 +18,7 @@ def backup_collection(keys):
"""
"""
Args:
Args:
keys (list): list of collection keys to backup
keys (list): list of collection keys to backup
Returns:
Returns:
dict: the backup
dict: the backup
"""
"""
...
...
tensorpack/tfutils/optimizer.py
View file @
7b4bc9cf
...
@@ -41,6 +41,7 @@ def apply_grad_processors(opt, gradprocs):
...
@@ -41,6 +41,7 @@ def apply_grad_processors(opt, gradprocs):
opt (tf.train.Optimizer):
opt (tf.train.Optimizer):
gradprocs (list[GradientProcessor]): gradient processors to add to the
gradprocs (list[GradientProcessor]): gradient processors to add to the
optimizer.
optimizer.
Returns:
Returns:
a :class:`tf.train.Optimizer` instance which runs the gradient
a :class:`tf.train.Optimizer` instance which runs the gradient
processors before updating the variables.
processors before updating the variables.
...
...
tensorpack/utils/fs.py
View file @
7b4bc9cf
...
@@ -84,19 +84,9 @@ def get_dataset_path(*args):
...
@@ -84,19 +84,9 @@ def get_dataset_path(*args):
"""
"""
d
=
os
.
environ
.
get
(
'TENSORPACK_DATASET'
,
None
)
d
=
os
.
environ
.
get
(
'TENSORPACK_DATASET'
,
None
)
if
d
is
None
:
if
d
is
None
:
old_d
=
os
.
path
.
abspath
(
os
.
path
.
join
(
d
=
os
.
path
.
join
(
os
.
path
.
expanduser
(
'~'
),
'tensorpack_data'
)
os
.
path
.
dirname
(
__file__
),
'..'
,
'dataflow'
,
'dataset'
))
old_d_ret
=
os
.
path
.
join
(
old_d
,
*
args
)
new_d
=
os
.
path
.
join
(
os
.
path
.
expanduser
(
'~'
),
'tensorpack_data'
)
if
os
.
path
.
isdir
(
old_d_ret
):
# there is an old dir containing data, use it for back-compat
logger
.
warn
(
"You seem to have old data at {}. This is no longer
\
the default location. You'll need to move it to {}
\
(the new default location) or another directory set by
\
$TENSORPACK_DATASET."
.
format
(
old_d
,
new_d
))
d
=
new_d
if
execute_only_once
():
if
execute_only_once
():
logger
.
warn
(
"
$TENSORPACK_DATASET not set, using {} for dataset
."
.
format
(
d
))
logger
.
warn
(
"
Env var $TENSORPACK_DATASET not set, using {} for datasets
."
.
format
(
d
))
if
not
os
.
path
.
isdir
(
d
):
if
not
os
.
path
.
isdir
(
d
):
mkdir_p
(
d
)
mkdir_p
(
d
)
logger
.
info
(
"Created the directory {}."
.
format
(
d
))
logger
.
info
(
"Created the directory {}."
.
format
(
d
))
...
...
tensorpack/utils/utils.py
View file @
7b4bc9cf
...
@@ -44,6 +44,8 @@ _RNG_SEED = None
...
@@ -44,6 +44,8 @@ _RNG_SEED = None
def
fix_rng_seed
(
seed
):
def
fix_rng_seed
(
seed
):
"""
"""
Call this function at the beginning of program to fix rng seed within tensorpack.
Args:
Args:
seed (int):
seed (int):
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment