Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
860f7a38
Commit
860f7a38
authored
Jan 03, 2019
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Add "toco_compatible" option for `export_compact`. (#1029)
parent
c44b65fc
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
3 deletions
+8
-3
tensorpack/tfutils/export.py
tensorpack/tfutils/export.py
+8
-3
No files found.
tensorpack/tfutils/export.py
View file @
860f7a38
...
...
@@ -13,7 +13,7 @@ from tensorflow.python.platform import gfile
from
tensorflow.python.tools
import
optimize_for_inference_lib
from
..input_source
import
PlaceholderInput
from
..tfutils.common
import
get_tensors_by_names
from
..tfutils.common
import
get_tensors_by_names
,
get_tf_version_tuple
from
..tfutils.tower
import
PredictTowerContext
from
..utils
import
logger
...
...
@@ -34,11 +34,15 @@ class ModelExporter(object):
super
(
ModelExporter
,
self
)
.
__init__
()
self
.
config
=
config
def
export_compact
(
self
,
filename
):
def
export_compact
(
self
,
filename
,
toco_compatible
=
False
):
"""Create a self-contained inference-only graph and write final graph (in pb format) to disk.
Args:
filename (str): path to the output graph
toco_compatible (bool): See TensorFlow's
`optimize_for_inference
<https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py>`_
for details. Only available after TF 1.8.
"""
self
.
graph
=
self
.
config
.
_maybe_create_graph
()
with
self
.
graph
.
as_default
():
...
...
@@ -66,12 +70,13 @@ class ModelExporter(object):
variable_names_blacklist
=
None
)
# prune unused nodes from graph
toco_args
=
()
if
get_tf_version_tuple
()
<
(
1
,
8
)
else
(
toco_compatible
,
)
pruned_graph_def
=
optimize_for_inference_lib
.
optimize_for_inference
(
frozen_graph_def
,
[
n
.
name
[:
-
2
]
for
n
in
input_tensors
],
[
n
.
name
[:
-
2
]
for
n
in
output_tensors
],
[
dtype
.
as_datatype_enum
for
dtype
in
dtypes
],
False
)
*
toco_args
)
with
gfile
.
FastGFile
(
filename
,
"wb"
)
as
f
:
f
.
write
(
pruned_graph_def
.
SerializeToString
())
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment