Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
61c113b8
Commit
61c113b8
authored
Jul 13, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Add a ModelDescBase without single-cost assumptions
parent
7240f877
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
13 additions
and
14 deletions
+13
-14
tensorpack/graph_builder/model_desc.py
tensorpack/graph_builder/model_desc.py
+13
-14
No files found.
tensorpack/graph_builder/model_desc.py
View file @
61c113b8
...
@@ -12,7 +12,7 @@ from ..utils.argtools import memoized
...
@@ -12,7 +12,7 @@ from ..utils.argtools import memoized
from
.input_source_base
import
InputSource
from
.input_source_base
import
InputSource
from
..models.regularize
import
regularize_cost_from_collection
from
..models.regularize
import
regularize_cost_from_collection
__all__
=
[
'InputDesc'
,
'ModelDesc'
]
__all__
=
[
'InputDesc'
,
'ModelDesc'
,
'ModelDescBase'
]
class
InputDesc
(
class
InputDesc
(
...
@@ -87,12 +87,10 @@ class InputDesc(
...
@@ -87,12 +87,10 @@ class InputDesc(
@
six
.
add_metaclass
(
ABCMeta
)
@
six
.
add_metaclass
(
ABCMeta
)
class
ModelDesc
(
object
):
class
ModelDescBase
(
object
):
""" Base class for a model description.
""" Base class for a model description. """
"""
# inputs:
# TODO remove this method? Now mainly used in predict/
# TODO remove this method?
@
memoized
@
memoized
def
get_reused_placehdrs
(
self
):
def
get_reused_placehdrs
(
self
):
"""
"""
...
@@ -151,11 +149,18 @@ class ModelDesc(object):
...
@@ -151,11 +149,18 @@ class ModelDesc(object):
def
_build_graph
(
self
,
inputs
):
def
_build_graph
(
self
,
inputs
):
pass
pass
class
ModelDesc
(
ModelDescBase
):
"""
A ModelDesc with single cost and single optimizers.
"""
def
get_cost
(
self
):
def
get_cost
(
self
):
"""
"""
Return the cost tensor in the graph.
Return the cost tensor in the graph.
Used by some of the tensorpack :class:`Trainer` which assumes single-cost models.
Used by some of the tensorpack :class:`Trainer` which assumes single-cost models.
You can ignore this method if you use your own trainer with more than one cost.
You can ignore this method (or just use :class:`ModelDescBase`)
if you use your own trainer with more than one cost.
It calls :meth:`ModelDesc._get_cost()` which by default returns
It calls :meth:`ModelDesc._get_cost()` which by default returns
``self.cost``. You can override :meth:`_get_cost()` if needed.
``self.cost``. You can override :meth:`_get_cost()` if needed.
...
@@ -175,7 +180,7 @@ class ModelDesc(object):
...
@@ -175,7 +180,7 @@ class ModelDesc(object):
"""
"""
Return the optimizer used in the task.
Return the optimizer used in the task.
Used by some of the tensorpack :class:`Trainer` which assume single optimizer.
Used by some of the tensorpack :class:`Trainer` which assume single optimizer.
You
can (and should) ignore this method
if you use a custom trainer with more than one optimizers.
You
should use :class:`ModelDescBase`
if you use a custom trainer with more than one optimizers.
Users of :class:`ModelDesc` will need to implement `_get_optimizer()`,
Users of :class:`ModelDesc` will need to implement `_get_optimizer()`,
which will only be called once per each model.
which will only be called once per each model.
...
@@ -187,9 +192,3 @@ class ModelDesc(object):
...
@@ -187,9 +192,3 @@ class ModelDesc(object):
def
_get_optimizer
(
self
):
def
_get_optimizer
(
self
):
raise
NotImplementedError
()
raise
NotImplementedError
()
def
get_gradient_processor
(
self
):
return
self
.
_get_gradient_processor
()
def
_get_gradient_processor
(
self
):
return
[]
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment