Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
822997c7
Commit
822997c7
authored
Aug 29, 2018
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
update docs and new model
parent
65bbd28a
Changes
7
Expand all
Hide whitespace changes
Inline
Side-by-side
Showing
7 changed files
with
42 additions
and
26 deletions
+42
-26
.github/ISSUE_TEMPLATE.md
.github/ISSUE_TEMPLATE.md
+3
-0
docs/tutorial/dataflow.md
docs/tutorial/dataflow.md
+7
-5
examples/FasterRCNN/README.md
examples/FasterRCNN/README.md
+16
-14
examples/FasterRCNN/train.py
examples/FasterRCNN/train.py
+1
-1
tensorpack/callbacks/trigger.py
tensorpack/callbacks/trigger.py
+5
-2
tensorpack/utils/compatible_serialize.py
tensorpack/utils/compatible_serialize.py
+0
-2
tensorpack/utils/serialize.py
tensorpack/utils/serialize.py
+10
-2
No files found.
.github/ISSUE_TEMPLATE.md
View file @
822997c7
...
...
@@ -2,3 +2,6 @@ An issue has to be one of the following:
-
Unexpected Problems / Potential Bugs
-
Feature Requests
-
Questions on Using/Understanding Tensorpack
To post an issue, please click "New Issue", choose your category, and read
instructions there.
docs/tutorial/dataflow.md
View file @
822997c7
...
...
@@ -13,6 +13,13 @@ A datapoint is a **list** of Python objects which are called the `components` of
that yields datapoints (lists) of two components:
a numpy array of shape (64, 28, 28), and an array of shape (64,).
As you saw,
DataFlow is __independent__ of TensorFlow since it produces any python objects
(usually numpy arrays).
To
`import tensorpack.dataflow`
, you don't even have to install TensorFlow.
You can simply use DataFlow as a data processing pipeline and plug it into any other frameworks.
### Composition of DataFlow
One good thing about having a standard interface is to be able to provide
the greatest code reusability.
...
...
@@ -65,8 +72,3 @@ generator = df.get_data()
for
dp
in
generator
:
# dp is now a list. do whatever
```
DataFlow is __independent__ of both tensorpack and TensorFlow.
To
`import tensorpack.dataflow`
, you don't even have to install TensorFlow.
You can simply use it as a data processing pipeline and plug it into any other frameworks.
examples/FasterRCNN/README.md
View file @
822997c7
This diff is collapsed.
Click to expand it.
examples/FasterRCNN/train.py
View file @
822997c7
...
...
@@ -441,7 +441,7 @@ class EvalCallback(Callback):
logger
.
info
(
"[EvalCallback] Will evaluate every {} epochs"
.
format
(
interval
))
def
_eval
(
self
):
if
cfg
.
TRAINER
==
'replicated'
or
cfg
.
TRAIN
.
NUM_GPUS
==
1
:
if
cfg
.
TRAINER
==
'replicated'
:
with
ThreadPoolExecutor
(
max_workers
=
self
.
num_predictor
,
thread_name_prefix
=
'EvalWorker'
)
as
executor
,
\
tqdm
.
tqdm
(
total
=
sum
([
df
.
size
()
for
df
in
self
.
dataflows
]))
as
pbar
:
futures
=
[]
...
...
tensorpack/callbacks/trigger.py
View file @
822997c7
...
...
@@ -86,11 +86,13 @@ class PeriodicRunHooks(ProxyCallback):
class
EnableCallbackIf
(
ProxyCallback
):
"""
En
able the ``{before,after}_epoch``, ``{before,after}_run``,
Dis
able the ``{before,after}_epoch``, ``{before,after}_run``,
``trigger_{epoch,step}``
methods of a callback,
only when
some condition satisfies.
methods of a callback,
unless
some condition satisfies.
The other methods are unaffected.
A more accurate name for this callback should be "DisableCallbackUnless", but that's too ugly.
Note:
If you use ``{before,after}_run``,
``pred`` will be evaluated only in ``before_run``.
...
...
@@ -101,6 +103,7 @@ class EnableCallbackIf(ProxyCallback):
Args:
callback (Callback):
pred (self -> bool): a callable predicate. Has to be a pure function.
The callback is disabled unless this predicate returns True.
"""
self
.
_pred
=
pred
super
(
EnableCallbackIf
,
self
)
.
__init__
(
callback
)
...
...
tensorpack/utils/compatible_serialize.py
100755 → 100644
View file @
822997c7
#!/usr/bin/env python
import
os
from
.serialize
import
loads_msgpack
,
loads_pyarrow
,
dumps_msgpack
,
dumps_pyarrow
...
...
tensorpack/utils/serialize.py
View file @
822997c7
...
...
@@ -2,8 +2,6 @@
# File: serialize.py
import
os
import
pyarrow
as
pa
from
.develop
import
create_dummy_func
__all__
=
[
'loads'
,
'dumps'
]
...
...
@@ -46,6 +44,16 @@ def loads_pyarrow(buf):
return
pa
.
deserialize
(
buf
)
try
:
# import pyarrow has a lot of side effect: https://github.com/apache/arrow/pull/2329
# So we need an option to disable it.
if
os
.
environ
.
get
(
'TENSORPACK_SERIALIZE'
,
'pyarrow'
)
==
'pyarrow'
:
import
pyarrow
as
pa
except
ImportError
:
pa
=
None
dumps_pyarrow
=
create_dummy_func
(
'dumps_pyarrow'
,
[
'pyarrow'
])
# noqa
loads_pyarrow
=
create_dummy_func
(
'loads_pyarrow'
,
[
'pyarrow'
])
# noqa
try
:
import
msgpack
import
msgpack_numpy
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment