Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
f873681f
Commit
f873681f
authored
Oct 22, 2016
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
bug fix
parent
491e7144
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
tensorpack/utils/loadcaffe.py
tensorpack/utils/loadcaffe.py
+3
-3
No files found.
tensorpack/utils/loadcaffe.py
View file @
f873681f
...
...
@@ -79,7 +79,7 @@ class CaffeLayerProcessor(object):
def
proc_scale
(
self
,
idx
,
name
,
param
):
bottom_name
=
self
.
net
.
bottom_names
[
name
][
0
]
# find te bn layer before this scaling
# find t
h
e bn layer before this scaling
for
i
,
layer
in
enumerate
(
self
.
net
.
layers
):
if
layer
.
type
==
'BatchNorm'
:
name2
=
self
.
layer_names
[
i
]
...
...
@@ -88,8 +88,8 @@ class CaffeLayerProcessor(object):
# scaling and BN share the same bottom, should merge
logger
.
info
(
"Merge {} and {} into one BatchNorm layer"
.
format
(
name
,
name2
))
return
{
name
+
'/beta'
:
param
[
1
]
.
data
,
name
+
'/gamma'
:
param
[
0
]
.
data
}
return
{
name
2
+
'/beta'
:
param
[
1
]
.
data
,
name
2
+
'/gamma'
:
param
[
0
]
.
data
}
# assume this scaling layer is part of some BN
logger
.
error
(
"Could not find a BN layer corresponding to this Scale layer!"
)
raise
ValueError
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment