Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
784417db
Commit
784417db
authored
Aug 31, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix resnet_group
parent
0e2e305a
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
13 additions
and
9 deletions
+13
-9
examples/ResNet/imagenet_resnet_utils.py
examples/ResNet/imagenet_resnet_utils.py
+13
-9
No files found.
examples/ResNet/imagenet_resnet_utils.py
View file @
784417db
...
@@ -124,14 +124,11 @@ def resnet_shortcut(l, n_out, stride, nl=tf.identity):
...
@@ -124,14 +124,11 @@ def resnet_shortcut(l, n_out, stride, nl=tf.identity):
def
apply_preactivation
(
l
,
preact
):
def
apply_preactivation
(
l
,
preact
):
"""
"""
'no_preact' for the first resblock in each group only, because the input is activated already.
'no_preact' for the first resblock in each group only, because the input is activated already.
'bnrelu
/relu
' for all the non-first blocks, where identity mapping is preserved on shortcut path.
'bnrelu' for all the non-first blocks, where identity mapping is preserved on shortcut path.
"""
"""
if
preact
==
'bnrelu'
:
if
preact
==
'bnrelu'
:
shortcut
=
l
shortcut
=
l
# preserve identity mapping
l
=
BNReLU
(
'preact'
,
l
)
l
=
BNReLU
(
'preact'
,
l
)
elif
preact
==
'relu'
:
shortcut
=
l
l
=
tf
.
nn
.
relu
(
l
)
else
:
else
:
shortcut
=
l
shortcut
=
l
return
l
,
shortcut
return
l
,
shortcut
...
@@ -152,6 +149,7 @@ def preresnet_basicblock(l, ch_out, stride, preact):
...
@@ -152,6 +149,7 @@ def preresnet_basicblock(l, ch_out, stride, preact):
def
preresnet_bottleneck
(
l
,
ch_out
,
stride
,
preact
):
def
preresnet_bottleneck
(
l
,
ch_out
,
stride
,
preact
):
# stride is applied on the second conv, following fb.resnet.torch
l
,
shortcut
=
apply_preactivation
(
l
,
preact
)
l
,
shortcut
=
apply_preactivation
(
l
,
preact
)
l
=
Conv2D
(
'conv1'
,
l
,
ch_out
,
1
,
nl
=
BNReLU
)
l
=
Conv2D
(
'conv1'
,
l
,
ch_out
,
1
,
nl
=
BNReLU
)
l
=
Conv2D
(
'conv2'
,
l
,
ch_out
,
3
,
stride
=
stride
,
nl
=
BNReLU
)
l
=
Conv2D
(
'conv2'
,
l
,
ch_out
,
3
,
stride
=
stride
,
nl
=
BNReLU
)
...
@@ -172,6 +170,13 @@ def preresnet_group(l, name, block_func, features, count, stride):
...
@@ -172,6 +170,13 @@ def preresnet_group(l, name, block_func, features, count, stride):
return
l
return
l
def
resnet_basicblock
(
l
,
ch_out
,
stride
,
preact
):
l
,
shortcut
=
apply_preactivation
(
l
,
preact
)
l
=
Conv2D
(
'conv1'
,
l
,
ch_out
,
3
,
stride
=
stride
,
nl
=
BNReLU
)
l
=
Conv2D
(
'conv2'
,
l
,
ch_out
,
3
,
nl
=
get_bn
(
zero_init
=
True
))
return
l
+
resnet_shortcut
(
shortcut
,
ch_out
,
stride
,
nl
=
get_bn
(
zero_init
=
False
))
def
resnet_bottleneck
(
l
,
ch_out
,
stride
,
preact
):
def
resnet_bottleneck
(
l
,
ch_out
,
stride
,
preact
):
l
,
shortcut
=
apply_preactivation
(
l
,
preact
)
l
,
shortcut
=
apply_preactivation
(
l
,
preact
)
l
=
Conv2D
(
'conv1'
,
l
,
ch_out
,
1
,
nl
=
BNReLU
)
l
=
Conv2D
(
'conv1'
,
l
,
ch_out
,
1
,
nl
=
BNReLU
)
...
@@ -184,11 +189,10 @@ def resnet_group(l, name, block_func, features, count, stride):
...
@@ -184,11 +189,10 @@ def resnet_group(l, name, block_func, features, count, stride):
with
tf
.
variable_scope
(
name
):
with
tf
.
variable_scope
(
name
):
for
i
in
range
(
0
,
count
):
for
i
in
range
(
0
,
count
):
with
tf
.
variable_scope
(
'block{}'
.
format
(
i
)):
with
tf
.
variable_scope
(
'block{}'
.
format
(
i
)):
# first block doesn't need activation
l
=
block_func
(
l
,
features
,
l
=
block_func
(
l
,
features
,
stride
if
i
==
0
else
1
,
stride
if
i
==
0
else
1
,
'no_preact'
if
i
==
0
else
'relu'
)
'no_preact'
)
# end of each group need an extra
activation
# end of each block need an
activation
l
=
tf
.
nn
.
relu
(
l
)
l
=
tf
.
nn
.
relu
(
l
)
return
l
return
l
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment