Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
34d20a81
Commit
34d20a81
authored
May 10, 2019
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix #1182
parent
6926d22a
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
14 additions
and
2 deletions
+14
-2
examples/keras/mnist-keras.py
examples/keras/mnist-keras.py
+14
-2
No files found.
examples/keras/mnist-keras.py
View file @
34d20a81
...
...
@@ -24,9 +24,21 @@ Note: this example does not work for replicated-style data-parallel trainers.
IMAGE_SIZE
=
28
# Work around a Keras issue: it append name scopes to variable names..
# May not work well if you use Keras layers inside other name scopes.
@
contextmanager
def
clear_tower0_name_scope
():
ns
=
tf
.
get_default_graph
()
.
get_name_scope
()
if
ns
==
'tower0'
:
with
tf
.
name_scope
(
'/'
):
yield
else
:
yield
@
memoized
# this is necessary for sonnet/keras to work under tensorpack
def
get_keras_model
():
with
tf
.
name_scope
(
'/'
):
with
clear_tower0_name_scope
(
):
M
=
keras
.
models
.
Sequential
()
M
.
add
(
KL
.
Conv2D
(
32
,
3
,
activation
=
'relu'
,
input_shape
=
[
IMAGE_SIZE
,
IMAGE_SIZE
,
1
],
padding
=
'same'
))
M
.
add
(
KL
.
MaxPooling2D
())
...
...
@@ -36,7 +48,7 @@ def get_keras_model():
M
.
add
(
KL
.
Conv2D
(
32
,
3
,
padding
=
'same'
,
activation
=
'relu'
))
M
.
add
(
KL
.
Flatten
())
M
.
add
(
KL
.
Dense
(
512
,
activation
=
'relu'
,
kernel_regularizer
=
keras
.
regularizers
.
l2
(
1e-5
)))
M
.
add
(
KL
.
Dropout
(
0.5
))
M
.
add
(
KL
.
Dropout
(
rate
=
0.5
))
M
.
add
(
KL
.
Dense
(
10
,
activation
=
None
,
kernel_regularizer
=
keras
.
regularizers
.
l2
(
1e-5
)))
return
M
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment