Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
4e3849e0
Commit
4e3849e0
authored
Dec 22, 2017
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
[PTB] really allow any number of layers (fix #567)
parent
4de54e62
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
9 additions
and
7 deletions
+9
-7
examples/PennTreebank/PTB-LSTM.py
examples/PennTreebank/PTB-LSTM.py
+9
-7
No files found.
examples/PennTreebank/PTB-LSTM.py
View file @
4e3849e0
...
@@ -67,9 +67,10 @@ class Model(ModelDesc):
...
@@ -67,9 +67,10 @@ class Model(ModelDesc):
return
tf
.
get_variable
(
n
,
[
BATCH
,
HIDDEN_SIZE
],
return
tf
.
get_variable
(
n
,
[
BATCH
,
HIDDEN_SIZE
],
trainable
=
False
,
trainable
=
False
,
initializer
=
tf
.
constant_initializer
())
initializer
=
tf
.
constant_initializer
())
self
.
state
=
state_var
=
\
(
rnn
.
LSTMStateTuple
(
get_v
(
'c0'
),
get_v
(
'h0'
)),
state_var
=
[
rnn
.
LSTMStateTuple
(
rnn
.
LSTMStateTuple
(
get_v
(
'c1'
),
get_v
(
'h1'
)))
get_v
(
'c{}'
.
format
(
k
)),
get_v
(
'h{}'
.
format
(
k
)))
for
k
in
range
(
NUM_LAYER
)]
self
.
state
=
state_var
=
tuple
(
state_var
)
embeddingW
=
tf
.
get_variable
(
'embedding'
,
[
VOCAB_SIZE
,
HIDDEN_SIZE
],
initializer
=
initializer
)
embeddingW
=
tf
.
get_variable
(
'embedding'
,
[
VOCAB_SIZE
,
HIDDEN_SIZE
],
initializer
=
initializer
)
input_feature
=
tf
.
nn
.
embedding_lookup
(
embeddingW
,
input
)
# B x seqlen x hiddensize
input_feature
=
tf
.
nn
.
embedding_lookup
(
embeddingW
,
input
)
# B x seqlen x hiddensize
...
@@ -102,10 +103,11 @@ class Model(ModelDesc):
...
@@ -102,10 +103,11 @@ class Model(ModelDesc):
def
reset_lstm_state
(
self
):
def
reset_lstm_state
(
self
):
s
=
self
.
state
s
=
self
.
state
z
=
tf
.
zeros_like
(
s
[
0
]
.
c
)
z
=
tf
.
zeros_like
(
s
[
0
]
.
c
)
return
tf
.
group
(
s
[
0
]
.
c
.
assign
(
z
),
ops
=
[]
s
[
0
]
.
h
.
assign
(
z
),
for
k
in
range
(
NUM_LAYER
):
s
[
1
]
.
c
.
assign
(
z
),
ops
.
append
(
s
[
k
]
.
c
.
assign
(
z
))
s
[
1
]
.
h
.
assign
(
z
),
name
=
'reset_lstm_state'
)
ops
.
append
(
s
[
k
]
.
h
.
assign
(
z
))
return
tf
.
group
(
*
ops
,
name
=
'reset_lstm_state'
)
def
_get_optimizer
(
self
):
def
_get_optimizer
(
self
):
lr
=
tf
.
get_variable
(
'learning_rate'
,
initializer
=
1.0
,
trainable
=
False
)
lr
=
tf
.
get_variable
(
'learning_rate'
,
initializer
=
1.0
,
trainable
=
False
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment