Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
e457e2db
Commit
e457e2db
authored
Nov 20, 2016
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
ConcatWith
parent
881c6c4b
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
25 additions
and
0 deletions
+25
-0
tensorpack/models/_common.py
tensorpack/models/_common.py
+1
-0
tensorpack/models/nonlin.py
tensorpack/models/nonlin.py
+1
-0
tensorpack/models/shapes.py
tensorpack/models/shapes.py
+23
-0
No files found.
tensorpack/models/_common.py
View file @
e457e2db
...
@@ -33,6 +33,7 @@ def layer_register(
...
@@ -33,6 +33,7 @@ def layer_register(
summary the output(activation) of this layer.
summary the output(activation) of this layer.
Can be overriden when creating the layer.
Can be overriden when creating the layer.
:param log_shape: log input/output shape of this layer
:param log_shape: log input/output shape of this layer
:param use_scope: whether to call this layer with an extra first argument as scope
"""
"""
def
wrapper
(
func
):
def
wrapper
(
func
):
...
...
tensorpack/models/nonlin.py
View file @
e457e2db
...
@@ -64,6 +64,7 @@ def LeakyReLU(x, alpha, name=None):
...
@@ -64,6 +64,7 @@ def LeakyReLU(x, alpha, name=None):
#x = ((1 + alpha) * x + (1 - alpha) * tf.abs(x))
#x = ((1 + alpha) * x + (1 - alpha) * tf.abs(x))
#return tf.mul(x, 0.5, name=name)
#return tf.mul(x, 0.5, name=name)
# TODO wrap it as a layer with use_scope=False?
def
BNReLU
(
x
,
name
=
None
):
def
BNReLU
(
x
,
name
=
None
):
x
=
BatchNorm
(
'bn'
,
x
,
use_local_stat
=
None
)
x
=
BatchNorm
(
'bn'
,
x
,
use_local_stat
=
None
)
x
=
tf
.
nn
.
relu
(
x
,
name
=
name
)
x
=
tf
.
nn
.
relu
(
x
,
name
=
name
)
...
...
tensorpack/models/shapes.py
0 → 100644
View file @
e457e2db
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# File: shapes.py
# Author: Yuxin Wu <ppwwyyxxc@gmail.com>
import
tensorflow
as
tf
from
._common
import
layer_register
__all__
=
[
'ConcatWith'
]
@
layer_register
(
use_scope
=
False
,
log_shape
=
False
)
def
ConcatWith
(
x
,
dim
,
tensor
):
"""
A wrapper around `tf.concat` to support `LinearWrap`
:param x: the input tensor
:param dim: the dimension along which to concatenate
:param tensor: a tensor or list of tensor to concatenate with x. x will be
at the beginning
:return: tf.concat(dim, [x] + [tensor])
"""
if
type
(
tensor
)
!=
list
:
tensor
=
[
tensor
]
return
tf
.
concat
(
dim
,
[
x
]
+
tensor
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment