Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
9995c69e
Commit
9995c69e
authored
Oct 11, 2016
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
use int32 to init global_step explicitly
parent
965aa953
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
3 additions
and
3 deletions
+3
-3
tensorpack/tfutils/common.py
tensorpack/tfutils/common.py
+1
-1
tensorpack/tfutils/symbolic_functions.py
tensorpack/tfutils/symbolic_functions.py
+2
-2
No files found.
tensorpack/tfutils/common.py
View file @
9995c69e
...
@@ -46,7 +46,7 @@ def get_global_step_var():
...
@@ -46,7 +46,7 @@ def get_global_step_var():
assert
scope
.
name
==
''
,
\
assert
scope
.
name
==
''
,
\
"Creating global_step_var under a variable scope would cause problems!"
"Creating global_step_var under a variable scope would cause problems!"
var
=
tf
.
get_variable
(
GLOBAL_STEP_OP_NAME
,
shape
=
[],
var
=
tf
.
get_variable
(
GLOBAL_STEP_OP_NAME
,
shape
=
[],
initializer
=
tf
.
constant_initializer
(),
initializer
=
tf
.
constant_initializer
(
value
=
0
,
dtype
=
tf
.
int32
),
trainable
=
False
,
dtype
=
tf
.
int32
)
trainable
=
False
,
dtype
=
tf
.
int32
)
return
var
return
var
...
...
tensorpack/tfutils/symbolic_functions.py
View file @
9995c69e
...
@@ -72,9 +72,9 @@ def class_balanced_sigmoid_binary_class_cross_entropy(pred, label, name='cross_e
...
@@ -72,9 +72,9 @@ def class_balanced_sigmoid_binary_class_cross_entropy(pred, label, name='cross_e
#eps = 1e-12
#eps = 1e-12
logstable
=
tf
.
log
(
1
+
tf
.
exp
(
-
tf
.
abs
(
z
)))
logstable
=
tf
.
log
(
1
+
tf
.
exp
(
-
tf
.
abs
(
z
)))
loss_pos
=
-
beta
*
tf
.
reduce_mean
(
-
y
*
loss_pos
=
-
beta
*
tf
.
reduce_mean
(
-
y
*
(
logstable
-
tf
.
minimum
(
0
,
z
)))
(
logstable
-
tf
.
minimum
(
0
.0
,
z
)))
loss_neg
=
(
1.
-
beta
)
*
tf
.
reduce_mean
((
y
-
1.
)
*
loss_neg
=
(
1.
-
beta
)
*
tf
.
reduce_mean
((
y
-
1.
)
*
(
logstable
+
tf
.
maximum
(
z
,
0
)))
(
logstable
+
tf
.
maximum
(
z
,
0
.0
)))
cost
=
tf
.
sub
(
loss_pos
,
loss_neg
,
name
=
name
)
cost
=
tf
.
sub
(
loss_pos
,
loss_neg
,
name
=
name
)
return
cost
return
cost
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment