Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
seminar-breakout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Shashank Suhas
seminar-breakout
Commits
b097e7d5
Commit
b097e7d5
authored
Dec 25, 2018
by
Yuxin Wu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
[MaskRCNN] use same actual warmup init LR regardless of total bs
parent
1f498ed6
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
4 additions
and
5 deletions
+4
-5
examples/FasterRCNN/config.py
examples/FasterRCNN/config.py
+3
-4
examples/FasterRCNN/train.py
examples/FasterRCNN/train.py
+1
-1
No files found.
examples/FasterRCNN/config.py
View file @
b097e7d5
...
...
@@ -108,8 +108,9 @@ _C.BACKBONE.STRIDE_1X1 = False # True for MSRA models
# schedule -----------------------
_C
.
TRAIN
.
NUM_GPUS
=
None
# by default, will be set from code
_C
.
TRAIN
.
WEIGHT_DECAY
=
1e-4
_C
.
TRAIN
.
BASE_LR
=
1e-2
# defined for
a total batch size of
8. Otherwise it will be adjusted automatically
_C
.
TRAIN
.
BASE_LR
=
1e-2
# defined for
total batch size=
8. Otherwise it will be adjusted automatically
_C
.
TRAIN
.
WARMUP
=
1000
# in terms of iterations. This is not affected by #GPUs
_C
.
TRAIN
.
WARMUP_INIT_LR
=
1e-2
*
0.33
# defined for total batch size=8. Otherwise it will be adjusted automatically
_C
.
TRAIN
.
STEPS_PER_EPOCH
=
500
_C
.
TRAIN
.
STARTING_EPOCH
=
1
# the first epoch to start with, useful to continue a training
...
...
@@ -235,9 +236,7 @@ def finalize_configs(is_training):
if
is_training
:
train_scales
=
_C
.
PREPROC
.
TRAIN_SHORT_EDGE_SIZE
if
not
isinstance
(
train_scales
,
(
list
,
tuple
)):
train_scales
=
[
train_scales
,
train_scales
]
if
train_scales
[
1
]
-
train_scales
[
0
]
>
100
:
if
isinstance
(
train_scales
,
(
list
,
tuple
))
and
train_scales
[
1
]
-
train_scales
[
0
]
>
100
:
# don't warmup if augmentation is on
os
.
environ
[
'TF_CUDNN_USE_AUTOTUNE'
]
=
'0'
os
.
environ
[
'TF_AUTOTUNE_THRESHOLD'
]
=
'1'
...
...
examples/FasterRCNN/train.py
View file @
b097e7d5
...
...
@@ -548,7 +548,7 @@ if __name__ == '__main__':
stepnum
=
cfg
.
TRAIN
.
STEPS_PER_EPOCH
# warmup is step based, lr is epoch based
init_lr
=
cfg
.
TRAIN
.
BASE_LR
*
0.33
*
min
(
8.
/
cfg
.
TRAIN
.
NUM_GPUS
,
1.
)
init_lr
=
cfg
.
TRAIN
.
WARMUP_INIT_LR
*
min
(
8.
/
cfg
.
TRAIN
.
NUM_GPUS
,
1.
)
warmup_schedule
=
[(
0
,
init_lr
),
(
cfg
.
TRAIN
.
WARMUP
,
cfg
.
TRAIN
.
BASE_LR
)]
warmup_end_epoch
=
cfg
.
TRAIN
.
WARMUP
*
1.
/
stepnum
lr_schedule
=
[(
int
(
warmup_end_epoch
+
0.5
),
cfg
.
TRAIN
.
BASE_LR
)]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment