Skip to content

scheduler

xvr.model.scheduler

WarmupCosineSchedule

WarmupCosineSchedule(
    optimizer, warmup_steps, t_total, cycles=0.5, last_epoch=-1
)

Linear warmup and then cosine decay. Linearly increases learning rate from 0 to 1 over warmup_steps training steps. Decreases learning rate from 1. to 0. over remaining t_total - warmup_steps steps following a cosine curve. If cycles (default=0.5) is different from default, learning rate follows cosine function after warmup.

Copied from https://github.com/TalSchuster/pytorch-transformers/blob/64fff2a53977ac1caac32c960d2b01f16b7eb913/pytorch_transformers/optimization.py#L64-L81

Source code in src/xvr/model/scheduler.py
16
17
18
19
20
def __init__(self, optimizer, warmup_steps, t_total, cycles=0.5, last_epoch=-1):
    self.warmup_steps = warmup_steps
    self.t_total = t_total
    self.cycles = cycles
    super().__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)