You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In your example, warmup_steps=30000, steps_in_epoch=10000. And condition timestep < warmup_steps is met always.
I guess it's mistake of constants in provided example.
So I wanna use you lr scheduler, but I can't get it how to use it properly.
When I should use scheduler step() in one epoch train part ? Always? Or increment timestep each epoch and when timestep > warmup_steps I will use scheduler.step() ?
scheduler = WarmupReduceLROnPlateauScheduler(
optimizer,
init_lr=1e-10,
peak_lr=1e-4,
warmup_steps=30000,
patience=1,
factor=0.3,
)
for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
if timestep < warmup_steps:
scheduler.step()
val_loss = validate()
scheduler.step(val_loss)
The text was updated successfully, but these errors were encountered:
pikaliov
changed the title
How to use WarmupReduceLROnPlateauScheduler on one epoch train part?
How to use WarmupReduceLROnPlateauScheduler in one epoch train part?
Apr 1, 2023
In your example,
warmup_steps=30000
,steps_in_epoch=10000
. And conditiontimestep < warmup_steps
is met always.I guess it's mistake of constants in provided example.
So I wanna use you lr scheduler, but I can't get it how to use it properly.
When I should use
scheduler step()
in one epoch train part ? Always? Or increment timestep each epoch and whentimestep > warmup_steps
I will usescheduler.step()
?The text was updated successfully, but these errors were encountered: