lr_schedulers
This file contains learning rate schedulers for tensorflow optimization.
- class tensiometer.synthetic_probability.lr_schedulers.ExponentialDecayScheduler(lr_max, lr_min, roll_off_step, steps)[source]
Exponentially decaying learning rate.
- Parameters:
lr_max – maximum learning rate
lr_min – minimum earning rate
roll_off_step – step at which the scheduler starts rolling off
steps – total number of steps
- class tensiometer.synthetic_probability.lr_schedulers.LRAdaptLossSlopeEarlyStop(monitor='val_loss', factor=np.float64(0.31622776601683794), patience=25, cooldown=10, verbose=0, min_lr=1e-05, threshold=0.0, **kwargs)[source]
Adaptive reduction of learning rate when likelihood improvement stalls for a given number of epochs.
- Parameters:
monitor
factor
patience
cooldown
verbose
min_lr