lr_schedulers

This file contains learning rate schedulers for tensorflow optimization.

class tensiometer.synthetic_probability.lr_schedulers.ExponentialDecayScheduler(lr_max, lr_min, roll_off_step, steps)[source]

Exponentially decaying learning rate.

Parameters:
  • lr_max – maximum learning rate

  • lr_min – minimum earning rate

  • roll_off_step – step at which the scheduler starts rolling off

  • steps – total number of steps

class tensiometer.synthetic_probability.lr_schedulers.LRAdaptLossSlopeEarlyStop(monitor='val_loss', factor=np.float64(0.31622776601683794), patience=25, cooldown=10, verbose=0, min_lr=1e-05, threshold=0.0, **kwargs)[source]

Adaptive reduction of learning rate when likelihood improvement stalls for a given number of epochs.

Parameters:
  • monitor

  • factor

  • patience

  • cooldown

  • verbose

  • min_lr

class tensiometer.synthetic_probability.lr_schedulers.PowerLawDecayScheduler(lr_max, lr_min, power, steps)[source]

Power law decaying learning rate.

Parameters:
  • lr_max – maximum learning rate

  • lr_min – minimum earning rate

  • power – power law index

  • steps – total number of steps

class tensiometer.synthetic_probability.lr_schedulers.StepDecayScheduler(lr_max=None, change_every=None, steps=None, steps_per_epoch=None, boundaries=None, values=None)[source]

Learning rate step function changes.

Parameters:
  • lr_max

  • change_every

  • steps

  • steps_per_epoch

  • boundaries

  • values