loss_functions
This file contains the loss functions for the normalizing flow training.
Since we are combining different loss functions we have different options.
- class tensiometer.synthetic_probability.loss_functions.SharpStep(step_epoch=50, value_1=1.0, value_2=0.1, beta=0.0, **kwargs)[source]
Implement sharp stepping between two values
Initialize loss function
- class tensiometer.synthetic_probability.loss_functions.SoftAdapt_weight_loss(tau=1.0, beta=0.0, smoothing=True, smoothing_tau=20, quantity_1='val_rho_loss', quantity_2='val_ee_loss', **kwargs)[source]
Implement SoftAdapt as in arXiv:1912.12355, with optional smoothing
Initialize loss function
- class tensiometer.synthetic_probability.loss_functions.annealed_weight_loss(anneal_epoch=125, lambda_1=1.0, beta=0.0, roll_off_nepoch=10, **kwargs)[source]
Slowly go from density to evidence-error loss.
Initialize loss function
- class tensiometer.synthetic_probability.loss_functions.constant_weight_loss(alpha=1.0, beta=0.0)[source]
Initialize loss function
- class tensiometer.synthetic_probability.loss_functions.random_weight_loss(initial_random_epoch=0, lambda_1=1.0, beta=0.0, **kwargs)[source]
Random weighting of the two loss functions.
Initialize loss function
- class tensiometer.synthetic_probability.loss_functions.standard_loss[source]
Standard density loss function for the normalizing flow.
- call(y_true, y_pred)[source]
Standard normalizing flow loss function is KL divergence of two abstract distributions.