loss_functions

This file contains the loss functions for the normalizing flow training.

Since we are combining different loss functions we have different options.

class tensiometer.synthetic_probability.loss_functions.SharpStep(step_epoch=50, value_1=1.0, value_2=0.1, beta=0.0, **kwargs)[source]

Implement sharp stepping between two values

Initialize loss function

print_feedback(padding='')[source]

Print feedback to screen

update_lambda_values_on_epoch_begin(epoch, **kwargs)[source]

Update values of lambda at epoch start. Takes in every kwargs to not crowd the interface…

class tensiometer.synthetic_probability.loss_functions.SoftAdapt_weight_loss(tau=1.0, beta=0.0, smoothing=True, smoothing_tau=20, quantity_1='val_rho_loss', quantity_2='val_ee_loss', **kwargs)[source]

Implement SoftAdapt as in arXiv:1912.12355, with optional smoothing

Initialize loss function

print_feedback(padding='')[source]

Print feedback to screen

update_lambda_values_on_epoch_begin(epoch, **kwargs)[source]

Update values of lambda at epoch start. Takes in every kwargs to not crowd the interface…

class tensiometer.synthetic_probability.loss_functions.annealed_weight_loss(anneal_epoch=125, lambda_1=1.0, beta=0.0, roll_off_nepoch=10, **kwargs)[source]

Slowly go from density to evidence-error loss.

Initialize loss function

print_feedback(padding='')[source]

Print feedback to screen

update_lambda_values_on_epoch_begin(epoch, **kwargs)[source]

Update values of lambda at epoch start. Takes in every kwargs to not crowd the interface…

class tensiometer.synthetic_probability.loss_functions.constant_weight_loss(alpha=1.0, beta=0.0)[source]

Initialize loss function

compute_loss(y_true, y_pred, sample_weight)[source]

Combine density and evidence-error loss

compute_loss_components(y_true, y_pred, sample_weight)[source]

Compute different components of the loss function

print_feedback(padding='')[source]

Print feedback to screen

reset()[source]

Reset loss functions hyper parameters

class tensiometer.synthetic_probability.loss_functions.random_weight_loss(initial_random_epoch=0, lambda_1=1.0, beta=0.0, **kwargs)[source]

Random weighting of the two loss functions.

Initialize loss function

print_feedback(padding='')[source]

Print feedback to screen

update_lambda_values_on_epoch_begin(epoch, **kwargs)[source]

Update values of lambda at epoch start. Takes in every kwargs to not crowd the interface…

class tensiometer.synthetic_probability.loss_functions.standard_loss[source]

Standard density loss function for the normalizing flow.

call(y_true, y_pred)[source]

Standard normalizing flow loss function is KL divergence of two abstract distributions.

compute_loss_components(y_true, y_pred, sample_weight)[source]

Compute different components of the loss function

print_feedback(padding='')[source]

Print feedback to screen

reset()[source]

Reset loss functions hyper parameters

class tensiometer.synthetic_probability.loss_functions.variable_weight_loss(lambda_1=1.0, lambda_2=0.0, beta=0.0)[source]

Initialize loss function

compute_loss(y_true, y_pred, sample_weight)[source]

Combine density and evidence-error loss

compute_loss_components(y_true, y_pred, sample_weight, lambda_1=None, lambda_2=None)[source]

Compute different components of the loss function

print_feedback(padding='')[source]

Print feedback to screen

reset()[source]

Reset loss functions hyper parameters

update_lambda_values_on_epoch_begin(epoch, **kwargs)[source]

Update values of lambda at epoch start. Takes in every kwargs to not crowd the interface…