training losses

Asymmetric-multilabel loss

class src.training.losses.asymetric.AsymmetricLossOptimized(gamma_neg=4, gamma_pos=1, clip=0.05, eps=1e-08, disable_torch_grad_focal_loss=False)[source]

Bases: Module

This class is an optimized implementation for the asymmetric loss calculation, it is more memory efficient and allocates better on gpu memory. It also favors inplace operations.

Asymmetric loss is a type of loss function that is used in machine learning to penalize different types of errors differently. This can be useful in tasks where some types of errors are more costly than others.

In multi-label classification, asymmetric loss can be used to penalize false positives more heavily than false negatives. This is because false positives can have a greater impact on the real world. For example, in a medical diagnosis task, a false positive could result in a patient receiving unnecessary treatment.

forward(x, y)[source]

” Loss forward step functoin

Parameters:
  • x – input logits

  • y – targets (multi-label binarized vector)

Returns:

the loss value (float)