pdf code

Generalized Focal Loss: Learning Qualified and Distributed Bounding Boxes for Dense Object Detection

这篇paper的初衷是分析IoU Centerness与classification loss的相关问题,在NMS的时候,我们使用的是IoU Centerness和cls Score的乘积,但是训练的时候,cls Score使用focal loss而IoU Centerness被视为回归问题。这就构成了一定的不匹配。

本文借此首先提出了 Quality Focal loss,使得focal loss可以统一到"连续的分类问题",其二本文进一步考虑将这个概念进行扩展到Distribution Focal Loss,可以拟合任意loss,最后提出的Generalized Focal loss融合多种情况。

Focal loss

Quality Focal Loss

def quality_focal_loss(
          pred,          # (n, 80)
          label,         # (n) 0, 1-80: 0 is neg, 1-80 is positive
          score,         # (n) reg target 0-1, only positive is good
          weight=None,
          beta=2.0,
          reduction='mean',
          avg_factor=None):
    """
        from https://github.com/implus/GFocal/blob/cc0e72680f16a8abe0770eb531d6baa07a6e511f/mmdet/models/losses/gfocal_loss.py
    """
    # all goes to 0
    pred_sigmoid = pred.sigmoid()
    pt = pred_sigmoid
    zerolabel = pt.new_zeros(pred.shape)
    loss = F.binary_cross_entropy_with_logits(
           pred, zerolabel, reduction='none') * pt.pow(beta)

    label = label - 1
    pos = (label >= 0).nonzero().squeeze(1)
    a = pos
    b = label[pos].long()

    # positive goes to bbox quality
    pt = score[a] - pred_sigmoid[a, b]
    loss[a,b] = F.binary_cross_entropy_with_logits(
           pred[a,b], score[a], reduction='none') * pt.pow(beta)

    loss = weight_reduce_loss(loss, weight, reduction, avg_factor)
    return loss

Distribution Focal Loss (DFL)

当我们使用序列的多个分类值(multi-bin)分类时,inference的时候我们使用

其中 ,直觉就是按照权重使用loss鼓励multibin ground truth临近的两侧分类点的权重。

def distribution_focal_loss(
            pred,
            label,
            weight=None,
            reduction='mean',
            avg_factor=None):
    """
        from https://github.com/implus/GFocal/blob/cc0e72680f16a8abe0770eb531d6baa07a6e511f/mmdet/models/losses/gfocal_loss.py
    """
    disl = label.long()
    disr = disl + 1

    wl = disr.float() - label
    wr = label - disl.float()

    loss = F.cross_entropy(pred, disl, reduction='none') * wl \
         + F.cross_entropy(pred, disr, reduction='none') * wr
    loss = weight_reduce_loss(loss, weight, reduction, avg_factor)
    return loss

Generalized Focal Loss (GFL)