Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression

摘要

邊框迴歸是目標檢測的關鍵步驟。在現有的方法中,雖然在邊框迴歸中廣泛採用了n範數損失,但並不是針對評價指標IoU進行定製的。近年來,人們提出了IoU損失和G-IoU損失兩種可以有利於提升IoU指標的方法,但仍存在收斂速度慢、迴歸不準確等問題。在這篇論文中,通過合併預測框和目標框之間的歸一化距離,我們提出了一種Distance -IoU損失,它在訓練過程中比IoU和GIoU損失收斂得更快。此外,本文還總結了邊界框迴歸中的三個幾何因素:重疊面積,中心點距離和長寬比。在此基礎上,提出了一個Complete IoU (CIoU)損失,從而可以更快的收斂並得到更好的性能。通過將DIoU和CIoU損失合併到最先進的目標檢測算法中,例如YOLO v3、SSD和Faster RCNN,我們在IoU度量和GIoU度量方面都取得了顯著的性能改善。另外,DIoU可以很容易的被引入到非最大抑制(non-maximum suppression, NMS)中作爲判據,進一步促進了性能的提升。

創新點:

1. A Distance-IoU loss, i.e., DIoU loss, is proposed for bounding box regression,which has faster convergence than IoU and GIoU losses.

2. A Complete IoU loss, i.e., CIoU loss, is further proposed by considering three geometric measures, i.e., overlap area, central point distance and aspect ratio, which better describes the regression of rectangular boxes.

3. DIoU is deployed in NMS, and is more robust than original NMS for suppressing redundant boxes.

4. The proposed methods can be easily incorporated into the state-of-the-art detection algorithms, achieving notable performance gains.

開源項目:

https://github.com/Zzh-tju/DIoU

https://github.com/generalized-iou/g-darknet  

https://github.com/JaryHuang/awesome SSD FPN GIoU

https://github.com/generalized-iou/Detectron.pytorch

發佈了11 篇原創文章 · 獲贊 2 · 訪問量 1796
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章