Towards few-annotation learning for object detection: are transformer-based models more efficient ? - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Towards few-annotation learning for object detection: are transformer-based models more efficient ?

Résumé

For specialized and dense downstream tasks such as object detection, labeling data requires expertise and can be very expensive, making few-shot and semi-supervised models much more attractive alternatives. While in the few-shot setup we observe that transformer-based object detectors perform better than convolution-based two-stage models for a similar amount of parameters, they are not as effective when used with recent approaches in the semi-supervised setting. In this paper, we propose a semi-supervised method tailored for the current state-of-the-art object detector Deformable DETR in the few-annotation learning setup using a student-teacher architecture, which avoids relying on a sensitive post-processing of the pseudo-labels generated by the teacher model. We evaluate our method on the semi-supervised object detection benchmarks COCO and Pascal VOC, and it outperforms previous methods, especially when annotations are scarce. We believe that our contributions open new possibilities to adapt similar object detection methods in this setup as well.
Fichier principal
Vignette du fichier
0604_NoteIEEE_CVF.pdf (1 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

cea-04044305 , version 1 (24-03-2023)

Identifiants

Citer

Quentin Bouniot, Angélique Loesch, Amaury Habrard, Romaric Audigier. Towards few-annotation learning for object detection: are transformer-based models more efficient ?. WACV2023 - 2023 IEEE-CVF Winter Conference on Applications of Computer Vision, Jan 2023, Waikoloa, HI, United States. pp.75-84, ⟨10.1109/WACV56688.2023.00016⟩. ⟨cea-04044305⟩
39 Consultations
23 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More