dddzg/up-detr

[TPAMI 2022 & CVPR2021 Oral] UP-DETR: Unsupervised Pre-training for Object Detection with Transformers

39
/ 100
Emerging

Introduces a random query patch detection pretext task for unsupervised transformer pre-training, eliminating annotation requirements during the initial phase while leveraging SwAV-initialized CNN backbones. Built on the DETR codebase with ResNet-50 backbone and transformer encoder-decoder, it achieves 43.1 AP on COCO after 300-epoch fine-tuning, outperforming supervised ImageNet pre-training with comparable training costs.

489 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 9 / 25
Community 20 / 25

How are scores calculated?

Stars

489

Forks

72

Language

Python

License

Apache-2.0

Last pushed

Jul 19, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/dddzg/up-detr"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.