2 Star 19 Fork 11

monkey_cici/mmdetection

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
.circleci
.dev_scripts
.github
configs
_base_
albu_example
atss
autoassign
boxinst
bytetrack
carafe
cascade_rcnn
cascade_rpn
centernet
centripetalnet
cityscapes
common
condinst
conditional_detr
convnext
cornernet
crowddet
dab_detr
dcn
dcnv2
ddod
ddq
deepfashion
deepsort
deformable_detr
detectors
detr
dino
double_heads
dsdl
dyhead
dynamic_rcnn
efficientnet
empirical_attention
fast_rcnn
faster_rcnn
fcos
foveabox
fpg
free_anchor
fsaf
gcnet
gfl
ghm
glip
gn+ws
gn
grid_rcnn
groie
grounding_dino
guided_anchoring
hrnet
htc
instaboost
lad
ld
README.md
ld_r101-gflv1-r101-dcn_fpn_2x_coco.py
ld_r18-gflv1-r101_fpn_1x_coco.py
ld_r34-gflv1-r101_fpn_1x_coco.py
ld_r50-gflv1-r101_fpn_1x_coco.py
metafile.yml
legacy_1.x
libra_rcnn
lvis
mask2former
mask2former_vis
mask_rcnn
maskformer
masktrack_rcnn
misc
mm_grounding_dino
ms_rcnn
nas_fcos
nas_fpn
objects365
ocsort
openimages
paa
pafpn
panoptic_fpn
pascal_voc
pisa
point_rend
pvt
qdtrack
queryinst
regnet
reid
reppoints
res2net
resnest
resnet_strikes_back
retinanet
rpn
rtmdet
sabl
scnet
scratch
seesaw_loss
selfsup_pretrain
simple_copy_paste
soft_teacher
solo
solov2
sort
sparse_rcnn
ssd
strong_baselines
strongsort
swin
timm_example
tood
tridentnet
v3det
vfnet
wider_face
yolact
yolo
yolof
yolox
demo
docker
docs
mmdet
projects
requirements
resources
tests
tools
.gitignore
.owners.yml
.pre-commit-config-zh-cn.yaml
.pre-commit-config.yaml
.readthedocs.yml
CITATION.cff
LICENSE
MANIFEST.in
README.md
README_zh-CN.md
dataset-index.yml
model-index.yml
pytest.ini
requirements.txt
setup.cfg
setup.py
克隆/下载
贡献代码
同步代码
Loading...
README

LD

Localization Distillation for Dense Object Detection

Abstract

Knowledge distillation (KD) has witnessed its powerful capability in learning compact models in object detection. Previous KD methods for object detection mostly focus on imitating deep features within the imitation regions instead of mimicking classification logits due to its inefficiency in distilling localization information. In this paper, by reformulating the knowledge distillation process on localization, we present a novel localization distillation (LD) method which can efficiently transfer the localization knowledge from the teacher to the student. Moreover, we also heuristically introduce the concept of valuable localization region that can aid to selectively distill the semantic and localization knowledge for a certain region. Combining these two new components, for the first time, we show that logit mimicking can outperform feature imitation and localization knowledge distillation is more important and efficient than semantic knowledge for distilling object detectors. Our distillation scheme is simple as well as effective and can be easily applied to different dense object detectors. Experiments show that our LD can boost the AP score of GFocal-ResNet-50 with a single-scale 1× training schedule from 40.1 to 42.1 on the COCO benchmark without any sacrifice on the inference speed.

Results and Models

GFocalV1 with LD

Teacher Student Training schedule Mini-batch size AP (val) Config Download
-- R-18 1x 6 35.8
R-101 R-18 1x 6 36.5 config model | log
-- R-34 1x 6 38.9
R-101 R-34 1x 6 39.9 config model | log
-- R-50 1x 6 40.1
R-101 R-50 1x 6 41.0 config model | log
-- R-101 2x 6 44.6
R-101-DCN R-101 2x 6 45.5 config model | log

Note

  • Meaning of Config name: ld_r18(student model)_gflv1(based on gflv1)_r101(teacher model)_fpn(neck)_coco(dataset)_1x(12 epoch).py

Citation

@Inproceedings{zheng2022LD,
  title={Localization Distillation for Dense Object Detection},
  author= {Zheng, Zhaohui and Ye, Rongguang and Wang, Ping and Ren, Dongwei and Zuo, Wangmeng and Hou, Qibin and Cheng, Mingming},
  booktitle={CVPR},
  year={2022}
}
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Python
1
https://gitee.com/monkeycc/mmdetection.git
git@gitee.com:monkeycc/mmdetection.git
monkeycc
mmdetection
mmdetection
main

搜索帮助