Active Object Detection with Knowledge Aggregation and Distillation from Large Models

被引:2
|
作者
Yang, Dejie [1 ]
Liu, Yang [1 ]
机构
[1] Peking Univ, Wangxuan Inst Comp Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR52733.2024.01573
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurately detecting active objects undergoing state changes is essential for comprehending human interactions and facilitating decision-making. The existing methods for active object detection (AOD) primarily rely on visual appearance of the objects within input, such as changes in size, shape and relationship with hands. However, these visual changes can be subtle, posing challenges, particularly in scenarios with multiple distracting no-change instances of the same category. We observe that the state changes are often the result of an interaction being performed upon the object, thus propose to use informed priors about object related plausible interactions (including semantics and visual appearance) to provide more reliable cues for AOD. Specifically, we propose a knowledge aggregation procedure to integrate the aforementioned informed priors into oracle queries within the teacher decoder, offering more object affordance commonsense to locate the active object. To streamline the inference process and reduce extra knowledge inputs, we propose a knowledge distillation approach that encourages the student decoder to mimic the detection capabilities of the teacher decoder using the oracle query by replicating its predictions and attention. Our proposed framework achieves state-of-the-art performance on four datasets, namely Ego4D, Epic-Kitchens, MECCANO, and 100DOH, which demonstrates the effectiveness of our approach in improving AOD. The code and models are available at https://github.com/idejie/KAD.git.
引用
收藏
页码:16624 / 16633
页数:10
相关论文
共 50 条
  • [41] Localization Distillation for Object Detection
    Zheng, Zhaohui
    Ye, Rongguang
    Hou, Qibin
    Ren, Dongwei
    Wang, Ping
    Zuo, Wangmeng
    Cheng, Ming-Ming
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 10070 - 10083
  • [42] An Approach to Incorporating Implicit Knowledge in Object Detection Models
    Peng, Wenbo
    Huang, Jinjie
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [43] A Light-Weight CNN for Object Detection with Sparse Model and Knowledge Distillation
    Guo, Jing-Ming
    Yang, Jr-Sheng
    Seshathiri, Sankarasrinivasan
    Wu, Hung-Wei
    ELECTRONICS, 2022, 11 (04)
  • [44] CLASS-WISE FM-NMS FOR KNOWLEDGE DISTILLATION OF OBJECT DETECTION
    Liu, Lyuzhuang
    Hirakawa, Tsubasa
    Yamashita, Takayoshi
    Fujiyoshi, Hironobu
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1641 - 1645
  • [45] Online_XKD: An online knowledge distillation model for underwater object detection
    Chen, Xiao
    Chen, Xingwu
    Wu, Fan
    Wang, Haiyan
    Yao, Haiyang
    COMPUTERS & ELECTRICAL ENGINEERING, 2024, 119
  • [46] Directional Alignment Instance Knowledge Distillation for Arbitrary-Oriented Object Detection
    Wang, Ao
    Wang, Hao
    Huang, Zhanchao
    Zhao, Boya
    Li, Wei
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [47] Incremental Deep Learning Method for Object Detection Model Based on Knowledge Distillation
    Fang W.
    Chen A.
    Meng N.
    Cheng H.
    Wang Q.
    Gongcheng Kexue Yu Jishu/Advanced Engineering Sciences, 2022, 54 (06): : 59 - 66
  • [48] Squeezed Deep 6DoF Object Detection using Knowledge Distillation
    Felix, Heitor
    Rodrigues, Walber M.
    Macedo, David
    Simoes, Francisco
    Oliveira, Adriano L., I
    Teichrieb, Veronica
    Zanchettin, Cleber
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [49] One-stage object detection knowledge distillation via adversarial learning
    Na Dong
    Yongqiang Zhang
    Mingli Ding
    Shibiao Xu
    Yancheng Bai
    Applied Intelligence, 2022, 52 : 4582 - 4598
  • [50] One-stage object detection knowledge distillation via adversarial learning
    Dong, Na
    Zhang, Yongqiang
    Ding, Mingli
    Xu, Shibiao
    Bai, Yancheng
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4582 - 4598