FOE-based regularization for optical flow estimation from an in-vehicle event camera

被引:0
|
作者
Nagata J. [1 ]
Sekikawa Y. [1 ]
Hara K. [2 ]
Aoki Y. [1 ]
机构
[1] Keio University, 3-14-1. Hiyoshi. Kohoku-ku, Yokohama, Kanagawa
[2] Dcnso IT Laboratory Inc., Shibuya Cross Tower 28th Floor, 2-15-1. Shibuya. Shibuya-ku, Tokyo
来源
IEEJ Transactions on Electronics, Information and Systems | 2019年 / 139卷 / 10期
关键词
Event camera; Focus of expansion; Optical flow;
D O I
10.1541/ieejeiss.139.1113
中图分类号
学科分类号
摘要
Optical flow estimation from an in-vehicle camera is an important task in automatic driving and advanced driver- A ssistance systems. However, there is a problem that optical flow estimation is mistakable with high contrast and high speed. Event camera can overcome these situations because it reports only the per-pixel intensity change with high dynamic range and low latency. However, the LI smoothness regularization in the conventional optical flow estimation method is not suitable for radial optical flow in the driving scene. Therefore, we propose to use the focus of expansion (FOE) for regularization of optical flow estimation in event camera. The FOE is defined as the intersection of the translation vector of the camera and the image plane. The optical flow becomes radial from the FOE excluding the rotational component. Using the property, the optical flow can be regularized in the correct direction in the optimization process. We demonstrated that the optical flow was improved by introducing our regularization using the public dataset. © 2019 The Institute of Electrical Engineers of Japan.
引用
收藏
页码:1113 / 1118
页数:5
相关论文
共 50 条
  • [1] FOE-based Regularization for Optical Flow Estimation from an In-vehicle Event Camera
    Nagata, Jun
    Sekikawa, Yusuke
    Hara, Kosuke
    Aoki, Yoshimitsu
    INTERNATIONAL WORKSHOP ON ADVANCED IMAGE TECHNOLOGY (IWAIT) 2019, 2019, 11049
  • [2] FOE-based regularization for optical flow estimation from an in-vehicle event camera
    Nagata, Jun
    Sekikawa, Yusuke
    Hara, Kosuke
    Aoki, Yoshimitsu
    ELECTRONICS AND COMMUNICATIONS IN JAPAN, 2020, 103 (1-4) : 19 - 25
  • [3] Vehicle Speed Estimation by In-Vehicle Camera
    Kaneko, Hiroki
    Morimoto, Masakazu
    Fujii, Kensaku
    2012 WORLD AUTOMATION CONGRESS (WAC), 2012,
  • [4] Simultaneous Optical Flow and Intensity Estimation from an Event Camera
    Bardow, Patrick
    Davison, Andrew J.
    Leutenegger, Stefan
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 884 - 892
  • [5] Single Image Optical Flow Estimation with an Event Camera
    Pan, Liyuan
    Liu, Miaomiao
    Hartley, Richard
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 1669 - 1678
  • [6] Visibility estimation in foggy conditions by in-vehicle camera and radar
    Mori, Kenji
    Kato, Terutoshi
    Takahashi, Tomokazu
    Ide, Ichiro
    Murase, Hiroshi
    Miyahara, Takayuki
    Tamatsu, Yukimasa
    ICICIC 2006: FIRST INTERNATIONAL CONFERENCE ON INNOVATIVE COMPUTING, INFORMATION AND CONTROL, VOL 2, PROCEEDINGS, 2006, : 548 - +
  • [7] Automatic Calibration of an in-Vehicle Camera based on Structure from Motion
    Hayakawa, Kazutaka
    Nishio, Haruki
    Nakagawa, Yoshiaki
    Sato, Tomokazu
    ITE TRANSACTIONS ON MEDIA TECHNOLOGY AND APPLICATIONS, 2025, 13 (01): : 136 - 146
  • [8] Time to collision estimation for vehicles coming from behind using in-vehicle camera
    Cosic, Luka
    Vranjes, Mario
    Ilkic, Veljko
    Mihic, Velibor
    2019 ZOOMING INNOVATION IN CONSUMER TECHNOLOGIES CONFERENCE (ZINC), 2019, : 109 - 112
  • [9] Camera orientation estimation using voting approach on the Gaussian sphere for in-vehicle camera
    Jo, Youngran
    Jang, Jinbeum
    Shin, Minwoo
    Paik, Joonki
    OPTICS EXPRESS, 2019, 27 (19) : 26600 - 26614
  • [10] Removing Reflection from In-vehicle Camera Image
    Inoue, Keisuke
    Sakaue, Fumihiko
    Sato, Jun
    VISAPP: PROCEEDINGS OF THE 15TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS, VOL 4: VISAPP, 2020, : 222 - 228