Over-Parameterized Variational Optical Flow

被引:0
|
作者
Tal Nir
Alfred M. Bruckstein
Ron Kimmel
机构
[1] Technion—Israel Institute of Technology,Department of Computer Science
关键词
Optical flow; Variational methods;  regularization; Over-parametrization;
D O I
暂无
中图分类号
学科分类号
摘要
A novel optical flow estimation process based on a spatio-temporal model with varying coefficients multiplying a set of basis functions at each pixel is introduced. Previous optical flow estimation methodologies did not use such an over parameterized representation of the flow field as the problem is ill-posed even without introducing any additional parameters: Neighborhood based methods of the Lucas–Kanade type determine the flow at each pixel by constraining the flow to be described by a few parameters in small neighborhoods. Modern variational methods represent the optic flow directly via the flow field components at each pixel. The benefit of over-parametrization becomes evident in the smoothness term, which instead of directly penalizing for changes in the optic flow, accumulates a cost of deviating from the assumed optic flow model. Our proposed method is very general and the classical variational optical flow techniques are special cases of it, when used in conjunction with constant basis functions. Experimental results with the novel flow estimation process yield significant improvements with respect to the best results published so far.
引用
收藏
页码:205 / 216
页数:11
相关论文
共 50 条
  • [31] An Improved Analysis of Training Over-parameterized Deep Neural Networks
    Zou, Difan
    Gu, Quanquan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [32] Over-parameterized Adversarial Training: An Analysis Overcoming the Curse of Dimensionality
    Zhang, Yi
    Plevrakis, Orestis
    Du, Simon S.
    Li, Xingguo
    Song, Zhao
    Arora, Sanjeev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [33] On Learning Over-parameterized Neural Networks: A Functional Approximation Perspective
    Su, Lili
    Yang, Pengkun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [34] Rethinking Influence Functions of Neural Networks in the Over-Parameterized Regime
    Zhang, Rui
    Zhang, Shihua
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 9082 - 9090
  • [35] Efficient Uncertainty Quantification and Reduction for Over-Parameterized Neural Networks
    Huang, Ziyi
    Lam, Henry
    Zhang, Haofeng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [36] Training Over-parameterized Models with Non-decomposable Objectives
    Narasimhan, Harikrishna
    Menon, Aditya Krishna
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [37] Exploiting Sparsity in Over-parameterized Federated Learning over Multiple Access Channels
    Kaur, Gagandeep
    Prasad, Ranjitha
    PROCEEDINGS OF 7TH JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE AND MANAGEMENT OF DATA, CODS-COMAD 2024, 2024, : 605 - 606
  • [38] Gibbs-based Information Criteria and the Over-Parameterized Regime
    Chen, Haobo
    Wornell, Gregory W.
    Bu, Yuheng
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [39] PHYSICS-INFORMED DEEP DEBLURRING: OVER-PARAMETERIZED VS. UNDER-PARAMETERIZED
    Banerjee, Abeer
    Saurav, Sumeet
    Singh, Sanjay
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 1615 - 1619
  • [40] Siamese network with a depthwise over-parameterized convolutional layer for visual tracking
    Wang, Yuanyun
    Zhang, Wenshuang
    Zhang, Limin
    Wang, Jun
    PLOS ONE, 2022, 17 (08):