Over-Parameterized Variational Optical Flow

被引:0
|
作者
Tal Nir
Alfred M. Bruckstein
Ron Kimmel
机构
[1] Technion—Israel Institute of Technology,Department of Computer Science
关键词
Optical flow; Variational methods;  regularization; Over-parametrization;
D O I
暂无
中图分类号
学科分类号
摘要
A novel optical flow estimation process based on a spatio-temporal model with varying coefficients multiplying a set of basis functions at each pixel is introduced. Previous optical flow estimation methodologies did not use such an over parameterized representation of the flow field as the problem is ill-posed even without introducing any additional parameters: Neighborhood based methods of the Lucas–Kanade type determine the flow at each pixel by constraining the flow to be described by a few parameters in small neighborhoods. Modern variational methods represent the optic flow directly via the flow field components at each pixel. The benefit of over-parametrization becomes evident in the smoothness term, which instead of directly penalizing for changes in the optic flow, accumulates a cost of deviating from the assumed optic flow model. Our proposed method is very general and the classical variational optical flow techniques are special cases of it, when used in conjunction with constant basis functions. Experimental results with the novel flow estimation process yield significant improvements with respect to the best results published so far.
引用
收藏
页码:205 / 216
页数:11
相关论文
共 50 条
  • [21] DO-Conv: Depthwise Over-Parameterized Convolutional Layer
    Cao, Jinming
    Li, Yangyan
    Sun, Mingchao
    Chen, Ying
    Lischinski, Dani
    Cohen-Or, Daniel
    Chen, Baoquan
    Tu, Changhe
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 3726 - 3736
  • [22] Convergence beyond the over-parameterized regime using Rayleigh quotients
    Robin, David A. R.
    Scaman, Kevin
    Lelarge, Marc
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [23] Understanding Implicit Regularization in Over-Parameterized Single Index Model
    Fan, Jianqing
    Yang, Zhuoran
    Yu, Mengxin
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (544) : 2315 - 2328
  • [24] Does Preprocessing Help Training Over-parameterized Neural Networks?
    Song, Zhao
    Yang, Shuo
    Zhang, Ruizhe
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] Convergence beyond the over-parameterized regime using Rayleigh quotients
    Robin, David A.R.
    Scaman, Kevin
    Lelarge, Marc
    Advances in Neural Information Processing Systems, 2022, 35
  • [26] OVER-PARAMETERIZED MODEL OPTIMIZATION WITH POLYAK-LOJASIEWICZ CONDITION
    Chen, Yixuan
    Shi, Yubin
    Dong, Mingzhi
    Yang, Xiaochen
    Li, Dongsheng
    Wang, Yujiang
    Dick, Robert P.
    Lv, Qin
    Zhao, Yingying
    Yang, Fan
    Gu, Ning
    Shang, Li
    11th International Conference on Learning Representations, ICLR 2023, 2023,
  • [27] Rethinking Gauss-Newton for learning over-parameterized models
    Arbel, Michael
    Menegaux, Romain
    Wolinski, Pierre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [28] Equivalence of predictors under real and over-parameterized linear models
    Gan, Shengjun
    Sun, Yuqin
    Tian, Yongge
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (11) : 5368 - 5383
  • [29] Gradient descent optimizes over-parameterized deep ReLU networks
    Zou, Difan
    Cao, Yuan
    Zhou, Dongruo
    Gu, Quanquan
    MACHINE LEARNING, 2020, 109 (03) : 467 - 492
  • [30] Preconditioned Gradient Descent for Over-Parameterized Nonconvex Matrix Factorization
    Zhang, Gavin
    Fattahi, Salar
    Zhang, Richard Y.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34