View-dependent Scene Appearance Synthesis using Inverse Rendering from Light Fields

被引:4
|
作者
Kang, Dahyun [1 ]
Jeon, Daniel S. [1 ]
Kim, Hakyeong [1 ]
Jang, Hyeonjoong [1 ]
Kim, Min H. [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Sch Comp, Daejeon 34141, South Korea
关键词
Light field; view synthesis; inverse rendering;
D O I
10.1109/ICCP51581.2021.9466274
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to enable view-dependent appearance synthesis from the light fields of a scene, it is critical to evaluate the geometric relationships between light and view over surfaces in the scene with high accuracy. Perfect diffuse reflectance is commonly assumed to estimate geometry from light fields via multiview stereo. However, this diffuse surface assumption is invalid with real-world objects. Geometry estimated from light fields is severely degraded over specular surfaces. Additional scene-scale 3D scanning based on active illumination could provide reliable geometry, but it is sparse and thus still insufficient to calculate view-dependent appearance, such as specular reflection, in geometry-based view synthesis. In this work, we present a practical solution of inverse rendering to enable view-dependent appearance synthesis, particularly of scene scale. We enhance the scene geometry by eliminating the specular component, thus enforcing photometric consistency. We then estimate spatially-varying parameters of diffuse, specular, and normal components from wide-baseline light fields. To validate our method, we built a wide-baseline light field imaging prototype that consists of 32 machine vision cameras with fisheye lenses of 185 degrees that cover the forward hemispherical appearance of scenes. We captured various indoor scenes, and results validate that our method can estimate scene geometry and reflectance parameters with high accuracy, enabling view-dependent appearance synthesis at scene scale with high fidelity, i.e., specular reflection changes according to a virtual viewpoint.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Remote view-dependent rendering
    El-Sana, J
    IMMERSIVE PROJECTION TECHNOLOGY AND VIRTUAL ENVIRONMENTS 2001, 2001, : 175 - +
  • [2] Displacement patches for view-dependent rendering
    Yotam Livny
    Gilad Bauman
    Jihad El-Sana
    The Visual Computer, 2012, 28 : 247 - 263
  • [3] Displacement patches for view-dependent rendering
    Livny, Yotam
    Bauman, Gilad
    El-Sana, Jihad
    VISUAL COMPUTER, 2012, 28 (03): : 247 - 263
  • [4] View-dependent Tetrahedral Meshing and Rendering using Arbitrary Segments
    Sondershaus, Ralf
    Strasser, Wolfgang
    JOURNAL OF WSCG, 2006, 2006, 14 (1-3): : 129 - 136
  • [5] A Virtual Reality System using View-dependent Stereoscopic Rendering
    Anh Nguyen Hoang
    Kim, Dongho
    2014 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND APPLICATIONS (ICISA), 2014,
  • [6] Interactive view-dependent rendering of large isosurfaces
    Gregorski, B
    Duchaineau, M
    Lindstrom, P
    Pascucci, V
    Joy, KI
    VIS 2002: IEEE VISUALIZATION 2002, PROCEEDINGS, 2002, : 475 - 482
  • [7] Integrating occlusion culling with view-dependent rendering
    El-Sana, J
    Sokolovsky, N
    Silva, CT
    VISUALIZATION 2001, PROCEEDINGS, 2001, : 371 - 378
  • [8] Salient Clustering for View-dependent Multiresolution Rendering
    Barni, Rodrigo
    Comba, Joao
    Varshney, Amitabh
    2009 XXII BRAZILIAN SYMPOSIUM ON COMPUTER GRAPHICS AND IMAGE PROCESSING (SIBGRAPI 2009), 2009, : 56 - +
  • [9] View-dependent rendering of virtual plane models
    Sheng, Bin
    Wu, Enhua
    ICAT 2006: 16TH INTERNATIONAL CONFERENCE ON ARTIFICIAL REALITY AND TELEXISTENCE - WORSHOPS, PROCEEDINGS, 2006, : 562 - +
  • [10] Capturing and view-dependent rendering of billboard models
    Lee, O
    Bhushan, A
    Diaz-Gutierrez, P
    Gopi, M
    ADVANCES IN VISUAL COMPUTING, PROCEEDINGS, 2005, 3804 : 601 - 606