A deep learning approach to water point detection and mapping using street-level imagery

被引:0
|
作者
Patel, Neil [1 ]
机构
[1] MIT, 50 Mem Dr, Cambridge, MA 02139 USA
关键词
deep learning; geographic information systems; street-level imagery; urban; water point mapping; SANITATION; ACCESS;
D O I
10.2166/wpt.2024.197
中图分类号
TV21 [水资源调查与水利规划];
学科分类号
081501 ;
摘要
Households in developing countries often rely on alternative shared water sources that exist outside of the datasets of public service providers. This poses a significant challenge to accurately measuring the number of households outside the public service system that use a safe and accessible water source. This article proposes a novel deep learning approach that utilizes a convolutional neural network to detect water points in street-level imagery from Google Street View. Using a case study of the Agege local government area in Lagos, Nigeria, the model detected 36 previously unregistered water points with 94.7% precision. HIGHLIGHTS The article presents a deep learning model that extracts street-level imagery from Google Street View and uses a YOLOv5 object detection model to detect shared water points using street-level imagery. The model detects 36 previously unregistered water points with 94.7% precision using a pilot study of Agege LGA in Lagos, Nigeria. Model performance is evaluated in terms of inference speed, cost, and scalability.
引用
收藏
页码:3485 / 3494
页数:10
相关论文
共 50 条
  • [1] Mapping trees along urban street networks with deep learning and street-level imagery
    Lumnitz, Stefanie
    Devisscher, Tahia
    Mayaud, Jerome R.
    Radic, Valentina
    Coops, Nicholas C.
    Griess, Verena C.
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2021, 175 : 144 - 157
  • [2] A PILOT STUDY OF URBAN POI MAPPING USING CROWDSOURCED STREET-LEVEL IMAGERY AND DEEP LEARNING
    Liu, Lanfa
    Zhou, Baitao
    Yi, Xuefeng
    XXIV ISPRS CONGRESS IMAGING TODAY, FORESEEING TOMORROW, COMMISSION IV, 2022, 43-B4 : 261 - 266
  • [3] A Hierarchical Urban Forest Index Using Street-Level Imagery and Deep Learning
    Stubbings, Philip
    Peskett, Joe
    Rowe, Francisco
    Arribas-Bel, Dani
    REMOTE SENSING, 2019, 11 (12)
  • [4] Automated building characterization for seismic risk assessment using street-level imagery and deep learning
    Pelizari, Patrick Aravena
    Geiss, Christian
    Aguirre, Paula
    Santa Maria, Hernan
    Merino Pena, Yvonne
    Taubenboeck, Hannes
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2021, 180 : 370 - 386
  • [5] Predicting walking-to-work using street-level imagery and deep learning in seven Canadian cities
    Dany Doiron
    Eleanor M. Setton
    Jeffrey R. Brook
    Yan Kestens
    Gavin R. McCormack
    Meghan Winters
    Mahdi Shooshtari
    Sajjad Azami
    Daniel Fuller
    Scientific Reports, 12
  • [6] Predicting walking-to-work using street-level imagery and deep learning in seven Canadian cities
    Doiron, Dany
    Setton, Eleanor M.
    Brook, Jeffrey R.
    Kestens, Yan
    McCormack, Gavin R.
    Winters, Meghan
    Shooshtari, Mahdi
    Azami, Sajjad
    Fuller, Daniel
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [7] USING STREET-LEVEL IMAGES AND DEEP LEARNING FOR URBAN LANDSCAPE STUDIES
    Li, Xiaojiang
    Cai, Bill Yang
    Ratti, Carlo
    LANDSCAPE ARCHITECTURE FRONTIERS, 2018, 6 (02) : 20 - 29
  • [8] Crowd-Mapping Urban Objects from Street-Level Imagery
    Qiu, Sihang
    Psyllidis, Achilleas
    Bozzon, Alessandro
    Houben, Geert-Jan
    WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 1521 - 1531
  • [9] Automatic Dense Visual Semantic Mapping from Street-Level Imagery
    Sengupta, Sunando
    Sturgess, Paul
    Ladicky, L'ubor
    Torr, Philip H. S.
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 857 - 862
  • [10] Investigating the use of deep learning models for land cover classification from street-level imagery
    Tsutsumida, Narumasa
    Zhao, Jing
    Shibuya, Naho
    Nasahara, Kenlo
    Tadono, Takeo
    ECOLOGICAL RESEARCH, 2024, 39 (05) : 757 - 765