太阳成8722(中国)有限公司-GREEN NO.1

太阳成8722  >  科学研究  >  科研成果  >  正文
科研成果
李霖的论文在REMOTE SENSING 刊出
发布时间:2018-12-04 13:07:19     发布者:易真     浏览次数:

标题:A Multiple-Feature Reuse Network to Extract Buildings from Remote Sensing Imagery

作者: Li, L (Li, Lin); Liang, J (Liang, Jian); Weng, M (Weng, Min); Zhu, HH (Zhu, Haihong)

来源出版物:REMOTE SENSING 卷:10 期:9 文献编号:1350  DOI10.3390/rs10091350  出版年:SEP 2018

摘要: Automatic building extraction from remote sensing imagery is important in many applications. The success of convolutional neural networks (CNNs) has also led to advances in using CNNs to extract man-made objects from high-resolution imagery. However, the large appearance and size variations of buildings make it difficult to extract both crowded small buildings and large buildings. High-resolution imagery must be segmented into patches for CNN models due to GPU memory limitations, and buildings are typically only partially contained in a single patch with little context information. To overcome the problems involved when using different levels of image features with common CNN models, this paper proposes a novel CNN architecture called a multiple-feature reuse network (MFRN) in which each layer is connected to all the subsequent layers of the same size, enabling the direct use of the hierarchical features in each layer. In addition, the model includes a smart decoder that enables precise localization with less GPU load. We tested our model on a large real-world remote sensing dataset and obtained an overall accuracy of 94.5% and an 85% F1 score, which outperformed the compared CNN models, including a 56-layer fully convolutional DenseNet with 93.8% overall accuracy and an F1 score of 83.5%. The experimental results indicate that the MFRN approach to connecting convolutional layers improves the performance of common CNN models for extracting buildings of different sizes and can achieve high accuracy with a consumer-level GPU.

入藏号:WOS:000449993800030

文献类型:Article

语种:English

作者关键词:  building extraction; deep learning; CNN; FCN

通讯作者地址: Li, L (reprint author), Wuhan Univ, Sch Resource & Environm Sci, 129 Luoyu Rd, Wuhan 430079, Hubei, Peoples R China.
Li, L (reprint author), Wuhan Univ, Collaborat Innovat Ctr Geospatial Technol, 129 Luoyu Rd, Wuhan 430079, Hubei, Peoples R China.

电子邮件地址:lilin@whu.edu.cn; liangjian@whu.edu.cn; wengmin@whu.edu.cn; hhzhu@whu.edu.cn

地址:

[Li, Lin; Liang, Jian; Weng, Min; Zhu, Haihong] Wuhan Univ, Sch Resource & Environm Sci, 129 Luoyu Rd, Wuhan 430079, Hubei, Peoples R China.
[Li, Lin] Wuhan Univ, Collaborat Innovat Ctr Geospatial Technol, 129 Luoyu Rd, Wuhan 430079, Hubei, Peoples R China.

影响因子:3.406


信息服务
学院网站教师登录 学院办公电话 学校信息门户登录

版权所有 © 太阳成8722
地址:湖北省武汉市珞喻路129号 邮编:430079 
电话:027-68778381,68778284,68778296 传真:027-68778893    邮箱:sres@whu.edu.cn

Baidu
sogou