Deep Convolutional Neural Network for Flood Extent Mapping Using Unmanned Aerial Vehicles Data

Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measure...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Sensors (Basel, Switzerland) Ročník 19; číslo 7; s. 1486
Hlavní autori: Gebrehiwot, Asmamaw, Hashemi-Beni, Leila, Thompson, Gary, Kordjamshidi, Parisa, Langan, Thomas E.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Switzerland MDPI 27.03.2019
MDPI AG
Predmet:
ISSN:1424-8220, 1424-8220
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the model on the new UAV imagery dataset. This approach allowed FCN-16s to be trained on the datasets that contained only one hundred training samples, and resulted in a highly accurate classification. Confusion matrix was calculated to estimate the accuracy of the proposed method. The image segmentation results obtained from FCN-16s were compared from the results obtained from FCN-8s, FCN-32s and SVMs. Experimental results showed that the FCNs could extract flooded areas precisely from UAV images compared to the traditional classifiers such as SVMs. The classification accuracy achieved by FCN-16s, FCN-8s, FCN-32s, and SVM for the water class was 97.52%, 97.8%, 94.20% and 89%, respectively.
AbstractList Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the model on the new UAV imagery dataset. This approach allowed FCN-16s to be trained on the datasets that contained only one hundred training samples, and resulted in a highly accurate classification. Confusion matrix was calculated to estimate the accuracy of the proposed method. The image segmentation results obtained from FCN-16s were compared from the results obtained from FCN-8s, FCN-32s and SVMs. Experimental results showed that the FCNs could extract flooded areas precisely from UAV images compared to the traditional classifiers such as SVMs. The classification accuracy achieved by FCN-16s, FCN-8s, FCN-32s, and SVM for the water class was 97.52%, 97.8%, 94.20% and 89%, respectively.
Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the model on the new UAV imagery dataset. This approach allowed FCN-16s to be trained on the datasets that contained only one hundred training samples, and resulted in a highly accurate classification. Confusion matrix was calculated to estimate the accuracy of the proposed method. The image segmentation results obtained from FCN-16s were compared from the results obtained from FCN-8s, FCN-32s and SVMs. Experimental results showed that the FCNs could extract flooded areas precisely from UAV images compared to the traditional classifiers such as SVMs. The classification accuracy achieved by FCN-16s, FCN-8s, FCN-32s, and SVM for the water class was 97.52%, 97.8%, 94.20% and 89%, respectively.Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the model on the new UAV imagery dataset. This approach allowed FCN-16s to be trained on the datasets that contained only one hundred training samples, and resulted in a highly accurate classification. Confusion matrix was calculated to estimate the accuracy of the proposed method. The image segmentation results obtained from FCN-16s were compared from the results obtained from FCN-8s, FCN-32s and SVMs. Experimental results showed that the FCNs could extract flooded areas precisely from UAV images compared to the traditional classifiers such as SVMs. The classification accuracy achieved by FCN-16s, FCN-8s, FCN-32s, and SVM for the water class was 97.52%, 97.8%, 94.20% and 89%, respectively.
Author Hashemi-Beni, Leila
Thompson, Gary
Langan, Thomas E.
Gebrehiwot, Asmamaw
Kordjamshidi, Parisa
AuthorAffiliation 3 Computer Science Department, Tulane University, 6823 St. Charles Avenue, Tulane University, New Orleans, LA 70118, USA; pkordjam@tulane.edu
1 Geomatics Program, Department of Built Environment, North Carolina A&T State University, Greensboro, NC 27411, USA; aagebrehiwot@aggies.ncat.edu
4 Florida Institute for Human and Machine Cognition, Pensacola, FL 32502, USA
2 North Carolina Emergency Management, Geodetic Survey, Raleigh, NC 27699-4298, USA; gary.thompson@ncdps.gov (G.T.); tom.langan@ncdps.gov (T.E.L.)
AuthorAffiliation_xml – name: 3 Computer Science Department, Tulane University, 6823 St. Charles Avenue, Tulane University, New Orleans, LA 70118, USA; pkordjam@tulane.edu
– name: 1 Geomatics Program, Department of Built Environment, North Carolina A&T State University, Greensboro, NC 27411, USA; aagebrehiwot@aggies.ncat.edu
– name: 2 North Carolina Emergency Management, Geodetic Survey, Raleigh, NC 27699-4298, USA; gary.thompson@ncdps.gov (G.T.); tom.langan@ncdps.gov (T.E.L.)
– name: 4 Florida Institute for Human and Machine Cognition, Pensacola, FL 32502, USA
Author_xml – sequence: 1
  givenname: Asmamaw
  surname: Gebrehiwot
  fullname: Gebrehiwot, Asmamaw
– sequence: 2
  givenname: Leila
  surname: Hashemi-Beni
  fullname: Hashemi-Beni, Leila
– sequence: 3
  givenname: Gary
  surname: Thompson
  fullname: Thompson, Gary
– sequence: 4
  givenname: Parisa
  surname: Kordjamshidi
  fullname: Kordjamshidi, Parisa
– sequence: 5
  givenname: Thomas E.
  surname: Langan
  fullname: Langan, Thomas E.
BackLink https://www.ncbi.nlm.nih.gov/pubmed/30934695$$D View this record in MEDLINE/PubMed
BookMark eNplkk1v1DAQhi1URD8P_AGUIxyW-itxfEGqth9UKnChPdaa2JOtS9ZO7aTAv292t61auHis8TvPa3tml2yFGJCQ94x-FkLTw8w0VUzW1RuywySXs5pzuvViv012c76llAsh6ndkW1AtZKXLHXJ9jNgX8xjuYzcOPgboiu84pnUYfsf0q2hjKk67GF1x8mfAMBTfoO99WBSXeb2GJYSArjjC5KeyK7zxtsNcHMMA--RtC13Gg8e4Ry5PT37Ov84ufpydz48uZrakaphJCdZh6TivLK1aq3TTWCegptZaxZzkjQYOnJWNUsKiqyVVXCNlCMIyEHvkfMN1EW5Nn_wS0l8TwZt1IqaFgTSs7mUqoKigdaypuWQ119IBVRQstLaRzYr1ZcPqx2aJzk5Pnr7jFfT1SfA3ZhHvTSWVLoWaAB8fASnejZgHs_TZYtdBwDhmMzWEM61LWU3SDy-9nk2eGjQJDjcCm2LOCVtj_QCrRk3WvjOMmtUImOcRmCo-_VPxBP1f-wAUs7IM
CitedBy_id crossref_primary_10_3390_s24217090
crossref_primary_10_1016_j_jhydrol_2020_125092
crossref_primary_10_3390_jmse11112154
crossref_primary_10_1016_j_mehy_2019_109503
crossref_primary_10_1016_j_ijdrr_2020_102030
crossref_primary_10_1016_j_ijdrr_2019_101288
crossref_primary_10_1007_s11600_024_01481_6
crossref_primary_10_1029_2023WR034545
crossref_primary_10_1109_TGRS_2024_3494257
crossref_primary_10_1111_jfr3_12684
crossref_primary_10_3390_rs14246374
crossref_primary_10_1016_j_jclepro_2021_127594
crossref_primary_10_1061__ASCE_WR_1943_5452_0001615
crossref_primary_10_1016_j_envsoft_2021_105186
crossref_primary_10_1016_j_isprsjprs_2022_03_013
crossref_primary_10_3390_rs15133263
crossref_primary_10_1016_j_autcon_2021_103916
crossref_primary_10_1109_JSTARS_2025_3584282
crossref_primary_10_1016_j_jag_2025_104720
crossref_primary_10_3390_rs11212492
crossref_primary_10_1016_j_scitotenv_2021_146927
crossref_primary_10_1016_j_gsf_2020_09_007
crossref_primary_10_3390_rs12152455
crossref_primary_10_1088_1361_6501_adf871
crossref_primary_10_3390_rs13163165
crossref_primary_10_1007_s42979_024_03066_y
crossref_primary_10_1080_10106049_2023_2252389
crossref_primary_10_3390_rs15225357
crossref_primary_10_1016_j_indcrop_2022_115762
crossref_primary_10_1016_j_rse_2023_113452
crossref_primary_10_3390_rs17081351
crossref_primary_10_1002_spe_2964
crossref_primary_10_1016_j_jhydrol_2022_127726
crossref_primary_10_1080_10106049_2022_2046866
crossref_primary_10_1016_j_jhydrol_2023_130148
crossref_primary_10_3390_s22051824
crossref_primary_10_3390_fractalfract6030134
crossref_primary_10_1016_j_ijdrr_2024_104359
crossref_primary_10_1007_s11707_022_0985_2
crossref_primary_10_1016_j_jhydrol_2024_131475
crossref_primary_10_3390_rs15082001
crossref_primary_10_1016_j_catena_2025_108919
crossref_primary_10_3390_rs16122193
crossref_primary_10_1016_j_jhydrol_2024_131755
crossref_primary_10_1016_j_jhydrol_2019_124482
crossref_primary_10_1109_JIOT_2021_3098379
crossref_primary_10_3390_rs14010223
crossref_primary_10_1007_s10586_021_03472_4
crossref_primary_10_1080_19475705_2022_2060138
crossref_primary_10_3390_w12020521
crossref_primary_10_3390_rs12193270
crossref_primary_10_1007_s44288_025_00145_2
crossref_primary_10_3390_electronics12234795
crossref_primary_10_1080_10106049_2022_2112982
crossref_primary_10_1016_j_earscirev_2024_104775
crossref_primary_10_1111_risa_14317
crossref_primary_10_1109_JSTARS_2023_3317500
crossref_primary_10_1016_j_engappai_2023_106476
crossref_primary_10_1109_ACCESS_2020_2965231
crossref_primary_10_1109_TGRS_2024_3502659
crossref_primary_10_1007_s40789_023_00622_4
crossref_primary_10_3390_geosciences10050177
crossref_primary_10_1016_j_jhydrol_2025_133687
crossref_primary_10_1016_j_ecohyd_2023_02_002
crossref_primary_10_1016_j_envsoft_2022_105333
crossref_primary_10_5194_hess_27_4135_2023
crossref_primary_10_1007_s10661_024_13597_9
crossref_primary_10_1016_j_isprsjprs_2021_08_016
crossref_primary_10_5194_hess_26_4345_2022
crossref_primary_10_1016_j_ijdrr_2024_104629
crossref_primary_10_1016_j_compenvurbsys_2021_101628
crossref_primary_10_1109_JSTARS_2022_3219724
crossref_primary_10_1080_19475705_2024_2364777
crossref_primary_10_3390_drones7010032
crossref_primary_10_1080_19479832_2020_1864787
crossref_primary_10_1016_j_envsoft_2023_105939
crossref_primary_10_1109_ACCESS_2024_3487413
crossref_primary_10_5194_nhess_20_907_2020
crossref_primary_10_1007_s10712_020_09611_7
crossref_primary_10_1016_j_procs_2024_09_395
crossref_primary_10_1016_j_wasec_2023_100141
crossref_primary_10_2166_bgs_2025_042
crossref_primary_10_1109_JSTARS_2022_3215730
crossref_primary_10_1016_j_jhydrol_2020_125734
crossref_primary_10_1080_10106049_2021_1986578
crossref_primary_10_1109_TGRS_2024_3407200
crossref_primary_10_3390_su13147547
crossref_primary_10_1080_10106049_2022_2032387
crossref_primary_10_1007_s11269_021_03051_7
crossref_primary_10_3390_rs12183053
crossref_primary_10_1109_JSTARS_2021_3092340
crossref_primary_10_1155_2023_5672401
crossref_primary_10_3390_geosciences9070323
crossref_primary_10_1016_j_inffus_2024_102445
crossref_primary_10_3390_rs11192331
crossref_primary_10_1109_JSTARS_2021_3051873
crossref_primary_10_3390_w17071005
crossref_primary_10_3390_math10244735
crossref_primary_10_1155_2022_2887502
crossref_primary_10_1002_gj_5196
crossref_primary_10_1145_3703157
crossref_primary_10_3389_frwa_2024_1346104
crossref_primary_10_3390_su15107897
crossref_primary_10_1016_j_rsase_2022_100896
crossref_primary_10_3390_rs12152490
crossref_primary_10_1007_s10812_023_01489_8
crossref_primary_10_3390_ijgi10030144
crossref_primary_10_1016_j_envsoft_2024_106252
crossref_primary_10_1016_j_jag_2021_102456
crossref_primary_10_1016_j_jhydrol_2020_125033
crossref_primary_10_3390_s22082888
crossref_primary_10_1016_j_rsase_2025_101571
crossref_primary_10_1109_ACCESS_2020_3030112
crossref_primary_10_3390_rs13071359
crossref_primary_10_1111_jfr3_12622
crossref_primary_10_1007_s40808_024_01972_x
crossref_primary_10_1016_j_jag_2024_103996
crossref_primary_10_3390_ijgi13120419
crossref_primary_10_1002_sd_3074
crossref_primary_10_1002_ima_22781
crossref_primary_10_1016_j_rse_2023_113956
crossref_primary_10_1109_JSTARS_2020_3047677
crossref_primary_10_1007_s11356_022_22375_4
crossref_primary_10_3390_foods11213483
crossref_primary_10_1088_1742_6596_2774_1_012077
crossref_primary_10_3390_s19225012
crossref_primary_10_1016_j_jhydrol_2022_127788
crossref_primary_10_1016_j_rse_2023_113556
crossref_primary_10_3390_app11104493
crossref_primary_10_1007_s41748_023_00369_7
crossref_primary_10_1007_s11069_021_05098_6
Cites_doi 10.1007/978-3-319-24574-4_28
10.1117/1.JRS.11.042609
10.1109/TGRS.2005.846154
10.5194/isprs-annals-III-3-473-2016
10.3390/s18113843
10.1162/neco.2008.12-07-661
10.1016/j.isprsjprs.2017.11.021
10.1109/TPAMI.2016.2644615
10.1007/978-3-642-46466-9_18
10.1109/CVPR.2015.7298965
10.1109/CVPR.2015.7298594
10.1155/2015/258619
10.3390/rs9050498
10.3390/w7041437
10.1109/TGRS.2016.2539957
10.4249/scholarpedia.1888
10.1016/S0034-4257(97)00083-7
10.1109/ICASSP.2013.6639347
10.1063/1.4825984
10.1088/1757-899X/245/1/012004
10.1109/IACC.2016.20
10.1080/19475683.2018.1450787
10.3390/s150715717
10.3390/rs71115702
10.1007/978-1-4757-2440-0
10.3390/rs70303372
10.1007/s11069-004-8891-3
ContentType Journal Article
Copyright 2019 by the authors. 2019
Copyright_xml – notice: 2019 by the authors. 2019
DBID AAYXX
CITATION
NPM
7X8
5PM
DOA
DOI 10.3390/s19071486
DatabaseName CrossRef
PubMed
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
PubMed
MEDLINE - Academic
DatabaseTitleList
PubMed
MEDLINE - Academic

CrossRef
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_6a0e7afd1b82418294da070acafcb4ba
PMC6479537
30934695
10_3390_s19071486
Genre Journal Article
GrantInformation_xml – fundername: North Carolina Collaboratory Policy
  grantid: Collaboratory
– fundername: National Science Foundation
  grantid: 1800768
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFFHD
AFKRA
AFZYC
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PJZUB
PPXIY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
3V.
ABJCF
ALIPV
ARAPS
HCIFZ
KB.
M7S
NPM
PDBOC
7X8
5PM
ID FETCH-LOGICAL-c507t-44acde5d226c06fc79bbcd3a80ccc71d42b9a2a215b773ced840729e01ea3c1a3
IEDL.DBID DOA
ISICitedReferencesCount 149
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000465570700005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1424-8220
IngestDate Fri Oct 03 12:51:21 EDT 2025
Tue Nov 04 02:00:31 EST 2025
Sun Nov 09 05:15:55 EST 2025
Wed Feb 19 02:35:08 EST 2025
Tue Nov 18 21:06:59 EST 2025
Sat Nov 29 07:16:26 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 7
Keywords floodplain mapping
fully convolutional network
remote sensing
convolutional neural networks
unmanned aerial vehicles
geospatial data processing
Language English
License Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c507t-44acde5d226c06fc79bbcd3a80ccc71d42b9a2a215b773ced840729e01ea3c1a3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
OpenAccessLink https://doaj.org/article/6a0e7afd1b82418294da070acafcb4ba
PMID 30934695
PQID 2202199546
PQPubID 23479
ParticipantIDs doaj_primary_oai_doaj_org_article_6a0e7afd1b82418294da070acafcb4ba
pubmedcentral_primary_oai_pubmedcentral_nih_gov_6479537
proquest_miscellaneous_2202199546
pubmed_primary_30934695
crossref_citationtrail_10_3390_s19071486
crossref_primary_10_3390_s19071486
PublicationCentury 2000
PublicationDate 20190327
PublicationDateYYYYMMDD 2019-03-27
PublicationDate_xml – month: 3
  year: 2019
  text: 20190327
  day: 27
PublicationDecade 2010
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
PublicationTitle Sensors (Basel, Switzerland)
PublicationTitleAlternate Sensors (Basel)
PublicationYear 2019
Publisher MDPI
MDPI AG
Publisher_xml – name: MDPI
– name: MDPI AG
References Hirpa (ref_6) 2015; 7
Ball (ref_18) 2017; 11
ref_36
ref_13
ref_34
ref_33
Sugata (ref_31) 2017; 273
Paoletti (ref_26) 2018; 145
ref_30
Badrinarayanan (ref_20) 2017; 39
Nguyen (ref_24) 2013; 1558
Ireland (ref_11) 2015; 7
ref_19
LeCun (ref_28) 1995; Volume 3361
Pradhan (ref_7) 2016; 54
ref_17
Bruzzone (ref_35) 2005; 43
Boccardo (ref_10) 2015; 15
ref_15
ref_37
Hu (ref_16) 2015; 2015
ref_25
ref_23
Jonkman (ref_2) 2005; 34
ref_22
ref_21
Feng (ref_9) 2015; 7
Huang (ref_5) 2018; 24
ref_1
ref_3
Sutskever (ref_12) 2008; 20
Grossberg (ref_14) 2013; 8
ref_29
ref_27
ref_8
ref_4
Stehman (ref_32) 1997; 62
References_xml – ident: ref_21
  doi: 10.1007/978-3-319-24574-4_28
– ident: ref_3
– volume: 11
  start-page: 042609
  year: 2017
  ident: ref_18
  article-title: Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community
  publication-title: J. Appl. Remote Sens.
  doi: 10.1117/1.JRS.11.042609
– volume: 43
  start-page: 1351
  year: 2005
  ident: ref_35
  article-title: Kernel-based methods for hyperspectral image classification
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2005.846154
– ident: ref_22
  doi: 10.5194/isprs-annals-III-3-473-2016
– ident: ref_37
  doi: 10.3390/s18113843
– volume: 20
  start-page: 2629
  year: 2008
  ident: ref_12
  article-title: Deep, narrow sigmoid belief networks are universal approximators
  publication-title: Neural Comput.
  doi: 10.1162/neco.2008.12-07-661
– ident: ref_1
– volume: 145
  start-page: 120
  year: 2018
  ident: ref_26
  article-title: A new deep convolutional neural network for fast hyperspectral image classification
  publication-title: SPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2017.11.021
– ident: ref_23
– volume: 39
  start-page: 2481
  year: 2017
  ident: ref_20
  article-title: Segnet: A deep convolutional encoder-decoder
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2016.2644615
– ident: ref_27
  doi: 10.1007/978-3-642-46466-9_18
– ident: ref_19
  doi: 10.1109/CVPR.2015.7298965
– ident: ref_30
  doi: 10.1109/CVPR.2015.7298594
– ident: ref_8
– volume: 2015
  start-page: 258619
  year: 2015
  ident: ref_16
  article-title: Deep convolutional neural networks for hyperspectral image classification
  publication-title: J. Sens.
  doi: 10.1155/2015/258619
– ident: ref_15
  doi: 10.3390/rs9050498
– ident: ref_25
– ident: ref_4
– ident: ref_29
– volume: 7
  start-page: 1437
  year: 2015
  ident: ref_9
  article-title: Urban flood mapping based on unmanned aerial vehicle remote sensing and random forest classifier—A case of Yuyao, China
  publication-title: Water
  doi: 10.3390/w7041437
– volume: Volume 3361
  start-page: 1995
  year: 1995
  ident: ref_28
  article-title: Convolutional networks for images, speech, and time series
  publication-title: The Handbook of Brain Theory and Neural Networks
– volume: 54
  start-page: 4331
  year: 2016
  ident: ref_7
  article-title: A new semiautomated detection mapping of flood extent from TerraSAR-X satellite image using rule-based classification and taguchi optimization techniques
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2016.2539957
– volume: 8
  start-page: 1888
  year: 2013
  ident: ref_14
  article-title: Recurrent neural networks
  publication-title: Scholarpedia
  doi: 10.4249/scholarpedia.1888
– volume: 62
  start-page: 77
  year: 1997
  ident: ref_32
  article-title: Selecting and interpreting measures of thematic classification accuracy
  publication-title: Remote Sens. Environ.
  doi: 10.1016/S0034-4257(97)00083-7
– ident: ref_17
  doi: 10.1109/ICASSP.2013.6639347
– volume: 1558
  start-page: 2237
  year: 2013
  ident: ref_24
  article-title: Satellite image classification using convolutional learning
  publication-title: AIP Conf. Proc.
  doi: 10.1063/1.4825984
– volume: 273
  start-page: 012004
  year: 2017
  ident: ref_31
  article-title: Leaf App: Leaf recognition with deep convolutional neural networks
  publication-title: IOP Conf. Ser. Mater. Sci. Eng.
  doi: 10.1088/1757-899X/245/1/012004
– ident: ref_33
  doi: 10.1109/IACC.2016.20
– volume: 24
  start-page: 113
  year: 2018
  ident: ref_5
  article-title: A near real-time flood-mapping approach by integrating social media and post-event satellite imagery
  publication-title: Ann. GIS
  doi: 10.1080/19475683.2018.1450787
– ident: ref_13
– volume: 15
  start-page: 15717
  year: 2015
  ident: ref_10
  article-title: UAV deployment exercise for mapping purposes: Evaluation of emergency response applications
  publication-title: Sensors
  doi: 10.3390/s150715717
– volume: 7
  start-page: 15702
  year: 2015
  ident: ref_6
  article-title: On the use of global flood forecasts and satellite-derived inundation maps for flood monitoring in data-sparse regions
  publication-title: Remote Sens.
  doi: 10.3390/rs71115702
– ident: ref_34
  doi: 10.1007/978-1-4757-2440-0
– ident: ref_36
– volume: 7
  start-page: 3372
  year: 2015
  ident: ref_11
  article-title: Examining the capability of supervised machine learning classifiers in extracting flooded areas from Landsat TM imagery: A case study from a Mediterranean flood
  publication-title: Remote Sens.
  doi: 10.3390/rs70303372
– volume: 34
  start-page: 151
  year: 2005
  ident: ref_2
  article-title: Global perspectives on loss of human life caused by floods
  publication-title: Nat. Hazards
  doi: 10.1007/s11069-004-8891-3
SSID ssj0023338
Score 2.6251092
Snippet Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 1486
SubjectTerms convolutional neural networks
floodplain mapping
fully convolutional network
geospatial data processing
remote sensing
unmanned aerial vehicles
Title Deep Convolutional Neural Network for Flood Extent Mapping Using Unmanned Aerial Vehicles Data
URI https://www.ncbi.nlm.nih.gov/pubmed/30934695
https://www.proquest.com/docview/2202199546
https://pubmed.ncbi.nlm.nih.gov/PMC6479537
https://doaj.org/article/6a0e7afd1b82418294da070acafcb4ba
Volume 19
WOSCitedRecordID wos000465570700005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: DOA
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: M~E
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: Health & Medical Collection
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: 7X7
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: BENPR
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: PIMPY
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3NT9swFH8ajAMcEBswykflTRx2iUhjJ46PfLTapLWqJpjKhcixXYo0UtSmHPnbec9JqxYh7bKLI9lObL33kvd78fPPAKeulad2GCUBT0UcoMdLAmViFxCTuLE8RsTrNwr_kr1eOhio_tJRX5QTVtEDV4I7S3TopB5afCY6mzRSwmo0U2300OQi99AolGoeTNWhFsfIq-IR4hjUn03R7UkE_smK9_Ek_e8hy7cJkksep7MD2zVUZOfVFD_BB1d8hq0lAsFduLty7oldjovn2oKwP9Ft-IvP72YISlmHstNZm_53l6yriZLhnvlkAXZTPGr61LJzb4rsjxv5RDl2pUu9Bzed9vXlj6A-MSEwiOvKQAhtrIstYioT0i4elecocZ2GxhjZsiLKlY40uvlcSm6cTYkfTbmw5TQ3Lc33Yb0YF-4AGN6nMLiynBsjsDUX8ZDz0CLccag_3oDvc0lmpqYTp1Mt_mYYVpDQs4XQG_Bt0fWp4tB4r9MFqWPRgWivfQUaQ1YbQ_YvY2jA17kyM3xNaO1DF248m2ZRhGCGuO9woC-VchdD0WKwSFTcALmi9pW5rLYUDyNPxZ0IqWIuD__H5I9gE9GYogS3SB7DejmZuRPYMM_lw3TShDU5kL5Mm_Dxot3r_256m8ey-9LGuv7Pbv_2FaBdClI
linkProvider Directory of Open Access Journals
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+Convolutional+Neural+Network+for+Flood+Extent+Mapping+Using+Unmanned+Aerial+Vehicles+Data&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Gebrehiwot%2C+Asmamaw&rft.au=Hashemi-Beni%2C+Leila&rft.au=Thompson%2C+Gary&rft.au=Kordjamshidi%2C+Parisa&rft.date=2019-03-27&rft.pub=MDPI&rft.eissn=1424-8220&rft.volume=19&rft.issue=7&rft_id=info:doi/10.3390%2Fs19071486&rft_id=info%3Apmid%2F30934695&rft.externalDocID=PMC6479537
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon