A deep convolutional encoder-decoder architecture for autonomous fault detection of PV plants using multi-copters
Uložené v:
| Názov: | A deep convolutional encoder-decoder architecture for autonomous fault detection of PV plants using multi-copters |
|---|---|
| Autori: | Mohammadreza Aghaei, Sayyed Majid Esmailifar, Amirmohammad Moradi Sizkouhi |
| Zdroj: | Solar Energy. 223:217-228 |
| Informácie o vydavateľovi: | Elsevier BV, 2021. |
| Rok vydania: | 2021 |
| Predmety: | 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology |
| Popis: | This study presents an autonomous fault detection method for a wide range of common failures and defects which are visually visible on PV modules. In this paper, we focus especially on detection of bird’s drops as a very typical defect on the PV modules. As a crucial prerequisite, a data-set of aerial imageries of the PV strings affected by bird’s drops were collected through several experimental flight by multi-copters in order to train an accurate fully convolutional deep network. These images are divided into three groups, namely, training, testing, and validation parts. For the purpose of bird’s drops segmentation, an improved encoder-decoder architecture is employed. In this regard, a modified VGG16 model is used as a backbone for the encoder part. The encoder of the network has a very flexible architecture that can be modified and trained for any other visual failure detection. Later on, extracted feature maps of images are imported into a decoder network to map the low resolution features to full resolution ones for pixel-wise segmentation. In addition, an image object positioning algorithm is presented to find the exact position of detected failures in local coordinate system. In a post-processing step, the detected damages are prioritized based on various parameters such as severity of shading and extent of impact on the PV module’s output current. For further validation, different affected PV modules were characterized according to the output patterns of the classification step in order to accurately evaluate the effect of birds’ drops and consequent shading on the parameters of PV modules based on their severity and location. Finally, the training and testing results demonstrate that the proposed FCN network is able to predict precisely covered pixels by bird’s drops on PV modules at pixel level with average accuracies of 98% and 93% for training and testing, respectively. |
| Druh dokumentu: | Article |
| Popis súboru: | |
| Jazyk: | English |
| ISSN: | 0038-092X |
| DOI: | 10.1016/j.solener.2021.05.029 |
| Prístupová URL adresa: | http://ui.adsabs.harvard.edu/abs/2021SoEn..223..217M/abstract https://www.sciencedirect.com/science/article/pii/S0038092X21003935 |
| Rights: | CC BY NC ND |
| Prístupové číslo: | edsair.doi.dedup.....e2d46373fbdafb5b51eedfca99a20107 |
| Databáza: | OpenAIRE |
| Abstrakt: | This study presents an autonomous fault detection method for a wide range of common failures and defects which are visually visible on PV modules. In this paper, we focus especially on detection of bird’s drops as a very typical defect on the PV modules. As a crucial prerequisite, a data-set of aerial imageries of the PV strings affected by bird’s drops were collected through several experimental flight by multi-copters in order to train an accurate fully convolutional deep network. These images are divided into three groups, namely, training, testing, and validation parts. For the purpose of bird’s drops segmentation, an improved encoder-decoder architecture is employed. In this regard, a modified VGG16 model is used as a backbone for the encoder part. The encoder of the network has a very flexible architecture that can be modified and trained for any other visual failure detection. Later on, extracted feature maps of images are imported into a decoder network to map the low resolution features to full resolution ones for pixel-wise segmentation. In addition, an image object positioning algorithm is presented to find the exact position of detected failures in local coordinate system. In a post-processing step, the detected damages are prioritized based on various parameters such as severity of shading and extent of impact on the PV module’s output current. For further validation, different affected PV modules were characterized according to the output patterns of the classification step in order to accurately evaluate the effect of birds’ drops and consequent shading on the parameters of PV modules based on their severity and location. Finally, the training and testing results demonstrate that the proposed FCN network is able to predict precisely covered pixels by bird’s drops on PV modules at pixel level with average accuracies of 98% and 93% for training and testing, respectively. |
|---|---|
| ISSN: | 0038092X |
| DOI: | 10.1016/j.solener.2021.05.029 |
Full Text Finder
Nájsť tento článok vo Web of Science