Synthetic time series dataset generation for unsupervised autoencoders
In Machine Learning, large models need to have access to a huge amount of training data. This requirement applies to many applications in an industrial environment. Furthermore, in specific processes it is not easy to obtain such a large amount of data for different reasons ranging from privacy, sec...
Saved in:
| Published in: | 2022 IEEE 27th International Conference on Emerging Technologies and Factory Automation (ETFA) pp. 1 - 8 |
|---|---|
| Main Authors: | , , |
| Format: | Conference Proceeding |
| Language: | English |
| Published: |
IEEE
06.09.2022
|
| Subjects: | |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In Machine Learning, large models need to have access to a huge amount of training data. This requirement applies to many applications in an industrial environment. Furthermore, in specific processes it is not easy to obtain such a large amount of data for different reasons ranging from privacy, security, and even process affectations. Therefore, this work proposes the creation of synthetic time series datasets to simulate processes out of a given subset of functional relationships. Moreover, using transfer learning to improve the performance of four autoencoder architectures in terms of unsupervised time series reconstruction, requiring fewer target data for training. We outline multiple concepts of data generation and use statistical analysis to evaluate the dataset performance and complexity. Further, the data is used to train unsupervised models and enables them to improve their reconstruction performance over 52 sensors and multiple fault cases. By reducing the amount of available train data, we still gain sufficient results through the pre-training. Overall, we see significant performance and interpretability improvements on a new time series analysis approach named Bag-of-Functions compared to convolutional and linear autoencoders. |
|---|---|
| DOI: | 10.1109/ETFA52439.2022.9921598 |