Fast deep autoencoder for federated learning

•A fast and privacy-preserving training algorithm for deep autoencoders is proposed.•The encoder block of the autoencoder is trained using Singular Value Decomposition.•The decoder block of the autoencoder is trained layer by layer.•The non-iterative nature of the proposed training algorithm makes i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition Jg. 143; S. 109805
Hauptverfasser: Novoa-Paradela, David, Fontenla-Romero, Oscar, Guijarro-Berdiñas, Bertha
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Elsevier Ltd 01.11.2023
Schlagworte:
ISSN:0031-3203
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A fast and privacy-preserving training algorithm for deep autoencoders is proposed.•The encoder block of the autoencoder is trained using Singular Value Decomposition.•The decoder block of the autoencoder is trained layer by layer.•The non-iterative nature of the proposed training algorithm makes it faster than other approaches.•The method is especially suitable for federated learning and edge computing environments. This paper presents a novel, fast and privacy preserving implementation of deep autoencoders. DAEF (Deep AutoEncoder for Federated learning), unlike traditional neural networks, trains a deep autoencoder network in a non-iterative way, which drastically reduces training time. Training can be performed incrementally, in parallel and distributed and, thanks to its mathematical formulation, the information to be exchanged does not endanger the privacy of the training data. The method has been evaluated and compared with other state-of-the-art autoencoders, showing interesting results in terms of accuracy, speed and use of available resources. This makes DAEF a valid method for edge computing and federated learning, in addition to other classic machine learning scenarios.
ISSN:0031-3203
DOI:10.1016/j.patcog.2023.109805