Fast deep autoencoder for federated learning

•A fast and privacy-preserving training algorithm for deep autoencoders is proposed.•The encoder block of the autoencoder is trained using Singular Value Decomposition.•The decoder block of the autoencoder is trained layer by layer.•The non-iterative nature of the proposed training algorithm makes i...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Pattern recognition Ročník 143; s. 109805
Hlavní autoři: Novoa-Paradela, David, Fontenla-Romero, Oscar, Guijarro-Berdiñas, Bertha
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 01.11.2023
Témata:
ISSN:0031-3203
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:•A fast and privacy-preserving training algorithm for deep autoencoders is proposed.•The encoder block of the autoencoder is trained using Singular Value Decomposition.•The decoder block of the autoencoder is trained layer by layer.•The non-iterative nature of the proposed training algorithm makes it faster than other approaches.•The method is especially suitable for federated learning and edge computing environments. This paper presents a novel, fast and privacy preserving implementation of deep autoencoders. DAEF (Deep AutoEncoder for Federated learning), unlike traditional neural networks, trains a deep autoencoder network in a non-iterative way, which drastically reduces training time. Training can be performed incrementally, in parallel and distributed and, thanks to its mathematical formulation, the information to be exchanged does not endanger the privacy of the training data. The method has been evaluated and compared with other state-of-the-art autoencoders, showing interesting results in terms of accuracy, speed and use of available resources. This makes DAEF a valid method for edge computing and federated learning, in addition to other classic machine learning scenarios.
ISSN:0031-3203
DOI:10.1016/j.patcog.2023.109805