Privacy Preserving Machine Learning with Homomorphic Encryption and Federated Learning

Privacy protection has been an important concern with the great success of machine learning. In this paper, it proposes a multi-party privacy preserving machine learning framework, named PFMLP, based on partially homomorphic encryption and federated learning. The core idea is all learning parties ju...

Full description

Saved in:
Bibliographic Details
Published in:Future internet Vol. 13; no. 4; p. 94
Main Authors: Fang, Haokun, Qian, Quan
Format: Journal Article
Language:English
Published: Basel MDPI AG 01.04.2021
Subjects:
ISSN:1999-5903, 1999-5903
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Privacy protection has been an important concern with the great success of machine learning. In this paper, it proposes a multi-party privacy preserving machine learning framework, named PFMLP, based on partially homomorphic encryption and federated learning. The core idea is all learning parties just transmitting the encrypted gradients by homomorphic encryption. From experiments, the model trained by PFMLP has almost the same accuracy, and the deviation is less than 1%. Considering the computational overhead of homomorphic encryption, we use an improved Paillier algorithm which can speed up the training by 25–28%. Moreover, comparisons on encryption key length, the learning network structure, number of learning clients, etc. are also discussed in detail in the paper.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1999-5903
1999-5903
DOI:10.3390/fi13040094