Authors
Ekanut Sotthiwat, Liangli Zhen, Zengxiang Li, Chi Zhang
Publication date
2021
Conference
IEEE/ACM international Symposium on Cluster, Cloud and Internet Computing (CCGrid)
Description
Multi-party computation (MPC) allows distributed machine learning to be performed in a privacy-preserving manner so that end-hosts are unaware of the true models on the clients. However, the standard MPC algorithm also triggers additional communication and computation costs, due to those expensive cryptography operations and protocols. In this paper, instead of applying heavy MPC over the entire local models for secure model aggregation, we propose to encrypt critical part of model (gradients) parameters to reduce communication cost, while maintaining MPC’s advantages on privacy-preserving without sacrificing accuracy of the learnt joint model. Theoretical analysis and experimental results are provided to verify that our proposed method could prevent deep leakage from gradients attacks from reconstructing original data of individual participants. Experiments using deep learning models over the MNIST …
Total citations
20222023202482414
Scholar articles
E Sotthiwat, L Zhen, Z Li, C Zhang - 2021 IEEE/ACM 21st International Symposium on …, 2021