Authors
Tabitha Ogilvie, Rachel Player, Joe Rowell
Publication date
2020
Journal
Cryptology ePrint Archive
Description
The fixed-Hessian minimisation method can be used to implement privacy-preserving machine learning training from homomorphic encryption. This is a relatively under-explored part of the literature, with the main prior work being that of Bonte and Vercauteren (BMC Medical Genomics, 2018), who proposed a simplified Hessian method for logistic regression training over the BFV homomorphic encryption scheme. Our main contribution is to revisit the fixed-Hessian approach for logistic regression training over the CKKS homomorphic encryption scheme. We improve on the prior work in several aspects, most notably showing how the native encoding in CKKS can be used to take advantage of SIMD operations. We implement our new fixed-Hessian approach in SEAL and compare it to more commonly-used minimisation methods, based on Gradient Descent and Nesterov’s Accelerated Gradient Descent. We find that the fixed-Hessian approach exhibits fast run time and comparable accuracy to these alternative approaches. Moreover, it can be argued to be more practical in the privacy-preserving training context, as no step size parameter needs to be chosen. As an additional contribution, we describe and implement three distinct training algorithms, based on Gradient Descent, Nesterov’s Accelerated Gradient Descent, and a fixed-Hessian method respec-tively, to achieve privacy-preserving ridge regression training from homomorphic encryption. To the best of our knowledge, this is the first time homomorphic encryption has been used to implement ridge regression training on encrypted data.
Total citations
202220232024114
Scholar articles