Authors
Cong Xie, Sanmi Koyejo, Indranil Gupta
Publication date
2020/11/21
Conference
International Conference on Machine Learning
Pages
10495-10503
Publisher
PMLR
Description
We propose Zeno++, a new robust asynchronous Stochastic Gradient Descent (SGD) procedure, intended to tolerate Byzantine failures of workers. In contrast to previous work, Zeno++ removes several unrealistic restrictions on worker-server communication, now allowing for fully asynchronous updates from anonymous workers, for arbitrarily stale worker updates, and for the possibility of an unbounded number of Byzantine workers. The key idea is to estimate the descent of the loss value after the candidate gradient is applied, where large descent values indicate that the update results in optimization progress. We prove the convergence of Zeno++ for non-convex problems under Byzantine failures. Experimental results show that Zeno++ outperforms existing Byzantine-tolerant asynchronous SGD algorithms.
Total citations
2019202020212022202320241721282930
Scholar articles
C Xie, S Koyejo, I Gupta - International Conference on Machine Learning, 2020