title | abstract | section | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | ||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Fixing by Mixing: A Recipe for Optimal Byzantine ML under Heterogeneity |
Byzantine machine learning (ML) aims to ensure the resilience of distributed learning algorithms to misbehaving (or Byzantine) machines. Although this problem received significant attention, prior works often assume the data held by the machines to be homogeneous, which is seldom true in practical settings. Data heterogeneity makes Byzantine ML considerably more challenging, since a Byzantine machine can hardly be distinguished from a non-Byzantine outlier. A few solutions have been proposed to tackle this issue, but these provide suboptimal probabilistic guarantees and fare poorly in practice. This paper closes the theoretical gap, achieving optimality and inducing good empirical results. In fact, we show how to automatically adapt existing solutions for (homogeneous) Byzantine ML to the heterogeneous setting through a powerful mechanism, we call nearest neighbor mixing (NNM), which boosts any standard robust distributed gradient descent variant to yield optimal Byzantine resilience under heterogeneity. We obtain similar guarantees (in expectation) by plugging NNM in the distributed stochastic heavy ball method, a practical substitute to distributed gradient descent. We obtain empirical results that significantly outperform state-of-the-art Byzantine ML solutions. |
Regular Papers |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
allouah23a |
0 |
Fixing by Mixing: A Recipe for Optimal Byzantine ML under Heterogeneity |
1232 |
1300 |
1232-1300 |
1232 |
false |
Allouah, Youssef and Farhadkhani, Sadegh and Guerraoui, Rachid and Gupta, Nirupam and Pinot, Rafael and Stephan, John |
|
2023-04-11 |
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics |
206 |
inproceedings |
|