TY - JOUR
T1 - Dynamics based privacy preservation in decentralized optimization
AU - Gao, Huan
AU - Wang, Yongqiang
AU - Nedić, Angelia
N1 - Publisher Copyright: © 2023 Elsevier Ltd
PY - 2023/5
Y1 - 2023/5
N2 - With decentralized optimization having increased applications in various domains ranging from machine learning, control, to robotics, its privacy is also receiving increased attention. Existing privacy solutions for decentralized optimization achieve privacy by patching information-technology privacy mechanisms such as differential privacy or homomorphic encryption, which either sacrifices optimization accuracy or incurs heavy computation/communication overhead. We propose an inherently privacy-preserving decentralized optimization algorithm by exploiting the robustness of decentralized optimization dynamics. More specifically, we present a general decentralized optimization framework, based on which we show that the privacy of participating nodes’ gradients can be protected by adding randomness in optimization parameters. We further show that the added randomness has no influence on the accuracy of optimization, and prove that our inherently privacy-preserving algorithm has R-linear convergence when the global objective function is smooth and strongly convex. We also prove that the proposed algorithm can avoid the gradient of a node from being inferable by other nodes. Simulation results confirm the theoretical predictions.
AB - With decentralized optimization having increased applications in various domains ranging from machine learning, control, to robotics, its privacy is also receiving increased attention. Existing privacy solutions for decentralized optimization achieve privacy by patching information-technology privacy mechanisms such as differential privacy or homomorphic encryption, which either sacrifices optimization accuracy or incurs heavy computation/communication overhead. We propose an inherently privacy-preserving decentralized optimization algorithm by exploiting the robustness of decentralized optimization dynamics. More specifically, we present a general decentralized optimization framework, based on which we show that the privacy of participating nodes’ gradients can be protected by adding randomness in optimization parameters. We further show that the added randomness has no influence on the accuracy of optimization, and prove that our inherently privacy-preserving algorithm has R-linear convergence when the global objective function is smooth and strongly convex. We also prove that the proposed algorithm can avoid the gradient of a node from being inferable by other nodes. Simulation results confirm the theoretical predictions.
KW - Decentralized optimization
KW - Privacy preservation
UR - http://www.scopus.com/inward/record.url?scp=85159887273&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85159887273&partnerID=8YFLogxK
U2 - 10.1016/j.automatica.2023.110878
DO - 10.1016/j.automatica.2023.110878
M3 - Article
SN - 0005-1098
VL - 151
JO - Automatica
JF - Automatica
M1 - 110878
ER -