Network Gradient Descent Algorithm for Decentralized Federated Learning
作者:
时间:2022-05-11
阅读量:528次
  • 演讲人: 王汉生(北京大学)
  • 时间:2022年05月25日 周三下午 14:00
  • 地点:腾讯会议 会议 ID:556-509-844
  • 主办单位:浙江大学数据科学研究中心、浙江大学统计学研究所

摘要:We study a fully decentralized federated learning algorithm, which is a novel gradient descent algorithm executed on a communication-based network. For convenience, we refer to it as a network gradient descent (NGD) method. In the NGD method, only statistics (e.g., parameter estimates) need to be communicated, minimizing the risk of privacy. Meanwhile, different clients communicate with each other directly according to a carefully designed network structure without a central master. This greatly enhances the reliability of the entire algorithm. Those nice properties inspire us to carefully study the NGD method both theoretically and numerically. Theoretically, we start with a classical linear regression model. We find that both the learning rate and the network structure play significant roles in determining the NGD estimator's statistical efficiency. The resulting NGD estimator can be statistically as efficient as the global estimator, if the learning rate is sufficiently small and the network structure is well balanced, even if the data are distributed heterogeneously. Those interesting findings are then extended to general models and loss functions. Extensive numerical studies are presented to corroborate our theoretical findings. Classical deep learning models are also presented for illustration purpose.

 

报告人简介:王汉生,北京大学光华管理学院商务统计与经济计量系,嘉茂荣聘教授,博导,系主任。国家杰出青年基金获得者,北京大学商务智能研究中心主任,全国工业统计学教学研究会青年统计学家协会创始会长,美国统计学会(ASA)Fellow,国际统计协会(ISI)Elected Member。先后历任Annals of Statistics,Journal of the American Statistical Association,Journal of Business and Economics Statistics等8个国际学术期刊副主编(Associate Editor)。研究兴趣包括:高维数据分析、变量选择、数据降维、极值理论、半参数模型、搜索引擎营销、社交关系网络等。已在国内外各种专业杂志上发表文章100多篇,并合著有英文专著共1本,(合)著中文教材4本,也是爱思唯尔中国高被引学者学者(数学类,2014—2019;应用经济学类:2020)。

 

联系人:张荣茂(rmzhang@zju.edu.cn

 

欢迎各位老师和同学参加!