原标题:在线学术报告 | 伍书缘:大规模分布式学习的拟牛顿更新
摘要
Distributed computing is critically important for modern statistical analysis. Herein, we develop a distributed quasi-Newton (DQN) framework with excellent statistical, computation, and communication efficiency. In the DQN method, no Hessian matrix inversion or communication is needed. This considerably reduces the computation and communication complexity of the proposed method. Notably, related existing methods only analyze numerical convergence and require a diverging number of iterations to converge. However, we investigate the statistical properties of the DQN method and theoretically demonstrate that the resulting estimator is statistically efficient over a small number of iterations under mild conditions. Extensive numerical analyses demonstrate the finite sample performance.
嘉宾介绍
伍书缘,北京大学光华管理学院商务统计与经济计量系在读博士生,师从王汉生教授。主要研究方向为再抽样方法、统计优化算法、大规模数据统计建模等。研究论文发表在Journal of Business and Economic Statistics, Statistica Sinica, Journal of the Royal Statistical Society. Series C等期刊上。
狗熊会线上学术报告厅向数据科学及相关领域的学者及从业者开放,非常期待各位熊粉报名或推荐报告人。相关事宜,请联系:常莹,ying.chang@clubear.org。
请添加熊二(clubear2)获取参会方式~ 返回搜狐,查看更多
责任编辑: