网站首页  关于我们  党建思政  团队队伍  团队建设  科学研究  人才培养  专业认证  员工工作  教工之家  员工之窗  English  领导信箱 

beat365学术报告之三十一

作者: 时间:2021-12-13 点击数:

报告题目:Innovation Compression for Communication-efficient Distributed Optimization with Linear Convergence

报告人:游科友 副教授

报告时间:2021年12月14日(周二)19:00

报告地点:腾讯会议:325-896-083

报告对象:电气学院相关专业本科生、研究生、教师

主办单位:beat365

报告人简介:

游科友,清华大学自动化系长聘副教授、博士生导师。2007年获中山大学统计科学学士学位。2007年8月至2012年6月在新加坡南洋理工大学电气与电子工程学院攻读博士学位和从事博士后研究。自2012年7月起任教于清华大学自动化系。曾受邀访问意大利都灵理工大学、澳大利亚墨尔本大学、香港科技大学等院校。长期从事复杂网络化系统的学习、优化与控制及其应用的研究。目前,担任Transactions on Control of Network Systems、IEEE Transactions on Cybernetics、Systems &Control Letters、IEEE Control System Letters等国际期刊副编委(Associate Editor)。先后主持国家自然科学基金委优青项目、重点项目、重点研发计划课题等,获关肇直最佳论文奖、亚洲控制学会淡马锡青年教育学者奖。

报告内容:Information compression is essential to reduce communication cost in distributed optimization over peer-to-peer networks. We propose a communication-efficient linearly convergent distributed (COLD) algorithm to solve strongly convex optimization problems. By compressing innovation vectors, which are the differences between decision vectors and their estimates, COLD is able to achieve linear convergence for a class ofδ-contracted compressors. We explicitly quantify how the compression affects the convergence rate and show that COLD matches the same rate of its uncompressed version. To accommodate a wider class of compressors that includes the binary quantizer, we further design a novel dynamical scaling mechanism and obtain the linearly convergent Dyna- COLD. Importantly, our results strictly improve existing results for the quantized consensus problem. Numerical experiments demonstrate the advantages of both algorithms under different compressors.

版权所有 beat·365(中国)官方网站 - 平台入口