报告题目:Towards Gradient-based Bilevel Optimization in Machine Learning
报 告 人: 张 进 南方科技大学副教授(优青)
报告时间:2023年5月25日周四下午4:30
报告地点:S3-313
摘要: Recently, Bi-Level Optimization (BLO) techniques have received extensive attentions from machine learning communities. Gradient methods have become mainstream techniques for BLO in learning fields. The validity of existing works heavily rely on a restrictive Lower-Level Strong Convexity (LLSC) condition. In this talk, we will discuss some recent advances to get rid of the limited LLSC restriction. First, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized double loop algorithmic framework for BLO. Second, by averaging the upper and lower level objectives, we propose a single loop Bi-level Averaged Method of Multipliers (sl-BAMM) for BLO that is simple yet efficient for large-scale BLO. We further provide non-asymptotic convergence analysis of sl-BAMM towards KKT stationary points. Experimental results demonstrate the superiority of our methods.
个人简介:
张进,南方科技大学数学系/深圳国家应用数学中心副教授,2007、2010年本科、硕士毕业于大连理工大学,2014年博士毕业于加拿大维多利亚大学。2015至2018年间任职香港浸会大学数学系,2019年初加入南方科技大学。致力于最优化理论和应用研究,代表性成果发表在Math Program、SIAM J Optim、Math Oper Res、SIAM J Numer Anal、J Mach Learn Res、IEEE T Pattern Anal Mach Intell,以及ICML、NeurIPS等有影响力的最优化、计算数学、机器学习期刊与会议上。研究成果获得中国运筹学会青年科技奖、广东省青年科技创新奖,主持国家自然科学基金优青项目、广东省自然科学基金杰青项目,以及国家自然科学基金 青年/面上项目。
上一条:学术报告会:High accuracy analysis of FEMs for several time-fractional PDEs 下一条:学术报告会:Stability of Riemann solutions to the compressible Navier-Stokes equations
【关闭】