ENGLISH

【437ccm必赢国际首页欢迎您】Accelerate Convex Optimization in Machine Learning by Leveraging Functional Growth Conditions

发布时间:2022年11月23日 09:17 浏览量:

报告题目Accelerate Convex Optimization in Machine Learning by Leveraging Functional Growth Conditions

报告人:徐易(437ccm必赢国际首页欢迎您人工智能学院)

报告时间:2022112414:30-15:30

报告地点:海山楼A1101

报告校内联系人:肖现涛 教授  联系电话:84708351-8307


报告摘要: In recent years, unprecedented growths in scale and dimensionality of data raise big computational challenges for traditional optimization algorithms; thus it becomes very important to develop efficient and effective optimization algorithms for solving numerous machine learning problems. Many traditional algorithms (e.g., gradient descent method) are black-box algorithms, which are simple to implement but ignore the underlying geometrical property of the objective function. Recent trend in accelerating these traditional black-box algorithms is to leverage geometrical properties of the objective function such as strong convexity. However, most existing methods rely too much on the knowledge of strong convexity, which makes them not applicable to problems without strong convexity or without knowledge of strong convexity. To bridge the gap between traditional black-box algorithms without knowing the problem's geometrical property and accelerated algorithms under strong convexity, how can we develop adaptive algorithms that can be adaptive to the objective function's underlying geometrical property? To answer this question, in this talk we focus on convex optimization problems and propose to explore an error bound condition that characterizes the functional growth condition of the objective function around a global minimum. Under this error bound condition, we develop algorithms that (1) can adapt to the problem's geometrical property to enjoy faster convergence in stochastic optimization; (2) can leverage the problem's structural regularizer to further improve the convergence speed; (3) can address both deterministic and stochastic optimization problems with explicit max-structural loss.


报告人简介:徐易,437ccm必赢国际首页欢迎您人工智能学院副教授,硕士生导师。博士毕业于美国爱荷华大学计算机系,本科毕业于浙江大学数学系。研究方向为机器学习、凸优化与非凸优化、深度学习、统计学习理论。在ICMLAAAINeurIPSIJCAICVPR等机器学习领域国际顶级会议和期刊上发表论文20余篇。多次担任领域内的重要国际学术会议和期刊ICMLNeurIPSAAAITPAMI等的审稿人,担任学术会议SIAM OP17ICML2021的分会主席。






邮编:116024

电话:(86)-531-88565657

地址:大连市甘井子区凌工路2号

Copyright© 437ccm必赢国际(中国·官方网站)-Webgame Platform2024      辽ICP备05001357号