资讯
当前位置: 首页 > 资讯 > 学术活动 > 正文

Stochastic alternating structure-adapted proximal gradient descent method with variance reduction for nonconvex nonsmooth optimization

来源:数学学院 日期:2023/11/30 09:31:25 点击数:

讲座题目:Stochastic alternating structure-adapted proximal gradient descent method with variance reduction for nonconvexnonsmoothoptimization

讲座时间:2023年12月7日星期四15:30-16:30

讲座地点:犀浦校区30456

主讲人简介:张文星,电子科技大学数学科学学院副教授,2012年于南京大学获得博士学位,2014-2015年在法国图卢兹第三大学从事博士后研究。主要研究兴趣为变分不等式算法设计及应用。主持国家自然科学基金面上、青年项目。在MathComput, Inverse Problems, J SciComput,ComputOptimAppl, SIAM J Imaging Sci, IEEE系列杂志发表论文多篇。

讲座内容简介:窗体顶端

The blocky optimization has gained a significant amount of attention in far-reaching practical applications. Following the recent work (M.Nikolovaand P. Tan, SIAM JOptim, 29:2053-2078) on solving a class of nonconvexnonsmoothoptimization, we develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solvingblocky optimization problems. By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex optimization whose objective is the sum of anonsmoothdata-fitting term and a finite number of differentiable functions. The sublinear convergence rate of s-ASAP is built upon the proximal point algorithmic framework, whilst the linear convergence rate of s-ASAP is achieved under the error bound condition. Furthermore, the convergence of the sequence produced by s-ASAP is established under the K—L property. Preliminary numerical simulations on some image processing applications demonstrate the compelling performance of the proposed method.窗体底端

主办:研究生院 承办:数学学院


作者:王中宝   编辑:刘中慧   


[西南交通大学新闻网版权所有,未经书面授权禁止使用]

[打印本页] [关闭窗口]