Details of Research Outputs

TitleLevel-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method
Author (Name in English or Pinyin)
Zhu, Daoli1,2,6; Deng, Sien3; Li, Minghua4; Zhao, Lei5
Date Issued2021-06-01
Indexed BySCIE
Firstlevel Discipline数学
Education discipline科技类
Published range国外学术期刊
Volume Issue Pages卷: 189 期: 3 页: 889-918
[1] Aussel, D., Daniilidis, A., Thibault, A.D.: Subsmooth sets: functional characterizations and related concepts. Trans. Am. Math. Soc. 357(4), 1275–1301 (2004) DOI: 10.1090/S0002-9947-04-03718-3
[2] Attouch, H., Bolte, J., Svaiter, B.F.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward–backward splitting, and regularized Gauss–Seidel methods. Math. Program. 137(1–2), 91–129 (2013) DOI: 10.1007/s10107-011-0484-9
[3] Aybat, N.S., Iyengar, G.: A unified approach for minimizing composite norms. Math. Program. 144(1–2), 181–226 (2014) DOI: 10.1007/s10107-012-0622-z
[4] Banjac, G., Margellos, K., Goulart, P.J.: On the convergence of a regularized Jacobi algorithm for convex optimization. IEEE Trans. Autom. Control 63(4), 1113–1119 (2018) DOI: 10.1109/TAC.2017.2737319
[5] Bernard, F., Thibault, L.: Uniform prox-regularity of functions and epigraphs in Hilbert spaces. Nonlinear Anal. 60, 187–207 (2005) DOI: 10.1016/
[6] Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of Łojasiewicz inequalities: subgradient flows, talweg, convexity. Trans. Am. Math. Soc. 362(6), 3319–3363 (2010) DOI: 10.1090/S0002-9947-09-05048-X
[7] Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.W.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Program. 165(2), 471–507 (2017) DOI: 10.1007/s10107-016-1091-6
[8] Bonettini, S., Loris, I., Porta, F., Prato, M.: Variable metric inexact line-search-based methods for nonsmooth optimization. SIAM J. Optim. 26(2), 891–921 (2016) DOI: 10.1137/15M1019325
[9] Burke, J.V., Deng, S.: Weak sharp minima revisited, part I: basic theory. Control Cybern. 31, 439–469 (2002)
[10] Candés, E.J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005) DOI: 10.1109/TIT.2005.858979
[11] Carpentier, P., Cohen, G.: Décomposition-Coordination en Optimisation Déterministe et Stochastique. Springer, Berlin (2017) DOI: 10.1007/978-3-662-55428-9
[12] Charisopoulos, V., Chen, Y., Davis, D., Díaz, M., Ding, L., Drusvyatskiy, D.: Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence. In: Foundations of Computational Mathematics, pp. 1–89 (2021)
[13] Chouzenoux, E., Pesquet, J.C., Repetti, A.: Variable metric forward–backward algorithm for minimizing the sum of a differentiable function and a convex function. J. Optim. Theory Appl. 162(1), 107–132 (2014) DOI: 10.1007/s10957-013-0465-7
[14] Cohen, G.: Auxiliary problem principle and decomposition of optimization problems. J. Optim. Theory Appl. 32(3), 277–305 (1980) DOI: 10.1007/BF00934554
[15] Cohen, G., Zhu, D.: Decomposition and coordination methods in large scale optimization problems: the nondifferentiable case and the use of augmented Lagrangians. Adv. Large Scale Syst. 1, 203–266 (1984)
[16] Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006) DOI: 10.1109/TIT.2006.871582
[17] Dontchev, A., Rockafellar, R.T: Implicit Functions and Solution Mappings, Springer (2009)
[18] Drusvyatskiy, D., Lewis, A.S.: Error bounds, quadratic growth, and linear convergence of proximal methods. Math. Oper. Res. 43(3), 919–948 (2018) DOI: 10.1287/moor.2017.0889
[19] Frankel, P., Garrigos, G., Peypouquet, J.: Splitting methods with variable metric for Kurdyka–Łojasiewicz functions and general convergence rates. J. Optim. Theory Appl. 165(3), 874–900 (2015) DOI: 10.1007/s10957-014-0642-3
[20] Hoffman, A.J.: On approximate solutions of systems of linear inequalities. J. Res. Natl. Bureau Stand. 49, 263–265 (1952) DOI: 10.6028/jres.049.027
[21] Hu, Y., Li, C., Meng, K., Qin, J., Yang, X.: Group sparse optimization via ℓp,q regularization. J. Mach. Learn. Res. 18(1), 960–1011 (2017)
[22] Karimi, H., Nutini, J., Schmidt, M.: Linear convergence of gradient and proximal-gradient methods under the Polyak–Łojasiewicz condition. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Springer, Cham, pp. 795–811 (2016)
[23] Kruger Alexander, Y., Lopez, Marco A., Yang, X.Q., Zhu, J.X.: Holder error bounds and Holder calmness with applications to convex semi-infinite optimization. Set-Valued Var. Anal. 27, 995–1023 (2019)
[24] Li, G., Pong, T.K.: Calculus of the exponent of Kurdyka–Łojasiewicz inequality and its applications to linear convergence of first-order methods. Found. Comput. Math. 18(5), 1199–1232 (2018) DOI: 10.1007/s10208-017-9366-8
[25] Lojasiewicz, S.: A topological property of real analytic subsets (in French). Coll du. CNRS. Les Equ. Aux derivees Partielles 87–89 (1963)
[26] Luo, Z.Q., Tseng, P.: Error bound and convergence analysis of matrix splitting algorithms for the affine variational inequality problem. SIAM J. Optim. 2(1), 43–54 (1992) DOI: 10.1137/0802004
[27] Mordukhovich, B.S: Variational Analysis and Generalized Differentiation I: Basic Theory, vol. 330, Springer (2006)
[28] Necoara, I., Nesterov, Y., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program. 1–39 (2018)
[29] Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, vol. 87, Springer (2013)
[30] Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables, vol. 30, SIAM (1970)
[31] Pang, J.S.: Error bounds in mathematical programming. Math. Program. 79, 299–332 (1997)
[32] Polyak, B.T.: Gradient methods for minimizing functionals (in Russian). Z. Vychislitel’noı Mat. Matematicheskoı Fiziki 643–653 (1963)
[33] Robinson, S.M.: Some continuity properties of polyhedral multifunctions. Math. Oper. Res. 5, 206–214 (1980) DOI: 10.1287/moor.5.1.43
[34] Rockafellar, R.T., Wets, R.J.B.: Variational Analysis, vol. 317, Springer (2009)
[35] Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B (Methodol.) 267–288
[36] Tseng, P., Yun, S.: A coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117(1–2), 387–423 (2009) DOI: 10.1007/s10107-007-0170-0
[37] Wang, X., Jane, J.Y., Yuan, X., Zeng, S., Zhang, J.: Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis. Set Valued Var. Anal. (2021). 10.1007/s11228-020-00570-0 DOI: 10.1007/s11228-020-00570-0
[38] Zhang, H.: New analysis of linear convergence of gradient-type methods via unifying error bound conditions. Math. Program. 180, 371–416 (2020) DOI: 10.1007/s10107-018-01360-1
[39] Zhang, H., Dai, Y.H., Guo, L., Peng, W.: Proximal-like incremental aggregated gradient method with linear convergence under Bregman distance growth conditions. In: Mathematics of Operations Research, pp. 1–21 (2019)
[40] Zhu, D., Marcotte, P.: An extended descent framework for variational inequalities. J. Optim. Theory Appl. 80(2), 349–366 (1994) DOI: 10.1007/BF02192941
[41] Zhu, D., Deng, S.: A Variational Approach on Level Sets and Linear Convergence of Variable Bregman Proximal Gradient Method for Nonconvex Optimization Problems. arXiv preprint arXiv:1905.08445 (2019)
[42] Zhu, D., Deng, S., Li, M., Zhao, L.: Level-Set Subdifferential Error Bounds and Linear Convergence of Variable Bregman Proximal Gradient Method. arXiv preprint arXiv:2008.13627 (2020)
Citation statistics
Cited Times:1[WOS]   [WOS Record]     [Related Records in WOS]
Document TypeJournal article
CollectionSchool of Data Science
Corresponding AuthorDeng, Sien
1.Shanghai Jiao Tong Univ, Antai Coll Econ & Management, Shanghai, Peoples R China
2.Shanghai Jiao Tong Univ, Sino US Global Logist Inst, Shanghai, Peoples R China
3.Northern Illinois Univ, Dept Math Sci, De Kalb, IL 60115 USA
4.Chongqing Univ Arts & Sci, Sch Math & Big Data, Chongqing, Peoples R China
5.Shanghai Jiao Tong Univ, Sch Naval Architecture Ocean & Civil Engn, Shanghai, Peoples R China
6.Chinese Univ Hong Kong , Sch Data Sci, Shenzhen, Peoples R China
Recommended Citation
GB/T 7714
Zhu, Daoli,Deng, Sien,Li, Minghuaet al. Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method[J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS,2021.
APA Zhu, Daoli, Deng, Sien, Li, Minghua, & Zhao, Lei. (2021). Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS.
MLA Zhu, Daoli,et al."Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method".JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS (2021).
Files in This Item:
There are no files associated with this item.
Related Services
Usage statistics
Google Scholar
Similar articles in Google Scholar
[Zhu, Daoli]'s Articles
[Deng, Sien]'s Articles
[Li, Minghua]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhu, Daoli]'s Articles
[Deng, Sien]'s Articles
[Li, Minghua]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhu, Daoli]'s Articles
[Deng, Sien]'s Articles
[Li, Minghua]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.