LEE Ching-pei
Contact: leechingpei@gmail.com / chingpei@ism.ac.jp
I am an associate professor in the Department of Advanced Data Science
at the Institute of Statistical Mathematics
(ISM). I am also affiliated with the Statistical Science
Program, Graduate Institute for Advanced Studies of the Graduate University for Advanced Studies (SOKENDAI).
Before ISM, I have been a tenure-track assistant research fellow at Academia Sinica and
a Peng Tsu Ann Assistant Professor in the Department of Mathematics and the
Institute for Mathematical Sciences at the
National University of Singapore.
Prior to that, I obtained my Ph.D in Computer Sciences with a minor in Mathematics at the University of Wisconsin-Madison.
I work on nonlinear optimization and its applications, especially to large-scale problems.
Discussion and collaboration on research is welcome. Please feel free to shoot me an e-mail.
If you would like to do a PhD with me, you should apply for the PhD program of SOKENDAI.
See the admission information here and the scholarship information here.
If you are interested in being an intern student or a postdoc in my group, you need to either be self-funded or go through the Postdoctoral Fellowship Program of JSPS.
Upcoming Travels/Presentations
- 2024 Oct. 20 - 23: INFORMS Annual Meeting, Seattle.
- 2024 Dec. 9 - 15: NeurIPS, Vancouver.
Research Interests
- Nonlinear optimization (problems involving real, as opposed to integer or discrete, variables): I'm mostly interested in algorithms that are efficient for large-scale real applications. This includes design and implementation of new algorithms, analysis of existing algorithms, study of specific problems, and nonsmooth analysis that helps us better understand the problems and algorithms
- Applications of nonlinear optimization: especially large-scale ones. Currently the major application I work on is practically efficient solvers for large-scale machine learning problems, especially those that are suitable for distributed and/or multicore environments, but I am also open to other applications
Selected Publications and Preprints (ordered in the time the work is finished/uploaded)
(Google scholar page for the full publication list.)
- Jan Harold Alcantara, Ching-pei Lee, Akiko Takeda. A four-operator splitting algorithm for nonconvex and nonsmooth optimization, 2024.
- Zih-Syuan Huang, Ching-pei Lee.
Regularized Adaptive Momentum Dual Averaging with an Efficient Inexact Subproblem Solver for Training Structured Neural Network, 2024. The 38th Conference on Neural Information Processing Systems (NeurIPS). (Code.)
- Jan Harold Alcantara, Ching-pei Lee.
Accelerated projected gradient algorithms
for sparsity constrained optimization problems, 2022. The 36th Conference on Neural Information Processing Systems (NeurIPS).
- Ching-pei Lee, Ling Liang, Tianyun Tang, Kim-Chuan Toh.
Accelerating
nuclear-norm regularized low-rank matrix optimization through
Burer-Monteiro decomposition, 2022.
- Jan Harold Alcantara, Ching-pei Lee.
Global convergence and acceleration of projection methods for feasibility problems involving union convex sets
, 2022.
- Zih-Syuan Huang, Ching-pei Lee.
Training Structured Neural Networks Through Manifold Identification and Variance Reduction, 2022. The 10th International Conference on Learning Representations (ICLR). (Code.)
- Ching-pei Lee.
Accelerating Inexact Successive Quadratic Approximation for Regularized
Optimization Through Manifold Identification, 2023. (Code.) Mathematical Programming.
- Ching-pei Lee, Po-Wei Wang, Chih-Jen Lin.
Limited-memory Common-directions Method for Large-scale Optimization:
Convergence, Parallelization, and Distributed Optimization. Mathematical Programming Computation, 2022. Implementation available in MPI-LIBLINEAR. A preliminary version appeared in SDM 2017.
- Yu-Sheng Li, Wei-Lin Chiang, Ching-pei Lee.
Manifold Identification for Ultimately Communication-Efficient
Distributed Optimization.
The 37th International Conference
on Machine Learning (ICML), 2020. (Code.)
- Ching-pei Lee, Cong Han Lim, Stephen J. Wright.
A Distributed Quasi-Newton Algorithm for Primal and Dual Regularized Empirical Risk Minimization, 2019. (Code for experiments in the paper.) A preliminary version appeared in KDD 2018.
- Ching-pei Lee, Stephen J. Wright.
First-order algorithms converge faster than O(1/k) on convex
problems.
The 36th International Conference on Machine Learning (ICML), 2019.
- Ching-pei Lee, Stephen J. Wright.
Inexact Variable Metric Stochastic Block-Coordinate Descent
for Regularized Optimization. Journal of Optimization Theory and Applications, 2020.
- Ching-pei Lee, Stephen J. Wright.
Inexact Successive Quadratic Approximation for Regularized Optimization. Computational Optimization and Applications, 2019.
- Stephen J. Wright, Ching-pei Lee.
Analyzing Random Permutations for Cyclic Coordinate Descent. Mathematics of Computation, 2020.
- Ching-pei Lee, Kai-Wei Chang.
Block-diagonal approximation for distributed dual regularized empirical risk minimization. Machine Learning, 2020. Implementation available in MPI-LIBLINEAR. (Code.) A preliminary version appeared in ICML 2015.
- Ching-pei Lee, Stephen J. Wright. Random Permutations Fix a Worst Case for Cyclic Coordinate Descent. the IMA Journal on Numerical Analysis, 2019.
- Po-Wei Wang, Ching-pei Lee, Chih-Jen Lin. The Common-directions Method for Regularized Empirical Risk Minimization, 2019. Journal of Machine Learning Research. (Supplementary mterials, code for experiments in the paper.)
- Tzu-Ming Kuo, Ching-pei Lee, Chih-Jen Lin.
Large-scale Kernel RankSVM.
SIAM International Conference on Data Mining, 2014.
Implemented in the LIBSVM extension for rankSVM.
(Supplementary materials,
code
for experiments in the paper.)
- Wei-Cheng Chang, Ching-pei Lee, Chih-Jen Lin.
A revisit to support vector data description (SVDD), 2015. Implemented in the LIBSVM extension for SVDD.
- Ching-pei Lee, Chih-Jen Lin.
Large-scale Linear RankSVM
.
Neural Computation, 2014.
Implemented in the LIBLINEAR extension for rankSVM.
(Supplementary materials,
code
for experiments in the paper.)
Selected Talks
Software
(Gitub Repositories)
RMDA - A Regularized Modernized Dual Averaging Algorithm for Training Structred Neural
Network Models
MADPQN - A Manifold-Aware Distributed Proximal Quasi-Newton Method for Regularized Optimization
Distributed LIBLINEAR
LIBLINEAR
RankSVM extensions of LIBSVM and LIBLINEAR