We consider sparse convex optimization problems, under general affine constraints. There are couplings between variables in both the cost function and constraints of the optimization problem.
We propose a distributed Jacobi algorithm to solve this problem in a cooperative manner. Using a local update in each iteration which is equivalent to a convex combination of new local solutions and old iterates, the Jacobi algorithm guarantees to achieve feasible solutions at every iteration.
We provide the a posteriori certification for centralized optimality of distributed solutions, and a priori conditions that guarantee convergence to optimality in several problem settings.
The proposed approach is useful for distributed model predictive control applications where feasibility is an important requirement. It fosters distributing the computations, especially in settings with a large number of subsystems, a sparse coupling structure, and local communication is available.
Bio: Dang Doan has backgrounds in Mechatronics, Systems and Control. In 2012, he obtained the PhD degree from Delft University of Technology (The Netherlands) in the field of distributed optimization-based approaches for control. After that, he served as a lecturer at Cantho University of Technology (Vietnam) for 3 years. Currently, he is a post-doc at the University of Freiburg (Germany) with the Georg Forster Fellowship from Alexander von Humboldt Foundation. He is working on distributed convex optimization algorithms, and implementation of fast solvers on embedded controllers for nonlinear model predictive control and moving-horizon estimation.
Welcome to everyone!