Gradient based block coordinate descent algorithms for joint approximate diagonalization of matrices
In this paper, we propose a gradient based block coordinate descent (BCD-G) framework to solve the joint approximate diagonalization of matrices defined on the product of the complex Stiefel manifold and the special linear group. Instead of the cyclic fashion, we choose the block for optimization in a way based on the Riemannian gradient. To update the first block variable in the complex Stiefel manifold, we use the well-known line search descent method. To update the second block variable in the special linear group, based on four different kinds of elementary rotations, we construct two classes: Jacobi-GLU and Jacobi-GLQ, and then get two BCD-G algorithms: BCD-GLU and BCD-GLQ. We establish the weak convergence and global convergence of these two algorithms using the Łojasiewicz gradient inequality under the assumption that the iterates are bounded. In particular, the problem we focus on in this paper includes as special cases the well-known joint approximate diagonalization of Hermitian (or complex symmetric) matrices by invertible transformations in blind source separation, and our algorithms specialize as the gradient based Jacobi-type algorithms. All the algorithms and convergence results in this paper also apply to the real case.
READ FULL TEXT