A Family of Iteration Functions for General Linear Systems
We develop novel theory and algorithms for computing approximate solution to Ax=b, or to A^TAx=A^Tb, where A is an m × n real matrix of arbitrary rank. First, we describe the Triangle Algorithm (TA), where given an ellipsoid E_A,ρ={Ax: ‖ x ‖≤ρ}, in each iteration it either computes successively improving approximation b_k=Ax_k ∈ E_A,ρ, or proves b ∉E_A, ρ. We then extend TA for computing an approximate solution or minimum-norm solution. Next, we develop a dynamic version of TA, the Centering Triangle Algorithm (CTA), generating residuals r_k=b - Ax_k via iterations of the simple formula, F_1(r)=r-(r^THr/r^TH^2r)Hr, where H=A when A is symmetric PSD, otherwise H=AA^T but need not be computed explicitly. More generally, CTA extends to a family of iteration function, F_t( r), t=1, …, m satisfying: On the one hand, given t ≤ m and r_0=b-Ax_0, where x_0=A^Tw_0 with w_0 ∈ℝ^m arbitrary, for all k ≥ 1, r_k=F_t(r_k-1)=b-Ax_k and A^Tr_k converges to zero. Algorithmically, if H is invertible with condition number κ, in k=O( (κ/t) lnε^-1) iterations ‖ r_k ‖≤ε. If H is singular with κ^+ the ratio of its largest to smallest positive eigenvalues, in k =O(κ^+/tε) iterations either ‖ r_k ‖≤ε or ‖ A^T r_k‖= O(√(ε)). If N is the number of nonzero entries of A, each iteration take O(Nt+t^3) operations. On the other hand, given r_0=b-Ax_0, suppose its minimal polynomial with respect to H has degree s. Then Ax=b is solvable if and only if F_s(r_0)=0. Moreover, exclusively A^TAx=A^Tb is solvable, if and only if F_s(r_0) ≠ 0 but A^T F_s(r_0)=0. Additionally, {F_t(r_0)}_t=1^s is computable in O(Ns+s^3) operations.
READ FULL TEXT