Talk by Chris Paige, McGill University

The surprising accuracy of Lanczos tridiagonalization for eigenproblems and solution of equations


Cornelius Lanczos's 1952 process for tridiagonalizing a symmetric matrix \(A\in\mathbb{R}^{n\times n}\) is the basis for many well known very useful large sparse matrix algorithms. The finite precision process can lose orthogonality immediately the first eigenvector of \(A\) has converged to machine precision, and so the Lanczos process was largely discarded, since many believed it did not work. But since about 1971 it has become the basis for some of our most powerful tools for large sparse matrix problems---even though the computational behaviour was poorly understood.

We show that the computational Lanczos process does eventually give all eigenvalues and vectors, or solve equations, with an accuracy similar to that of backward stable methods.

Golub and Kahan's 1965 bidiagonalization of a general matrix \(B\) can be used to find the singular value decomposition of \(B\), or solve equations or least squares problems involving \(B\). It can be framed and analyzed as a Lanczos process, and so the present results will probably show it also leads to results which are as accurate as we could hope.

Go back