Contact us on (02) 8445 2300
For all customer service and order enquiries

Woodslane Online Catalogues

9780898714029 Academic Inspection Copy

The Symmetric Eigenvalue Problem

Description
Table of
Contents
Google
Preview
According to Parlett, ""Vibrations are everywhere, and so too are the eigenvalues associated with them. As mathematical models invade more and more disciplines, we can anticipate a demand for eigenvalue calculations in an ever richer variety of contexts"". Anyone who performs these calculations will welcome the reprinting of Parlett's book (originally published in 1980). In this unabridged, amended version, Parlett covers aspects of the problem that are not easily found elsewhere. The chapter titles, given below, convey the scope of the material succinctly. The aim of the book is to present mathematical knowledge that is needed in order to understand the art of computing eigenvalues of real symmetric matrices, either all of them or only a few. The author explains why the selected information really matters and he is not shy about making judgments. The commentary is lively but the proofs are terse. The first nine chapters are based on a matrix on which it is possible to make similarity transformations explicitly. The only source of error is inexact arithmetic. The last five chapters turn to large sparse matrices and the task of making approximations and judging them. Key Features: Convergence theory for the Rayleigh quotient iteration. The direct rotation as a rival to Householder reflectors. Eigenvectors of tridiagonals. Convergence theory, simpler than Wilkinson's, for Wilkinson's shift strategy in QL and QR. New proofs and sharper results for error bounds. Optimal properties of Rayleigh - Ritz approximations. Approximation theory from Krylov subspaces, Paige's theorem for noisy Lanczos algorithms, and semiorthogonality among Lanczos vectors. Four flavours of subspace iteration.
Preface to the First Edition Preface to the Classics Edition Introduction Chapter 1: Basic Facts About Self-Adjoint Matrices Chapter 2: Tasks, Obstacles, and Aids Chapter 3: Counting Eigenvalues Chapter 4: Simple Vector Iterations Chapter 5: Deflation Chapter 6: Useful Orthogonal Matrices (Tools of the Trade) Chapter 7: Tridiagonal Form Chapter 8: The QL and QR Algorithms Chapter 9: Jacobi Methods Chapter 10: Eigenvalue Bounds Chapter 11: Approximations from a Subspace Chapter 12: Krylov Subspaces Chapter 13: Lanczos Algorithms Chapter 14: Subspace Iteration Chapter 15: The General Linear Eigenvalue Problem Appendix A: Rank-One and Elementary Matrices Appendix B: Chebyshev Polynomials Annotated Bibliography References Index.
Google Preview content