Linear Algebra and Matrix Analysis for Statistics
Linear Algebra and Matrix Analysis for Statistics offers a gradual exposition to linear algebra without sacrificing the rigor of the subject. It presents both the vector space approach and the canonical forms in matrix theory. The book is as self-contained as possible, assuming no prior knowledge of linear algebra. The authors first address the rudimentary mechanics of linear systems using Gaussian elimination and the resulting decompositions. They introduce Euclidean vector spaces using less abstract concepts and make connections to systems of linear equations wherever possible. After illustrating the importance of the rank of a matrix, they discuss complementary subspaces, oblique projectors, orthogonality, orthogonal projections and projectors, and orthogonal reduction. The text then shows how the theoretical concepts developed are handy in analyzing solutions for linear systems. The authors also explain how determinants are useful for characterizing and deriving properties concerning matrices and linear systems. They then cover eigenvalues, eigenvectors, singular value decomposition, Jordan decomposition (including a proof), quadratic forms, and Kronecker and Hadamard products. The book concludes with accessible treatments of advanced topics, such as linear iterative systems, convergence of matrices, more general vector spaces, linear transformations, and Hilbert spaces.