# Linear algebra for pattern processing : projection, singular value decomposition, and pseudoinverse / Kenichi Kanatani.

Material type: TextSeries: Synthesis lectures on signal processing ; #21. | Synthesis digital library of engineering and computer sciencePublisher: San Rafael, California (1537 Fourth Street, 1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool Publishers, [2021]Description: 1 PDF (xiv, 141 pages) : illustrations (some color)Content type: text Media type: electronic Carrier type: online resourceISBN: 9781636391083Subject(s): Algebras, Linear | Pattern recognition systems -- Mathematical models | linear spaces | eigenvalues | spectral decomposition | singular value decomposition | pseudoinverse | least-squares solution | Karhunen-Loève expansion | principal component analysis | trifocal tensorsGenre/Form: Electronic books.Additional physical formats: Print version:: No titleDDC classification: 526/.1/015125 LOC classification: TA347.L5 | K365 2021ebOnline resources: Abstract with links to resource | Abstract with links to full text Also available in print.Item type | Current library | Call number | Status | Date due | Barcode |
---|---|---|---|---|---|

Ebooks | Indian Institute of Technology Delhi - Central Library | Available |

Mode of access: World Wide Web.

System requirements: Adobe Acrobat Reader.

Part of: Synthesis digital library of engineering and computer science.

Includes bibliographical references (pages 135-136) and index.

1. Introduction -- 1.1. Linear space and projection -- 1.2. Eigenvalues and spectral decomposition -- 1.3. Singular values and singular value decomposition -- 1.4. Pseudoinverse -- 1.5. Least-squares solution of linear equations -- 1.6. Probability distribution of vectors -- 1.7. Fitting spaces -- 1.8. Matrix factorization -- 1.9. Triangulation from three views -- 1.10. Fundamentals of linear algebra

2. Linear space and projection -- 2.1. Expression of linear mapping -- 2.2. Subspaces, projection, and rejection -- 2.3. Projection matrices -- 2.4. Projection onto lines and planes -- 2.5. Schmidt orthogonalization -- 2.6. Glossary and summary -- 2.7. Supplemental notes -- 2.8. Problems

3. Eigenvalues and spectral decomposition -- 3.1. Eigenvalues and eigenvectors -- 3.2. Spectral decomposition -- 3.3. Diagonalization of symmetric matrices -- 3.4. Inverse and powers -- 3.5. Glossary and summary -- 3.6. Supplemental notes -- 3.7. Problems

4. Singular values and singular value decomposition -- 4.1. Singular values and singular vectors -- 4.2. Singular value decomposition -- 4.3. Column domain and row domain -- 4.4. Matrix representation -- 4.5. Glossary and summary -- 4.6. Supplemental notes -- 4.7. Problems

5. Pseudoinverse -- 5.1. Pseudoinverse -- 5.2. Projection onto the column and row domains -- 5.3. Pseudoinverse of vectors -- 5.4. Rank-constrained pseudoinverse -- 5.5. Evaluation by matrix norm -- 5.6. Glossary and summary -- 5.7. Supplemental notes -- 5.8. Problems

6. Least-squares solution of linear equations -- 6.1. Linear equations and least squares -- 6.2. Computing the least-squares solution -- 6.3. Multiple equations of one variable -- 6.4. Single multivariate equation -- 6.5. Glossary and summary -- 6.6. Supplemental notes -- 6.7. Problems

7. Probability distribution of vectors -- 7.1. Covariance matrices of errors -- 7.2. Normal distribution of vectors -- 7.3. Probability distribution over a sphere -- 7.4. Glossary and summary -- 7.5. Supplemental notes -- 7.6. Problems

8. Fitting spaces -- 8.1. Fitting subspaces -- 8.2. Hierarchical fitting -- 8.3. Fitting by singular value decomposition -- 8.4. Fitting affine spaces -- 8.5. Glossary and summary -- 8.6. Supplemental notes -- 8.7. Problems

9. Matrix factorization -- 9.1. Matrix factorization -- 9.2. Factorization for motion image analysis -- 9.3. Supplemental notes -- 9.4. Problems

10. Triangulation from three views -- 10.1. Trinocular stereo vision -- 10.2. Trifocal tensor -- 10.3. Optimal correction of correspondences -- 10.4. Solving linear equations -- 10.5. Efficiency of computation -- 10.6. 3d position computation -- 10.7. Supplemental notes -- 10.8. Problems

a. Fundamentals of linear algebra -- A.1. Linear mappings and matrices -- A.2. Inner product and norm -- A.3. Linear forms -- A.4. Quadratic forms -- A.5. Bilinear forms -- A.6. Basis and expansion -- A.7. Least-squares approximation -- A.8. Lagrange's method of indeterminate multipliers -- A.9. Eigenvalues and eigenvectors -- A.10. Maximum and minimum of a quadratic form -- B. Answers.

Abstract freely available; full-text restricted to subscribers or individual document purchasers.

Compendex

INSPEC

Google scholar

Google book search

Linear algebra is one of the most basic foundations of a wide range of scientific domains, and most textbooks of linear algebra are written by mathematicians. However, this book is specifically intended to students and researchers of pattern information processing, analyzing signals such as images and exploring computer vision and computer graphics applications. The author himself is a researcher of this domain. Such pattern information processing deals with a large amount of data, which are represented by high-dimensional vectors and matrices. There, the role of linear algebra is not merely numerical computation of large-scale vectors and matrices. In fact, data processing is usually accompanied with "geometric interpretation." For example, we can think of one data set being "orthogonal" to another and define a "distance" between them or invoke geometric relationships such as "projecting" some data onto some space. Such geometric concepts not only help us mentally visualize abstract high-dimensional spaces in intuitive terms but also lead us to find what kind of processing is appropriate for what kind of goals. First, we take up the concept of "projection" of linear spaces and describe "spectral decomposition," "singular value decomposition," and "pseudoinverse" in terms of projection. As their applications, we discuss least-squares solutions of simultaneous linear equations and covariance matrices of probability distributions of vector random variables that are not necessarily positive definite. We also discuss fitting subspaces to point data and factorizing matrices in high dimensions in relation to motion image analysis. Finally, we introduce a computer vision application of reconstructing the 3D location of a point from three camera views to illustrate the role of linear algebra in dealing with data with noise. This book is expected to help students and researchers of pattern information processing deepen the geometric understanding of linear algebra.

Also available in print.

Title from PDF title page (viewed on May 3, 2021).

There are no comments on this title.