Syllabus Detail

Department of Mathematics Syllabus

This syllabus is advisory only. For details on a particular instructor's syllabus (including books), consult the instructor's course page. For a list of what courses are being taught each quarter, refer to the Courses page.

MAT 167: Applied Linear Algebra

Approved: 2018-05-30, N. Saito, J. De Loera, E. G. Puckett
Suggested Textbook: (actual textbook varies by instructor; check your instructor)
Lars Eldén: Matrix Methods in Data Mining and Pattern Recognition, SIAM, 2007
https://doi.org/10.1137/1.9780898718867
Search by ISBN on Amazon: 978-0-898716-26-9
Prerequisites:
MAT 022A or MAT 027A or MAT 067 or BIS 027A.
Suggested Schedule:

The lecture notes prepared and used by Naoki Saito are available at: https://www.math.ucdavis.edu/~saito/courses/167/lectures.html. The corresponding homework problems as well as sample midterm and final exam problems are available upon request.

Lectures Sections

Comments/Topics

1st Week Chap. 1 and from various sources (see note below); Chapter numbers refer to the book of Elden while it is easy to see the corresponding lecture slides by Saito

Motivational introduction with examples; Review of basic linear algebra: meaning of matrix-vector/matrix-matrix multiplications

2nd Week Chap. 2

Review of basic linear algebra: range; null space; linear independence; bases; dimensions; ranks; inverse matrices; inner products; vector and matrix norms

3rd Week Sec. 3.6; Chap.4; Chap. 5

Introductory least squares problems; orthogonality; projectors; QR factorization; classical Gram-Schmidt orthogonalization

4th Week Chap. 5

Modified Gram-Schmidt; Householder triangularization; Givens rotations

5th Week Midterm Exam; Chap. 6

Introduction to Singular Value Decomposition (SVD)

6th Week Chap. 6, Chap. 7

Low rank approximation; condition numbers; SVD vs Principal Component Analysis

7th Week Chap. 9

Data clustering; Nonnegative Matrix Factorization

8th Week Chap. 10

Applications of SVD I: pattern classification

9th Week Chap. 11

Applications of SVD II: text mining

10th Week Chap. 12 as well as extra material, handouts

Applications of SVD III: search engines (see notes below)

  • In the 1st Week, it is important to motivate the students using important examples to show how linear algebra is used in real world. Suggested examples are: music signal representation and compression, image compression, web search engines, inverse problems (e.g., tomography), etc.
  • In the 8th and 9th Weeks, the instructor should supply some applications of SVD including web search engines, least squares problems, image approximations, pattern classification, inverse problems in a more detailed manner some of which were discussed in the 1st Week.
  • In the 10th Week, the instructor can freely choose various applications of interest. The instructor is encouraged to check the following optional textbooks for further examples and applications:
    • Carl D. Meyer: Matrix Analysis and Applied Linear Algebra, SIAM, 2000.
    • Lloyd. N. Trefethen and Davis Bau, III: Numerical Linear Algebra, SIAM, 1997.
    • Michael W. Berry and Murray Browne: Understanding Search Engines: Mathematical Modeling and Text Retrieval, 2nd Ed., SIAM, 2005. Available free online via UCD’s SIAM subscription at https://doi.org/10.1137/1.9780898718164
    • David Skillicorn: Understanding Complex Datasets: Data Mining with Matrix Decompositions, Chapman & Hall/CRC, 2007. Available free online via UCD’s subscription at https://www.taylorfrancis.com/books/9781584888338
    • Cleve Moler: Numerical Computing with MATLAB. Available for free online at https://www.mathworks.com/moler/chapters.html. Chapter 2: Linear Equations, Chapter 5: Least Squares, and Chapter 10: Eigenvalues and Singular Values are quite relevant for this course.
Additional Notes:

Naoki Saito’s Lecture slides available online at: https://www.math.ucdavis.edu/~saito/courses/167/lectures.html . The ebook is available with free of charge thanks to the SIAM subscription of UCD: https://doi.org/10.1137/1.9780898718867

  • Basic concepts of linear algebra (22A or 67) are assumed but the most important concepts will be briefly reviewed in the class.
  • Important: our Math Analytics & Operations Research majors are required to take this course. Thus, data science applications (e.g., pattern classification, text mining, search engines, etc.) must be included.
  • Since this course emphasizes modern applications it is reasonable to have a longer more difficult final project instead of an in-class final exam. Here are three examples of final projects:
    • Google's Pagerank algorithm: The challenge is to explain and illustrate how this Google search algorithm works. The project is nicely structured by solving Problems 2.26 and 2.27 of Chapter 2 of Moler.
    • Dolby Sound and Noise removal algorithm: The challenge is given a noisy sound (e.g., an AM radio station broadcast) you clean it up by decomposition into Fourier components. Chapter 8 of Moler's book shows you how to deal with sounds in MATLAB. Students can structure a final project starting with solutions to problems 8.8 and 8.9 of Moler’s book.
    • Image Processing and Eigenvalues. The challenge is to illustrate in a meaningful way how linear algebra can be used to analyze simple picture images. How can you recognize a face within a gallery of pictures, say that of Barack Obama. The principles of how to recognize images using the SVD are in Elden’s book. However, problems 10.12 and 10.13 in Chapter 10 of Moler provides good concrete challenges to start a nice project.