Linear algbera #3: projections, orthogonalization, and least-squares

Mike X Cohen, Neuroscientist, teacher, writer

Play Speed
  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x
15 Videos (2h 43m)
    • Projections in R^2

      9:37
    • Projections in R^N

      14:08
    • Orthogonal and parallel vector components

      10:55
    • Code challenge: decompose vector to orthogonal components

      8:57
    • Orthogonal matrices

      13:44
    • Gram-Schmidt and QR decomposition

      15:39
    • Matrix inverse via QR decomposition

      2:26
    • Code challenge: Inverse via QR

      7:52
    • Introduction to least-squares

      12:55
    • Least-squares via left inverse

      10:49
    • Least-squares via orthogonal projection

      7:55
    • Least-squares via row-reduction

      10:50
    • Model-predicted values and residuals

      6:32
    • Least-squares application 1

      12:22
    • Least-squares application 2

      18:14

About This Class

This class follows from linear algebra #1-2.

3

Students

--

Projects

0

Reviews (0)

Mike X Cohen

Neuroscientist, teacher, writer

Officially I'm Dr. Michael X Cohen, but I prefer just "Mike" or "Mike X" or "the mysterious X." I'm a scientist because I believe that discovery and the drive to understand mysteries are among the most important drivers of progress in human civilization. And I believe in teaching because, well, because I really like teaching. I've been doing it my whole life. I teach "real-life" courses, online courses, university courses. I've written several books about neuroscience and data analysis, which...

See full profile