# Computational Linear Algebra for Coders

Other,, Summer 2017 , Prof. Rachel Thomas

Updated On 02 Feb, 19

Other,, Summer 2017 , Prof. Rachel Thomas

Updated On 02 Feb, 19

This course is focused on the question: How do we do matrix computations with acceptable speed and acceptable accuracy? The course is taught in Python with Jupyter Notebooks, using libraries such as scikit-learn and numpy for most lessons, as well as numba and pytorch in a few lessons.

- On-demand Videos
- Login & Track your progress
- Full Lifetime acesses

4.1 ( 11 )

Course materials available here: https://github.com/fastai/numerical-linear-algebra

SVD is intimately connected to the eigen decomposition, so we will now learn how to calculate eigenvalues for a large matrix. We will use DBpedia, a large dataset of Wikipedia links, and the principal eigenvector gives the relative importance of different Wikipedia pages (this is the basic idea of Googles PageRank algorithm)

Topics covered:

- Full vs Reduced Factorizations

- Matrix Inversion is Unstable

- SVD

- DBpedia Dataset

- Power Method

This material is reviewed in the Lesson 10 Video

Course overview blog post: http://www.fast.ai/2017/07/17/num-lin-alg/

Taught in the University of San Francisco MS in Analytics (MSAN) graduate program: https://www.usfca.edu/arts-sciences/graduate-programs/analytics

Ask questions about the course on our fast.ai forums: http://forums.fast.ai/c/lin-alg

Sam

Sep 12, 2018

Excellent course helped me understand topic that i couldn't while attendinfg my college.

Dembe

March 29, 2019

Great course. Thank you very much.