x
Menu

Computational Linear Algebra for Coders

Other,, Summer 2017 , Prof. Rachel Thomas

Updated On 02 Feb, 19

Overview

This course is focused on the question: How do we do matrix computations with acceptable speed and acceptable accuracy? The course is taught in Python with Jupyter Notebooks, using libraries such as scikit-learn and numpy for most lessons, as well as numba and pytorch in a few lessons.

Includes

Lecture 2: Computational Linear Algebra 2: Topic Modelling with SVD & NMF

4.1 ( 11 )


Lecture Details

Course materials available here: https://github.com/fastai/numerical-linear-algebra
We use a dataset of messages posted on discussion forums to identify topics. A term-document matrix represents the frequency of the vocabulary in the documents. We factor it using Singular Value Decomposition (SVD) and Non-negative Matrix Factorization (NMF).

We use PyTorch as a GPU-accelerated alternative to Numpy to speed things up, and we cover Stochastic Gradient Descent, a very useful, general purpose optimization algorithm.

This video is fast-paced, so be sure to watch Lesson 3 for a review and Q&A of the topics covered here.

Course overview blog post: http://www.fast.ai/2017/07/17/num-lin-alg/
Taught in the University of San Francisco MS in Analytics (MSAN) graduate program: https://www.usfca.edu/arts-sciences/graduate-programs/analytics
Ask questions about the course on our fast.ai forums: http://forums.fast.ai/c/lin-alg

Topics covered:
- Singular Value Decomposition (SVD)
- Non-negative Matrix Factorization (NMF)
- Stochastic Gradient Descent (SGD)
- Intro to PyTorch

Ratings

0


0 Ratings
55%
30%
10%
3%
2%
Comments
comment person image

Sam

Excellent course helped me understand topic that i couldn't while attendinfg my college.

Reply
comment person image

Dembe

Great course. Thank you very much.

Reply
Send