You already know about fast.ai, so I won’t bore you with yet another explanation. But while you may be familiar with fast.ai’s fantastic deep learning courses, perhaps you don’t know about their equally remarkable Computational Linear Algebra course.

The course, by Rachel Thomas, co-founder at fast.ai, is equal parts Jupyter notebook-based textbook — created by Rachel for the course — and a series of accompanying lecture videos — also created by Rachel. What exactly is covered within?

This course is focused on the question: How do we do matrix computations with acceptable speed and acceptable accuracy?

What does it take to understand and utilize computational linear algebra in the wild, and why would you bother? From the course textbook’s Motivation section in Chapter 1:

It’s not just about knowing the contents of existing libraries, but knowing how they work too. That’s because often you can make variations to an algorithm that aren’t supported by your library, giving you the performance or accuracy that you need. In addition, this field is moving very quickly at the moment, particularly in areas related to deep learning, recommendation systems, approximate algorithms, and graph analytics, so you’ll often find there’s recent results that could make big differences in your project, but aren’t in your library.

#overviews #course #fast.ai #linear algebra #ai

Computational Linear Algebra for Coders: The Free Course
1.50 GEEK