Instructor: Ashwin Pananjady
TAs: Mouhyemen Khan and TBA
(Tentative) Schedule:
Lectures: MW 3.30-4.45pm, Weber Space Science and Technology Lecture Hall 1
Instructor OH: 5-5.30pm M (outside lecture hall) and 8-9am Th (online)
TA OH and problem solving session: 5.30-7pm F (location TBD)
Description: This course for beginning graduate students develops the mathematical foundations of machine learning, rigorously introducing students to modeling and representation, statistical inference, and optimization. The class will rigorously build up the two pillars of modern machine learning: linear algebra and probability. The class is proof-based, and much of our development will be theoretical, although there will also be exposure to simulating algorithms to see when they work (or don’t work). The eventual focus is not necessarily to understand cutting-edge applied machine learning (although we will touch upon this), but the mathematical principles on which these ideas are built.
Upon successful completion of the course, you will have learned:
(a) The linear algebraic principles behind modeling function classes, with exposure to both finite and infinite dimensional modeling techniques.
(b) The probabilistic principles based on which we can perform statistical estimation with our models given data.
(c) Some basic principles that govern the design and analysis of optimization algorithms used to fit models to data.
The most important takeaway for some of you might be to recognize that these ideas can help in designing new, principled machine learning methodology, or conversely, to recognize the immense opportunity that exists to place several modern machine learning techniques on a rigorous footing.
Prerequisites:
Students will be expected to have a working knowledge of probability and statistics, linear algebra and multivariable calculus (at the level of MATH 1553 and 2551 or equivalent), basic optimization, and proficiency with Python programming. The most important prerequisite is that you are open to learning (if you don’t already) how to write rigorous proofs. Having taken a rigorous, proof-based undergraduate course will prove very helpful. We will try our best to bring you up to speed with some of the prerequisites by using some auxiliary handouts, reviews during lecture, and optional homework (HW0). Given the breadth of machine learning and the fact that we will try to cover topics with rigor, the class will be proceed at a fast pace. Students should expect to do extra work in proportion to the amount of background that they are missing.
Syllabus: See the detailed syllabus for the class here.
HW0 for the class is here. The introductory lecture (with a link to Piazza signup) is here.