The course assumes students are comfortable with analysis, probability, statistics, and basic programming. This course will cover core concepts in machine learning and statistical inference. The ML concepts covered are spectral methods (matrices and tensors), non-convex optimization, probabilistic models, neural networks, representation theory, and generalization. In statistical inference, the topics covered are detection and estimation, sufficient statistics, Cramer-Rao bounds, Rao-Blackwell theory, variational inference, and multiple testing. In addition to covering the core concepts, the course encourages students to ask critical questions such as: How relevant is theory in the age of deep learning? What are the outstanding open problems? Assignments will include exploring failure modes of popular algorithms, in addition to traditional problem-solving type questions.
Students are allowed to use up to 48 late hours. Late hours must be used in units of hours. Specify the number of hours used when turning in the assignment. Late hours cannot be used on the projects. There will be no TA support over the weekends.
Project Grade Decomposition
Homeworks: (taken from CS 1) It is common for students to discuss ideas for the homework assignments. When you are helping another student with their homework, you are acting as an unofficial teaching assistant, and thus must behave like one. Do not just answer the question or dictate the code to others. If you just give them your solution or code, you are violating the Honor Code. As a way of clarifying how you can help and/or discuss ideas with other students (especially when it comes to coding and proofs), we want you to obey the "50 foot rule". This rule states that your own solution should be at least 50 feet away . If you are helping another students but cannot without consulting your solution, don't help them, and refer them instead to a teaching assistant.
Projects: Students are allowed to collaborate fully within their project teams, but no collaboration is allowed between teams.
|Lecture 1:||Introduction, Probability|
|Lecture 2:||Model, sufficient statistics, Setup for Neyman Pearson, Bayesian, Minmax|
|Lecture 3:||Solving for NP, Bayesian|
|Lecture 4:||Sequential detection|
|Lecture 5:||Estimation, UMVU, different loss functions|
|Lecture 6:||Cramer Rao and ML|
|Lecture 7:||Stein’s method|
|Lecture 8:||First Quiz-Jan 30 (In class)|
|Lecture 9:||Spectral Methods: PCA/CCA|
|Lecture 10:||Spectral Methods: HMM|
|Lecture 11:||Spectral Methods: Tensor methods|
|Lecture 12:||Spectral Methods: Method of moments|
|Lecture 13:||Optimization: Non-convex analysis|
|Lecture 14:||Optimization: Competitive problems|
|Lecture 15:||Second Quiz-Feb 25 (In class)|
|Lecture 16:||Representation theory: Neural Networks. Approximation results|
|Lecture 17:||Representation theory: Neural Networks. Approximation results|
|Lecture 18:||Generalization theory: VC and Radamacher bounds|
|Lecture 19:||Generalization theory: VC and Radamacher bounds|
|Final Presentation Day:||March 13 9am-1pm ANB 105|
Lecture videos can be found in this YouTube playlist