MRF 340/ 8:30am-9:50am Mon/Wed / Prof. Fei Sha

GFS 116/ 2:00pm-3:20pm Tue/Thu / Prof. Yan Liu

Farhad Pourtaran (pourtara@usc.edu), Taha Bahadori (mohammab@usc.edu), Wenzhe Li (wenzheli@usc.edu) and Yuan Shi (yuanshi@usc.edu).

The chief objective of this course is to teach methods in pattern classification and machine learning. Key components include statistical learning approaches, including but not limited to various parametric and nonparametric methods for supervised and unsupervised learning problems. Particular focuses on the theoretical understanding of these methods, as well as their computational implications.

Undergraduate level training or coursework in linear algebra, calculus and multivariate calculus, basic probability and statistics; an undergraduate level course in Artificial Intelligence may be helpful but is not required.

As you have already known, there will be an entrance exam at the first day of the class (Monday or Tuesday, depending on which section). However, due to the sitting limitation, and the number of students who are interested in this course, we may not have enough seats for those who are on the waitlist. Therefore we will host the exam on a first-come-first-serve basis. If you're not able to be seated on Monday's exam, you are welcome to use Tuesday as a backup plan. (You can take either exam, irrespective of which section you are interested in). To help you to prepare the exam, you can review the references listed in the syllabus or consult the review slides (to be lectured on the second class) in here.

The exam is closed-book. We will distribute the exam and answer sheets. You only need to bring writing tools (pens or pencils). You will need to take the exam, whether you are registered or still on waiting list. There is no exception to the rule.

The schedule is shown for Prof. Liu's lecture dates. Prof Fei's lecture is one date ahead.

Date | Topics | Reading Assignment |
---|---|---|

8/26 | Entrance Exam | |

8/28 | Overview of ML | |

9/2 | Review of Basic Math Topics | |

9/4 | Nearest neighbors | [PP] 1.4.1-1.4.3; [SL] 13.3 |

9/9 | Decision Tree | [PP] 3.5, 16.2; [SL] 6.6.3, 9.2. HW#1 out |

9/11 | Naive Bayes | [PP] 3.5; [SL] 6.6.3 |

9/16 | Logistic regression (Part 1) | [PP] 1.4.6, 8.1-8.3; [SL] 4.1-4.2, 4.4 [PP] |

9/18 | Logistic regression (Part 2) | PP] 1.4.6, 8.1-8.3; [SL] 4.1-4.2, 4.4 [PP]. HW#2 out |

9/23 | Linear/Gaussian discriminant analysis, Perceptron, online learning | 4.2.1-4.2.5, 8.5.1-8.5.4; [SL] 4.3, 4.5 |

9/25 | Linear regression | [PP] 1.4.5, 7.1-7.3, 7.5.1-7.5.2, 7.5.4, 7.6; [SL] 3.1-3.2 |

9/30 | Overfitting, bias/variance tradeoff | [PP] 1.4.7, 1.4.8; [SL] 7.1-7.3, 7.10 |

10/2 | bias/variance tradeoff, regularization | [PP] 1.4.7, 1.4.8; [SL] 7.1-7.3, 7.10. |

10/7 | Kernel methods, SVM (Part 1) | [PP] 14.1, 14.2.1-14.2.4, 14.4.1, 14.4.3; [SL] 5.8, 6.3, 6.7. |

10/9 | Kernel methods, SVM (Part 2) | [PP]14.5.2-14.5.4; [SL] 12.1-12.3. HW#3 out |

10/14 | SVM (Part 3) | [PP]14.5.2-14.5.4; [SL] 12.1-12.3 |

10/16 | Pragmatics: comparing and evaluating classifiers | [PP] 16.7, 16.8 |

10/21 | Quiz 1 | |

10/23 | Boosting | [PP] 16.4.1-16.4.5, 16.4.8, 16.4.9; [SL] 16.3 |

10/28 | Neural networks and deep learning | [PP] 16.5.1-16.5.6, 28; [SL] 11.3-11.7 |

10/30 | Clustering | [PP] 11.1-11.3, 11.4.1-11.4.4, 11.5; [SL] 14.3.1-14.3.9, 8.5. |

11/4 | Mixture models | [PP] 11.1-11.3, 11.4.1-11.4.4, 11.5; [SL] 14.3.1-14.3.9, 8.5. . HW#4 out |

11/6 | Large-scale Learning for Big Data [Slides] | No Reading |

11/11 | Dimensionality reduction and visualization | [PP] 12.2, [SL] 14.5.1 |

11/13 | Kernel PCA and Hidden Markov models | [PP] 17.1-17.4, 17.5.1-17.5.2 |

11/18 | Hidden Markov Model (Part 2) | [PP] 17.1-17.4, 17.5.1-17.5.2. HW #5 out |

11/20 | Introduction to Graphical Models | [PP] 10.1, 10.2.1-10.2.3, 10.3-10.5 |

11/25 | Other current and trendy topics | No Reading |

11/27 | No class | Happy Thanksgiving |

12/2 | Course review/summary | No Reading |

12/4 | Quiz 2 (cumulative) |