Saarland University, Machine Learning Group, Fak. MI - Mathematik und Informatik, Campus E1 1, 66123 Saarbrücken, Germany     

Machine Learning Group
Department of Mathematics and Computer Science - Saarland University

TEACHING

MACHINE LEARNING

Wintersemester 2010/2011

GENERAL INFORMATION

In a broader perspective machine learning tries to automatize the process of empirical sciences - namely extracting knowledge about natural phenomena from measured data with the goal to either understand better the underlying processes or to make good predictions. Machine learning methods are therefore widely used in different fields: bioinformatics, computer vision, information retrieval, computer linguistics, robotics,...

The lecture gives a broad introduction into machine learning methods. After the lecture the students should be able to solve and analyze learning problems.

List of topics (tentative)

  • Reminder of probability theory
  • Maximum Likelihood/Maximum A Posteriori Estimators
  • Bayesian decision theory
  • Linear classification and regression
  • Kernel methods
  • Model selection and evaluation of learning methods
  • Feature selection
  • Nonparametric methods
  • Boosting, Decision trees
  • Neural networks
  • Structured Output
  • Semi-supervised learning
  • Unsupervised learning (Clustering, Independent Component Analysis)
  • Dimensionality Reduction and Manifold Learning
  • Statistical learning theory

Previous knowledge of machine learning is not required. The participants should be familiar with linear algebra, analysis and probability theory on the level of the local `Mathematics for Computer Scienticists I-III' lectures. In particular, attendees should be familiar with

Type: Core lecture (Stammvorlesung), 9 credit points

LECTURE MATERIAL

Incremental lecture notes: ML Lecture notes (Version: 30.01.2012).

Old lecture notes: PDF. It is not recommended to print them as these notes will updated over the semester.

The practical exercises will be in Matlab.

SLIDES AND EXCERCISES

21.10. - Introduction
25.10. - Reminder: Probability Theory Exercise 1 Solution 1
29.10. - Reminder: Probability Theory (cont)
1.11. - no lecture Exercise 2 Solution 2
05.11. - Decision Theory Matlab Decision Boundary Demo
08.11. - Decision Theory (cont.) Exercise 3 Solution 3
11.11. - Empirical risk minimization
15.11. - Linear Regression Exercise 4 Solution 4 Data
15.11. - Linear Regression (cont)
22.11. - Introduction to Optimization Exercise 5 Solution 5 Data (contains Matlab code of Ex4)
26.11. - Linear Classification
29.11. - Linear Classification + Kernels Exercise 6 Solution 6 Data for Ex. 15
03.12. - Linear Classification + Kernels (cont.)
13.12. - Kernels (RKHS+Representer Th.)
17.12. - Kernels (RKHS+Representer Th.) (cont.)
03.01. - Evaluation, ROC-Curve, AUC Exercise 7 Solution 7 Data for Exercise 19
07.01. - Tests, Model selection
10.01. - Feature selection I Exercise 8 Solution 8 Data for Exercise 20/21
14.01. - Feature selection II
17.01. - Feature selection II (cont.) Exercise 9 Solution 9 Data for Exercise 22/23
20.01. - Boosting
24.01. - Decision Trees, Neural Networks
and Nearest Neighbor Methods
Exercise 10 Solution 10
27.01. - Semi-supervised Learning
31.01. - K-Means and Spectral Clustering
04.02. - Hierarchical Clustering
07.02. - Dimensionality Reduction
11.02. - Statistical Learning Theory

LITERATURE AND OTHER RESOURCES

The lecture will be partially based on the following books and partially on recent research papers:

  • R.O. Duda, P.E. Hart, and D.G.Stork: Pattern Classification, Wiley, (2000).
  • B. Schoelkopf and A. J. Smola: Learning with Kernels, MIT Press, (2002).
  • J. Shawe-Taylor and N. Christianini: Kernel Methods for Pattern Analysis, Cambridge University Press, (2004).
  • C. M. Bishop: Pattern recognition and Machine Learning, Springer, (2006).
  • T. Hastie, R. Tibshirani, J. Friedman: The Elements of Statistical Learning, Springer, second edition, (2008).
  • L. Devroye, L. Gyoerfi, G. Lugosi: A Probabilistic Theory of Pattern Recognition, Springer, (1996).
  • L. Wasserman: All of Statistics, Springer, (2004).
  • S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University Press, (2004).

Other resources:

  • Matlab is available on cip[101-114] and cip[220-238].studcs.uni-sb.de, gpool[01-27].studcs.uni-sb.de
    The path is /usr/local/matlab/bin.
    For the sun workstations you have to select in the menu Applications/studcsApplications/Matlab
    Access from outside should be possible via ssh: ssh -X username@computername.studcs.uni-sb.de
  • Material for Matlab:

NEWS

Scheine: You can get the certificate for the machine learning lecture at the secretary's office, E1 1, Room 221, Mo-Th 7.30-11.30

Results of re-exam together with final grades of all participants: Results.

Klausureinsicht: You can check your re-exam on Tuesday, 05.04., 16-18, E1 1, Room 225.

TIME AND LOCATION

Lecture: Mo, 10.15-12, E1 3, HS III and Fr, 10.15-12, E1 3, HS I

Exercise groups (tentative):

  • Group A, Monday, 16-18, Seminar room U12 (ground floor), E1 1, tutor: Thomas Buehler
  • Group B, Tuesday, 16-18, Seminar room U12 (ground floor), E1 1, tutor: Martin Slawski
  • Group C, Friday, 16-18, Seminar room 3 (216), E2 4, tutor: Maksim Lapin

EXAMS AND GRADING

Exams: Mid-term: 10.12. , 14-17 Uhr     End-term: 18.2. , 14-17 Uhr    Re-exam: 25.3. , 14-17 Uhr

Grading:

  • 50% of the points in the exercises (up to that point) are needed to take part in the exams (end-term/re-exam). In order to being admitted for the endterm and re-exam, you need to have presented properly once a solution in the exercise groups.
  • An exam is passed if you get at least 50% of the points.
  • The grading is based on the best result of the end-term and re-exam

LECTURER

Prof. Dr. Matthias Hein

Office Hours: Mo, 16-18, Do, 16-18