Machine Learning

Semester:
WS 2018
Type:
Lecture
Lecturer:
Credits:
V3 + Ü1 (6 ECTS credits)

News

  • We are aware that there are currently some problems with registration for this course in the new rwth online system. We are working to fix them. Please be patient.

Lecture Description

The goal of Machine Learning is to develop techniques that enable a machine to "learn" how to perform certain tasks from experience.

The important part here is the learning from experience. That is, we do not try to encode the knowledge ourselves, but the machine should learn it itself from training data. The tools for this are statistical learning and probabilistic inference techniques. Such techniques are used in many real-world applications. This lecture will teach the fundamental machine learning know-how that underlies such capabilities. In addition, we show current research developments and how they are applied to solve real-world tasks.

Example questions that could be addressed with the techniques from the lecture include

  • Is this email important or spam?
  • What is the likelihood that this credit card transaction is fraudulent?
  • Does this image contain a face?

Exercises

The class is accompanied by exercises that will allow you to collect hands-on experience with the algorithms introduced in the lecture.

There will be both pen&paper exercises and practical programming exercises (roughly 1 exercise sheet every 2 weeks). Please submit your solutions electronically through the L2P system.

We ask you to work in teams of 2-3 students.

Literature

The first half of the lecture will follow the book by Bishop. For the second half, we will use the Deep Learning book by Goodfellow as a reference.

  • Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006
  • Ian Goodfellow, Yoshua Bengio, Aaron Courville, Deep learning, MIT Press, 2016

Wherever research papers are necessary for a deeper understanding, we will make them available in the L2P.

Additional Resources

  • Kevin Murphy, Machine Learning -- A Probabilistic Perspective, MIT Press, 2012.

Python Resources

Course Schedule
Date Title Content Material
Introduction Introduction, Probability Theory, Bayes Decision Theory, Minimizing Expected Loss
Prob. Density Estimation I Parametric Methods, Gaussian Distribution, Maximum Likelihood
Prob. Density Estimation II Bayesian Learning, Nonparametric Methods, Histograms, Kernel Density Estimation
Prob. Density Estimation III Mixture of Gaussians, k-Means Clustering, EM-Clustering, EM Algorithm
Disclaimer Home Visual Computing institute RWTH Aachen University