Kaisa_2012_3_photo by Veikko Somerpuro

Please sign up in moodle for "Probabilistic Graphical Models 2019" or "PGM2019".

Log in to view the registration key for Moodle.


Here is the course’s teaching schedule. Check the description for possible other schedules.

Tue 15.1.2019
16:15 - 18:00
Thu 17.1.2019
16:15 - 18:00
Tue 22.1.2019
16:15 - 18:00
Thu 24.1.2019
16:15 - 18:00
Tue 29.1.2019
16:15 - 18:00
Thu 31.1.2019
16:15 - 18:00
Tue 5.2.2019
16:15 - 18:00
Thu 7.2.2019
16:15 - 18:00
Tue 12.2.2019
16:15 - 18:00
Thu 14.2.2019
16:15 - 18:00
Tue 19.2.2019
16:15 - 18:00
Thu 21.2.2019
16:15 - 18:00
Tue 26.2.2019
16:15 - 18:00
Thu 28.2.2019
16:15 - 18:00

Other teaching

16.01. - 27.02.2019 Wed 12.15-14.00
Kari Rantanen
Teaching language: English
22.01. - 26.02.2019 Tue 14.15-16.00
Kari Rantanen
Teaching language: English


Master's Programme in Data Science is responsible for the course.

The course belongs to the Machine learning module.

The course is available to students from other degree programmes.

Prerequisites in terms of knowledge

The students must be familiar with the basics of first-order logic and probability calculus (multivariate distributions, Bayes formula). For example, the course "Introduction to Machine Learning" covers these preliminaries.

Prerequisites for students in the Data Science programme, in terms of courses

DATA11002 Introduction to Machine Learning

Prerequisites for other students in terms of courses

DATA11002 Introduction to Machine Learning

Recommended preceding courses


Data Science Project I–II


  • Can explain the meaning of a Bayesian network model as a parametric model (set of probability distributions), factorization of a joint probability distribution, and as an independence model (using d-separation, and local and global Markov properties).
  • Can derive the maximum likelihood parameters, the maximum posterior parameters (with conjugate prior), and the expected parameters for the Multinomial distribution

  • Can explain the role of the parameter prior in parameter learning

  • Can explain the model structure learning problem and how that differs from the parameter learning problem

  • Can explain what overfitting is

  • Can explain the concept of equivalence class and determine whether two networks are equivalent or not

  • Can compute conditional distributions from simple discrete probabilistic models, like a fixed Naïve Bayes classifier, a finite mixture model or a Hidden Markov Model
  • Can implement a probabilistic inference algorithm for a fixed singly-connected graph with the parameters given
  • Can learn a Naïve Bayes Classifier from a set of data and use the model for predictive inference

  • Can learn the parameters of a Bayesian network from a set of data and use the model for predictive inference

  • Knows how to compute the marginal likelihood for discrete Bayesian networks and can explain how to use it for model structure selection
  • Can implement an algorithm for learning a discrete Bayesian network, given data

Recommended time/stage of studies for completion:

  • after completing the course "Introduction to Machine Learning".
  • spring term first year.

Term/teaching period when the course will be offered:

  • period III

This course provides an introduction to probabilistic modeling from a computer scientist"s perspective. Many of the research issues in Artificial Intelligence, Computational Intelligence and Machine Learning/Data Mining can be viewed as topics in the "science of uncertainty," which addresses the problem of optimal processing of incomplete information, i.e., plausible inference, and this course shows how the probabilistic modeling framework forms a theoretically elegant and practically useful solution to this problem. The course focuses on the "degree-of-belief" interpretation of probability and illustrates the use of the Bayes Theorem as a general rule of belief-updating. As a concrete example of methodological tools based on this approach, we will use probabilistic graphical models focusing in particular on (discrete) Bayesian networks, and on their applications in different probabilistic modeling tasks.

All materials are available from the course webpage.

The following is the current plan for Spring 2018: teaching methods evolve from year to year.

The primary mode of instruction consists of lectures and exercise sessions with active guidance, supported by other forms of teaching methods when applicable. The students are encouraged to attend the lectures and they need to solve homework exercises including problems involving small programming tasks to reach the learning outcomes related to implementation skills. The weekly exercise sessions comprise (i) online problem solving in small groups; the students teach each others the solutions found, and (ii) discussion of the completed homework exercises.

The following is the current plan for Spring 2018: assessment practices evolve from year to year. Grading scale is 1...5. On should obtain 30 points to pass the course with grade 1. Grade 5 is obtained with 50 points out of the maximum 60. Linear scale is applied for other grades. Deviations from the scheme are possible depending on the difficulty of the exam.

The course exam takes into account exercise points that are gathered throughout the course. Alternatively, a separate exam can be taken by self-study.