¹û¶³Ó°Ôº

XClose

¹û¶³Ó°Ôº Module Catalogue

Home
Menu

Probabilistic and Unsupervised Learning (COMP0086)

Key information

Faculty
Faculty of Engineering Sciences
Teaching department
Computer Science
Credit value
15
Restrictions
Module delivery for PGT (FHEQ Level 7) available on MSc Computational Statistics and Machine Learning; MSc Machine Learning.
Timetable

Alternative credit options

There are no alternative credit options available for this module.

Description

Aims:

This module provides students with an in-depth introduction to statistical modelling and unsupervised learning techniques. It presents probabilistic approaches to modelling and their relation to coding theory and Bayesian statistics. A variety of latent variable models will be covered including mixture models (used for clustering), dimensionality reduction methods, time series models such as hidden Markov models which are used in speech recognition and bioinformatics, independent components analysis, hierarchical models, and nonlinear models. The course will present the foundations of probabilistic graphical models (e.g., Bayesian networks and Markov networks) as an overarching framework for unsupervised modelling. We will cover Markov chain Monte Carlo sampling methods and variational approximations for inference. Time permitting, students will also learn about other topics in probabilistic (or Bayesian) machine learning.

Intended learning outcomes:

On successful completion of the module, a student will be able to:

  1. Understand the theory of unsupervised learning systems.
  2. Have in-depth knowledge of the main models used in Unsupervised Learning; to understand the methods of exact and approximate inference in probabilistic models.
  3. Recognise which models are appropriate for different real-world applications of machine learning methods.

Indicative content:

The following are indicative of the topics the module will typically cover:

Ìý

  • Basics of Bayesian learning and regression.
  • Latent variable models, including mixture models and factor models.
  • The Expectation-Maximisation (EM) algorithm.
  • Time series, including hidden Markov models and state-space models.
  • Spectral learning.
  • Graphical representations of probabilistic models.
  • Belief propagation, junction trees and message passing.
  • Model selection, hyperparameter optimisation and Gaussian-process regression.

Audit:

Any student or researcher at ¹û¶³Ó°Ôº meeting the requisite conditions is welcome to attend the lectures. Students who wish to formally register on the module should consult with the Module leader.

Requisites:

To be eligible to select this module as an optional or elective, a student must: (1) be registered on a programme and year of study for which it is formally available; (2) have a good background in statistics, calculus, linear algebra, and computer science; and (3) have good competency in MATLAB or Octave (or be taking a class on MATLAB/ Octave, or be willing to learn it on their own).

Students wishing to take the module should thoroughly review the maths in the provided before the start of the module.

Module deliveries for 2024/25 academic year

Intended teaching term: Term 1 ÌýÌýÌý Postgraduate (FHEQ Level 7)

Teaching and assessment

Mode of study
In person
Methods of assessment
50% Exam
50% Coursework
Mark scheme
Numeric Marks

Other information

Number of students on module in previous year
43
Module leader
Dr Dmitry Adamskiy
Who to contact for more information
cs.pgt-students@ucl.ac.uk

Last updated

This module description was last updated on 8th April 2024.

Ìý