¹û¶³Ó°Ôº

XClose

¹û¶³Ó°Ôº Module Catalogue

Home
Menu

Robot Vision and Navigation (COMP0249)

Key information

Faculty
Faculty of Engineering Sciences
Teaching department
Computer Science
Credit value
15
Restrictions
Module delivery for PGT (FHEQ Level 7) available on MSc Artificial Intelligence for Biomedicine and Healthcare; MSc Artificial Intelligence for Sustainable Development; MSc Computer Graphics, Vision and Imaging; MSc Robotics and Artificial Intelligence; MSc Medical Robotics and Artificial Intelligence.
Timetable

Alternative credit options

There are no alternative credit options available for this module.

Description

Students will gain knowledge about robot real-time pose estimation and mapping, with an emphasis on the use of vision as a primary sensor for mapping the environment. The module will provide students with an understanding and practical experience of how to combine information from satellite navigation and motion sensing systems, recover geometry from optical sensors and creating an environment map which a robot can use for navigation and motion planning.

Ìý

To navigate safely, robots need the ability to localize themselves autonomously using their onboard sensors. Potential tasks include the automatic 3D reconstruction of buildings, inspection and surveillance.

Ìý

Aims:

ÌýThe aims of this module are to:

  • Support students to design and optimize robot vision and navigation systems that can operate reliably and accurately in real-world environments.
  • Develop students’ knowledge about robot real-time pose estimation and mapping, with an emphasis on the use of vision as a primary sensor for mapping the environment.
  • Develop students’ understanding and practical experience of how to combine information from satellite navigation and motion sensing systems, recover geometry from optical sensors.
  • Develop students’ understanding and practical experience of creating an environment map which a robot can use for navigation and motion planning.

Intended learning outcomes:

On successful completion of the module, a student will be able to:

  1. Apply fundamental techniques used for real-time estimation in linear and nonlinear systems.
  2. Formulate algorithms to fuse data from satellite and motion sensing systems to estimate robot position.
  3. Formulate mapping and localisation problems in which robots construct sparse maps of their environment.
  4. Create 3D reconstructions of the environment using camera data.
  5. Programme with Matlab or Python or C++.

Indicative content:

The following are indicative of the topics the module will typically cover:

  • Mathematical formulation of the SLAM problem.
  • Graphical models and probability.
  • Sparse SLAM algorithms (system - ORB-SLAM2.)
  • Grid and volume-based algorithms (systems - Octomap.)
  • Normalised Distributed Transformations.
  • Implicit representations such as NeRFs.

Requisite conditions:

To be eligible to select this module as optional or elective, a student must be (1) registered on a programme and year of study for which it is formally available; (2) have completed an introductory module on machine learning.

Module deliveries for 2024/25 academic year

Intended teaching term: Term 2 ÌýÌýÌý Postgraduate (FHEQ Level 7)

Teaching and assessment

Mode of study
In person
Intended teaching location
¹û¶³Ó°Ôº East
Methods of assessment
100% Coursework
Mark scheme
Numeric Marks

Other information

Number of students on module in previous year
0
Who to contact for more information
cs.pgt-students@ucl.ac.uk

Last updated

This module description was last updated on 8th April 2024.

Ìý