The unmanned autonomous systems laboratory (UASL) is led by Dr Lounis Chermak and is part of the Centre for Electronic Warfare Information and Cyber (CEWIC). It works very closely with industrial organisations that have a strong heritage in autonomous systems research.

The UASL is establishing itself more and more as a world leader in the applications related to automatic sensing and processing and intelligence autonomy of unmanned vehicles. In GPS-denied navigation of UAVs, for example, the UASL developed innovative imaging based solutions using either monocular visible camera or stereo visible cameras for one or multiple cooperative UAVs. Furthermore, it developed a unique navigation system based on multispectral visible-infrared stereo imaging systems that fits the needs of a lot of military type of applications. The UASL also leads all activities linked to infrared-based/3D seekers, particularly detection, tracking and recognition of targets for missile applications.

Summary of applications

The UASL is equipped, using its research funding, with the necessary facilities to make sure the algorithms developed could be validated in extensive experiments. The laboratory has two indoor localisation systems to use as ground truth in our navigation experiments. The first piece of equipment is a multi-view calibrated camera-based system providing accuracy of up to 3-4cm in position. The second piece is a high-quality laser-based motion tracking system able to provide an indoor localisation within a millimetre for vehicles, from ground robotic platforms to aerial vehicles, such as Quadrotors, that have been acquired for use indoors and outdoors. Other systems, including infrared and visible band cameras and acoustic sensors, are available to use for the research intended by the UASL team.

About the facility

In addition to Dr Chermak, the UASL team comprises:

  • 1 x lecturer
  • 2 x research fellows
  • 11 x PhD students

The research work developed at the UASL is not only recognised at the national level but also and very much at the international level. The laboratory's expertise in unmanned aerial vehicle navigation was, in 2009, selected for and funded by the European FP7 Marie Curie International Research Staff Exchange Scheme to further investigate, through the International Cooperation Program for Unmanned Aerial Systems Research and Development – in collaboration with Polytechnic University Madrid and the Australian Research Centre for Aerospace Automation (ARCAA), Queensland University of Technology – to what extent navigation and planning decisions and intelligence could be provided safely to unmanned aerial systems. The UASL secured funding from BAE systems and EPSRC in 2010 to investigate initial research concepts of cooperative mosaicking for guidance of multiple UGVs.

In 2013, another programme of research, co-sponsored by MBDA systems and EPSRC, to investigate cooperative navigation for multiple UAVs was finalised. Additionally, the UASL has been selected by ESA on a number of occasions under European open competition NPI schemes to contribute in a number of research programmes related to navigation and autonomy of systems. One of those programmes was related to the famous ExoMars programme, to develop a Mars Rover robot vision based localisation. Currently, the UASL is leading technical activities in two major and highly competitive defence research programmes under MCM ITP, which is a UK MOD and DGA funded partnership. These programmes are named 3D Automatic Target Recognition Seekers and Architecture and Rapid Static Target Modelling, respectively. In the former, we lead 3D ATR and human-in-the-loop interface/decisions for 3D-based seekers' activities. In the latter, we optimise the 3D modelling of scenes for missile planning.

Current PhD supervision

  • Visible band, multi-spectral and thermal stereo navigation for ground and air vehicles;
  • Investigation of convex optimisation techniques in problems of motion and pose estimation;
  • Fault detection and isolation for UAVs' inertial navigation systems;
  • Robust vision-based slope estimation and rocks detection for an autonomous space lander;
  • Infrared (IR)-based relative navigation and guidance for active debris removal;
  • 3D automatic target recognition for future LIDAR-based missiles;
  • Increasing micro UAVs' autonomy via vision-based control (application to power asset inspection);
  • Acoustic and video detection and localisation in sensors networks;
  • Optimal fault-tolerant flight tracking control for aircraft with imperfect actuators;
  • Robust 3D registration and tracking with RGBD sensors;
  • Haptic/active stick tele-operation for UAVs;
  • Enhanced speech recognition with fused audio and video modalities.