Student theses topics (non-exhaustive list)

Heterogeneous sensor fusion and calibration

Robotic autonomy is a rapidly evolving field, with sensor fusion playing a pivotal role in enabling robots to make informed real-time decisions, navigate in complex environments, and ensure safe and efficient operations. Sensor fusion involves the data integration from diverse sensors, such as cameras, lidars, radars, and GPS, to create a unified and comprehensive understanding of the robot's surroundings or location in the environment. To achieve accurate and robust sensor fusion, it is imperative to address the calibration of these sensor systems. 

We aim to develop algorithms and methodologies to determine spatial relations and synchronize these varied sensor systems and, finally, fuse the incoming information to estimate the robot's position in the environment.

Object manipulation and human-robot collaboration

Manipulating objects in the environment is fundamental in many industrial and commercial applications, from manufacturing assembly lines to warehouse management. Automating this process using robotic arms not only enhances efficiency but also reduces human errors and can operate in environments that might be hazardous to humans. The goal of this topic is to design, simulate and implement a system where a robotic arm can identify, pick, and place objects in specified locations. Additionally, including the human in the process so as to execute complex tasks jointly often includes detecting the human and inferring their intentions.

Specific thesis topics can focus on perception, motion planning, motion control and integration of the complete pipeline. As well as human detection, pose esimation and intention esimation using the Bayesian theory of mind. The challenge is to ensure that robotic arms can autonomously and accurately perform this task with or without the human in the loop, adapting to different objects and environments.

3D perception, localization and mapping

 

 

 

 

 

 

 

Interpreting and infering the state of the environment is a fundamental problem in various emerging fields, from autonomous driving, mobile robotics to virtual reality. Additionally, localization of an autonomous agent within an unknown environment and building the map of it is on of the pilars of robot autonomy, also known as simultaneous localization and mapping or SLAM. To tackle these challenges various perception  sensors can be used, e.g., cameras, 3D laser range sensors, radars, as well as proprioceptive sensors like IMUs. Nowadays, novel deep learning approaches are optimized to exploit rich information present in the sensors and aid in solving these challenges.

Specific topics can focus on visual, laser, radar, as well as fusion thereof for odometry and SLAM. The goal of this topic is to explore existing approaches and develop advanced solutions for localization and mapping of autonomous agents.

Robot motion control and planning

The main task of mobile robots is to move through the environment to transport goods/people or perform tasks while avoiding obstacles. To achieve the desired movement of the robot, it is necessary to plan the path and ensure the following of that path. However, many challenges, such as dynamic obstacles, large environments, rough or complex terrain, multi-robot collaboration etc. need to be resolved to achieve a feasible, possibly optimal, path to a given target position and to ensure efficient path re-planning.

Specific topics can focus on: developing path planning algorithms capable of efficiently navigating through dynamic environments, exploring strategies to manage large maps to optimize computational efficiency and to maintain maps as the environment changes over time, upgrading path planning algorithms that can handle rough or complex terrain, investigating how multiple autonomous agents can collaborate to navigate within the same environment efficiently and safely, particularly in scenarios like multi-robot exploration or transportation.



Current Projects

 

 

External sites