Project designation
HTPDIR: Human Tracking and Perception in Dynamic Immersive Rooms


The potential of immersive reality systems and its broad range of application are well known. Despite the numerous systems available, there are currently no solutions that map static and dynamic obstacles in the physical space, and thereby users have to circulate in empty rooms or perform interaction with the immersive system in a very limited space. On the other hand, the tracking of the users restricts to the user's head (or hands through the use of controllers), or when members and hands are also tracked, users need to stand in front of the RGB-D sensors -D (or other sensors) in a limited space. The motion tracking of the entire body in larger areas is usually solved using tags, with optical or radiofrequency modulation, placed at several points of the body, and using high frequency sensors to capture these tags. This leads to systems with low flexibility and extremely expensive. In this project it is proposed to develop a low cost system based on multiple RGB-D sensors (> = 4), providing simultaneously the tracking of the full body of the user at any point of the physical space (areas of tens of m ^ 2 ) considering unstructured and dynamic environments, gesture recognition for a more natural interaction with the immersive environment, as well as the interaction with the real world from the immersive world. This system finds a broad range of applications in many areas, namely, shopping area (real estate development, allowing to show final finishing/decoration in spaces), education (space simulation in classrooms, museums, showrooms), training in simulation (accident simulation, police training, social training). The possibility of adapting any space to a new dynamic virtual world, where the user can move within it, opens up a whole new range of solutions with a potential that can be exploited in several areas of activity.

Project code
Start date
End date