Over the last decade, GPS-denied navigation has been a formidable task, whose solution can enable a wide range of applications, from space exploration, and autonomous ground and aerial vehicles, to indoor localization and augmented reality applications.
Interestingly, the endeavor towards GPS-denied localization initiated with the push of the space exploration frontiers, since it was a prerequisite for the success of interplanetary missions. Specifically, researchers have been investigating the fusion of visual information from a camera, with motion information, provided by an Inertial Measurement Unit (IMU), in order to track the position and orientation of an object, in what is termed a Vision-aided Inertial Navigation System (VINS).
Recently, the decreasing cost of inertial and visual sensors along with the increasing computing capabilities of modern mobile devices, allowed for the transfer of VINS technology from spacecraft navigation to consumer-grade devices, such as wearables, cell-phones and tablets.
In this presentation, solutions to three key problems in VINS will be presented:
We’ll include videos and demos of how these techniques are being used by the MARS lab.
Back to History