Research Project

EB-SLAM: Event-based simultaneous localization and mapping

Type

National Project

Start Date

01/01/2018

End Date

30/06/2021

Project Code

DPI2017-89564-P

Project illustration

Staff

Project Description

We will develop a high-speed, high-dynamic-range localization and mapping device that fuses inertial measurements with those of a dynamic vision system, most commonly known as event-based camera. Such a device could be used for estimating the movement of an autonomous vehicle or a UAV, in environments without GPS readings, undergoing high dynamics, and under conditions of poor illumination or severe illumination changes.

The problem is multidisciplinary, requiring expertise in computer vision, inertial systems and stochastic estimation. The approach to solve it is unconventional because we will use event cameras. These cameras, unlike conventional ones, provide asynchronous sequences of events at very high speeds of those pixels that change in the scene, rather than full image frames at much lower frequencies. Such type of cameras have very seldom been used previously as proprioceptive sensors to estimate motion, which is the challenge we seek to explore. When fused with an IMU, the device will be able to recover 3D position, orientation, velocities and acceleration with respect to the gravity direction. And since events from the camera come at the microsecond precision, the computed estimates can be published at rates approaching the MHz.

We will develop algorithms to extract the camera motion and the environment structure from the asynchronous sequence of camera events. To integrate the IMUs measurements of angular velocity and acceleration, we will model their systematic biases and make an observability analysis. We will use advanced sparse nonlinear optimization techniques to solve the set of kinematic constraints resulting from IMU integration and the spatial constraints resulting from the observation of the events. We will also address the relative spatial and temporal calibrations between both sensors. The accuracy and performance of our system will be validated against trajectories recorded with an absolute positioning system (Optitrak). We will build prototypes for three specific applications: perception for humanoid locomotion, autonomous high-speed UAV maneuvering, and long-range car dead reckoning. All three applications are subject to abrupt illumination changes and/or high-speed motion, conditions for which event cameras are an adequate sensor choice.

Project Publications

Journal Publications

  • J. Solà, J. Vallvé, J. Casals, J. Deray, M. Fourmy, D. Atchuthan, A. Corominas Murtra and J. Andrade-Cetto. WOLF: A modular estimation framework for robotics based on factor graphs. IEEE Robotics and Automation Letters, 7(2): 4710-4717, 2022.

    Open/Close abstract Abstract Info Info pdf PDF
  • V. Vaquero, I. del Pino, F. Moreno-Noguer, J. Solà, A. Sanfeliu and J. Andrade-Cetto. Dual-branch CNNs for vehicle detection and tracking on LiDAR data. IEEE Transactions on Intelligent Transportation Systems, 22(11): 6942-6953, 2021.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Vallvé, J. Solà and J. Andrade-Cetto. Pose-graph SLAM sparsification using factor descent. Robotics and Autonomous Systems, 119: 108-118, 2019.

    Open/Close abstract Abstract Info Info pdf PDF
  • N. Palomeras, M. Carreras and J. Andrade-Cetto. Active SLAM for autonomous underwater exploration. Remote Sensing, 11(23): 2827:1-19, 2019.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Vallvé, J. Solà and J. Andrade-Cetto. Graph SLAM sparsification with populated topologies using factor descent optimization. IEEE Robotics and Automation Letters, 3(2): 1322-1329, 2018.

    Open/Close abstract Abstract Info Info pdf PDF

Conference Publications

  • Y. Tian and J. Andrade-Cetto. Event transformer FlowNet for optical flow estimation, 2022 British Machine Vision Conference, 2022, London.

    Open/Close abstract Abstract Info Info pdf PDF
  • W.O. Chamorro, J. Andrade-Cetto and J. Solà. High-speed event camera tracking, 2020 British Machine Vision Conference, 2020, (Virtual).

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Martí, A. Santamaria-Navarro, C. Ocampo-Martínez and J. Andrade-Cetto. Multi-task closed-loop inverse kinematics stability through semidefinite programming, 2020 IEEE International Conference on Robotics and Automation, 2020, Paris, France, pp. 7108-7114, IEEE.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Martí, J. Solà, C. Mastalli and A. Santamaria-Navarro. Squash-box feasibility driven differential dynamic programming, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2020, Las Vegas, NV, USA, pp. 7637-7644.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Deray, B. Magyar, J. Solà and J. Andrade-Cetto. Timed-elastic smooth curve optimization for mobile-base planning, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2019, Macau, China, pp. 3143-3149.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Deray, J. Solà and J. Andrade-Cetto. Joint on-manifold self-calibration of odometry model and sensor extrinsics using pre-integration, 9th European Conference on Mobile Robots, 2019, Prague, pp. 1-6.

    Open/Close abstract Abstract Info Info pdf PDF
  • M. Fourmy, D. Atchuthan, N. Mansard, J. Solà and T. Flayols. Absolute humanoid localization and mapping based on IMU Lie group and fiducial markers, 2019 IEEE-RAS International Conference on Humanoid Robots, 2019, , pp. 237-243.

    Open/Close abstract Abstract Info Info pdf PDF

Other Publications

  • J. Solà, J. Deray and D. Atchuthan. A micro Lie theory for state estimation in robotics. Technical Report IRI-TR-18-01, Institut de Robòtica i Informàtica Industrial, CSIC-UPC, 2018.

    Open/Close abstract Abstract Info Info pdf PDF