Delete search term

Header

Main navigation

Neuromorphic Technology for Embodied AI

At a glance

  • Project leader : Prof. Yulia Sandamirskaya
  • Project team : Paul Fox
  • Project budget : CHF 194'000
  • Project status : ongoing
  • Funding partner : Public sector (excl. federal government) (Kanton Zürich / Digitalisierungsinitiative DIZH / DIZH Fellowship)
  • Contact person : Yulia Sandamirskaya

Description

Automation in dynamic environments - in agriculture, healthcare, or small-scale production - is limited. Rigid industrial robots fail in these complex environments. The key towards more flexible, adaptive, and safe automation is better perception. Robots need to assess their environment visually fast enough to track and react to moving objects. Today vision systems are limited intheir speed by the frame rate and data density of image-based sensors.

Neuromorphic computing offers fast and energy-efficient event-based cameras and spiking neural network acceleratorsthat can enable real-time 3D vision for object detection, localization, and pose estimation – key tasks for vision in robotics.Neuromorphic computing spans computing hardware, sensors, algorithms, and application development. Neuromorphichardware reaches up to 5 orders of magnitude better energy-delay product for recurrent neural networks, parallel search andoptimization algorithms.

Our research will include:

  1. Building the real-time 3D event-based vision sensor based on Depth from motion, Event-based stereo, or Active structured light;
  2. Developing efficient 3D representations for neuromorphic hardware: hierarchical representation for shapessimilar to CNNs for texture; NeRFs or other implicit representations are a starting point.
  3. Developing methodsfor object detection, localization and pose estimation using a) Neuromorphic pattern matching with kNN-search and b)VSA-like resonator networks.
  4. In parallel to the technical work, we will be working with DSI on ethical considerations regarding the use of robots in public spaces and in new areas that our technology will enable.