Robots may soon be able to navigate complex environments
Getting machines to move autonomously in space requires a great deal of computing power and energy. Researchers have now drawn on nature to develop a new solution.
Even animals as small bees are easily able to find their way around complex environments. They use visual signals, among other things, to estimate their own movement and track their position in relation to key landmarks. Machines such as robots, drones and augmented-reality glasses use cameras to do this, and these cameras generate countless images that contain a considerable amount of information that is not needed. Processing these images requires a great deal of computing power and energy, which means the devices tend to be larger and heavier.
A research team including Yulia Sandamirskaya, who heads the Research Centre for Cognitive Computing in Life Sciences at the ZHAW School of Life Sciences and Facility Management, has developed a new type of energy-efficient solution and demonstrated its applicability to a real robot task. The results were published in the renowned journal Nature Machine Intelligence.
Eye-like camera
In this innovative solution, not only the algorithms but also the hardware have been modelled on natural neural networks. The camera functions in a similar way to the human eye, which continuously registers changes in light intensity. Thanks to this lightning-speed perception, machines can keep reorienting itself without having to stop their movements. According to Yulia Sandamirskaya, this could enable robots to navigate dynamic environments more safely and efficiently. “This is crucial for developing support technologies for senior citizens, for example.”