Prof. Darius Burschka learned from mariners

Calculating collision risks: with the constant bearing principle

Preventing accidents by computing impending collisions of drones or cars: that is the goal of Darius Burschka. The professor at the Technical University of Munich (TUM) tracks every point in images generated by cameras carried by flying drones or cars on the road. In principle he applies the “constant bearing” technique traditionally used by navigators at sea.

iStockphoto.com / eyewave
Prof. Darius Burschka uses the principle of standing bearing to avoid collisions between vehicles or drones.

The compound eye of a wasp gave Prof. Burschka an idea. By moving its body back and forth, the insect detects which objects are nearby and which ones are further away. In this way it builds a mental map as it flies.

Airspace and road traffic: 60 measurements a second for greater safety

A similar principle is applied in a solution with which Prof. Burschka, the Co-Head of the Perception Group at the Munich Institute of Robotics and Machine Intelligence (MIRMI) of TUM, tries to identify impending collisions of drones or cars with other objects. His computer system checks the image points from a camera 60 times per second and determines the “collision conditions”. “We track up to a million pixels of an image in real time,” says Burschka. To compute this “optical flow”, he does not need a supercomputer. Instead, he works with a “mere” high-performance graphic processor that handles the images, another process to calculate the collision paths, and a camera. The researcher explains: “We look at the detectable characteristics in the image and watch how they move across it.”

Two-dimensional images as a foundation: similar to the constant bearing approach at sea

To calculate the immediate danger of a collision, the TUM professor only needs two-dimensional images from a perspective like the one used by the wasp to fix individual points and perceive changes in them – or like a sailor applying the constant bearing method. With that method, a ship is determined to be on a collision course if the absolute bearing between approaching ships shows little or no change as the distance decreases. “The best way of detecting a potential collision is to keep an eye on which surrounding objects are not moving,” says Burschka. The TUM scientist calculates where and in which direction objects fly past the camera, i.e. “penetrate the observational plane”. In conventional applications, autonomous driving experts, for example, use several cameras to compute distances to nearby objects using vectors. “When the objects are far away from the camera, the 3D process is no longer reliable,” explains Burschka. Changes in positions of points from one image to the next can no longer be seen.

Paradigm shift: time to interaction replaces metric state analysis

With the new method, rapidly approaching objects still remote from the observer are recognized as more dangerous than others that are closer but moving in the same direction. “Instead of moving objects being prioritized by their motion alone, this is done on the basis of dynamic collision conditions,” says Prof. Burschka. All “features” in the image are now under observation and the potentially dangerous ones can be flagged accordingly. “We measure the ‘time to interaction’ – in other words, the time that will elapse before a collision occurs,” he explains. The new method will enable scientists to analyze the paths of moving objects with just one camera, which will also be in motion. “Unlike metric reconstruction, this approach is much cheaper and more robust,” Prof. Burschka believes. The ‘time to interaction’ approach would thus represent a paradigm shift in research. The professor plans to use his invention in drones, networked vehicles and service robotics.

Publications

Estimating dense optical flow of objects for autonomous vehicles; Ee Heng Chen, Jöran Zeisler, Darius Burschka; IEEE Intelligent Vehicles Conference 2021

Task representation in robots for robust coupling of perception to action in dynamic scenes; Darius Burschka; Robotics Research 2020 

Further information and links

Prof. Darius Burschka a principal investigator and co-head of the Perception Group at the Munich Institute of Robotics and Machine Intelligence (MIRMI). With the MIRMI, TUM has established an integrative research center for science and technology to develop innovative and sustainable solutions for the central challenges of our time. It has cutting-edge expertise in central areas of robotics, perception and data science. In the key research and application field “Future of Health”, work is conducted in the areas of machine learning in medicine, data mining and analysis, virtual and augmented reality, sensor systems in robotics and safe human/robot interactions (HRI), and soft robotic design and control. For more information see: https://www.mirmi.tum.de/.

Technical University of Munich

Contacts to this article:

Prof. Dr.-Ing. Darius Burschka

Chair of Robotics, Artificial Intelligence and Real Time Systems

Technische Universität München (TUM)

burschka@tum.de

+49-89-289-17638

 

Andreas Schmitz

Corporate Communication Center 

Press Robotics and maschine Intelligence

Andreas.schmitzspam prevention@tum.de

Back to list
HSTS