Wide-ranging research on embodied intelligence at TUM
Robotics and AI as the decathlon of engineering sciences
From food services and nursing care to the retail sector: the shortage of trained personnel is making itself felt in everyday life. What could be more obvious than having robots take on certain tasks in these areas? Take a restaurant, for example: a robot stands behind the counter, keeps an eye on new customers, goes to the table when they are ready to order, takes the order and passes it on. It serves meals and drinks, chats with the guests and finally brings the check. For Prof. Daniel Rixen, this scenario misses the point: “Do we really want a robot that looks human?” asks the professor at the Chair of Applied Mechanics at TUM. “Or do we want a machine of some kind that replaces or improves on the functions that a human performs?” Cameras could keep an eye on people in a restaurant, notice when they want something and take their orders using a microphone built into the table. And: Do the drinks and meals actually have to be carried to the table by a humanoid robot? “That could be done by robotic arms that pour the champagne from above and serve the steak dinner from the side,” says Prof. Rixen. This leads to the realization: What robots look like can play a role, but it doesn’t need to. They can range from nanorobots to drones or from self-driving cars to robotic arms. The principle “Form follows function” applies not only to design, but also to robotics. Applications where robots are assigned clear tasks are easy to control. An example would be picking peppers in a greenhouse. “The pepper doesn’t care whether it’s picked by a human or a robot,” says Prof. Rixen. One thing is clear: in the modern era, the tool must fulfil its purpose.
The walking robot: do we really need it?
Professor Rixen, whose projects include research with the humanoid robot Lola, knows what it means to teach a two-legged robot to walk, especially on uneven ground. If the robot is also expected to carry drinks without spilling a drop, the requirements for a humanoid waiter in a restaurant become even more demanding. “Lola was built 15 years ago as a rigid machine that can be precisely controlled,” says Prof. Rixen, explaining basic research done at the Applied Mechanics chair. “We now know that it is important for joints in robots to have more flexibility and give. When Lola stumbles, she can make fewer than seven adjustments per second,” says the professor, who is responsible for everything that moves and oscillates. Algorithms use input from motion sensors in the chest area to control motors and joints. But human beings, with millions of years of development behind them, are in a completely different league. In Rixen’s words, making the ‘muscle’ of a robot as efficient as a human muscle is the holy grail of robotics. The machine is still unable to store energy to be released later, as in the human Achilles tendon, which delivers important impetus when a person walks. In mechanical terms, that would perhaps mean placing springs across the joints. But at present that could cause potentially uncontrollable vibrations.
Robotics: the decathlon of engineering sciences
“The humanoid robot is an ideal concept in the world of research,” says Prof. Alin Albu-Schäffer. The robotics expert at the German Aerospace Center (DLR) and at TUM calls it the “decathlon of engineering science”. Mechanics, informatics, medicine, electrical engineering and ethics are just a few of the scientific disciplines whose expertise is needed in the development of robots. True to the motto of physicist Richard Feynman, “What I do not understand I cannot create,” the director of the Institute for Robotics and Mechatronics at DLR and the head of the TUM Chair of Sensor Based Robotic Systems and Intelligent Assistance Systems works on the technical implementation of robotics concepts. With “cobots” – collaborative robots – the focus is on practical applications. This new approach, which emerged around 10 years ago, is centered around a robot that is adapted to the size, strength and speed of a human being. “Cobot arms” like those produced by companies such as Franka Emika and Kuka learn complex tasks that they will later perform with people nearby.
Robots of the future will be a little better than humans
For Albu-Schäffer it is clear: “The more human the surroundings of a robot and the more varied its tasks, the more important it is for it to be humanoid.” He also has a further prerequisite: “It needs to be a little better than a person – to run a bit faster, have a little more dexterity or be able to fly.” The cobot approach, which Prof. Sami Haddadin, the head of the Munich Institute of Robotics and Machine Intelligence (MIRMI) helped to launch, is benefiting from considerable advances in artificial intelligence. Progress in areas ranging from image and language recognition to the ChatGPT language generator and realtime technologies will facilitate further developments. Important goals include: the ability of a robot to find its way in any surroundings it will encounter. The main protagonist of cobot technology in MIRMI is GARMI, a nursing care robot from the TUM Geriatronics Research Center in Garmisch-Partenkirchen. It is designed to work with the elderly and persons requiring nursing care.
Embodied intelligence: coping with complex surroundings
A key qualification of robots is intelligence – in other words, the ability to make decisions independently. “The robot needs a physical body and artificial intelligence,” says Prof. Albu-Schäffer, who uses the term ‘embodied intelligence’. AI is the specialized field of TUM professor Angela Schoellig, who came to TUM from the University of Toronto in 2022. Her task: to integrate machine learning into robots to enable them to perform more complex tasks. “By ‘more complex’, we especially mean as compared to robots that perform a pre-programmed movement and do the same thing over and over all day long,” says Schoellig, referring to the robots typically used in mass production. The ideal robot will operate in complex environments and do its own planning. Instead of being programmed by hand, it will be capable of learning and making adjustments.
Transferring ChatGPT to robotic hardware
Assuming that a robot has to find a certain floor and room in a building independently. “Unless a programmer knows all the details along the way, they can’t code this in advance,” says the AI expert Schoellig, Director Industry & International at MIRMI. So the robot has to find its own way, learn how to spot the elevator, press the button for the correct floor, know which hallway to take and which door to open. This includes many other skills – recognizing objects, not pressing the elevator buttons too hard, avoiding obstacles and even asking the way if necessary. And: sharing what it learns with other robots. The latter is similar to ChatGPT: “A lot of people have posted things on the internet. Now a neural network is learning it,” says Prof. Schoellig. It is obvious, however, that ChatGPT is not much help to a robot at present: “AI has to interact with the physical world. It’s not enough ‘just’ to be able to chat,” says Prof. Albu-Schäffer, downplaying the role of programs such as ChatGPT in robotics research at present. The special challenge now facing robotics is the diversity of systems. For robots using different machine learning models or sensors, it is still difficult to transfer knowledge and possibly make it available to all of the other robots through the cloud, for example.
Robotics: developing tools to perform tasks faster and better
Past research has addressed many individual AI-based skills, but always for a specific purpose: for example Prof. Schoellig has deployed flying robots in mining to continually track the size of rock fragments after blasting in order to optimize the explosive charges over time. In addition, developers have built a robot – known as The Thing – that can catch balls or balance objects on a tray. A robot is capable of taking objects out of a basket, positioning them correctly on an assembly line or assembling them directly. Consequently, the issue in the future will be not so much whether a robot in a restaurant looks, moves and talks like a person, but rather whether it serves a useful purpose.
At the Automatica trade fair from 27 to 30 June 2023, you will find over 30 demonstrations of research work on robotics and AI in Hall B4 in the AI Society area. Click here for the overview.
In a press tour on 27 June 2023, from 11:30 to 12:30, representatives of the media will get to see newly developed demos from the areas of health, mobility, environment and work. Accreditation via andreas.schmitz. @tum.de
Prof. Angela Schoellig will join Prof. Sandra Hirche and Prof. Stefan Leutenegger from TUM at the roundtable "ONE Munich: Trustworthiness of Human-Centered Robotics and AI" on 28 June between 12:45 and 13:30 (Hall B4, 329).
On 28.6.2023, the munich_i Hightech Summit (Hall B4/530) will take place between 9:30 and 18:00. The contributions will be moderated by Prof. Angela Schoellig, Prof. Cristina Piazza, Prof. Alin Albu-Schaeffer and Prof. Darius Burschka. To the agenda of the munich_i Hightech Summit 2023.
All lectures, discussion rounds and roundtables on the fringe of munich_i can be found here.
Contacts to this article:
Prof. Angela Schoellig
Chair for Safety, Performance and Reliability of Learning Systems
Technical University of Munich (TUM)
Prof. Daniel Rixen
Chair of Applied Mechanics
Technical University of Munich (TUM)
Prof. Alin Albu-Schaeffer
Chair of Sensor Based Robotic Systems and Intelligent Assistance Syswtems
Technical University of Munich (TUM)
Head of the Institute of Robotics and Mechatronics
German Aerospace Center (DLR)