She Teaches Robots to See

- EN - DE

Eva Reitbauer is trying to teach robots to "see". Her robots include autonomous compost turning machines whose sensory organs are sensors such as laser scanners and cameras.

Robots have their own ’sensory organs’. Image source: Schmied - TU G
Robots have their own ’sensory organs’. Image source: Schmied - TU Graz
Compost turners are responsible for regularly turning the compost in triangular heaps at the composting site. This is an essential quality factor in commercial composting, as the raw material must be exposed to the right temperature, moisture and aeration during the composting process in order to ultimately become nutrient-rich and valuable soil. The compost heaps were turned manually in the past. One person had to steer the compost turner across the square - a hot and very unpleasant job for the nose. Eva Reitbauer worked on an autonomous compost-turning machine as part of her doctoral thesis at Graz University of Technology (TU Graz). To ensure that they can navigate independently and work without any accidents, it is particularly important that they are aware of their surroundings. She therefore answered very essential questions on the subject of "seeing" with respect to robots.

The following interview deals exclusively with the topic of "seeing". A portrait of Eva Reitbauer - "With the compost turner to her doctorate" and a detailed report on the project "TU Graz develops autonomous electric compost turner"

News+Stories: Robots look very humanoid in my imagination. So - again in my imagination - they also see like humans. Is that right? Do they have eyes or do they work with other sensory organs?

Eva Reitbauer: Robots don’t always have the classic human robot form that we know from films. For example, my robots are vehicles that drive autonomously over composting facilities, turning rows of compost and aerating it. To do this, the robot must be able to recognise the compost rows as well as react to obstacles and also avoid them.

Robots don’t have eyes like we humans do, but can perceive their surroundings with the help of various sensors. On the one hand, they have cameras. These can either be "normal" RGB colour cameras that deliver images like classic photos. But they can also be cameras with special functions, such as thermal cameras that react to heat. There are also so-called stereo cameras, which work with two lenses that record pictures simultaneously. These cameras are very similar to the human eye and enable the robots to "see" in three dimensions. This enables them to estimate distances and sizes.

But cameras are not the only sensors, are they?

Reitbauer: No. In addition to cameras, we also use LiDAR (Light Detection and Ranging) sensors. These are laser scanners that scan the surroundings with active light pulses that they emit. The robot’s surroundings can even be recorded directly in 3D. These sensors are used in almost all’autonomous robotics applications today. Autonomous cars also use ultrasound and radar sensors.

Sensors and algorithms are constantly evolving as are the robots’ capabilities. To be able to perceive their surroundings specially well and correctly in many situations, robots not only use one sensor, but combine many different sensors to be able to see as well and as extensively as possible.

Why do robots still find it difficult to see? Why are we humans still ahead of them?

Reitbauer: Robots are not so bad at seeing. Sometimes they can even detect things that the human eye cannot see - for example, heat sources using thermal cameras or things in the dark using LiDAR.

What is still very difficult for robots, however, is to understand the environment they see in the same way that we humans do. It is a particular challenge to correctly contextualise the images seen, to correctly analyse complex situations and to draw the appropriate conclusions and actions from them.