Read next
Robotics
Debugging robots: the unresolved frustration of developers.
Robots can relieve people of many tasks – but only as long as they work. And as soon as a fault occurs, the time-consuming search for its cause begins. In contrast to conventional software development, robots have even more areas in which bugs i.e., errors could be hidden. Especially once the development has progressed and the interaction between the robot and its environment is considered. A field report from the world of research.
© iStock/tickcharoen04
Is there a problem with the general power supply? Are all the sensors working properly? Is the recorded data transferred correctly to the software, i.e. the robot’s brain? Or did someone just forget to activate a certain function? In a recent research project, after analyzing a discussion with employees working in the field of robotics , it was revealed that the debugging (troubleshooting) of robots frequently leads to frustration. There are several reasons why:
What makes troubleshooting robots so difficult?
A lack of specialized and known tools means that debugging of robots largely tends to use generic methods from the field of software development, such as the output of sensor data in the command line. However, in the case of certain sensors such as the laser scanner, this output takes the form of an apparently infinite matrix of numbers that cannot possibly be interpreted by the human eye. For this, visualization tools, such as RViz, are available. These tools are able to display the information collected by the robot and thereby construct its subjective environment.
To be able to compare this subjective environment with the actual environment of the robot, though, the developers have to use a simulator at the same time, or observe the robot in the laboratory. This change in visual context means it is almost impossible to judge in real time whether the camera angle is correct, for instance. However, repetitions and recordings cost time and are annoying, as does every attempt to rectify the error. After all, when a software change is to be tried out, the entire code must be repacked and reloaded onto the robot first.
So why, despite all these problems, are specific robot debugging tools not used? On the one hand, very few tools are available that focus on debugging. While there are various simulation and visualization tools that offer useful functions, they sometimes provide information which is too complex or insufficiently detailed to be able to localize an error. On the other hand, although the problem has been recognized in research and approaches have been repeatedly developed, they haven’t succeeded in establishing themselves on the market. And because developers are already required to view a visualization tool, a simulator and the console all at the same time, the skepticism surrounding the integration of yet another tool is understandable.
How can the situation be improved?
As part of a master’s thesis in collaboration with LMU Munich, the Fraunhofer Institute for Cognitive Systems IKS recently attempted to look at the problem of debugging in robotics from an application-oriented point of view. This was followed by discussions with developers and extensive literature-based research. The project culminated in the development of a set of 14 requirements that a helpful type of debugging software should meet. These include, for example, the visualization of sensor data directly in the simulation of the robot, as well as its integration into one of the tools which is already in use. Then, the development of a prototype got underway.
The development is based on Unity, one of the two leading game engines (framework for game development) and its robotics extension, which allows for the importing and simulation of a robot. This takes place via the network connection between Unity and the ROS workspace of the robot. The robot runs on the Robot Operating System (ROS), within the framework of which the software side of the robot is developed. The visualization of collected data in the Unity prototype is possible in both 2D and 3D.
A visual overview of the robot’s architecture is also integrated, showing which functional components are active, where each of these components obtains its information, where it sends this information, and the data volumes that flow in each case. All this information is exchanged between Unity and ROS pretty much in real time.
A user test with 16 participants was carried out to evaluate the prototype in comparison with the RViz and RQT tools, which are currently popular with robotics developers. A significantly superior performance and higher usability (user-friendliness) were evident, not only among users with experience with robots, but also among the inexperienced testers. This also applied equally to all levels of experience. Moreover, during the test experiment, it was found that the experienced users in particular preferred to use the data from the console, which they reported to be due to their poor experiences with the visualization tools they are familiar with.