International. A cardboard sign with a colorful pattern. All the researchers at the Faculty of Technology Engineering at KU Leuven needed to cheat an intelligent camera. To be clear: Wiebe Van Ranst, Simen Thys and Toon Goedemé, from the EAVISE research group, have no bad intentions. His research aims to expose the weaknesses of intelligent detection systems.
"Intelligent detection systems are based on pattern recognition," says Professor Goedemé, head of EAVISE (Integrated and Artificially Intelligent Vision Engineering) at the De Nayer campus. "They consist of a camera and a software to interpret the images automatically. If you train these systems with images of different people for a while, they learn to recognize people and distinguish them from objects. Although we differ in height, color of hair or face, the algorithm identifies us as human beings. "
"This makes intelligent detection systems very suitable for security purposes. They automatically give a signal as soon as the cameras detect an intruder, even when that person tries to hide. In the past, you needed security guards for that, and they would be watching the screens for hours and hours. A tedious job that is becoming a thing of the past. "
Millions of parameters
However, intelligent detection systems are not infallible. Sometimes they have difficulty detecting certain patterns. Researchers around the world are trying to expose the Achilles heel of detection systems. Small changes are enough to do this. Fake glasses made of cardboard, for example, are enough to confuse a facial recognition system.
Professor Goedemé and his team have taken things a step further. The master student Simen Thys and the postdoctoral Wiebe Van Ranst managed to cheat YOLO, one of the most popular algorithms for detecting objects and people.
The researchers lifted a cardboard poster of 40 by 40 cm, with a colorful print, in front of his body. That was enough to fool YOLO: carrying the sign makes you invisible to the system.
Very remarkable, according to Professor Goedemé: "In previous tests, people wore a T-shirt with the image of a bird. The algorithm did not recognize a person, but detected a bird. Our pattern, which was designed using artificial intelligence, makes people invisible. If you carry the sign, the system does not detect you, neither as a human being nor as an object. Remarkable, and we do not know exactly why this particular pattern can fool YOLO. After all, algorithms based on neural networks use millions of parameters. For researchers, this is still a black box. "
The researchers are enthusiastic, but are quick to warn of other security flaws.
"Where to go from here?" That's easy: we found a weakness, and now it needs to be fixed, in this case, I could teach the YOLO algorithm that people who hold a sign with this particular pattern are also human beings. It is hard to do, however, it is safe to assume that YOLO also has other weaknesses, can we ever fix all the security flaws? I do not believe it, I already mentioned it: such an algorithm is a black box This is the beginning of a arms race".
"We did not expect this type of exaggeration, but we see why the application appeals to the imagination, the idea that you can become invisible to security cameras using nothing more than a colorful poster is intriguing."
Source: KU Leuven.