Researchers create a patch that can hide people from AI object detectors

Undoubtedly, artificial intelligence (AI) is taking surveillance technology to greater heights. Using machine learning capabilities, cameras can now detect the things, human or any trouble without any human support. These cameras have definitely improved the efficiency and accuracy of human detection system on airports, railway station, and other sensitive public places.

Want to be undetected by these AI-powered surveillance cameras?

Three machine learning researchers from the University of KU Leuven in Belgium found a way to confuse an AI system. They developed a small, square, colorful printed patch Which can be used as a cloaking device for hiding the people from object detectors. They called this patch an “adversarial patch”.

This 40 sq. cm printed patch can remarkably lower the accuracy of an object detector. The researchers named Simen Thys, Wiebe Van Ranst and Toon Goedemé have published their paper to the arXiv preprint server.

The team has also uploaded a video on YouTube where they demonstrated how the person holding the patch is ignored or remain undetected by the AI cameras, while the person without the patch is successfully detected. And when the patch is flipped to its blank side, the camera again detects the persons’ presence.

How does it all happen?

The AI system is trained for recognizing different objects and persons by showing it thousands of objects that fit into given categories. But, the recent research appears that human-spotting AI systems can be fooled by images placed in front of their forms.

The KU Leuven researchers write, “Creating adversarial attacks for person detection is more challenging because human appearance has many more variations than for example stop signs.”

For testing, they used the YOLOv2 object detector, a popular fully-convolutional neural network model. The team created several types of images and then they tested it with the AI system until they found one that worked particularly well. The image we can see in the video is an image of people holding colorful umbrellas that had been altered by rotating it and adding noise.

Also, it was necessary to hold the photograph in a position that occupied the box that the AI system constructed to determine if a given object was identifiable. The researchers found the patches worked “quite well” as long as they are positioned correctly.

“From our results, we can see that our system is able significantly lower the accuracy of a person detector…In most cases, our patch is able to successfully hide the person from the detector. Where this is not the case, the patch is not aligned to the center of the person,” the researchers said.

Trending

The four-legged robot learned to stand on two legs without falling

Most four-legged robots have a similar design; their legs consists of two segments. Besides, these legs usually can bend only in one direction, which...

A self-solving Rubik’s Cube that floats in the air while solving itself

After wowing us with his self-solving Rubik’s Cube, the creator Human Controller now takes things to another level by making his miniature...

Related Stories

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay on top - Get the daily news in your inbox

Get the best futuristic stories staight into your inbox before everyone else!