Artificial intelligence (AI) based surveillance has been increasing in recent times, putting end to public anonymity by allowing governments and law enforcement authorities to track citizens in masses. But with every technology, a way to trick it is introduced.
A paper was presented by a group of students of the University of KU Leuven in Belgium on preprint server arXiv. This included how the AI system used for recognizing people in images can easily be tricked with the help of simple printed patterns.
Printing one of the specially designed patches by students and wearing it around your neck will make you invisible for the AI detection. According to writers, if this technique is incorporated with clothing simulation, a T-shirt print can be designed which will make you virtually invisible, and impossible for current auto-surveillance cameras to detect you.
This may be a little strange for the general public but in AI world these patterns are called adversarial examples that use brittle intelligence of computer vision systems and fool them by detecting that is not present there.
Adversarial examples have already been used to trick the facial recognition systems, by wearing a pair of special glasses. Algorithms are being fooled by turning them into stickers which are printed on 3D objects as well as art is made by fooling systems.
With adversarial examples comes many drawbacks as according to researches this can be used negatively as well. For example, self-driving cars can be fooled by making stop sign appear as a lamppost, or preventing the medical AI vision systems to detect diseases.
David Ha, Google researcher found out that this adversarial patch trick only YOLOv2 algorithm but will fail in its purpose for off-the-shelf computer vision systems developed by various tech companies including Google. Also if a person is looking at the image, it would not work.
So the trick to make yourself invisible for surveillance systems, for now, is only possible in science fictions. Though many would be interested in ensuring their anonymity to AI surveillance.
Read next: These Stats Show How Machine Learning Will Evolve Our Lives In Future [Infographic]
A paper was presented by a group of students of the University of KU Leuven in Belgium on preprint server arXiv. This included how the AI system used for recognizing people in images can easily be tricked with the help of simple printed patterns.
Printing one of the specially designed patches by students and wearing it around your neck will make you invisible for the AI detection. According to writers, if this technique is incorporated with clothing simulation, a T-shirt print can be designed which will make you virtually invisible, and impossible for current auto-surveillance cameras to detect you.
This may be a little strange for the general public but in AI world these patterns are called adversarial examples that use brittle intelligence of computer vision systems and fool them by detecting that is not present there.
Adversarial examples have already been used to trick the facial recognition systems, by wearing a pair of special glasses. Algorithms are being fooled by turning them into stickers which are printed on 3D objects as well as art is made by fooling systems.
With adversarial examples comes many drawbacks as according to researches this can be used negatively as well. For example, self-driving cars can be fooled by making stop sign appear as a lamppost, or preventing the medical AI vision systems to detect diseases.
David Ha, Google researcher found out that this adversarial patch trick only YOLOv2 algorithm but will fail in its purpose for off-the-shelf computer vision systems developed by various tech companies including Google. Also if a person is looking at the image, it would not work.
So the trick to make yourself invisible for surveillance systems, for now, is only possible in science fictions. Though many would be interested in ensuring their anonymity to AI surveillance.
Read next: These Stats Show How Machine Learning Will Evolve Our Lives In Future [Infographic]