Want to avoid being seen by person-recognizing camera systems? Wear a shirt printed with a complex, confusing image that looks like a mangled JPG of a crowd scene.
The bright adversarial pattern, which a human viewer can darn-near see from space, renders the wearer invisible to the software looking at him. ... Code does not "think" in terms of facial features, the way a human does, but it does look for and classify features in its own way. To foil it, the "cloaks" need to interfere with most or all of those priors. Simply obscuring some of them is not enough. Facial recognition systems used in China, for example, have been trained to identify people who are wearing medical masks while trying to prevent the spread of COVID-19 or other illnesses.
And of course, to make the task even more challenging, different object detection frameworks all use different mechanisms to detect people, Goldstein explained. "We have different cloaks that are designed for different kinds of detectors, and they transfer across detectors, and so a cloak designed for one detector might also work on another detector," he said.
See also Adversarial Fashion.