jm + stickers   1

These stickers make AI hallucinate things that aren’t there - The Verge
The sticker “allows attackers to create a physical-world attack without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene.” So, after such an image is generated, it could be “distributed across the Internet for other attackers to print out and use.”

This is why many AI researchers are worried about how these methods might be used to attack systems like self-driving cars. Imagine a little patch you can stick onto the side of the motorway that makes your sedan think it sees a stop sign, or a sticker that stops you from being identified up by AI surveillance systems. “Even if humans are able to notice these patches, they may not understand the intent [and] instead view it as a form of art,” the researchers write.
self-driving  cars  ai  adversarial-classification  security  stickers  hacks  vision  surveillance  classification 
january 2018 by jm

Copy this bookmark:



description:


tags: