Emergent Tech

Artificial Intelligence

Now that's sticker shock: Sticky labels make image-recog AI go bananas for toasters

Google boffins develop tricks to fool machine-learning software

By Thomas Claburn in San Francisco

49 SHARE

Despite dire predictions from tech industry leaders about the risks posed by unfettered artificial intelligence, software-based smarts remain extremely fragile.

The eerie competency of machine-learning systems performing narrow tasks under controlled conditions can easily become buffoonery when such software faces input designed to deceive.

Adversarial data models have been used to dupe image recognition algorithms and fool facial recognition systems, among other things. Recently, a team of MIT students working under the name LabSix demonstrated a way to create a 3D model of a turtle that image recognition software would identify as a rifle.

In a research paper presented in December through a workshop at the 31st Conference on Neural Information Processing Systems (NIPS 2017) and made available last week through ArXiv, a team of researchers from Google discuss a technique for creating an adversarial patch.

This patch, sticker, or cutout consists of a psychedelic graphic which, when placed next to an object like a banana, makes image recognition software see something entirely different, such as a toaster.

Fooled ... the sticker that tricked the software. Credit: Google

The Google researchers – Tom B. Brown, Dandelion Mané, Aurko Roy, Martín Abadi, and Justin Gilmer – demonstrated their perception altering scheme using a pre-trained Keras model called VGG16.

The attack differs from other approaches in that it doesn't rely on altering an image with graphic artifacts. Rather, it involves adding the adversarial patch to the scene being captured by image recognition software.

"We construct an attack that does not attempt to subtly transform an existing item into another," the researchers explain. "Instead, this attack generates an image-independent patch that is extremely salient to a neural network. This patch can then be placed anywhere within the field of view of the classifier, and causes the classifier to output a targeted class."

The boffins observe that because the patch is separate from the scene, it allows attacks on image recognition systems without concern for lighting conditions, camera angles, the type of classifier being attacked, or other objects present in the scene.

How we fooled Google's AI into thinking a 3D-printed turtle was a gun: MIT bods talk to El Reg

READ MORE

While the ruse recalls schemes to trick face scanning systems with geometric makeup patterns, it doesn't involve altering the salient object in the scene. The addition of the adversarial patch to the scene is enough to confuse the image classification code.

The researchers say they believe the attack works because of the way image classification tasks are designed.

"While images may contain several items, only one target label is considered true, and thus the network must learn to detect the most 'salient' item in the frame," the paper explains. "The adversarial patch exploits this feature by producing inputs much more salient than objects in the real world."

As demonstrated in this video, the colorful patch is so packed with features that it demands attention from the classifier, which then ignores the original object.

The researchers conclude that those designing defenses against attacks on machine learning models need to consider not just imperceptible pixel-based alterations that mess with the math but also additive data that confuses classification code. ®

Sign up to our NewsletterGet IT in your inbox daily

49 Comments

More from The Register

OK, Google. Music in 2019 isn't what it was, but Play nice, will ya?

Bug appears to hate tunes released this year

Google Play Store spews malware onto 9 million 'Droids

How did these get through the net?

FYI: Drone maker DJI's 'Get it on Google Play' website button definitely does not get the app from Google Play...

Updated Quadcopter slinger rudely palms folk off to .apk download

Latest Google+ flaw leads Chocolate Factory to shut down site early

52.5 million accounts at risk, tens of people are worried

Here you go, cloudy admins: Google emits NATty odds 'n' sods

Google Cloud Next Incremental titbits aimed at time-poor techies

Google's secret to a healthy phone? Remote-controlling your apps

Look Ma, no not much malware!

Thanksgiving brings together Apple's Siri and Google Assistant

A divided tech nation embraces, uncomfortably

Google hands out roses to preferred Android MDM vendors

Lucky few get Chocolate Factory's endorsement as Enterprise Mobility Management

Google-whisperers beat reCaptcha voice challenge with 90% success rate

Code's up on Github and Google's fine with that

Colour us shocked: Google in €50m GDPR fine appeal bombshell

Didn't see that coming