Nightshade tool protects images from AI misuse without permission

2 mins

Jude Foster

Published by: Jude Foster

08 April 2024, 03:50PM GMT+00:00

In Brief

University of Chicago launches Nightshade to protect images from AI.

Nightshade embeds hidden info to disrupt AI learning without consent.

Tool works on images even if cropped, edited, or displayed digitally.

Aims to enforce permissions, challenging AI firms' ineffective opt-outs.

Part of a growing toolkit for artists to defend against AI misuse.

Nightshade tool protects images from AI misuse without permission

Researchers from the University of Chicago have come up with a clever solution called Nightshade to stop AI programs from using images they shouldn't. This tool sneaks in some hidden info into pictures that messes up AI attempts to learn from them without getting a green light from the image owners.

Nightshade is like a secret guardian for artists and copyright owners. It hides a trick in their images that doesn't change how the picture looks to us, but to an AI, it's a big deal. This trick stays even if the image is cropped, edited, or squished down. Even screenshots or digital displays of these "poisoned" images will throw AI for a loop. The idea here is simple but effective: making it tricky for anyone to use artists' images without asking first. Despite some AI companies offering a way out for artists who don't want their images used, the folks behind Nightshade say these aren't good enough. They argue that it's too easy for these protections to be ignored, leaving artists out in the cold.

Nightshade isn't out to break the AI models but to make those who use images without permission think twice, adding a cost to their actions. It's a bit like another project from the University of Chicago called Glaze, which also protects artists' work but focuses more on keeping an artist's unique style safe from AI replication. Nightshade's main aim is to stop images from being nabbed without a nod. Anyone can grab this tool online and use it on both Windows and Mac computers without needing any fancy equipment.

As we see more tools like MIT’s PhotoGuard, Google DeepMind's SynthID, and Meta's Stable Signature popping up, it's clear that there's a growing toolkit for creators to protect their work from being misused by AI. These tools offer different ways to shield artwork, indicating a broader movement towards empowering artists in the digital realm.



Blue robot
Brown robot
Green robot
Purple robot

Share this material in socials

Copy link
Bell notification
Blue mail
Blured bell
Blue Mail
Mail plane
Mail plane
Mail icon
Mail icon
Mail icon

Join our newsletter

Stay in the know on the latest alpha, news and product updates.