Nightshade, a tool against AI scraping

Nightshade is a new tool artists can employ to disrupt AI Models. This is important – since Ai models cannot “forget” they ate copyrighted material even though the owners of said stuff ask for removal, well… let’s give the models something bad to eat. Something that will corrupt the data already inside. Guten appetit, stronzi.

“More precisely, Nightshade transforms images into “poison” samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.” (from What is nightshade, https://nightshade.cs.uchicago.edu)

💚 Congratulations to all the researchers and artists who worked on this tool.

This said, right now Nightshade is available only for powerful desktop devices, but soon will be available via Browser. Most interesting, in fact, is Web Glaze, allowing artists to glaze their images even if they’re unable to install Glaze.

How can I help support Nightshade and Glaze?
Thank you so much for thinking of us. It is important to us that we not only continue to provide Glaze to visual creators for free, but also extend its protective capabilities. If you or your organization may be interested in pitching in to support and advance our work, please contact our colleague Joshua Leavitt to learn more about gift opportunities for Glaze (managed by the University of Chicago campus offices).”
(from FAQ, https://nightshade.cs.uchicago.edu)

Other than this, please support the project by following it on:
X/Twitter: @TheGlazeProject IG: theglazeproject

Article written on 12/02/2024 – Elena Greenedera Zambelli #noai


Discover more from Greenedera art

Subscribe to get the latest posts sent to your email.

Comments are closed.