``Nightshade'', a tool that prevents image learning by AI by adding ``poison'' to images, is publicly available and anyone can use it



On January 19, 2024, Nightshade , a tool developed by a research team at the University of Chicago that prevents AI from learning images, was released to the public and anyone can download it.

Nightshade: Protecting Copyright

https://nightshade.cs.uchicago.edu/whatis.html




Glaze - What is Glaze
https://glaze.cs.uchicago.edu/what-is-glaze.html

AI-poisoning tool Nightshade now available for artists to use | VentureBeat
https://venturebeat.com/ai/nightshade-the-free-tool-that-poisons-ai-models-is-now-available-for-artists-to-use/

Developed under Professor Ben Zhao of the University of Chicago, Nightshade is a tool that ``makes changes to images that are difficult to notice with the human eye to interfere with AI learning.'' The same research team announced an AI learning inhibition tool called ' Glaze ' in March 2023, but according to the research team, Glaze was developed as a 'defensive tool,' while Nightshade was developed as an 'attack tool.' It has been developed as a ``tool''.

Glaze is a tool that applies a process called ' perturbation ' to the original image so that the AI is unable to learn the artist's artistic style and fails to imitate it. On the other hand, Nightshade is not a defense against style imitation, but can fundamentally disrupt AI learning by distorting the representation of features in images. The research team describes this mechanism of Nightshade as ``putting 'poison' on the image.''

Also, according to the research team, even if images processed using Nightshade are cropped, resampled, compressed, etc., the effect on the processing will be minimal.




The mechanism of Nightshade and examples of images actually output are summarized in detail in the following article.

``Nightshade'' is a learning prevention tool that can poison illustrations and photographs and inhibit the learning of image generation AI - GIGAZINE



The research team says, ``Nightshade has a weaker effect of interfering with AI learning than Glaze, but it is also less visible to the human eye.''




On the other hand, there are cases where images processed using Nightshade change so much that they are indistinguishable to the human eye. Below is an example presented by 2D artist Повышенная синицевость . The left image is the original image, the center image is the output image with Nightshade's 'Low Fast' setting, and the right image is the output image with the 'Low Slow' setting. When you zoom in, you can see that the image quality of the background curtains etc. in the image processed with Nightshade is rougher than the original image.



In response to this point, the research team said, ``This can happen with complex renderings like this image. If you think the change in the image is too much for your art style, use Nightshade. There's no need.'




In addition, in response to these points, Nightshade at the time of writing the article stated that ``It hardly works with the previous generation training process'' and ``It doesn't work at all with the latest training process'', and ``Although it is academically interesting, It may be a case of exaggerating an unrealistic result.''

Nightshade is available in a Windows version and a Mac version equipped with Apple silicon, and anyone can download it . In addition, as Nightshade's popularity is increasing at the time of article creation, it has been reported that it takes a long time to download. Therefore, the research team recommends downloading from the fast mirror link for Nightshade binaries.




As a future goal, the research team aims to develop an integrated version of Glaze and Nightshade to further strengthen the inhibition of learning by AI.




He also revealed that there are plans to integrate Nightshade into the browser version of Glaze.




Furthermore, it has been reported that they are considering developing Glaze not only for still images but also for videos.


in Software, Posted by log1r_ut