Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
venturebeat.com Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
Nightshade was developed by University of Chicago researchers under computer science professor Ben Zhao and will be added as an option to...
You're viewing a single thread.
All Comments
119 comments
It should be pretty easy to filter out everything that is not visible to humans.
12 0 Reply
119 comments
Scroll to top