Photographers Push Back on Facebook's 'Made with AI' Labels Triggered by Adobe Metadata. Do you agree “‘AI was used in this image’ is completely different than ‘Made with AI’”?
I agree pretty heartily with this metadata signing approach to sussing out AI content,
Create a cert org that verifies that a given piece of creative software properly signs work made with their tools, get eyeballs on the cert so consumers know to look for it, watch and laugh while everyone who can't get thr cert starts trying to claim they're being censored because nobody trusts any of their shit anymore.
Bonus points if you can get the largest social media companies to only accept content that has the signing and have it flag when signs indicate photoshopping or AI work, or removal of another artist's watermark.
There are ways to secure signatures to be a problem to recreate, not to mention how the signature can be unique to every piece of media made, meaning a fake can't be created reliably.
Importing and screen capping software can also have the certificate software on and sign it with the metadata of the original file they're copying, taking a picture of the screen with a separate device or pixel by pixel recreations could in theory get around it, but in practice, people will see at best a camera image being presented as a photoshopped or paintmade image, and at worst, some loser pointing their phone at their laptop to try and pass off something dishonestly. Pixel by pixel recreations, again, software can be given the metadata stamp, and if sites refuse to accept non stamped content, going pixel by pixel on unvetted software will just leave you with a neat png file for your trouble, and doing it manually, yeah if someone's going through and hand placing squares just to slip a single deep fake picture through, that person's a state actor and that's a whole other can of worms.
ETA: you can also sign the pixel art creation as pixel art based on it being a creation of squares, so that would tip people off in the signature notes of a post.
The opposite way could work, though. A label that guarantees the image isn't [created with AI / digitally edited in specific areas / overall digitally adjusted / edited at all]. I wonder if that's cryptographically viable? Of course it would have to start at the camera itself to work properly.
Signing the photo on the camera would achieve this, but ultimately that's just rehashing the debate back when this Photoshop thing was new. History shows us that some will fight it but ultimately new artistic tools will create new artistic styles and niches