Pictures as proof have been questionable for a while if a good photoshopper creates something. It was video that was much harder. AI still isn't perfect, but a short moving clip along with AI audio can fool many people now.
Especially in combination with a real videos. Like just twisting words or cutting something in between, replacing a word, changing an expression and so on.
I’ve been thinking that it would be fun to take a 100% real photo of a stack of real hands from the exact right angle that it looks like a mutated abomination generated by some stupid AI.
Biometrics should be usernames, not passwords. Fingerprints, irises, faces, vocal patterns, all of it, no matter how good it is, only identifies the person trying to enter/use something and is somewhat easy to steal without their knowledge.
If you want true security you still need to ask for a passcode that only the now-identified user will know.
And yes, it is still possible to intercept the passcode at the moment that the user interacts with the locking mechanism, but that is completely different from grabbing it when they're randomly walking down the street, etc.
(Edit to add: I didn't think this needed to be explained, but I'm not saying biometrics should replace usernames, I'm saying they shouldn't have replaced passwords. And yes, you can still use biometrics in the authentication process to identify that it's you, i.e. your username, but you still need a password.)
The only form of authentication that will work long term is to run a hash on the entire person.
Basically instead of authenticating that it is the same person, you authenticate that whatever is attempting access shares enough characteristics with that person to use the resource in the same way.
Like, a perfect transporter clone of me can get access to my stuff, but it’s okay because he’s got my same goals and moral constraints.
Maybe, but did you here about all the topless photoshoots Scarlett Johansen has been doing? Neither have I, but I've certainly seen pictures of it, so it must have happened. Just don't look at the fingers
Very true. It's the immediacy of it that struck me this morning.
If I make a post telling you that I met Elvis Presley as an old man and you respond with "Pics or it didn't happen". I can literally post a reply to you within a couple of minutes (If I knew where to go) of me meeting an elderly Elvis Presley. Whereas before I'd have to put some effort into it, and then respond a day later with my finished photoshop creation. The immediacy lends it credibility in the way that traditional photo manipulation didn't.
Sometimes it feels technology may doom us all in the end. We’ve got a rough patch in society starting now, now that liars and cheats can be more convincingly backed up, and honest folk hidden behind credible doubt that they are the liars.
AI isn’t just on the path to make convincing lies, it’s on the path to ensuring that all truth can be doubted as well. At which point, there is no such thing as truth until we learn yet a new way to tell the difference.
“They don’t need to convince us what they are saying, the lies, are true. Just that there is no truth, and you cannot believe anything you are told.”
It kind of makes me want to go live in a shack in the woods, grow a garden and live out the rest of my days growing my beard and writing increasingly more obscure poetry until it gets posthumously published after my death.
They could make it difficult to open up the camera and extract its signing key, but only one person has to do it successfully for the entire system to be unusable.
In theory you could have a central authority that keeps track of cameras that have had their keys used for known-fake images, but then you're trusting that authority not to invalidate someone's keys for doing something they disagree with, and it still wouldn't prevent someone from buying a camera, extracting its key themselves, and making fraudulent images with a fresh, trusted key.
Anything from now that people want to authenticate in the future, they can publish the hash of.
So long as people trust the fact that the hash was published now, in the future when it’s fakable they can trust that it existed before the faking capability was developed.