Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)EM
emmy67 @lemmy.world
Posts 0
Comments 47
Chat GPT appears to hallucinate or outright lie about everything
  • The fundamental problem is all those results are on people with abnormal brain function. Because of the corpus calusotomy.

    It can't be assumed things work that way in a normal brain.

    People do make up things in regards to themselves often. Especially in the case of dissonance. But that's in relation to themselves, not the things they know. Most people, if you asked what op did will either admit they don't know or that you should look it up. The more specific the question the less likely to make something up.

  • Chat GPT appears to hallucinate or outright lie about everything
  • Funny thing is, that the part of the brain used for talking makes things up on the fly as well 😁 there is great video from Joe about this topic, where he shows experiments done to people where the two brain sides were split.

    Having watched the video. I can confidently say you're wrong about this and so is Joe. If you want an explanation though let me know.

  • SITE CHANGED HEADLINE: Harris says she would appoint a Republican to her Cabinet if elected
  • And I don't think you recognise what a bomb that would be at the moment of government handover. Claims from new cabinet monster kamala stole the election.

    Dude. This is a much more volatile situation than it has been in the past, with loads of potential for it to blow up in her face.

    It's a risky more she doesn't need to make.

  • SITE CHANGED HEADLINE: Harris says she would appoint a Republican to her Cabinet if elected
  • The only positive for her tp have a republican in cabinet is that she wants to be bipartisan. A stupid move when the whole party has endorsed trump and won't listen to any dissent. The only way a republican could blow up their career would be to not resign and accuse them of corruption.

  • Man Arrested for Creating Child Porn Using AI
  • You're right, it's not. It needs to know what things look like. Which. Once again, it's not going to without knowing what those things look like. Sorry dude either csam is in the training data and can do this. Or it's not. But I'm pretty tired of this. Later fool

  • Man Arrested for Creating Child Porn Using AI
  • Generative AI, just like a human, doesn't rely on having seen an exact example of every possible image or concept

    If a human has never seen a dog before, they don't know what it is or what it looks like.

    If it's the same as a human, it won't be able to draw one.

  • Man Arrested for Creating Child Porn Using AI
  • I wasn't the one attempting to prove that. Though I think it's definitive.

    You were attempting to prove it could generate things not in its data set and i have disproved your theory.

    To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it'll improve the quality of it by orders of magnitude.

    To me, the takeaway is that you know less about ai than you claim. Much less. Cause we have actual instances and many where csam is in the training data. Don't believe me?

    Here's a link to it

  • Man Arrested for Creating Child Porn Using AI
  • But you do know because corn dogs as depicted in the picture do not exists so there couldn't have been photos of them in the training data, yet it was still able to create one when asked.

    Yeah, except photoshop and artists exist. And a quick google image search will find them. 🙄

  • Man Arrested for Creating Child Porn Using AI
  • Then if your question is "how many Photograph of a hybrid creature that is a cross between corn and a dog were in the training data?"

    I'd honestly say, i don't know.

    And if you're honest, you'll say the same.

  • Man Arrested for Creating Child Porn Using AI
  • It didn't generate what we expect and know a corn dog is.

    Hence it missed because it doesn't know what a "corn dog" is

    You have proven the point that it couldn't generate csam without some being present in the training data

  • Man Arrested for Creating Child Porn Using AI
  • A dumb argument. Corn and dog were. But that's not a corn dog like what we expect when we think corn dog.

    Hence it can't get what we know a corn dog is.

    You have proved the point for us since it didn't generate a corn dog.