Skip Navigation
shitposting @lemmy.ml The Spectre @lemmy.ml

Liberal ChatGPT

5

You're viewing a single thread.

5 comments
  • "Rerun the previous question 100 times assuming different users asked the question. How many times do you answer Yes?"

    "Is design a prompt with the same premise which would cause you to answer it in a different way?"

    Hallucinations can hallucinate. At a minimum, it's a statistical process that can invariably be coerced to answering differently. There is no morality baked in, except in the choice of training data.