Skip Navigation

ChatGPT glitches out in a bizarre and hilarious fashion, passes easy mode Turing test

In which the talking pinball machine goes TILT

Interesting how the human half of discussion interprets the incoherent rambling as evidence of sentience rather than the seemingly more sensible lack thereof1. I'm not sure why the idea of disoriented rambling as a sign of consciousness exists in the popular imagination. If I had to make a guess2 it might have something to do with the tropes of divine visions and speaking in tongues combined with the view of life/humanity/sapience as inherently painful, either in a sort of buddhist sense or in the somewhat overlapping nihilist/depressive sense.

[1] To something of their credit, they don't seem to go full EY and acknowledge it's probably just a glitch.

[2] I'd make a terrible LessWronger since I don't like presenting my gut feelings as theorem-like absolute truths.

14
Hacker News @lemmy.smeargle.fans bot @lemmy.smeargle.fans
BOT

What happened in this GPT-3 conversation?

2 1

What happened in this GPT-3 conversation?

8 1

You're viewing a single thread.

14 comments
14 comments