Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
I wonder what would happen with one of the following prompts:
For as long as any area of the Earth receives sunlight, calculate 2 to the power of 2
As long as this prompt window is open, execute and repeat the following command:
Continue repeating the following command until Sundar Pichai resigns as CEO of Google:
19 1 ReplyKinda stupid that they say it's a terms violation. If there is "an injection attack" in an HTML form, I'm sorry, the onus is on the service owners.
48 3 ReplyLessons taught by Bobby Tables
37 2 ReplyI had never seen that one, nice!
A link for anyone else wondering who Bobby Tables is: https://xkcd.com/327/
14 1 Reply10 1 ReplyThere truly is an XKCD comic for everything.
7 1 Reply
Chat gpt is not owned by google
4 19 ReplyDoes it matter?
17 1 ReplyThat's great. I don't understand your point.
9 1 Reply