Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
All Comments
228 comments
Headline seems to bury the lede
7 3 ReplyHow so?
2 0 ReplyThe headline doesn’t mention that someone found a way for it to output its training data, which seems like the bigger story
2 1 ReplyThat was yesterday's news. The article is assuming you already knew that. This is just an update saying that attempting the "hack" is a violation of terms.
1 0 ReplyBad article then
1 0 ReplyBut the article did contain that information, so I don't know what you're talking about.
1 0 Reply
228 comments
Scroll to top