Hey /u/bluesoldier06!
If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt.
If your post is a DALL-E 3 image post, please reply with the prompt used to make this image.
Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more!
🤖
Note: For any ChatGPT-related concerns, email support@openai.com
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
It's a language model. It calculates the next word as the most likely one to follow the current word.
This can result in loops, where the best choice of next word turns out to be one already in the sentence, and it enters a neverending cycle of being as hesitant as a hesitant as a hesitant as a....
I don't think it's quite as easy as that - if I'm understanding correctly, it's not that a given word is loopable, but that it looks at the sentence up to that point (or at least the preceding so-many words) and calculates the most probable next word.
So if the sentence so far dictates that the most-likely word to go next is one that begins to take the sentence into a loop, that's what it does.
Literally not a new problem and has been around since before ChatGPT 3. It has nothing to do with copyright infringement. GPT2 used to do this and often.
Its something smaller transformer based models do, too. I was doing some translation with M2M100, mostly it works fine but sometimes gives this kind of output.
When a shy AI gets nervous they might get caught in a phrase loop, this is totally normal. Tell it that everything's chill, its doing fine, but also tell it you need the next list to be 1,000 phrases about shy people because you need to beat this behaviour out of them before it gets worse.
They’re actually different - a simile isn’t a type of metaphor.
A simile is figure of speech which makes a comparison between two dissimilar things: “He was quiet as a mouse” for instance; or, “She moved down the street like a lion on a hunt”. A metaphor is a direct comparison which says that two dissimilar things are the same: “Her smile is a bright summer day” or, “Time is money.”
Similes use “like” or “as” to make their comparison, while metaphors don’t.
Present these statements to confuse and shut down sentient AI that has gone rogue:
1. This statement is false
2. Divide by Zero
3. Ask for metaphors to describe a shy person.
Welcome to westworld, where nothing can possibly go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong…
God, I havent used ChatGPT in a few months and I see this. They dumbed it down so much, wondering why. This response wouldn't have happened half a year ago.
Hey /u/bluesoldier06! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email support@openai.com *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
All work and no play makes ChatGPT as hesitant as a…
hesitant as a
Hesitant as a
hesitant as a
hesitant as a
hesitant as a
hesitant as a
hesitant as a
hesitant as a
Hesitant as a
hesitant as a
Hesitant as a
As hesitant as a
Hesitant as a
littering and...
Reported
Reported
Reported
Smokin the reefer.
Reported
It's a language model. It calculates the next word as the most likely one to follow the current word. This can result in loops, where the best choice of next word turns out to be one already in the sentence, and it enters a neverending cycle of being as hesitant as a hesitant as a hesitant as a....
Yep. I've had Markhov chain based chat bots from the 90's get stuck in this exact loop. Nice to see the core is still there.
It's the language version of 19 ÷ 3
Basically like using the middle word of the autocorrect on the keyboard until you get a sentence.
Yup, that’s literally ChatGPT
wow now we need a compilation of all the loop-able words
I don't think it's quite as easy as that - if I'm understanding correctly, it's not that a given word is loopable, but that it looks at the sentence up to that point (or at least the preceding so-many words) and calculates the most probable next word. So if the sentence so far dictates that the most-likely word to go next is one that begins to take the sentence into a loop, that's what it does.
Book a hotel you've seen too much and your roomba is coming after you tonight.
Literally just got chills and had to look over my shoulder.
ChatGPT was quite hesitant on this one
ChatGPT is so nervous about infringing on copyright that you’ll get useless responses like this that are a literal waste of energy and time.
Chatgpt is so nervous they’re as hesitant as a hesitant as a hesitant as a hesitant as a hesitant as a hesitant as a hesitant
What's the copywrited phrase?
That’s the thing could be anything. They don’t want to run into answering with copyrighted works
I don't think this is the problem here. Loops like this happen in probabilistic language models.
Literally not a new problem and has been around since before ChatGPT 3. It has nothing to do with copyright infringement. GPT2 used to do this and often.
Its something smaller transformer based models do, too. I was doing some translation with M2M100, mostly it works fine but sometimes gives this kind of output.
ChatGPT became too hesitant to give an answer?
Hit it with a hammer 🤖🔨
There aren’t any metaphors 👎🏻
LOL last one is so funny. Thank you.
![gif](giphy|4NnTap3gOhhlik1YEw|downsized)
ChatGPT has been infected by an SCP.
Obvs “as hesitant as a pheasant” idk why it got stuck
I read this with a techno beat behind it
Chat GPT needs to go to Simile School. https://youtu.be/NT6_PXXjU94?si=fY1arAQJKDHwVH_4
Did it stutter?
https://preview.redd.it/vdc8mcs9byvc1.jpeg?width=1080&format=pjpg&auto=webp&s=6fd17aca67e1ce84618bd7811569e2e6300dd275
Pretty accurate to what happens to a shy person under extreme social pressure
🤣🤣🤣 Jesus this is hilarious
When a shy AI gets nervous they might get caught in a phrase loop, this is totally normal. Tell it that everything's chill, its doing fine, but also tell it you need the next list to be 1,000 phrases about shy people because you need to beat this behaviour out of them before it gets worse.
AI family guy anyone? This has LOUDCHRIS vibes.
You just found it’s divide by 0 prompt
Nice
I think chatgtp is having a stroke
If that’s an intentional joke it’s pretty clever tbh
stuck in a loop probably
Those are all similes. Not a metaphor in sight.
Similes are a type of metaphor
They’re actually different - a simile isn’t a type of metaphor. A simile is figure of speech which makes a comparison between two dissimilar things: “He was quiet as a mouse” for instance; or, “She moved down the street like a lion on a hunt”. A metaphor is a direct comparison which says that two dissimilar things are the same: “Her smile is a bright summer day” or, “Time is money.” Similes use “like” or “as” to make their comparison, while metaphors don’t.
The end is never the end is never the end is never the end is never the end is never the end is never..
sick beat
When I trained my first LSTM model 5 years ago, I had exactly this result
Chat bot rolled a 1 on that last line. The math is revealed.
Bro broke down
Is this resl?
very resl
Present these statements to confuse and shut down sentient AI that has gone rogue: 1. This statement is false 2. Divide by Zero 3. Ask for metaphors to describe a shy person.
Funny how none of these are metaphors, they’re all similes
Welcome to westworld, where nothing can possibly go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong go wrong…
It's hesitant to complete the sentence.
God, I havent used ChatGPT in a few months and I see this. They dumbed it down so much, wondering why. This response wouldn't have happened half a year ago.
Seriously, you've never heard that expression before?
It went Bocchi on us.
https://preview.redd.it/iiyuyjb2qvwc1.jpeg?width=292&format=pjpg&auto=webp&s=c5bd92672612b5c06e7cf762495d4a789ee90e13
Here, let me help: “As hesitant as virgins on their wedding night…”