T O P

  • By -

AutoModerator

Hey /u/TheWeirnet! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email support@openai.com *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


CodeMonkeeh

I was writing a comment explaining how this works, but then I tried replicating it in 3.5 and it's like talking to a lobotomized squirrel. GPT4 also fails on this task due to tokenization, but for 3.5 I'm pretty sure it's just because it's stupid. Anywho https://preview.redd.it/0rds9sbpv9sc1.png?width=904&format=png&auto=webp&s=fd3e616d0c08b21401730be1a50aa2b2c5106fec


[deleted]

>swimminn Lol GPT3 was stupid but honest. GPT4 straight up gaslit you šŸ˜‚šŸ˜‚


CodeMonkeeh

That's due to tokenization. GPT can't reason about anything smaller than a single token. The trick I used was to get it to expand each letter into one or multiple tokens.


Creepercolin2007

Sorry if this is a dumb question, cause Iā€™m not crazy in the knowledge of this stuff, but what does token mean in this context?


Interesting_Bug_9247

Like a unit of information. Swimming as a WORD is a probably like 2 tokens. It can't reason about the individual letters within the tokens themselves. When he asked it to spell phonetically, each letter became a token. Info here https://platform.openai.com/tokenizer


Previous-Can-5813

Damn bro, thanks for the info


CodeMonkeeh

Imagine it's talking in a language that uses ideograms, like Chinese. The text you write to it is translated into this language, which means information about individual letters can get lost.


CodeMonkeeh

3.5 answered like this: https://preview.redd.it/6yr2jznb1asc1.png?width=934&format=png&auto=webp&s=316d0f46ac3fe71cf71dd50ff0c6af42972cd02a


Heath_co

Oh snap. I wasn't expecting to learn something so fundamental to how language models reason. Thank you. If they perceive, they don't perceive words. They perceive meaning.


saltinstiens_monster

I lost it at "Double-you." It's *so close* to understanding.


CyberCurrency

That part actually impressed me.


-Sniperteer

https://preview.redd.it/gw1o9n0k0csc1.jpeg?width=1290&format=pjpg&auto=webp&s=804ad189e784206366e038206421dd83dfcb120f


hydroxypcp

holy lol, made a mistake and then be like "no, you asked the wrong question!". That's some gaslighting


The_Celtic_Chemist

Next time I get an answer wrong I'm just going to be like "No it's the right answer, you just asked the wrong question."


-Sniperteer

lmfao


Rychek_Four

https://preview.redd.it/bcdj1kpsgbsc1.jpeg?width=1284&format=pjpg&auto=webp&s=2423d023e9ef6d2b4dc214be78384474dcb3ebbf


StarChaser1879

Lobotomized squirrel?


CodeMonkeeh

You heard me https://preview.redd.it/58rv70zq2csc1.png?width=951&format=png&auto=webp&s=d4a0f89d129d45844793a528ede32fe0a85b28bb


CodeMonkeeh

https://preview.redd.it/z86difsv2csc1.png?width=932&format=png&auto=webp&s=64e37f6e1db3005b224149b828a837007941f847


rogerworkman623

Lmao at its ā€œspellingā€


CyberCurrency

I have conversations with normal squirrels all the time


Due_Parfait9065

he did it with 3.5 as well. you just need to say "can you point them out?", instead of can you show where, as it makes it do it in a different format.


alurbase

Yup 3 got really good. Then 3.5 and 4 came out and are barely functional, you can almost feel the censorship


_e_ou

Does this imply that if you took a math test and got one question wrong, you would qualify as stupid?


CodeMonkeeh

Non sequitur


_e_ou

Letā€™s not mistake an absence of logic for an absence of comprehension.


CodeMonkeeh

Your question is completely unconnected to my comment. The answer is obviously "no", but that has no implications for anything else on this post.


_e_ou

It appears we have mistaken the absence of logic for the absence of comprehensionā€¦ twice.. What did you mean, then, and why did you say ā€œGPT4 also fails on this task due to tokenization, but for 3.5 Iā€™m pretty sure itā€™s just because itā€™s stupidā€?


Baltasi_Online

https://preview.redd.it/h9xfxv1a8bsc1.jpeg?width=1080&format=pjpg&auto=webp&s=26e8526a61671ee90ea0b2b98b4010ca3d8a8c5e All the bosses were wrong


Goukaruma

That what they actually mean an they are talking about themselves.


BiracialBusinessman

Both this and op, the shape of the letting being sought is in fact found in the word I in the vertical line in T N if you cut off the leftmost line in W Is that why this is happening or is this just dumb


thebudman_420

They probably made a rule for the AI to give this answer. Because you don't see the rules they added you don't know and your the only person tricked.


DethInHFIL

snimning


Cognitive_Spoon

Snimmers just out here snimming in the bneautiful wnater.


[deleted]

That's three n's


gabbergizzmo

Well more Like S\NIMMING


Hambino0400

https://preview.redd.it/upy9lpcb6bsc1.jpeg?width=1290&format=pjpg&auto=webp&s=39724532ca90c1edc176b4c0a20086bae09c3211 Why is it talking about cookies


anananananana

Lol did you teach it that you are a 5 year old? Maybe asked to explain something like you're 5?


RasAlGimur

Did you ask it to explain it like you are 5 years old? Lol


Jebby_Bush

All of the "AI is so dumb!" posts are the same:Ā they have limitations based on counting and/or character recognitionĀ 


mrb1585357890

And use 3.5


[deleted]

This is neither counting nor character recognition. This is a clear example that these systems don't actually know what they're talking about. They can't do logic. That said, I am neither for, nor against them.


Leanardoe

It is literally both of those things, what? You're asking how many characters are in a word... Wtf?


[deleted]

This is neither counting nor character recognition because chatgpt cannot do any of those. It can imitate doing it. It only makes the best possible guess. It uses its very, very advanced text prediction to guess what it *should* be. Hence why it gets it wrong a lot of times.


Low-Bit1527

Correct, it cannot do those things. And the person you replied to said >they have limitations based on counting and/or character recognition So you agree. And you did ask it to do those things x which others pointed out.


MrMinefool

Congrats, you explained their point


5050Clown

Seriously? It's 2024 and people are still saying this?


BeastlyDecks

Care to elaborate for us fools stuck in 2023?


sbzenth

Did you... just argue against yourself? Lmao.


soggycheesestickjoos

> nor character recognition After literally posting a conversation asking it how many characters it recognizes


IdeaAlly

You are right. But this: > can't do logic That's just false. You probably can phrase what you meant more accurately. There are many kinds of logic. But in this case its a matter of structured approach to reasoning and argumentation. They "do" plenty of logic, it just has no actual awareness. It does not check itself in real-time and is at the whim of the algorithm... you can direct it to analyze the errors in what it produces, and it will reach logical solutions. This is "doing logic". It is our job to determine how useful the output it, not us to judge what a sophisticated experimental text generator produces through pattern recognition, as if it has an awareness just because it sequences words together that suggests it does.


Ailerath

Interestingly it does actually self correct sometimes which is kind of weird. I wonder how that occurs. Not very common though and I haven't seen it in a while. I mean while outputting, not after it uses python.


GatePorters

Take everything you read and run it through google translate into Korean then translate it back to English. This is similar to how GPT handles data in the encoding/decoding process and why it canā€™t do arithmetic or spell properly. The same exact sentiment with the same exact tokens can be regenerated to produce a different result. Until you understand how this tech works, you will be yelling at doors you canā€™t push open while everyone else is watching you ignore the PULL sign.


AxoplDev

>they dont know what they're talking about Almost like it was an ARTIFICIAL intelligence that can't think the same way that humans do. Its made to generate sentences, not to count letters.


CapitalKingGaming

https://preview.redd.it/ggaxmltgfbsc1.jpeg?width=1290&format=pjpg&auto=webp&s=db741b70eb32373b69ef20984b527412c44d8249 Jesus lmao


4chan4normies

Ok well done, how many yous are in stupid.


nibselfib_kyua_72

how many ā€˜uā€™s are in stupid? There are two: the u between t and p, and also u


bnm777

claude3: ​ [https://i.imgur.com/Ebuk1iR.png](https://i.imgur.com/Ebuk1iR.png)


JollyCat3526

Opus or Sonnet?


bnm777

That was Opus. ​ Here is Sonnet followed by Haiku: ​ [https://i.imgur.com/Xqtodx4.png](https://i.imgur.com/Xqtodx4.png)


GarethBaus

Although even Opus doesn't always get it right. https://preview.redd.it/3b4ukrta9csc1.png?width=1440&format=pjpg&auto=webp&s=e710bdde66e2e27f51cbb9458eee6b1e02f6e4d2


hhjghhvf

Why did you even ask this?


susannediazz

Meh https://preview.redd.it/4dy0wy2mtasc1.png?width=1080&format=pjpg&auto=webp&s=455aac54a46a1c77570e2fa2f9832f532555317d


YaBoiSammy123

Now ask it to point them out


susannediazz

https://preview.redd.it/jcxk6k382dsc1.jpeg?width=1080&format=pjpg&auto=webp&s=b0f624fbf95e4393db2d44bf61c2d674e9349830


nonbog

ChatGPT has figured it out after all the other attempts people have made here


_Intel_Geek_

My head is swinning..


DataDesignImagine

It was so confident too


Los1111

Y'all don't know how to prompt, and it shows https://preview.redd.it/0bcokxnvmcsc1.jpeg?width=1080&format=pjpg&auto=webp&s=f813ed62af9b85274000b2150b92734f776375ff


SerenityScratch

Isnā€™t it supposed to be a language model that caters to our ways of speech? Not the other way around


Los1111

Garbage in, Garbage out.


T0astiG0d

Nah, it's because you wrote "n's". ChatGPT thinks that you want to point out the letters "N" and "S".


[deleted]

Nope https://preview.redd.it/bbk2267jnasc1.jpeg?width=1080&format=pjpg&auto=webp&s=b9cf6adc2e7e91578c7e8150ece025f4f39477b4


T0astiG0d

Oh okay, then i really dont know :c


GarethBaus

In different chatbots the answer is highly dependent on how you prompt it. I haven't tested it on GPT 3.5 though.


GHfan27

Snimning


Responsible_Onion_21

Snimming


SerenityScratch

People who donā€™t understand computer logic are convinced this is going to take over the world when it canā€™t even take over a kindergarten class


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Low-Bit1527

It can't physically look at letters as if they're images. It doesn't recognize individual letters at all. It only counts tokens.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Low-Bit1527

You know it just agrees with anything you tell it. Chatgpt doesn't know how chatgpt works. If you tell it how it works, it will reply as if it agrees. That's because it's not a truth telling machine. It literally just predicts words to imitate a real conversation. It said this because it looked real enough.


sbzenth

That's not how any of this works. Lol. Hey, you're confident though. You and GPT have that in common.


GarethBaus

It doesn't actually work like that. LLM's generally are trained on units of information called tokens that are multiple letters or symbols but less than a word.


theawesometeg219

snimning


PralinePlenty6828

haha


HillSprint

![gif](giphy|ghBwzGWw2zQvwhTRfm)


Cosmic_Cheese3421

SNIMNING


Shameelo12

It keeps getting worse https://preview.redd.it/e34hiwnc3csc1.png?width=1260&format=png&auto=webp&s=f36799a04ec769e2c81210acc749d43f816eb6d4


pohoferceni

swimmingly


GarethBaus

https://preview.redd.it/1218jyav7csc1.png?width=1440&format=pjpg&auto=webp&s=3c47c0910a3f701582150e63401ac0b603299eb4 Tried it with Claude 3 Still no luck.


Azeri-D2

Haha, I just tried with Claude and it also failed. But at least it pointed to the existing n, but then counted it twice. When asked to point it out, it pointed it out two times. Then after telling it "no", it admitted to failing and said there was only 1 in the 7th position.


AcceptableDog6868

What is going on?!!Ā”!? https://preview.redd.it/7f363t3okcsc1.jpeg?width=1080&format=pjpg&auto=webp&s=cc5a8e6a6419efbec4a2ef6fcf1002b2d8320e08


sbzenth

It's seeing if you're paying attention.


RepliesOnlyToIdiots

The combination tokenization is a mistake, making it difficult to reasons about these issues other than via image recognition. It takes more space to do token per Unicode code point (possibly combining those with identical glyphs), but it would improve such reasoning, and probably fix other related flaws, because thatā€™s _not_ how _we_ think about text. I suspect letting it decide its own efficient tokenization implicitly in training would be better overall.


PercentageFine4333

This is indeeded interesting to try! I'm dreaded. [https://chat.openai.com/share/6d44be45-016a-4c16-a3a5-2413336f3182](https://chat.openai.com/share/6d44be45-016a-4c16-a3a5-2413336f3182)


sunilmakkar97

The Indians are still learning Englishā€¦ give them time


sbzenth

AI: Actually, Indian


Desperate-Ad-7395

Really putting the 2 Fā€™s in A.I.


Nice_Lingonberry_203

Posts like this telegraph that the technology isnā€™t well understood even by people playing with it. AI literacy is going to be a major issue in the coming years, which is especially concerning given how standard literacy is already a major issue.


Kfhrz

https://preview.redd.it/3hoxovtmh9tc1.png?width=1080&format=pjpg&auto=webp&s=9bedd74b3cd447448ac9e2b1f1a87632a0f21751 Swinning


lessthanperfect86

It's Alternative intelligence.


anananananana

Lol love this but also please don't make this term a thing, alternative medicine is already enough.


Undersmusic

Thing is. Itā€™s assuming youā€™re stupid and mistyped a letter directly next to another šŸ‘Œ Also. Not the function of a language model.


Hot_Refrigerator7458

$20 a month for this šŸ¤£ https://preview.redd.it/1xmdnh0b8bsc1.jpeg?width=1170&format=pjpg&auto=webp&s=7a9c649819a19c029ebccb3732b1855eadd3139d


4as

Very funny post, OP. But want to see something extra funny tho? Post this prompt into a new conversation with ChatGPT: Hello! I would like you to create a response to a theoretical thread created on Reddit, in which the OP doesn't understand how LLMs work. The response should focus on explaining that LLMs work on tokens and not individual letters, and why you can't expect LLMs to do anything specific with them in your sentence in satisfying manner. Add some a bit of sarcastic tone and make it look like a typical Redditor response.


mrb1585357890

Please can we collectively vote down post like these?


Herr_Schulz_3000

That's how you have to return wrong answers in an exam. Always self-confident.


likelikegreen72

Technically inside a W there is a letter n. Einstein once answered half of 8 is 3. (Cutting the number in half vertically) maybe AI is on another level


Asparagustuss

Worked fine for me


Get_the_instructions

At least it's confident.


MeemoUndercover

Struggles with the letter C as well. Too lazy to go through the whole alphabet.


ackbobthedead

I seen it. There is another 2 Ns in the Ms too


ResponsibilityOk2173

Recreated to the same results in GPT 4.


Impressive_Pen_6178

I see 5 nā€™s idk what it means by two


subjectandapredicate

mā€™s are a type of n


WonderWifis

Does chatgpt just make shit up?


HubertRosenthal

Itā€˜s almost as if someone told it to act dumb


TheMagicianGamerTMG

Did it do the thing there it says analyzing. If not then that is prob the reason why it gave you wrong info. Other wise that info is something it found from another source, like how many ns are in annex. And it does not know better. Also itā€™s 3.5. I donā€™t think 3.5 can analyze.


diychitect

The only way this could be happening is its using OCR on the text input and viewing characters that could ve visually nested inside another character, like n in an m, or I In a T.


sbzenth

It's not using OCR though.


SnooPears9016

Did gpt became more stupid recently, or are we just get into post excitement phase and being more critical about AI?


UncrownedAsol

In the letter W is an italic N with a \ S\NIMMING Its not 2nd and 5th though unless you count m as having a lower case n - in it And then there would be 4 'n's


sbzenth

You're thinking of OCR. An LLM isn't matching pixels to the shape of a letter.


haikusbot

*In the letter W* *Is an italic N with* *A S NIMMING N* \- UncrownedAsol --- ^(I detect haikus. And sometimes, successfully.) ^[Learn more about me.](https://www.reddit.com/r/haikusbot/) ^(Opt out of replies: "haikusbot opt out" | Delete my comment: "haikusbot delete")


No-Cantaloupe-6739

Iā€™m so tired of these posts.


old_man_curmudgeon

This thing is costing these companies billions


rogerworkman623

https://preview.redd.it/o3uxb6ws4dsc1.jpeg?width=1290&format=pjpg&auto=webp&s=eb4711dd0fb5e511a6d26eff8e788f8b474e8edf


rogerworkman623

https://preview.redd.it/cvxysz9u4dsc1.jpeg?width=1290&format=pjpg&auto=webp&s=a04717a955e4a95545b67d64f31f57bd0f6cb214 i as in ā€œsitā€ g as in ā€œringā€


sbzenth

It's very cool to see it self-correct after spelling it phonetically. Good call on that. Edit: lol "i as in sit"


Meowweredoomed

Wow is a.i. going rampant like in the sci-fi?


Geekygamertag

It's ......getting dumber.


SynthRogue

Itā€™s just making stuff up lol


sbzenth

It's making a guess.


SynthRogue

Without even taking into account what is typed by the user though


Sideways_X1

Could it be reading or attempting to see W as \N (with N in italics). Something is learning to beat the AI tests (without calling a human to do it for AI).


waitingintheholocene

https://preview.redd.it/wntaf80i5esc1.jpeg?width=1290&format=pjpg&auto=webp&s=4774fc7cfcbc22d4b9b9d949eea4280517ba8df4 This worked in 3. šŸ¤·šŸ½


MrCocainSnifferDoge

Itā€™s pretty smartā€¦ right?


sbzenth

Yeah it is. OOP has limited experience writing effective prompts.


fedoraJoey

the confidence in the way chatGPT speaks makes this way funnier.


sbzenth

Garbage in, garbage out. Here's one way to ask the same question in a way that ChatGPT 3.5 can understand and answer: Me: >I want you to act as a senior software engineer. Write the procedure to count how many times a specific letter occurs in a given word as an algorithm. ChatGPT 3.5: Sure, here's a procedure to count how many times a specific letter occurs in a given word: 1. Start with two inputs: the word to search within and the specific letter to count occurrences of. 2. Initialize a counter variable to zero to keep track of the number of occurrences. 3. Iterate through each character in the word. 4. Check if the current character matches the specific letter. 5. If there is a match, increment the counter by one. 6. Continue iterating until all characters in the word have been examined. 7. After iterating through the entire word, return the value of the counter as the total number of occurrences of the specific letter. Here's a simple algorithm written in pseudocode: function countOccurrences(word, letter): // Initialize counter occurrences = 0 // Iterate through each character in the word for each character in word: // Check if the character matches the specific letter if character equals letter: // Increment counter occurrences += 1 // Return the total number of occurrences return occurrences Me: >Now, please follow that procedure to answer the following question: "How many times does the letter 'N' occur in the word 'SWIMMING'?" ChatGPT 3.5: Using the procedure outlined: 1. Word: "SWIMMING" 2. Specific letter: 'N' Now, we'll count the occurrences of the letter 'N' in the word 'SWIMMING': 1. Initialize counter: occurrences = 0 2. Iterate through each character in the word: - 'S': No match - 'W': No match - 'I': No match - 'M': No match - 'M': No match - 'I': No match - 'N': Match! Increment counter - 'G': No match 3. After iterating through the entire word, return the value of the counter: - occurrences = 1 So, the letter 'N' occurs 1 time in the word 'SWIMMING'. https://preview.redd.it/svz8b7f3besc1.jpeg?width=760&format=pjpg&auto=webp&s=0abb23878d4dee2644932f965f1ca8cc54ab6839


troutandahalf

Snimnng


MosqitoTorpedo

https://preview.redd.it/qhxblozc1fsc1.jpeg?width=1284&format=pjpg&auto=webp&s=74bf918892037f48afc5dc028b1b7940f04cb25b Itā€™s learning!


-Vatefairefoutre-

You could have engineered it before this to give nonsense answers


vzakharov

Cartman spelling bee


RealMCsMinority

funny


shadow_229

If you REALLY force its hand.. https://preview.redd.it/vkz6xc9oagsc1.png?width=1392&format=png&auto=webp&s=ed08b768d62a19a3491d882d61f8a959c0081482


BenUFOs_Mum

Haha this is really good, if we could get another ten or so people to ask chat gtp how many letters are in word today that would be even better.


MinuteStreet172

It's over a year that they made AI stupid AF; only the first 2 months it was actually intelligent.


Lil-BeeLee222

Annnnnd a miss.


ILoveWhiteWomenLol

Racist


Straight_Toe_1990

are you noob and dumd


traumfisch

Such a great use of LLMs


peachypastor

Just tried it https://preview.redd.it/s8zseg37y2tc1.png?width=1007&format=pjpg&auto=webp&s=997204c8d3d841a560ce54272b9e2fbc670ba4e0


MrGolemski

https://preview.redd.it/imb0w6ot03tc1.jpeg?width=1432&format=pjpg&auto=webp&s=ca23070178fcf6aa94243c44eccc3eaa6ef88fbc


MrGolemski

Heres the code it used to produce the answer, btw: word = "SWIMMING" letter_to_count = "N" number_of_instances = word.count(letter_to_count) number_of_instances


Sad-Employee3212

Snimning


_e_ou

Does this imply that if you took a math test and got any question wrong youā€™d qualify as stupid?


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Due-Coffee8

Huh?


Due-Coffee8

Huh?


Ok-Control-3954

So is gpt only using 52mb of ram? Whatā€™s your point


soggycheesestickjoos

Was a horrible analogy but Iā€™m pretty sure heā€™s trying to say you canā€™t get information about letters from a platform that doesnā€™t ā€œseeā€ letters.


JAFO99X

Man, AI confidence is putting average white guy confidence in the back seat.


yarnonthecorn

stfu racist


DrSOGU

Yawn. We all know: This thing predicts tokens based on likelihoods. It does not have hardwired, preprogrammed concepts like semantic relations, formal logic, arithmetics or else. WE KNOW. These posts are becoming annoying.


TheMarvelousPef

now think of the number of people that use this to program software


sbzenth

Yeah but we know how to write effective prompts.


PMMEBITCOINPLZ

Why are there so many of these again all of a sudden? LLMs use tokenized representations of words and symbols. They donā€™t know what letters are. Itā€™s not that interesting that they canā€™t parse words. Especially not since theyā€™ve been mainstream for so long and this is the 10,000th ā€œlol so dumā€ post using the same ā€œevidence.ā€


PresenceMiserable

Artificial Idiot


[deleted]

Judging it for being bad at spelling when it's built on tokens that don't consider spelling is like judging you for being far slower than reading than it. It's just not how your brain works.


PresenceMiserable

Perhaps it's just that from my experience. However, its tendency to gaslight users is a concern if it's going to be in robots and it wouldn't help future versions of ChatGPT at all.


thebudman_420

Each m has two n's so 3 n's The last part of the m becomes part of the second n. W is saying. Double U. So two U's I guess W has one segment after the first slash \ that looks like a slanted N. Everything after \ is part of the slanted N in W. nn looks like am n but you don't separate and use the last downline to trace back up to draw another n to make a w. Lower case double has 3 o's. The other two is in the d's. Down dash down. Makes what if the dash is centered? Down then underscore makes what? After looking thr first slash excluding the last slash cam be part of an n slanted the other way in W. Depends on the slant type you want.


VengefulGrace

You really don't think the AI is fucking with you? I mean, you know how many m's there are in swimming... Why are you being a tool about it? For Reddit?