General Discussion
In reply to the discussion: I just pasted the Tyler Robinson and roommate text messages into ChatGPT and asked whether they seem authentic. [View all]William Seger
(12,204 posts)They just mimic speech. Reasoning requires judging the soundness of any premises and the validity of the logic, and AI bots are completely incapable of either so far. Anything resembling logical reasoning is coincidental, since it's just a mashup (or maybe a hallucination) of all the associations and patterns it has formed from the training material. ChatGPT does not "understand" a single word of its own output.
Many years ago, there was an "expert system" demo program that would guess what animal you were thinking of by asking "yes or no" questions, and it would usually succeed in less than 10 questions. I haven't tried this experiment lately, but when I tried it a while back with ChatGPT, it was unable to make any progress after asking 50 questions, so I gave up. Another experiment was asking it to solve a simple math "word problem" about how many coins were in each of three stacks, given some comparisons between stacks. After presenting what looked like an impressive and correct attempt to solve it algebraically, it said that one stack had zero coins no understanding of why that was not a valid answer.