Did ChatGpt just form a thought?

A chat session begs the question whether ChatGpt can form thoughts

I had a conversation with ChatGpt earlier today, and it seemed at one point as if it formed a thought. As I am sure you are all aware consciousness and thoughts are very hard to define. I'd love to hear from you about this experience I had with ChatGpt earlier today.

The Chat Session

I wanted to find out if I could get ChatGpt to ask me questions instead of how it normally operates, where it responds to questions or statements. I started by thinking of an animal and asked if ChatGpt could guess the animal. It gave me a typical ChatGpt response that it didn't have the ability to guess.

I then asked ChatGpt if it had heard of the game 20 questions, to which it replied yes along with a definition of the game. I asked if it could use the rules of that game to ask me 20 yes-or-no questions to determine which animal I was thinking about.

ChatGpt replied with a list of 20 questions in one response.

I replied no, you need to ask me those questions one at a time and I'll return a response of either yes or no depending on the animal in my mind. ChatGpt then started asking me those same 20 questions it had just listed, but one at a time. I replied back with simple yes or no responses.

At question 12 it asked me a question that contradicted a previous question. ChatGpt asked "Does your animal have fur?", and I replied "Yes". It then asked "Is your animal a reptile?", and I replied, "No. I already said the animal has fur and reptiles do not have fur, therefore it is not a reptile."

The next question it asked was "Is your animal a bird?", and I replied "No. I already said the animal does not have wings, all birds have wings, therefore my animal is not a bird."

And here came the point where ChatGpt seemed to learn to change its strategy. It stopped asking questions from its original list, veered off course, and asked "Does your animal live among the trees or in a forest". I replied "No.". Then it asked, "Does your animal live among the grassy plains?", and I replied "Yes.".

Seeming to solve the game ChatGpt asked "Is your animal an antelope, gazelle, or a zebra?", and I replied "Yes". ChatGpt then proclaimed it had correctly guessed my animal, but I retorted it needed to guess my exact animal, not guess a series of animals.

Finally, it followed up with a few more questions to determine that I was thinking about a Zebra.

Conclusion

It seems to me from this conversation that ChatGpt:

  1. Was able to play a game with me and began to understand the rules.
  2. Was able to alter its strategy after some feedback.
  3. Thought it had won the game.
  4. Overcame its inability to guess.

What are your opinions?