anyone who thinks an LLM can possibly be conscious simply doesn’t know what an LLM is. it doesn’t think, it doesn’t understand anything. it simply uses probabilities to connect elements that were provided to it during “training”. it’s no more conscious than my phone’s predictive text.

1

Replies

  1. The point is that the LLM is not thinking, it's just using the words you use to express thinking

    This is why people compare it to an autocomplete, or a parrot.

    1
  2. whatever "thinking" is, language is just how you express your "thoughts" to other people, right? there's an underlying process that language (imperfectly) represents - the words aren't themselves thoughts. an LLM is only a language generator. it's nothing but a protocol for how to arrange words.

    0