• Blue_Morpho@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    15 hours ago

    Hallucinate is the word that has been assigned to what you described. When you don’t assign additional emotional baggage to the word, hallucinate is a reasonable word to pick to decribe when an llm follows a chain of words that have internal correlation but no basis in external reality.

    • Oascany@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      13 hours ago

      Trying to isolate out “emotional baggage” is not how language works. A term means something and applies somewhere. Generative models do not have the capacity to hallucinate. If you need to apply a human term to a non-human technology that pretends to be human, you might want to use the term “confabulate” because hallucination is a response to stimulus while confabulation is, in simple terms, bullshitting.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        11 hours ago

        A term means something and applies somewhere.

        Words are redefined all the time. Kilo should mean 1000. It was the international standard definition for 150 years. But now with computers it means 1024.

        Confabulation would have been a better choice. But people have chosen hallucinate.

        • mushroommunk@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 hours ago

          Although I agree with you, you chose a poor example.

          Kilo doesn’t mean 1024, that’s kibi. Many of us in tech differentiate because it’s important.