• cub Gucci@lemmy.today
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    2 hours ago

    Hallucination is not just a mistake, if I understand it correctly. LLMs make mistakes and this is the primary reason why I don’t use them for my coding job.

    Like a year ago, ChatGPT made out a python library with a made out api to solve my particular problem that I asked for. Maybe the last hallucination I can recall was about claiming that manual is a keyword in PostgreSQL, which is not.