It’s even worse: It answers the questions correctly enough that most people cannot tell the difference – but still not reliably correctly. Meaning you get answer that sound very convincing, but could easily still be dead wrong.
Except when you ask it for the meaning of an acronym and they say something with totally different letters. Yet people treat it as a source on something they know so little about that they cannot possibly tell it its just spitting out nonsense.
It’s even worse: It answers the questions correctly enough that most people cannot tell the difference – but still not reliably correctly. Meaning you get answer that sound very convincing, but could easily still be dead wrong.
Except when you ask it for the meaning of an acronym and they say something with totally different letters. Yet people treat it as a source on something they know so little about that they cannot possibly tell it its just spitting out nonsense.
Just like your regular uncle. Or ultra right podcaster.
It’s no surprise llms behave like the most vocal and dubiously confident people in the world.
it says as you type that it can make mistakes