I assume it’s just because when writing about potential nuclear war, most people write about the bombs going off. There aren’t a lot of stories and articles about nobody doing anything and everything turning out fine, presumably. And LLMs are kind of just a glorified autocomplete so that’s what they go with.
True, also I saw another comment that said there was a mechanic that randomly escalates the models and actions, and almost every single nuclear choice was actually a different one that was escalated
I assume it’s just because when writing about potential nuclear war, most people write about the bombs going off. There aren’t a lot of stories and articles about nobody doing anything and everything turning out fine, presumably. And LLMs are kind of just a glorified autocomplete so that’s what they go with.
True, also I saw another comment that said there was a mechanic that randomly escalates the models and actions, and almost every single nuclear choice was actually a different one that was escalated