• koper@feddit.nl
    link
    fedilink
    English
    arrow-up
    11
    ·
    19 hours ago

    MS Paint isn’t marketed or treated as a source of truth. LLMs are.

    • backgroundcow@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      5
      ·
      17 hours ago

      Does the marketing matter when the reason for the offending output is that the user spent significant deliberate effort in coaxing the LLM to output what it did? It still seems like MS Paint with extra steps to me.

      I get not wanting LLMs to unprompted output “offensive content”. Just like it would be noteworthy if “Clear canvas” in MS Paint sometimes yielded a violent bloody photograph. But, that isn’t what is going on in OPs clickbait.

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        when the reason for the offending output is that the user spent significant deliberate effort in coaxing the LLM to output what it did?

        What about all the mentally unstable people who aren’t trying to get to say crazy things, end up getting it to say crazy things just by the very nature of the conversations they’re having with it? We’re talking about a stochastic yes man who can take any input and turn it into psychosis under the right circumstances, and we already have plenty of examples of it sending unstable people over the edge.

        The only reason this is “click bait” is because someone chose to do this, rather than their own mental instability bringing this out organically. The fact that this can, and does, happen when someone is trying to do it should make you really consider the sort of things it will tell someone who may be in a state where they legitimately consider crazy shit to be good advice.

        • backgroundcow@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          edit-2
          9 hours ago

          The only reason this is “click bait” is because someone chose to do this, rather than their own mental instability bringing this out organically.

          This is my point. The case we are discussing now isn’t noteworthy, because someone doing it deliberately is equally “impressive” as writing out a disturbing sentence in MS Paint. One cannot create a useful “answer engine” without it being capable of producing something that looks weird/provoking/offensive when taken out of context; no more than one can create a useful drawing program that blocks out all offensive content. Nor is it a worthwhile goal.

          The cases to care about are those where the LLM takes a perfectly reasonable conversation off the rails. Clickbait like the one in the OP is actually harmful in that they drown out such real cases, and is therefore deserving of ridicule.