• HeyJoe@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    3 days ago

    If they are being stored on his servers then he absolutely should be, and whoever else is in charge of that company. If that was found on any regular persons pc it would be over, so why not here.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      3 days ago

      In America:

      Section 230 of the Communications Act provides immunity for online platforms and users, stating they generally aren’t liable for content posted by others, allowing them to host third-party information without being treated as the “publisher or speaker”.

      I’m guessing Europe has a similar provision.

        • shalafi@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          Oh! I hadn’t thought that through. Guess we don’t have laws to cover hold AI responsible and the company can simply dodge responsibility.

      • HeyJoe@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        Thanks! That makes sense. Although it’s not really others if you ask me, its themselves. I’m sure they would argue otherwise. No accountability and it’s only getting worse.

    • AnneBonny@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      The story says:

      After days of concern over use of the chatbot to alter photographs to create sexualised pictures of real women and children stripped to their underwear without their consent

      Pictures of women or children in underwear are generally not illegal in the United States.

      • Luxyr@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        There are states that say a character in a book being gay or trans is automatically porn, so I think intentionally sexualizing a child in underwear should be sufficient.