Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

  • irate944@piefed.social
    link
    fedilink
    English
    arrow-up
    23
    ·
    5 hours ago

    Yeah you’re right, I was just making a joke.

    But it does create some silly situations like you said

      • IratePirate@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        A critical, yet respectful and understanding exchange between two individuals on the interwebz? Boy, maybe not all is lost…