The AI tool Grok is estimated to have generated approximately 3 million sexualized images, including 23,000 that appear to depict children, after the launch of a new image editing feature powered by the tool on X, according to new analysis of a sample of images.[1]

The image-generating feature exploded in popularity on December 29th, shortly after Elon Musk announced a feature enabling X users to use Grok to edit images posted to the platform with one click.[2] The feature was restricted to paid users on January 9th in response to widespread condemnation of its use for generating sexualized images, with further technical restrictions on editing people to undress them added on January 14th.[3]

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    14 hours ago

    I’m so out of the loop after deleting my Twitter account a while ago. Is it now just a porn generator instead of a website?