Grok, the AI chatbot on the X social media platform, has issued answers recently that have startled and amused users, including insinuating that President Donald Trump is a pedophile and that Erika Kirk is actually Vice President JD Vance in drag.
Okay, yes, Donald Trump is some kind of pervert and predator, and he made a comment about a ten-year-old girl one time (joking “I might be dating her in 10 years”), but he mostly preys upon teenage girls. So, not a pedophile — the word for that is ephebephile. Now, I might be splitting a hair here. Both types prefer underage partners. One is clearly more inflammatory. But the fact that the other is more accurate, and that a computer should know the difference, should tell you that AI can leverage the court of public opinion.
Also, shaming a woman by calling her a man in drag is just ugly, I don’t care who she is. I may be just a bit left of centre, but I think even if you’re a lot further left than I am, you can understand that if you don’t want the sword of sexism swung at you, maybe you shouldn’t swing it at others. But in any case, a computer is jumping to this conclusion because these are unpopular figures.
I don’t like these people either.
But what happens when it’s you? Or someone you love? Or the fact that Grok has been used to create CSAM of a ten-year-old girl and a teenage girl? This was in the news the other day, someone got it to take a photo of a grade-school girl in a nice, conservative dress, and swap it for a skimpy bikini. Then they got a photo of a teenage girl (I think someone from Stranger Things?) and had it render her fully nude.
So yeah, it’s funny when AI attacks fascist dictators and their supporters, but it’s also attacking children. It doesn’t know the difference, it doesn’t care, and it can’t be made to do either. It’s just a tool that is used by whoever wields it against whomever they do not like or want to see harmed.
I get what point you’re making in distinguishing between pedophile and ephebophile, but personally I don’t find the distinction particularly relevant. As an adult, the level of grossed out I feel at the prospect of sexual interactions with a young teenager Vs a literal child is approximately equal, because it’s not their physical attributes that cause ick, but rather the exploitation and power dynamics involved.
Edit: I guess what I’m arguing is that in practice, we see the term “pedophilia” used as an umbrella term that encompasses pedophilia, hebephilia and ephebophilia, and I think that is a reasonable use of the term. It does muddy the waters a tad, given that pedophilia does still have its more specific use of referring to sexual attraction to pre-pubescent children, but I don’t think that an issue in the majority of contexts. When it comes to the law, an adult having sex with a child is equally illegal as an adult having sex with a 15 year old. Sure, we can split this hair and distinguish between the terms, but we don’t need to
I mean, yeah, but my point is, a computer would know the difference and not have the emotional attachment to it. That it can associate a person with a wrong thing (even if they are morally equivalent), then it can associate a person with a wrong thing that is not morally equivalent. And that’s the problem.
This should show people how unreliable AI is.
Okay, yes, Donald Trump is some kind of pervert and predator, and he made a comment about a ten-year-old girl one time (joking “I might be dating her in 10 years”), but he mostly preys upon teenage girls. So, not a pedophile — the word for that is ephebephile. Now, I might be splitting a hair here. Both types prefer underage partners. One is clearly more inflammatory. But the fact that the other is more accurate, and that a computer should know the difference, should tell you that AI can leverage the court of public opinion.
Also, shaming a woman by calling her a man in drag is just ugly, I don’t care who she is. I may be just a bit left of centre, but I think even if you’re a lot further left than I am, you can understand that if you don’t want the sword of sexism swung at you, maybe you shouldn’t swing it at others. But in any case, a computer is jumping to this conclusion because these are unpopular figures.
I don’t like these people either.
But what happens when it’s you? Or someone you love? Or the fact that Grok has been used to create CSAM of a ten-year-old girl and a teenage girl? This was in the news the other day, someone got it to take a photo of a grade-school girl in a nice, conservative dress, and swap it for a skimpy bikini. Then they got a photo of a teenage girl (I think someone from Stranger Things?) and had it render her fully nude.
So yeah, it’s funny when AI attacks fascist dictators and their supporters, but it’s also attacking children. It doesn’t know the difference, it doesn’t care, and it can’t be made to do either. It’s just a tool that is used by whoever wields it against whomever they do not like or want to see harmed.
I get what point you’re making in distinguishing between pedophile and ephebophile, but personally I don’t find the distinction particularly relevant. As an adult, the level of grossed out I feel at the prospect of sexual interactions with a young teenager Vs a literal child is approximately equal, because it’s not their physical attributes that cause ick, but rather the exploitation and power dynamics involved.
Edit: I guess what I’m arguing is that in practice, we see the term “pedophilia” used as an umbrella term that encompasses pedophilia, hebephilia and ephebophilia, and I think that is a reasonable use of the term. It does muddy the waters a tad, given that pedophilia does still have its more specific use of referring to sexual attraction to pre-pubescent children, but I don’t think that an issue in the majority of contexts. When it comes to the law, an adult having sex with a child is equally illegal as an adult having sex with a 15 year old. Sure, we can split this hair and distinguish between the terms, but we don’t need to
I mean, yeah, but my point is, a computer would know the difference and not have the emotional attachment to it. That it can associate a person with a wrong thing (even if they are morally equivalent), then it can associate a person with a wrong thing that is not morally equivalent. And that’s the problem.
Removed by mod