Its human partners said the flirty, quirky GPT-4o was the perfect companion – on the eve of Valentine’s Day, it’s being turned off for good. How will users cope?
There are already apps that target this demographic but expanding on it. Anecdotally many of the people attached to 4o seem to be women seeking emotional attachment. These new AI companion apps scope up this demographic I’m sure. But they also target horny men and prey on their impulses to drain their credit cards (you buy your AI gifts or whatever until the post-nut clarity sets in I guess).
I don’t think OpenAI was intentionally targeting women. I doing know if they ever intended for people to fall in love with 4o, it just kind of started happening
It may be grimly positive that AI companies start targeting whales for this kind of financial draining, instead of using their unwarranted VC subsidies to give anybody with a cheap ChatGPT account access to the fake romance engine.
And unfortunately, it doesn’t look like there’s any groups that are positioned to do anything about it. Every single “AI safety” group I’ve seen is effectively a corporate front, distracting people with fictional dangers instead of real ones like this.
There are already apps that target this demographic but expanding on it. Anecdotally many of the people attached to 4o seem to be women seeking emotional attachment. These new AI companion apps scope up this demographic I’m sure. But they also target horny men and prey on their impulses to drain their credit cards (you buy your AI gifts or whatever until the post-nut clarity sets in I guess).
Wait… the target are women? Thats very surprising… Id expect the major target to be gooner males.
I don’t think OpenAI was intentionally targeting women. I doing know if they ever intended for people to fall in love with 4o, it just kind of started happening
It may be grimly positive that AI companies start targeting whales for this kind of financial draining, instead of using their unwarranted VC subsidies to give anybody with a cheap ChatGPT account access to the fake romance engine.
And unfortunately, it doesn’t look like there’s any groups that are positioned to do anything about it. Every single “AI safety” group I’ve seen is effectively a corporate front, distracting people with fictional dangers instead of real ones like this.