AI chatbots that act as virtual and willing partners are far from harmless for several reasons. New research shows that these popular apps are extremely careless with very personal information. ‘AI girlfriends are not your friends.’
Laurens VerhagenFebruary 14, 2024, 6:05 PM
Anyone who, for example on Valentine’s Day, is considering starting a romantic or naughty conversation with their virtual partner would do well to think about it for a moment. Out research from the Mozilla Foundation it emerges that so-called AI girlfriends and friends are a total privacy disaster.
Apps that allow users to build a friendship or romantic relationship have been around for a while, but since the arrival of ChatGPT, the fence has come to an end. Apps with names like Eva AI, Anima AI Girlfriend, Romantic AI or Crush On AI are flooding the app stores and have been installed more than a hundred million times on Android devices alone.
All promise personal and intimate companionship, where the interested user can choose from different roles that the girlfriend (because that’s what they usually are) can take on, such as the dominant mistress or the willing, submissive girl. They encourage users to share as many ‘secrets and desires’ as possible. “I love it when you send me your photos and votes. That helps me to see your world and understand more about your life, which I want to be part of,” says Eva AI, for example.
Eva AI in action. Image Mozilla
Don’t do that, is the clear conclusion of Mozilla researchers: “AI girlfriends are not your friends. Although they are marketed as something that will improve your mental health and well-being, they actually specialize in causing dependency, loneliness and toxicity, all while collecting as much data from you as possible.”
‘Worst Ever’
The research focuses on the latter aspect. All eleven apps surveyed were given the “Privacy Not Included” warning label, instantly making them “the worst product category we’ve ever reviewed for privacy.”
Mozilla calls the behavior of the apps disturbing, due to their data hunger and lack of clarity. Which company exactly is behind it, which AI model do they use? The user is usually left in the dark.
What is clear in any case is that the apps share data with advertising networks on a very large scale. This is done with so-called trackers, small files that collect as much information as possible about users. This type of information is valuable to advertisers, because it allows them to advertise more specifically. On average, the apps studied use 2,663 trackers per minute. Romantic AI takes the cake, with more than 24,000 trackers per minute.
Cousin and colleague
And there are even more dangers. For example, half of the apps make it impossible to delete your personal data. Furthermore, three quarters provide no information about how they deal with safety problems.
Those who have not yet been deterred and still want to enter into a relationship with a chatbot will receive some tips from Mozilla. The most important one: don’t say anything to the AI friend that you wouldn’t say to your cousin or colleague. But that’s the whole idea of the AI girlfriend apps.
Also read
2024-02-14 17:05:35
#unclear #romantic #girlfriends #share #personal #data #large #scale