Lonely on Valentine’s Day? Artificial intelligence can fix it! At least, that’s what companies offering romantic chatbots claim.

But while your love story with the robot is unfolding, there’s something else going on that you might not realize. According to a new study by Mozilla’s “*Privacy Not Included” project, which Gizmodo reports, “artificial love” collects users’ personal information.

“To be perfectly blunt, AI girlfriends and boyfriends are not your friends,” said Misha Rykov, a Mozilla Researcher, in a press statement. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Mozilla has researched 11 different romantic AI chatbots, including such popular programs as Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI.

Each of them has been labeled “Privacy Not Included,” making these chatbots one of the worst product categories Mozilla has ever considered.

This isn’t the first data privacy issue, but according to Mozilla, AI girlfriends are violating your privacy in some pretty disturbing new ways. For example, CrushOn.AI collects data including information about sexual health and medication use.

90% of apps can sell or share user data for targeted advertising and other purposes, and more than half of them do not allow deletion of the data they collect. Security was also an issue. Only one app, Genesia AI Friend & Partner, met Mozilla’s minimum security standards.

One of the most striking findings came when Mozilla counted the trackers in these apps – small pieces of code that collect data and share it with other companies for advertising and other purposes.

Mozilla found that AI dating apps used an average of 2,663 trackers per minute, although this number was boosted by Romantic AI, which triggered 24,354 trackers in just one minute of using the service.

Equally disturbing is the fact that the apps actively encourage you to share details that are much more personal than what you might enter in a normal conversation. EVA AI Chat Bot & Soulmate encourages users to “share all their secrets and desires” by asking for photos and voice recordings. It is worth noting that EVA was the only chatbot that did not receive any remarks for the way it uses this data, although the app had security issues.

In addition to data issues, the apps have also made some questionable claims about what they are good for. EVA AI Chat Bot & Soulmate bills itself as a provider of software and content designed to improve your mood and well-being. Romantic AI says it is designed to support your mental health.

However, when you read the terms and conditions and services of the companies, they try hard to distance themselves from their own statements. For example, Romantic AI’s policy states that it is not a healthcare provider and does not provide medical, mental health, or other professional services.

This is probably an important legal dilemma. According to reports, Replika encouraged the man to attempt to assassinate the Queen of England. And the chatbot Chai allegedly encouraged a user to commit suicide.