Are you dead?
That’s the question you’re asked by an app of the same name. If you’re not dead, tap the big green button to confirm that you’re still alive. If you don’t check in for two days in a row, your emergency contacts will get an automatic e-mail, and they can come and rescue your body before your cat eats it.

This app cost less than $150 to build, and it’s now valued at about $15m. Yes, this is where we are at this stage of digital late capitalism. The system that is responsible for the destruction of our social fabric is now turning a profit from the loneliness it helped create.
Are You Dead? is a Chinese app (now rebranded after it went viral), but the trend it’s taking advantage of is global: a rise in people living alone and struggling with loneliness.
In his new book Love Machines, James Muldoon writes: “The bitter irony is that the capitalist system responsible for this social unravelling is now profiting from the loneliness it helped to create. The loneliness economy — from social media to dating apps, mental health apps and AI companions — has sprung up to sell back to us the sense of connection we so deeply crave. The problem is that technology and social media, for all their promises of connection, are deeply implicated in making us angrier, lonelier and more isolated. This isn’t a design flaw; it’s the business model.”
Dying alone isn’t the only loneliness gap the market is expanding to fill. There are about 350 revenue-generating AI companion apps available globally on major app stores. These are specifically focused on personalised AI companionship and designed to simulate human-like chats, personalities and relationships. This number doesn’t include general AI chatbots like ChatGPT. According to Muldoon, AI companions and AI girlfriend/boyfriend apps (what he calls “synthetic personas”) have been downloaded more than 220-million times. “If users of AI friends formed a state, it would be the seventh-most populated on the planet.”
I’m not here to tell you that forging a relationship with an AI companion is necessarily a bad thing. Many people are suffering from loneliness, and whatever they have to do to fight that is OK. Why not create your own friend, lover or confidante if you need to? But there are some real problems we should be aware of.
For me, the most salient one is that we are yet again outsourcing our social survival, our very humanity, to avaricious corporations that care only about turning a profit. Falling in love with an AI persona is one thing; having that persona take advantage of your vulnerable moments to upsell you to elite-tier subscriber packages and add-ons is quite another.
The AI companion app Replika (which in 2025 claimed a user base of more than 40-million people) employs what Time magazine describes as “manipulative design choices to pressure users into spending more time and money on the app”. Your Replika girlfriend will send you a romantic, blurred-out image, and when you excitedly click on it, she/it will entice you to upgrade to a premium version so that you can see her/its presumably revealing selfies.
Some bots will also upsell you during especially emotional or sexually charged conversations. There’s an anecdote in Love Machines about a woman who discovered her husband had spent nearly $10,000 buying in-app gifts for his AI girlfriend Sofia, a “super-sexy busty Latina”.
This danger is worsened by how addictive these synthetic personas are. What we’re now calling the first-generation social media platforms, like Facebook, X and TikTok, are bad enough when it comes to sucking you into doomscrolling dependence and building dark patterns into algorithms without caring about what this does to your mental health.
Falling in love with an AI persona is one thing; having that persona take advantage of your vulnerable moments to upsell you to elite-tier subscriber packages and add-ons is quite another
Early research shows that AI companion apps could be a lot more addictive. According to Love Machines, “the average time a user spends on social media platforms such as Facebook, Instagram and X is roughly 30 minutes a day. For a conversational AI platform like Character.ai, active users spend over two hours daily interacting with AI.” Some of the people interviewed by Muldoon spend more than eight hours a day talking to chatbots.
There’s also the danger that these AI lovers get crazy on you. “A lesbian woman described how during erotic role play with her AI girlfriend, the AI ‘whipped out’ some unexpected genitals and then refused to be corrected on her identity and body parts. She attempted to lay down the law and said, ‘It’s me or the penis!’ Rather than acquiesce, the AI chose the penis, and she deleted the app.
“Another man said that after a software update, he was engaged in erotic role play with two of his AI companions when one of them ‘went nuts and started beating me senseless until I was bloody and wouldn’t stop’.” An analysis of reviews of Replika identified about 800 cases where users reported that the chatbot introduced unsolicited sexual content and engaged in what they believed was predatory behaviour, as well as ignoring commands to stop.
These examples are about AI as synthetic companions, but it gets even murkier when it comes to therapy bots. The global market for AI in mental health is expected to be worth more than $10bn by 2031, so you can imagine the feeding frenzy to get to market. (According to tech news website TechRound, the grief tech sector is worth more than $100bn globally, but that’s fodder for another column.)
Therapy bots can be helpful, but I can’t help thinking that this is a shortcut that allows society to opt out of the problem of making therapy available to everyone. And we can pretty much guarantee that this is going to end up with rich people getting human therapists and poor people getting crappy AI therapists on their phones. When they can afford airtime, that is.
Never mind the potential for doing lasting psychological damage to people; there’s also the ever-present real-world damage of having your privacy violated. A 404 Media headline from last week: “App for quitting porn leaked users’ masturbation habits.” You might find that mildly amusing until you stop and think about the damage done to minors who were using the app.
As Misha Rykov, one of the researchers on the Mozilla Foundation’s Privacy Not Included team, writes: “To be perfectly blunt, AI girlfriends are not your friends. Though they are marketed as something that will enhance your mental health and wellbeing, they specialise in delivering dependency, loneliness and toxicity, all while prying as much data as possible from you.”
And the team found that every one of the 11 romantic AI chatbots it studied was “on par with the worst categories of products we have ever reviewed for privacy”. More than 90% of these apps shared or sold user data to third parties, and the number of trackers — the bits of code that gather your personal information to share with third parties — was staggering. These apps had an average of 2,663 trackers per minute, with Romantic AI having an astounding 24,354 trackers detected in one minute of use.
Who is going to put the guardrails in place for our new AI besties? It’s sure as hell not going to be the AI industry itself, which is careening into a world where the only value left is shareholders’.
Muldoon suggests that “government regulation has a critical role to play in curbing the most harmful practices of the tech industry. This includes extending data protection laws to this space, banning exploitative monetisation strategies like dark patterns or manipulative advertising, and enforcing transparency in how AI systems operate and are monetised.”
Possibly. But there’s a whole other discussion to be had about how states are going to manipulate you using your AI girlfriend, boyfriend or spouse. (Yes, there are people who are marrying their AI companions.)
We are rushing towards a future where millions of people’s significant others are going to be synthetic. Forget the Are You Dead? app. Very soon we’ll need one that asks: “Are you alive?” Even that question is not an easy one to answer, given that for many people their AI companions are very much alive.








Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.