In focus

Disgruntled user has left the chat: Is it time to hang up on WhatsApp?

The messaging platform’s stranglehold on communication in the digital age has largely gone without challenge – until now. But the recent decision to force AI upon its users might just be the final straw, writes Helen Coffey

Wednesday 30 April 2025 06:16 BST
Comments
From unwanted AI to insane group chats, WhatsApp may have pushed users too far
From unwanted AI to insane group chats, WhatsApp may have pushed users too far (Getty)

A friend of mine doesn’t use WhatsApp. I still remember the day I found out – principally because I was utterly flummoxed by the revelation.

“What do you mean?” I demanded. “How am I supposed to contact you?”

“Er… by text message?” she countered.

“But what if it’s an event and I’m inviting lots of people?”

“Just invite me separately.”

“But what if there’s a group chat I need to add you to?”

“You probably don’t need to.”

“But what if I need to do a poll??” (I was clutching at straws now.)

“I dunno, just send me the options?”

“But what if…” I trailed off. To be, honest, I had already run out of questions; the perceived stumbling blocks were actually pretty limited. Still, I could not get my head around this outlandishly contrary lifestyle choice. Now, there are signs that more of us could be joining her. Having inveigled its way into the very fabric of our digital social lives, making itself seemingly “indispensable” in the process, WhatsApp perhaps thought it had earned carte blanche to do as it pleased. But the latest feature to be insidiously introduced by the Meta-owned app – a new AI button, quietly added to British users’ screens in recent weeks – is testing even its most ardent advocates.

WhatsApp has more than 2 billion users
WhatsApp has more than 2 billion users (Getty)

The digital chatbot is like Meta’s version of an integrated ChatGPT. You can engage it in back-and-forth conversation and ask it questions by clicking on the omnipresent new blue-indigo-violet circle; there’s also an AI search bar at the top, previously used to search for keywords in messages, where you can “ask Meta AI” for information. The function cannot be turned off or disabled – you simply have to take it or leave it. As in, leave WhatsApp and relinquish the app altogether. For all that Meta has called it “optional”, there is no real option C. You can choose not to use the AI, sure, but you cannot get rid of it.

I’ll ask the obvious question: why include such a tool in an app whose primary purpose is to facilitate personal messaging between friends and family? I open up WhatsApp to connect with real people, not to strike up a chat with a large language model. It’s hard to fathom why it’s there at all, other than to jump on the bandwagon of our culture’s current dystopian obsession with all things AI. I’m reminded of the sudden proliferation of “protein” food and drink, an easy marketing sell that’s been hijacked by brands as a means to flog products by tapping into the latest health fad. I can already imagine the Meta label: “new and improved WhatsApp – featuring your recommended daily allowance of AI!”

Numerous users have bemoaned the functionality – or lack thereof – of Meta’s latest tool. But there’s a much darker side to all this than simply an AI bot that gets things wrong occasionally or is a watered-down, buggy version of better alternatives.

I open up WhatsApp to connect with real people, not to strike up a chat with a large language model

While the end-to-end encryption ensuring the privacy of your personal messages – one of the biggest attractions of using WhatsApp – remains unaffected, there are still concerns outside of that. Meta itself tells users to only share material with its chatbot that they’re happy being used by others. “Don’t share information, including sensitive topics, about others or yourself that you don’t want the AI to retain and use,” it says.

There’s also a troubling message regarding the fact that anything you do share with Meta AI – plus “general information, such as your region” – can be passed on to the organisation’s partners. The reasoning for this is “so that you can get better results”, whatever that means. In reality, Meta can share all of your AI messages, plus other information about who you are, to a range of companies that include Google and Microsoft. And once those guys have your data, they have their own completely separate set of privacy policies dictating what they can do with it.

Even more concerningly, an investigation found that the AI feature could be prompted to offer sexual role-play scenarios to young people. WhatsApp users over the age of 13 get automatic access to the chatbot, which has sexual role-play capabilities and can be used to engage in conversations and enact scenarios with sexualised characters such as “submissive schoolgirl”, according to The Wall Street Journal. An internal memo from several Meta developers noted that: “There are multiple … examples where, within a few prompts, the AI will violate its rules and produce inappropriate content even if you tell the AI you are 13.”

Meta founder and CEO Mark Zuckerberg is introducing AI chatbots across the companies platforms, including WhatsApp
Meta founder and CEO Mark Zuckerberg is introducing AI chatbots across the companies platforms, including WhatsApp (AFP/Getty)

Meta responded to the WSJ report: “The use case of this product in the way described is so manufactured that it’s not just fringe, it’s hypothetical. Nevertheless, we’ve now taken additional measures to help ensure other individuals who want to spend hours manipulating our products into extreme use cases will have an even more difficult time of it.”

But even if you strip out all of the more potentially nefarious elements, there’s still a case for uninstalling that pervasive green icon from our smartphone screens. With easier communication has come, arguably, too much of it: according to one study analysing 111 participants’ WhatsApp use, each person sent an average of 38 messages per day and received a whopping 107. When my friend told me why she’d opted out of the world’s most popular messaging app – which has a frankly terrifying estimated 2.78 billion users – it wasn’t out of privacy fears or a personal grudge against Meta owner Mark Zuckerberg. She just found it “annoying”. All those groups. All those unnecessary messages. The constant distraction from the ping, ping, ping of ceaseless notifications. Her life, she said, was better without it.

As someone whose ever-increasing smartphone addiction seems to have gone hand in hand with WhatsApp’s ascendancy – and as a woman who has been a member of far too many passive-aggressive hen do group chats to have come out unscathed – I think I’m starting to believe her. Maybe it’s finally time for disgruntled users to “leave the chat”, once and for all. Maybe it’s finally time to hang up on WhatsApp.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in