Top mental health apps are “data suckers” that can trade your sensitive information

Mental health apps are worse at protecting their users’ personal information than almost any other category of apps, according to new research from the Mozilla Foundation.

The mediation service Calm, the App Store’s top mental health app, as well as apps for people recovering from eating disorders and sexual assault were criticized for their approach to user privacy by the authors of the *Privacy Report Not Included.

Many of the mental health apps reviewed for the report routinely shared user data, allowed weak passwords, had unclear or vague privacy policies, and targeted their users with personalized ads based on their data, the report found.

Mental health apps can pose additional privacy risks due to the highly sensitive and personal data they handle, linking data about a user’s mental state to identifying markers such as age, gender, race, location and even their contact information, the statement said. researchers.

“These are all details that can be used to identify an individual, whether that’s an employer, an advertiser, a health insurance company, or an adversary seeking to use this information in malicious ways,” Jen Caltrider, chief privacy officer, told NOS. inclusive. Euronews Next.

Monetizing mental health

The review of 32 apps serving needs related to mental health and religious beliefs found that 28 of them “raised strong concerns about user data management” while 25 failed to meet security standards for passwords, security updates, and management of vulnerabilities.

“With such power, these apps must equally take on the responsibility to respect and secure their users’ data,” said Caltrider.

“These apps target people who are most vulnerable. People who probably don’t expect their mental health to struggle can become ad data.”

The six apps with the worst privacy protections were Better Help, Youper, Woebot, Better Stop Suicide, Pray.com and Talkspace, the researchers said. online therapy app Talkspace, collecting private chat logs of therapy sessions.

‘Data suction machines’

In general, users should be careful when logging into mental health apps, the report warned, calling it a “data-gathering bonanza.”

“Nearly all apps reviewed gobble up users’ personal data — more than even Mozilla researchers have seen from apps and connected devices,” it said.

Data is big business, the report warned, with insurance companies able to collect more information about their customers and data brokers “enriching their databases” with users’ personal data.

“Despite their flaws, hundreds of millions of dollars are invested in these apps. In some cases, they act like data-sucking machines layered on top of a mental health app. In other words, a wolf in sheep’s clothing,” said Mozilla researcher Misha Rykov.

Euronews Next contacted Calm and Talkspace to request a response to the report, but they did not immediately respond.

How can you protect your data?

There are ways app users can reduce the chance of their sensitive data falling into the wrong hands, Caltrider told Euronews Next.

“Make sure any information you share through these apps is information you’re okay with being made public, as nothing is guaranteed to be 100 percent private when shared through these apps,” she said.

Users of online therapy services should ask their therapist to take notes by hand and not upload them to the app’s system, she said, to reduce the chances of private conversations being shared — intentionally or accidentally.

Caltrider also advised that users opt out of data sharing agreements when possible and that they do not link their social media accounts to mental health or prayer apps.

Users can also make sure they set strong passwords, even if an app doesn’t need it, and should regularly request companies to remove any personal data they have

saved, she added.

Top mental health apps are “data suckers” that can trade your sensitive information

Leave a Reply

Your email address will not be published.

Scroll to top