Android users, beware! Several popular mental health apps with around 14.7 million downloads on Google Play Store may be putting your data at risk. According to research by Oversecured (via Bleeping Computer), multiple mental health apps, including AI-based therapy chatbots, contain vulnerabilities that are capable of exposing private therapy conversations, mood logs and medical details. In one case, researchers found more than 85 medium- and high-risk security flaws in a single app. As per the report, some of these apps – with millions of downloads globally – claim to offer privacy and encryption on the vendor’s servers. “Mental health data carries unique risks. On the dark web, therapy records sell for $1,000 or more per record, far more than credit card numbers,” says Sergey Toshin, founder of mobile security company Oversecured.
A total of 1,575 security flaws detected
In its research, Oversecured scanned ten mobile apps advertised as tools that can help with various mental health problems. During the research, the organisation uncovered a total of 1,575 security vulnerabilities (54 rated high-severity, 538 medium-severity, and 983 low-severity). Though not critical, these vulnerabilities can be exploited to intercept login credentials, spoof notifications, HTML injection, or to locate the user.
“These apps collect and store some of the most sensitive personal data in mobile: therapy session transcripts, mood logs, medication schedules, self-harm indicators, and in some cases, information protected under HIPAA,” the researchers note.
How the flaws are misused
According to the report, some apps improperly handle links and commands from outside sources. This could allow attackers to access internal parts of the app that are not meant to be exposed, including areas that handle login tokens or session data. In simple terms, a hacker could trick an app into opening protected sections and gain access to therapy records.One therapy app with over a million downloads allegedly uses Intent.parseUri() on an externally controlled string and launches the resulting messaging object (intent) without validating the target component. This allows an attacker to force the app to open any internal activity, even if it is not intended for external access.“Since these internal activities often handle authentication tokens and session data, exploitation could give an attacker access to a user’s therapy records,” Oversecured research says. Other apps were found storing sensitive information locally in ways that any app on the phone could read. This could expose CBT session notes, mood scores, and personal journal entries. Researchers also found unprotected configuration data, such as backend server addresses, and the use of weak random number generators for security keys.Many of the apps also lack basic protections like root detection, the research found. This means that on a rooted phone, other apps could freely access stored health data.
Limited updates raise concerns
In its research, Oversecured also noted that most of these apps still had medium-level problems that weaken overall security. Only four of the 10 apps had been updated recently, while others had not seen updates since late 2025 or even 2024.The scans were carried out in late January 2026, and researchers said they could not confirm whether the issues have since been fixed.