Uncategorized

Analysis of 200 education dept-endorsed school apps finds most are selling BS when it comes to the privacy of children’s data

Analysis of almost 200 school-endorsed apps found that most start harvesting children’s data within seconds in contravention of the developer’s own privacy policies, leaving underage users exposed to significant privacy and security risks.

The findings by UNSW researchers come from an audit of around 200 Android educational apps sourced from school recommendation lists, state Department of Education websites, and the Google Play Store.

The results were presented in the paper “Analysing Privacy Risks in Children’s Educational Apps in Australia,” authored by Dr Rahat Masood, a cyber security expert at UNSW, and his colleagues Sicheng Jin, Jung-Sook Lee and Hye-Young (Helen) Paik.

The research team found that many of the apps collected sensitive data, transmitting it to third parties, and hiding behind privacy policies so complex very few parents can understand them.

Dr Masood said they wanted to analyse whether Australia, the federal government and education departments are aware of the security and privacy risks involved for children as teaching goes digital and relies on tech suppliers.

Illusion of safety

What’s quickly became apparent is that tech platforms are driving a truck through the privacy of students while pretending to be safer for underage users. In some instances apps marketed to young children – using terms such as “Kids,” “Preschool,” or “ABC” – were no safer than general-audience apps, and in some instances worse alignment between their stated privacy commitments and actual behaviour.

The research paper described this as “the illusion of safety” – child-centric branding cultivates parental trust without providing genuine protection.

A staggering 76% of apps targeted at children showed at least one form of policy distortion, compared with 67% of general educational titles.

The researchers found apps carrying child-friendly names often embedded the same advertising and analytics tools found in commercial entertainment apps, including the same tools used to track adults using the internet.

API vulnerabilities

They also found significant security concerns.

Almost 80% of apps contained “hard-coded secrets” – API (Application Programming Interfaces) keys and credentials embedded directly in the app’s code in a way that could be accessed by anyone who decompiled the application.

“Hard-coded secrets mean that if you configure an API, you have a password or passphrase and the API key is hard-coded within the code,” Dr Masood said.

“Anyone can access it and do whatever they want with the API. It is not a good practice from a development point of view.”

Their analysis found that 89.3% of apps began transmitting data to third parties before a user had interacted with the app at all. Opening an app was enough to send device identifiers, location metadata, and other sensitive information to analytics platforms and advertising networks.

“Even if you are not interacting with the app – you just open it and that’s it – it is still transferring lots of data,” Dr Masood said.

“Telemetry data which mainly refers to tracker-related identifiers and used for the automatic collection and transmission of data to remote servers. Despite just opening the app and not using any educational feature, it is still transferring a lot of information that is sensitive and can actually identify your device.”

Report coauthor Dr Rahat Masood

The research findings also sit in contrast to the government’s ban on children under 16 using social media amid concerns that tech companies target young people.

Australia’s privacy commissioner flagged concerns about privacy and safety during the trail period for the ban but the issues she raised were largely ignored in the final report.

The Office of the Australian Information Commissioner (OAIC) told the organisers of the Age Assurance Technology Trial (AATT), which preceded the under-16s ban, that their reports used inflated privacy language that couldn’t be supported by the trial’s own methodology.  The OAIC noted that a comprehensive privacy assessment against the Privacy Act had not been conducted as part of the trial, despite being proposed in the evaluation proposal.

Feeding Facebook

That broad interpretation of privacy appears to also apply to assessments of government-endorsed apps for school kids.

The UNSW researchers found that 83.6% of apps checked transmit persistent identifiers – unique codes that can track a device across sessions and across different apps. More than two-thirds (67.9%) of the apps contained at least one embedded tracker or analytics tool, such as Firebase, Facebook SDK, or Unity Analytics.

Dr Masood noted that “none of these are needed to actually run the educational app.”

The research team also analysed the privacy policies of the apps and found that just 3% were “fairly easy” to read. The other 97% required university-level literacy or higher to grasp their meaning.

“Nobody will understand these terminologies and jargon,” she said.

“Comprehension, readability, understandability – all these metrics that we analysed were all very bad.”

On top of that the legal text often doesn’t reflect what the app actually does. Just a quarter of the apps examined – ie, about 50 – were fully consistent between their stated privacy policy and their observed behaviour during testing.

“We matched the privacy policy with the dynamic analysis – when the app is running, whether it is collecting the data and whether it is mentioned in the privacy policy or not,” Dr Masood said.

“Only one in four were matching. Some of the policies appear to have been generated using AI tools.”

One app listed in its store description as “Data Not Collected” was observed initialising Firebase analytics and transmitting persistent identifiers from the moment it first launched. Another that claimed “no ads, no tracking” was found to be sending data to Unity Analytics and Google before a user had done anything.

Crackdown needed

Dr Masood said the problem starts with the each state’s Department of Education drawing up its recommended list of apps for educators.

“They look at very high-level details and they don’t download the app – they don’t do the dynamic analysis, they don’t go through the accessibility and readability of the privacy policies,” she said.

Schools are told the apps were assessed through a quality assurance framework, but she said it’s inadequate and teachers are largely unaware of the risks embedded in these tools, while parents assume that if an app has been approved, it is safe..

“They [teachers] are out of resources – first of all – and they don’t know about any security issues. They were just given an app to use and that’s it,” she said.

Dr Masood and her colleagues believe a “traffic light” system would be a better solution as a visual summary of an app’s privacy and security profile, bypassing the legal jargon.

Their research calls for stricter oversight of the “child-directed” app category, arguing that labels such as “Kids” or “Educational” should have a verified technical baseline, rather than being used as a content descriptor.

The also want regulators to prohibit “idle telemetry” – transmitting data before a user has done anything.

The project was funded by the UNSW Australian Human Rights Institute.

Source link

Visited 1 times, 1 visit(s) today

Leave a Reply

Your email address will not be published. Required fields are marked *