Uncategorized

UK regulators reject social media ban for kids, write open letter to YouTube, Facebook and Instagram: ‘Tell us how…’

UK regulators reject social media ban for kids, write open letter to YouTube, Facebook and Instagram: ‘Tell us how…’
AI-generated image for representational purpose

UK regulators have officially rejected the proposal for a total social media ban for children under 16. Instead, they are putting tech giants in the spotlight, demanding that platforms like YouTube, TikTok, Facebook and Instagram must demonstrate exactly how they are keeping young users safe. The UK’s online safety watchdogs – Ofcom and the Information Commissioner’s Office (ICO) – sent a joint “open letter” to the world’s biggest social media companies, giving them until April 30 to report on their progress.“Our message to platforms is simple: act today to keep children safe online. There’s now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place. Platforms need to be ready to demonstrate what they’re doing to keep underage children out and safeguard those children that are old enough to access their services,” said Paul Arnold, ICO Chief Executive Officer.The move comes after British lawmakers voted against a proposed blanket ban earlier this month. Rather than cutting children off from the internet, regulators are now focusing on “stricter enforcement” of existing safety laws.

Read the open letter sent by ICO CEO Paul Arnold to the Big Tech

To social media and video sharing platforms operating in the UK,Today, the ICO is issuing a call for you to urgently review and strengthen your age assurance measures to prevent underage children accessing your services. New, viable age assurance technologies are now readily available. You should act now to implement them to enforce your own minimum age requirements. Our Children’s code strategy work has identified that many services set a minimum age of 13 but rely on self-declaration as the primary method to enforce it. As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them. This puts under-13s at risk by allowing their information to be collected and used unlawfully, without the protections they are entitled to. Age assurance technologies have rapidly advanced in recent years, creating new viable and privacy friendly solutions that can enable you to much more accurately identify if children are 13 or over before they are able to access your service. We are concerned that services have not yet implemented the technology that is now available to protect young children. With growing concerns about the risks to children even older than 13 accessing services like yours, now is the time to act. You should be making full use of the current viable technologies to prevent under 13s – who you already recognise are too young to be on your services – from gaining access. Minimum age obligationsWhere services have set a minimum age – such as 13 – they generally have no lawful basis for processing the personal data of children under that age on their service. If your service is not suitable for children under a minimum age set out in your terms of service, you should therefore prevent access to children under your minimum age by implementing an effective age gate. Given the advances in age assurance technologies, we expect services to be making use of current viable technologies – examples include but are not limited to, facial age estimation, digital ID, or one-time photo matching – when enforcing minimum age requirements. We understand that most services are relying on self-declaration to identify whether children are 13 or over, with a limited number also utilising some form of profiling to enforce minimum age requirements. As currently deployed, we don’t think that these tools are effective and therefore they should not continue to be relied upon to prevent access to under 13s. Any age assurance technology you choose must comply with data protection law. It must be lawful, fair, proportionate, secure, collect the minimum data necessary, and be clearly explained to users in an age-appropriate way. We have published more information about our position on age assurance in our Opinion. Action for industryWith ever growing public concern, the status-quo is not working and industry must do more to protect children. You should act now to identify and implement current viable technologies to prevent children under your minimum age from accessing your service.We expect industry to take urgent steps to meet this call to action and we will be monitoring practices to decide whether further regulatory action is necessary. We have started direct engagement with some of the highest risk services and expect them to work directly with us to strengthen their age assurance measures over the next two months. The ICO expects full cooperation during engagement. Now is the time to act to keep children safe online. Paul ArnoldChief Executive OfficerInformation Commissioner’s Office

Source link

Visited 1 times, 1 visit(s) today

Leave a Reply

Your email address will not be published. Required fields are marked *