Social media firms asked to toughen up age checks for under-13s 3 days ago Share Save Laura Cress and Imran Rahman-Jones , Technology reporters Share Save Getty Images Major technology companies have been asked to bring in more robust age checks for under-13s in the UK, similar to those currently in place for services designed for adults. The platforms contacted by media regulator Ofcom and data watchdog the Information Commissioner's Office (ICO) are Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X. They have been told they should do more to make sure younger children are kept safe online. Ofcom chief executive Dame Melanie Dawes said services were currently "failing to put children's safety at the heart of their products". The companies have defended the safeguards they have in place, with YouTube owner Google saying it was surprised by Ofcom's approach, urging it to focus on higher risk services instead. But both regulators said the social media platforms needed to strengthen their commitment to stopping children under 13 from signing up. "We're not saying this is a completely blank sheet of paper they need to address, but they have not gone far enough," Dame Melanie told BBC Radio 4's Today programme. She said she believed Google was "uncomfortable" with Ofcom's call to action. "It's putting the spotlight on them and asking them to account for what they're doing, not through a Silicon Valley press release but on the UK public's terms." Currently, many platforms rely on people who sign up to self-report their own ages. "As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them," the ICO said in an open letter to social media and video platforms. Most social media platforms have a minimum age limit of 13, but Ofcom research suggests 86% of children aged 10-12 have their own social media profile. Ofcom wants firms to use "highly-effective age checks," which are currently only required by law for certain services which provide over-18 content, such as pornography. Implementing similar methods for young children's social media would require the big tech firms to voluntarily bring most robust measures in. The ICO's focus is on the handling of young children's data. "Where services have set a minimum age - such as 13 - they generally have no lawful basis for processing the personal data of children under that age on their service," its letter, from chief executive Paul Arnold, said. Technology secretary Liz Kendall said no platform would get a "free pass" when it came to protecting children and that Ofcom had her full support in holding the platforms to account. "No company should need a court order to act responsibly to protect children," she added. What do the tech companies say? YouTube said it was surprised by Ofcom's "move away from a risk-based approach, particularly given that we routinely
Originally reported by BBC Technology. Published on ABN12.
