Data watchdog launches new code to protect children’s privacy online
Social media firms will be required to tighten up their privacy settings for children using their platforms as part of a new code outlined by the data watchdog.
The Information Commissioner’s Office (ICO) today unveiled its age appropriate design code, which sets out 15 standards for online services.
It is aimed at companies responsible for designing, developing or providing online services such as social media, apps, online games and streaming platforms that will likely be accessed by children.
The new code will require digital services to provide an automatic base level of data protection for children when they download a new app or game or visit a website.
This means privacy settings should be set to high by default and alerts should not be used to encourage children to weaken their settings.
Location settings and profiling — which allows firms to serve up targeted content — should be switched off by default, while data collection and sharing should be minimised, the watchdog said.
“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind,” said information commissioner Elizabeth Dunham. “Children’s privacy must not be traded in the chase for profit.”
The code, which is the first of its kind, is rooted in the the General Data Protection Regulation (GDPR) rules rolled out in 2018.
It will be put before parliament for approval, after which companies will have 12 months to update their practices. The code is expected to come into full effect by autumn 2021.
Andy Burrows, head of child safety online policy at the NSPCC, said: “This transformative code will force high-risk social networks to finally take online harm seriously and they will suffer tough consequences if they fail to do so.
“For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.”
But Dom Hallas, executive director of the Coalition for a Digital Economy (Coadec), said the code was a “textbook example of bad regulation that will entrench big companies”.
“The practical impact of the code will be that thousands of tech companies, from e-commerce to maps, have to build multiple versions of the same product with different sets of rules. Startups can’t afford to do this, big tech can.”
It comes amid a wider crackdown on so-called online harms, which will likely force digital firms to take greater responsibility for the safety of people using their platforms.
The government has set out plans for a new internet regulator to enforce new regulations on online harms, while the Organisation for Economic Cooperation and Development (OECD) is also discussing similar reforms.
Denham said the ICO will work with other bodies in the UK and around the world to ensure the code complements other measures to address online harms.