OfCom Publishes ‘Major Policy Statement for Protection of Children Online’
In what the UK regulatory agency described as a “major milestone,” OfCom published yesterday a new policy statement detailing new rules established following consultation with “companies, children’s safety campaigners and other organizations – as well as children and their guardians.”
“The Protection of Children Codes and Guidance published today build on the rules that we have already put in place to protect all users, including children, from illegal harms such as protecting children from being groomed and sexually exploited,” OfCom said in the statement. “With today’s publication, providers must take action to comply with these rules. The result will be a safer life online for children in the UK.”
Along with the statement, OfCom issued a “summary of the measures in the Codes” titled “Codes at a glance,” as well as summary of OfCom’s decisions.
In the statement, OfCom specified that providers of online services “in scope of the children’s duties now have to complete and record children’s risk assessments by 24 July 2025.”
“Subject to the Codes completing the Parliamentary process, from 25 July 2025, they will need to take the safety measures set out in the Codes or use other effective measures to protect child users from content that is harmful to them,” OfCom added. “We are ready to take enforcement action if providers do not act promptly to address the risks to children on their services.”
The ”Codes at a glance” document includes a set of tables detailing the “user-to-user and search services likely to be accessed by children and which services they apply to.” OfCom noted that for some of the measures, “whether the measure applies to a provider of a particular service will depend on the level of risk on the service, whether it meets other specific risk criteria (e.g., having relevant functionalities), and/or the size of the service.”
In a press release published in response to the OfCom statement, adult industry child protection advocacy organization Adult Sites Advocating Child Protection (ASACP) noted that key measures of the new rules include “safer content feeds, robust age verification, rapid response to harmful content, more user control, and simplified reporting, while accountability and oversight also receive more attention.”
“For example, tech companies must adjust their recommendation algorithms to filter out harmful material from children’s feeds if their platforms are deemed medium or high risk,” ASACP observed, “High-risk services must also use advanced age assurance technology to distinguish children from adults, and platforms without strong age checks must assume that younger users are present and adjust their content accordingly.”
ASACP Executive Director Tim Henning said the organization “supports thoughtful regulations that protect the innocence of youth while preserving the rights of adults.”
“Besides Ofcom’s requirements for the U.K., the association promotes establishing a global child safety standard that includes our free child protection resources for parents, Best Practices tailored to specific adult market segments, and our comprehensive Code of Ethics for all website and mobile app publishers,” Henning added.
To read OfCom’s full statement and for additional information on the regulations which become effective in July, click here.