Ofcom Issues Statement on ‘Age Checks to Protect Children Online’
LONDON – UK regulatory agency OfCom, the organization responsible for enforcing the UK’s “Online Safety Act” (OSA), released a statement Thursday on “age checks to protect children online,” calling the decisions reflected in the statement “the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children.”
The statement follows OfCom’s first major policy statement on its new online safety rules, published last month.
“For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services,” asserted OfCom Chief Executive, Melanie Dawes. “Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.”
Dawes added that as “age checks start to roll out in the coming months, adults will start to notice a difference in how they access certain online services.”
“Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services – including social media – which allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest,” Dawes said.
Dawes said OfCom will be “monitoring the response from industry closely.”
“Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom,” Dawes warned.
In the statement, OfCom sought to address three central questions: “What are online services required to do, and by when?” and “What does highly effective age assurance mean?”
Noting that the OSA “divides online services into different categories with distinct routes to implement age checks,” OfCom said “the action we expect all of them to take starts from today.”
“All user-to-user and search services – defined as ‘Part 3’ services – in scope of the Act, must carry out a children’s access assessment to establish if their service – or part of their service – is likely to be accessed by children,” OfCom said in the statement. “From today, these services have three months to complete their children’s access assessments, in line with our guidance, with a final deadline of 16 April. Unless they are already using highly effective age assurance and can evidence this, we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply with the children’s risk assessment duties and the children’s safety duties.”
OfCom said it will publish its “Protection of Children Codes and children’s risk assessment guidance” in April, meaning that “services that are likely to be accessed by children will need to conduct a children’s risk assessment by July 2025 – that is, within three months” of the publication of those codes.
“Following this, they will need to implement measures to protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified,” OfCom added. “These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful content.”
OfCom added services that “allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July 2025 at the latest to protect children from encountering it.”
As OfCom noted in its statement, the OSA “imposes different deadlines on different types of providers.”
“Services that publish their own pornographic content (defined as ‘Part 5’ services) including certain Generative AI tools, must begin taking steps immediately to introduce robust age checks, in line with our published guidance,” OfCom said. “Services that allow user-generated pornographic content – which fall under ‘Part 3’ services – must have fully implemented age checks by July.”
As to “highly effective age assurance,” OfCom summarized the agency’s “final position” on what satisfies its definition of the term as follows:
- confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
- sets out a non-exhaustive list of methods that we consider are capable of being highly effective.
- They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
- confirms that methods including self-declaration of age and online payments which don’t require a person to be 18 are not highly effective;
- stipulates that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should services host or permit content that directs or encourages users to attempt to circumvent an age assurance process; and
- sets expectations that sites and apps consider the interests of all users when implementing age assurance – affording strong protection to children, while taking care that privacy rights are respected and adults can still access legal pornography.
OfCom said “this approach will secure the best outcomes for the protection of children online in the early years of the Act being in force.”
“While we have decided not to introduce numerical thresholds for highly effective age assurance at this stage (e.g. 99% accuracy), we acknowledge that numerical thresholds may complement our four criteria in the future, pending further developments in testing methodologies, industry standards, and independent research,” OfCom added.
To read the full text of the OfCom statement “age checks to protect children online,” click here. OfCom’s “Quick guide to children’s risk assessments” is available here.