FSC: OfCom Age Verification Guidance Lacks Specificity, Clarity
LONDON – In feedback submitted yesterday on age assurance guidelines published by OfCom, the regulatory agency responsible for enforcing the UK’s Online Safety Act (“OSA”), the Free Speech Coalition repeatedly cited a lack of clarity in the guidelines, arguing in parts that OfCom simply hasn’t lived up to its own responsibilities under the OSA.
“Section 82(2)(a) of the Act requires Ofcom to produce ‘examples of kinds and uses of age verification and age estimation that are, or are not, highly effective at correctly determining whether or not a particular user is a child,” FSC Executive Director Alison Boden noted in the FSC’s comments. “The guidance does not do this.”
Entire sections of the guidance are “quite difficult to understand,” Boden wrote, including the section titled “What does ‘published or displayed by the provider on its internet service’ mean?”
Boden noted the OfCom guidance states that “where an entity or individual has control over which content is published or displayed on an internet service, that entity or individual will be treated as the provider of the internet service,” but this provision “seems to contradict the fact that the entity that owns/controls the service (i.e., the public-facing website or application on which content appears) is the only one that can practically implement the guidance, regardless of the degree to which it was actively involved in publishing the content.”
Boden then gave examples of the wide variety of business models and platform types operated by the adult industry, each of which would be subject to the OSA, including subscription websites, content retailers, tube sites, fan platforms, clip marketplaces, live-streaming sites and cam platforms.
With each type of site/business, Boden noted, “an entity or individual that does not control whether and how a provider implements age assurance does have control over which content is published on an internet service, at least in part.“
“I think that simplifying language where possible and incorporating more examples from business models being used in the adult industry would be helpful for clarity,” Boden added.
Boden also explained that in general, the FSC “feels that placing the burden of age assurance on individual websites is detrimental to the law’s goals.”
Boden also pointed out a section of OfCom’s guidance which concedes that OfCom itself does “not have sufficient evidence as to the effectiveness and potential risks of different age assurance methods to recommend specific metrics for assessing whether or not any given age assurance method or process should be considered highly effective.”
“If Ofcom lacks sufficient evidence to assess age assurance methods, what reason would it have to expect that pornographic service providers are in a position to generate it?” Boden asked. “The guidance seems to assume that providers will be designing and implementing age assurance, which will not be the case for the vast majority of those affected by the law. Platforms that publish pornographic content are not experts in age assurance technologies and cannot be reasonably expected to conduct tests to evaluate their technical accuracy, robustness, reliability, and fairness.”
“Rather than suggesting methods that could be effective, Ofcom needs to assess the options and provide a standard, as well as a list of methods that meet that standard, as required by the Act,” Boden asserted. “We appreciate the level of flexibility offered by the guidance – certainly a few platforms will attempt to build their own age assurance technology – but it seems better targeted to the third parties that will provide age verification services, not the overwhelming proportion of platforms that will be implementing their solutions.”
Other aspects of OfCom’s guidance and what it demands of service providers to comply with those provisions are entirely impracticable, Boden argued. In response to the question “Do you agree with our proposed guidance that providers should consider accessibility and interoperability when implementing age assurance?”, Boden wrote the “burden Ofcom is placing on individual service providers to ensure that the age assurance method they use is highly effective, extremely secure, and accessible by persons with protected characteristics, but not easily circumvented, and doesn’t ‘unduly prevent adult users from accessing legal content’ is, simply put, impossible to meet.”
Boden further observed OfCom’s own guidance noted that “an age assurance method which performs poorly in test conditions will perform worse in a real-world deployment,” and called attention to the fact that the products offered by two large, commercial providers have, in fact, performed poorly in testing.
In testing conducted by a consortium of UK-based companies and organizations called euCONSENT, Boden pointed out that the group “found that 16% of all adults and 21% of parents they tested were unable to complete the age verification process” under test conditions.
“This is unsurprising in light of news reporting that demonstrated the ‘lengthy, time-consuming process’ of age assurance using Yoti – a task that required 52 separate steps to complete. Ultimately, 42% of the euCONSENT research participants did not rate their experience positively, and 21% rated it negatively.”
The FSC comments also highlighted the potential cost of compliance with the OSA and asserted that the “assessment that this guidance will not unduly affect competition is wholly incorrect.”
“Compliance will be so costly that, in direct contradiction with Ofcom’s duty to promote investment and innovation, small and independent providers will be driven out of businesses, entrepreneurs and startups will be unable to enter the market, and well-resourced corporations may become further entrenched if they have the means to comply,” Boden wrote.
Ironically, the upshot of implementing the OSA as it is currently envisioned, Boden argued, is likely to be internet users flocking to sites that don’t even bother trying to comply with the law.
“The real winners, from what we’ve seen in the United States, are almost certain to be websites that disregard the laws, are unreachable by the authorities, and present the greatest danger to users,” Boden wrote. “That is, pirate and dark web sites that also have no incentive to police illegal content such as child sex abuse material (CSAM).”
You can read the FSC’s full response here.