“Filter Bubble Transparency Act” Introduced in U.S. Senate
WASHINGTON – Last week, U.S. Senators John Thune (R-S.D.), Marsha Blackburn (R-Tenn.), Richard Blumenthal (D-Conn.) and Jerry Moran (R-Kan.) proposed a new piece of legislation called the “Filter Bubble Transparency Act” (“FBTA”), which the senators said is designed to give consumers more control over how content is presented to them on major social media platforms.
The name of the legislation is derived from the title of a book by Eli Pariser published in 2011, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. According to its sponsors, the FBTA will “make it easier for internet platform users to understand the potential manipulation that exists with secret algorithms and require large-scale platforms to allow those users to consume information outside of that potential manipulation zone or ‘filter bubble.’”
While unlikely to directly affect many adult industry social media platforms due to limitations of its application written into the legislation (see next paragraph for more information on those limitations), the FBTA is bound to impact members of the adult industry as social media users, if not as platform owners/operators.
Under its current text, the FBTA would not apply to platforms which are “wholly owned, controlled, and operated by a person that for the most recent 6-month period, did not employ more than 500 employees; for the most recent 3-year period, averaged less than $50,000,000 in annual gross receipts; collects or processes on an annual basis the personal data of less than 1,000,000 individuals; or is operated for the sole purpose of conducting research that is not made for profit either directly or indirectly.”
For platforms covered by the FBTA, it would be “unlawful for any person to operate a covered internet platform that uses an opaque algorithm unless… the person provides notice to users of the platform that the platform uses an opaque algorithm that makes inferences based on user-specific data to select the content the user sees.”
The legislation requires the notice advising users of the “opaque algorithm” to be “presented in a clear, conspicuous manner on the platform whenever the user interacts with an opaque algorithm for the first time” and specifies that the notice “may be a one-time notice that can be dismissed by the user.”
The Act defines the term “opaque algorithm” as “an algorithmic ranking system that determines the order or manner that information is furnished to a user on a covered internet platform based, in whole or part, on user-specific data that was not expressly provided by the user to the platform for such purpose.”
The FBTA also includes an exception that pertains to “age-appropriate content filters,” such that the term “opaque algorithm” does not include “an algorithmic ranking system used by a covered internet platform if the only user-specific data (including inferences about the user) that the system uses is information relating to the age of the user; and such information is only used to restrict a user’s access to content on the basis that the individual is not old enough to access such content.”
A violation of the FBTA would be treated as “a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the Federal Trade Commission Act.”
In a press release announcing the legislation, Thune, who serves as the Chairman of the Senate Subcommittee on Communications, Technology, Innovation, and the Internet, said the FBTA is “about transparency and consumer control.”
“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences from the reams of personal data at their fingertips that can be used to affect behavior and influence outcomes,” Thune said. “That’s why I believe consumers should have the option to either view a platform’s opaque algorithm-generated content or its filter bubble-free content, and, at the very least, they deserve to know how large-scale internet platforms are delivering information to their users.”
While the requirement to disclose the use of algorithms likely wouldn’t present a Constitutional problem, attorney Larry Walters told YNOT the bill may have other defects from a First Amendment perspective.
“The First Amendment protects traditional editorial functions such as how a platform arranges or displays content,” Walters said. “To the extent the bill forces large online platforms to display content in a certain manner, at the option of the user, it may be vulnerable to a First Amendment attack.”
Walters added that while FBTA pertains to “cutting-edge Internet law issues,” making it difficult to assess how courts might view the law’s requirements, the bill still “triggers free speech concerns.”
“A platform should retain the right to display lawful user content in any manner it chooses,” Walters said. “There is a role for the FTC to play in preventing consumer deception. However, that concern is typically addressed by requiring conspicuous disclosures as opposed to mandating access to an alternate version of the platform.”