from Humans Are Free:
Putting children under surveillance and limiting their access to information doesn’t make them safer — in fact, research suggests just the opposite. Unfortunately those tactics are the ones endorsed by the Kids Online Safety Act of 2022 (KOSA), introduced by Sens. Blumenthal and Blackburn. The bill deserves credit for attempting to improve online data privacy for young people, and for attempting to update 1998’s Children’s Online Privacy Protection Rule (COPPA). But its plan to require surveillance and censorship of anyone under sixteen would greatly endanger the rights, and safety, of young people online.
TRUTH LIVES on at https://sgtreport.tv/
KOSA would require the following:
- A new legal duty for platforms to prevent certain harms: KOSA outlines a wide collection of content that platforms can be sued for if young people encounter it, including “promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor.”
- Compel platforms to provide data to researchers
- An elaborate age-verification system, likely run by a third-party provider
- Parental controls, turned on and set to their highest settings, to block or filter a wide array of content
There are numerous concerns with this plan. The parental controls would in effect require a vast number of online platforms to create systems for parents to spy on — and control — the conversations young people are able to have online, and require those systems be turned on by default. It would also likely result in further tracking of all users.
Data collection is a scourge for every internet user, regardless of age.
And in order to avoid liability for causing the listed harms, nearly every online platform would hide or remove huge swaths of content. And because each of the listed areas of concern involves significant gray areas, the platforms will over-censor to attempt to steer clear of the new liability risks.
These requirements would be applied far more broadly than the law KOSA hopes to update, COPPA. Whereas COPPA applies to anyone under thirteen, KOSA would apply to anyone under sixteen — an age group that child rights organizations agree have a greater need for privacy and independence than younger teens and kids. And in contrast to COPPA’s age self-verification scheme, KOSA would authorize a federal study of “the most technologically feasible options for developing systems to verify age at the device or operating system level.”
Age verification systems are troubling — requiring such systems could hand over significant power, and private data, to third-party identity verification companies like Clear or ID.me. Additionally, such a system would likely lead platforms to set up elaborate age-verification systems for everyone, meaning that all users would have to submit personal data.
Lastly, KOSA’s incredibly broad definition of a covered platform would include any “commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.”
That would likely encompass everything from Apple’s iMessage and Signal to web browsers, email applications and VPN software, as well as platforms like Facebook and TikTok — platforms with wildly different user bases and uses.
It’s also unclear how deep into the ‘tech stack’ such a requirement would reach – web hosts or domain registries likely aren’t the intended platforms for KOSA, but depending on interpretation, could be subject to its requirements.
And, the bill raises concerns about how providers of end-to-end encrypted messaging platforms like iMessage, Signal, and WhatsApp would interpret their duty to monitor minors’ communications, with the potential that companies will simply compromise encryption to avoid litigation.
Censorship Isn’t The Answer
KOSA would force sites to use filters to block content — filters that we’ve seen, time and time again, fail to properly distinguish“good” speech from “bad” speech. The types of content targeted by KOSA are complex, and often dangerous — but discussing them is not bad by default.
It’s very hard to differentiate between minors having discussions about these topics in a way that encourages them, as opposed to a way that discourages them. Under this bill, all discussion and viewing of these topics by minors should be blocked.
The law requires platforms to ban the potentially infinite category of “other matters that pose a risk to physical and mental health of a minor.