Kids Online Safety Act

From Bibliotheca Anonoma

In 1997, the Supreme Court of the United States unanimously struck down two provisions of the Communications Decency Act, which made "the knowing transmission of obscene or indecent messages to any recipient under 18 years of age" and "knowing[ly] sending or displaying of patently offensive messages in a manner that is available to a person under 18 years of age" felonies[1]. "The level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox", quoted the Court; "the Government may not 'reduce the adult population . . . to . . . only what is fit for children.'"[2]

It's been 25 years and a bill pending in the Senate hasn't learned this lesson. The bill is entitled the "Kids Online Safety Act"[3]. The following summarizes and analyzes this Act; numbers and letters in parentheses indicate particular clauses of the bill.

Current Status

Reported Out of Committee with an Amendment

The below analysis relates to the original version and will be edited to reflect the committee substitute.

Definitions

Anyone 16 or under is a "minor". (2(5))

A "covered platform" is an "online platform" that "is reasonably likely to be used, by a minor." (2(3))

Duty of Care

This Act requires a "covered platform...to act in the best interests of a minor that uses [its] products or services." (3(a))

It must "take reasonable measures...to prevent and mitigate:"

  • "mental health disorders" (3(b)(1))
  • "addiction-like behaviors" (3(b)(2))
  • bullying and harassment (3(b)(3))
  • "sexual exploitation" (3(b)(4))
  • advertising of illegal drugs, gambling, tobacco, and alcohol (3(b)(5))
  • "financial harms" (3(b)(6))

Restrictions

Safeguards for Minors

A covered platform must have "readily accessible and easy-to-use safeguards" for minors and their parents to:

  • stop other people, especially "adults with no relationship to" them, from finding them (4(a)(1)(A))
  • "prevent other users, registered or not from viewing" and "restricting public access to" their personal data (4(a)(1)(B))
  • limit features that make them want to use the covered platform more (4(a)(1)(C))
  • opt out of, or limit recommendations from, algorithmic recommendation systems (4(a)(1)(D))
  • delete their account and all its data (4(a)(1)(E))
  • restrict sharing of their geolocation, and tell them when it's tracked (4(a)(1)(F))
  • limit the time they spend there (4(a)(1)(G))

If a platform "knows or reasonably believes" a user to be a minor, these settings must be "the strongest option available" (4(a)(2)).

Parental Tools

A covered platform must provide "readily accessible and easy-to-use" tools for parents of minors to:

  • "control" their "privacy and account settings" including the above "#Safeguards for Minors" (4(b)(2)(A))
  • restrict their purchases (4(b)(2)(B))
  • track the time they spend (4(b)(2)(C))
  • turn off parental controls (4(b)(2)(D))
  • "allow [them] to address the harms" covered in "#Duty of Care" (4(b)(2)(E))

A minor will be told when those tools are in effect (4(b)(3)).

If a platform "knows or reasonably believes" a user to be a minor, these tools must be in effect by default (4(b)(4)).

Reporting Mechanisms

A covered platform has to have "a readily accessible and easy-to-use means to submit reports of harms to a minor" (4(c)(1)(A)). It has to "receive and respond to reports in a reasonable and timely manner" (4(c)(2)).

Disclosure

Before a person "a covered platform reasonably believes is a minor" signs up for it, it must get the minor and their parents to acknowledge

  • how it looks after their personal data (5(a)(1)(A))
  • how to access the #Restrictions (5(a)(1)(B))
  • whether it poses any "heightened risk of harm" to them (5(a)(1)(C))

The terms and conditions of any covered platform that uses an "algorithmic recommendation system" must describe

  • how the system uses personal data belonging to minors (5(b)(1))
  • how minors and their parents can "opt out or down-rank...recommendations" provided by the system (5(b)(2))

Any "advertising aimed at minors" on a covered platform must

  • tell them why they've been targeted for the ad including what personal data was used (5(c)(2))
  • explicitly say that it is an ad (5(c)(3))

All these disclosures must be "clear, accessible, and easy-to-understand".

Transparency

Every year a covered platform has to get an "independent, third-party audit" including "reasonable inspection" (6(a)(1)).