Kids Online Safety Act

From Bibliotheca Anonoma
Revision as of 04:39, 12 April 2022 by Quintuplicate (talk | contribs)

In 1997, the Supreme Court of the United States unanimously struck down two provisions of the Communications Decency Act, which made "the knowing transmission of obscene or indecent messages to any recipient under 18 years of age" and "knowing[ly] sending or displaying of patently offensive messages in a manner that is available to a person under 18 years of age" felonies[1]. "The level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox", quoted the Court; "the Government may not 'reduce the adult population . . . to . . . only what is fit for children.'"[2]

It's been 25 years and a bill pending in the Senate hasn't learned this lesson. The bill is entitled the "Kids Online Safety Act"[3]. The following summarizes and analyzes this Act; numbers and letters in parentheses indicate particular clauses of the bill.

Current Status

In Committee

Definitions

Anyone 16 or under is a "minor". (2(3))

A "covered platform" is "a commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor." (2(2))

Duty of Care

"A covered platform has a duty to act in the best interests of a minor that uses the platform's products or services." (3(a))

This duty includes preventing harm to minors from, in particular, the following acts:

  • promotion of self-harm, suicide, eating disorders (3(b)(1))
  • addiction-like behaviors (3(b)(2))
  • bullying and harassment (3(b)(3))
  • grooming and trafficking of child pornography (3(b)(4))
  • advertising of illegal drugs, gambling, tobacco, and alcohol (3(b)(5))
  • deceptive marketing practices (3(b)(6))

Restrictions

Safeguards for Minors

A covered platform must have "readily accessible and easy-to-use safeguards" for minors and their parents to:

  • "limit the ability of other individuals...in particular adults with no relationship to the minor" to find them (4(a)(1)(A))
  • "prevent other individuals from viewing" and "restricting public access to" their personal data (4(a)(1)(B))
  • limit features that make them want to use the covered platform more, like autoplay, rewards for usage, and notifications (4(a)(1)(C))
  • opt out of algorithms that use their personal data (4(a)(1)(D))
  • delete their account and all its data (4(a)(1)(E))
  • restrict their geolocation from being shared, and tell them when it's tracked (4(a)(1)(F))
  • limit the time they spend on the covered platform (4(a)(1)(G))

If a platform "knows or reasonably believes" a user to be a minor, these settings must be "the strongest option available" (4(a)(2)).

A platform can't "encourage minors to weaken or turn off safeguards" and has to provide "age appropriate" information about them (4(a)(3)).

Parental Tools

A covered platform must provide "readily accessible and easy-to-use" tools for parents of minors to:

  • "control" their "privacy and account settings" including the above "#Safeguards for Minors" (4(b)(2)(A))
  • restrict their purchases (4(b)(2)(B))
  • track the time they spend (4(b)(2)(C))
  • turn off parental controls (4(b)(2)(D))
  • learn about what they are doing so they can deal with the harms covered in "#Duty of Care" (4(b)(2)(E))

A minor will be told when those tools are in effect (4(b)(3)).

If a platform "knows or reasonably believes" a user to be a minor, these tools must be in effect by default (4(b)(4)).

Reporting Mechanisms

A covered platform has to have "a readily accessible and easy-to-use means to submit reports of harms to a minor" (4(c)(1)(A)). It has to "receive and respond to reports in a reasonable and timely manner" (4(c)(2)).