Kids Online Safety Act: Difference between revisions
(22 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
In 1997, the Supreme Court of the United States unanimously struck down two provisions of the Communications Decency Act, which made "the knowing transmission of obscene or indecent messages to any recipient under 18 years of age" and "knowing[ly] sending or displaying of patently offensive messages in a manner that is available to a person under 18 years of age" felonies<ref>[https://www.supremecourt.gov/opinions/boundvolumes/521bv.pdf 521 U.S. 844, 859]</ref>. "The level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox", quoted the Court; "the Government may not 'reduce the adult population . . . to . . . only what is fit for children.'"<ref>[https://www.supremecourt.gov/opinions/boundvolumes/521bv.pdf 521 U.S. 844, 875]</ref> | In 1997, the Supreme Court of the United States unanimously struck down two provisions of the Communications Decency Act, which made "the knowing transmission of obscene or indecent messages to any recipient under 18 years of age" and "knowing[ly] sending or displaying of patently offensive messages in a manner that is available to a person under 18 years of age" felonies<ref>[https://www.supremecourt.gov/opinions/boundvolumes/521bv.pdf 521 U.S. 844, 859]</ref>. "The level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox", quoted the Court; "the Government may not 'reduce the adult population . . . to . . . only what is fit for children.'"<ref>[https://www.supremecourt.gov/opinions/boundvolumes/521bv.pdf 521 U.S. 844, 875]</ref> | ||
It's been 25 years and a bill pending in the Senate hasn't learned this lesson. The bill is entitled the "Kids Online Safety Act"<ref>https://www.congress.gov/bill/117th-congress/senate-bill/3663/text</ref>. The following summarizes and analyzes this Act; numbers and letters in parentheses indicate particular clauses of the bill. | It's been 25 years and a bill pending in the Senate hasn't learned this lesson. The bill is entitled the "Kids Online Safety Act"<ref>https://www.congress.gov/bill/117th-congress/senate-bill/3663/text</ref>. This bill wasn't passed in the 117th Congress, but now another bill has been reported from committee<ref>https://www.congress.gov/bill/118th-congress/senate-bill/1409/text</ref>. The following summarizes and analyzes this Act; numbers and letters in parentheses indicate particular clauses of the bill. | ||
==Current Status== | ==Current Status== | ||
The "Kids Online Safety Act" passed the Senate July 31, 2024. | |||
==Definitions== | |||
These definitions are arranged so that every definition appears after all definitions it relies on, and will be in bold when used in the remainder of this analysis. | |||
Anyone under 17 is a "minor" (2(8)), and anyone under 13 is a "child" (2(1)). | |||
An "online platform" is any public-facing website or mobile app "that predominantly provides a community forum for user generated content" and includes social media services and virtual reality environments (2(9)). | |||
An "online video game" is any video game that connects to the internet and allows users to create content, engage in microtransactions, or communicate with other users, or has minor-specific advertising (2(10)). | |||
A "covered platform" is an '''online platform''', '''online video game''', messaging, or streaming service that "is reasonably likely to be used, by a '''minor'''." Broadband internet, email, teleconferencing, and text messaging services, schools, nonprofits, libraries, news apps, and VPNs are not covered platforms (2(3)). Video streaming services have their own regulations. | |||
"Knows" includes "knowledge fairly implied on the basis of objective circumstances" (2(6)). The FTC must consider "whether the operator, using available technology, exercised reasonable care", in deciding whether or not that clause applies if it had no actual knowledge (15(b)). | |||
"Mental health disorder" means any mental disorder included in DSM-5 or its successor (2(7)). | |||
"Personal data" means info "that identifies or is linked or reasonably linkable to a particular '''minor'''" and includes "consumer device identifiers" (2(12)). | |||
==Duty of Care== | ==Duty of Care== | ||
A covered platform must "take reasonable measures...to prevent and mitigate the following harms to minors:" | |||
* "anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors" (3(a)(1)) | |||
* "addiction-like behaviors" (3(a)(2)) | |||
* bullying and harassment (3(a)(3)) | |||
* "sexual exploitation and abuse" (3(a)(4)) | |||
* advertising of narcotics, gambling, tobacco, and alcohol (3(a)(5)) | |||
* "financial harms" (3(a)(6)) | |||
It | It doesn't have to stop minors from "deliberately and independently searching for...content" (3(b)(1)), or ban "resources for the prevention or mitigation" of those harms (3(b)(2)). | ||
==Restrictions== | ==Restrictions== | ||
===Safeguards for Minors=== | ===Safeguards for Minors=== | ||
A covered platform must have "readily accessible and easy-to-use safeguards" for minors | A covered platform must have "readily accessible and easy-to-use safeguards" for '''minors''' to: | ||
* stop other people | * stop other people from finding them (4(a)(1)(A)) | ||
* "prevent other users, registered or not from viewing" and "restricting public access to" their personal data (4(a)(1)(B)) | * "prevent other users, whether registered or not from viewing" and "restricting public access to" their '''personal data''' (4(a)(1)(B)) | ||
* limit features that | * limit features that result in '''compulsive usage''' like rewards, notifications, or automatic playing of media (4(a)(1)(C)) | ||
* opt out of, or limit recommendations from, | * opt out of, or limit recommendations from, algorithms (4(a)(1)(D)) | ||
* restrict sharing of their '''geolocation''', and tell them when it's tracked (4(a)(1)(E)) | |||
* restrict sharing of their geolocation, and tell them when it's tracked (4(a)(1)( | |||
If a platform | A '''minor''' must be able to delete their account and all its '''personal data''', and limit the amount of time they spend there (4(a)(2)). | ||
If a platform '''knows''' a user is a '''minor''', these settings must be set by default to the strongest (4(a)(3)). | |||
===Parental Tools=== | ===Parental Tools=== | ||
A covered platform must provide "readily accessible and easy-to-use" tools for parents of minors to: | A covered platform must provide "readily accessible and easy-to-use" tools for parents of minors to: | ||
* | * view their "privacy and account settings" including the above "[[#Safeguards for Minors]]", and change the settings for any '''child''' (4(b)(2)(A)) | ||
* restrict their purchases (4(b)(2)(B)) | * restrict their purchases (4(b)(2)(B)) | ||
* track the time they spend (4(b)(2)(C | * track and restrict the time they spend (4(b)(2)(C)) | ||
A minor will be told when those tools are in effect (4(b)(3)). | A '''minor''' will be told when those tools are in effect (4(b)(3)). | ||
If a platform | If a platform '''knows''' a user is a '''child''', these tools must be in effect by default (4(b)(4)). | ||
===Reporting Mechanisms=== | ===Reporting Mechanisms=== | ||
A covered platform | A covered platform needs "a readily accessible and easy-to-use" way to report "harms to minors" (4(c)(1)(A)), an "electronic point of contact" specific to them (4(c)(1)(B)), and a way to confirm and track reports (4(c)(1)(C)). It must respond "in a reasonable and timely manner", within 10 days of getting them if it has at least 10 million users in the US (4(c)(2)(A)), and 21 days if not(4(c)(2)(B)). Imminent threats must be dealt with "as promptly as needed" (4(c)(2)(C)). | ||
===Other Restrictions=== | |||
A '''covered platform''' can't facilitate the advertising of narcotics, gambling, alcohol, and tobacco to '''minors''' (4(d)), or modify the UI "with the purpose or substantial effect of subverting or impairing user autonomy, decision-making, or choice with respect to" what's required by [[#Safeguards for Minors]] and [[#Parental Tools]] (4(e)(2)). | |||
A '''covered platform''' must let '''minors''' and parents know about the [[#Safeguards for Minors]] and [[#Parental Tools]] it imposes, but can't encourage them to weaken or disable them. Controls to enable and disable them must be "readily-accessible and easy-to-use" and "in the same language, form, and manner" as the '''covered platform''' (4(e)(1)). | |||
===Exceptions=== | |||
This Act doesn't: | |||
* stop a '''covered platform''' from limiting their algorithms from recommending "harmful, obscene, or unlawful content to '''minors'''" (4(e)(3)(A)(i)), or blocking or filtering spam or protecting their security (4(e)(3)(A)(ii)) | |||
* force a covered platform to disclose the content of a '''minor''''s communications (4(e)(3)(B)) | |||
* stop a covered platform from cooperating with law enforcement agencies regarding activity that it "reasonably and in good faith" believes is illegal (15(d)(1)) | |||
* force a covered platform to implement age gating or verification (15(c)(2)) | |||
Personalized recommendation systems are still allowed but only if they're based on the language spoken by the minor, the city where they live, and where they live (4(e)(3)(C)). The requirements imposed can be bundled with OSes and consoles as long as the '''minor''' is informed (4(e)(3)(D)). | |||
==Disclosure== | ==Disclosure== | ||
Before a person "a covered platform reasonably believes is a minor" signs up for it, it must get | Before a person "a covered platform reasonably believes is a minor" signs up for it, it must get them and their parents to acknowledge | ||
* how it looks after their personal data (5(a)(1)(A)) | * how it looks after their personal data (5(a)(1)(A)) | ||
* how to access the [[#Restrictions]] (5(a)(1)(B)) | * how to access the [[#Restrictions]] (5(a)(1)(B)) | ||
* whether it poses any "heightened risk of harm" to them (5(a)(1)(C)) | * whether it poses any "heightened risk of harm" to them (5(a)(1)(C)) | ||
The terms and conditions of any covered platform that uses an | For a child, parental consent is required also (5(a)(2)). | ||
The terms and conditions of any covered platform that uses an algorithm for recommendations must describe | |||
* how the system uses personal data belonging to minors (5(b)(1)) | * how the system uses personal data belonging to minors (5(b)(1)) | ||
* how minors and their parents can | * how minors and their parents can opt out of or control the system (5(b)(2)) | ||
Any ad aimed at minors on a covered platform must | |||
* tell them why they've been targeted for the ad including what personal data was used (5(c)(1)(B)) | |||
* explicitly say that it is an ad (5(c)(1)(C)) | |||
All these disclosures must be "clear, accessible, and easy-to-understand", and must be "in the same language, form, and manner as the covered platform provides any product or service used by minors and their parents" (5(e)). | |||
==Transparency== | ==Transparency== | ||
Every year a covered platform has to get an "independent, third-party | Every year a covered platform that has at least 10 million "active users on a monthly basis in the United States" has to get an "independent, third-party...reasonable inspection" (6(a)(1)). |
Latest revision as of 03:50, 7 August 2024
In 1997, the Supreme Court of the United States unanimously struck down two provisions of the Communications Decency Act, which made "the knowing transmission of obscene or indecent messages to any recipient under 18 years of age" and "knowing[ly] sending or displaying of patently offensive messages in a manner that is available to a person under 18 years of age" felonies[1]. "The level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox", quoted the Court; "the Government may not 'reduce the adult population . . . to . . . only what is fit for children.'"[2]
It's been 25 years and a bill pending in the Senate hasn't learned this lesson. The bill is entitled the "Kids Online Safety Act"[3]. This bill wasn't passed in the 117th Congress, but now another bill has been reported from committee[4]. The following summarizes and analyzes this Act; numbers and letters in parentheses indicate particular clauses of the bill.
Current Status[edit]
The "Kids Online Safety Act" passed the Senate July 31, 2024.
Definitions[edit]
These definitions are arranged so that every definition appears after all definitions it relies on, and will be in bold when used in the remainder of this analysis.
Anyone under 17 is a "minor" (2(8)), and anyone under 13 is a "child" (2(1)).
An "online platform" is any public-facing website or mobile app "that predominantly provides a community forum for user generated content" and includes social media services and virtual reality environments (2(9)).
An "online video game" is any video game that connects to the internet and allows users to create content, engage in microtransactions, or communicate with other users, or has minor-specific advertising (2(10)).
A "covered platform" is an online platform, online video game, messaging, or streaming service that "is reasonably likely to be used, by a minor." Broadband internet, email, teleconferencing, and text messaging services, schools, nonprofits, libraries, news apps, and VPNs are not covered platforms (2(3)). Video streaming services have their own regulations.
"Knows" includes "knowledge fairly implied on the basis of objective circumstances" (2(6)). The FTC must consider "whether the operator, using available technology, exercised reasonable care", in deciding whether or not that clause applies if it had no actual knowledge (15(b)).
"Mental health disorder" means any mental disorder included in DSM-5 or its successor (2(7)).
"Personal data" means info "that identifies or is linked or reasonably linkable to a particular minor" and includes "consumer device identifiers" (2(12)).
Duty of Care[edit]
A covered platform must "take reasonable measures...to prevent and mitigate the following harms to minors:"
- "anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors" (3(a)(1))
- "addiction-like behaviors" (3(a)(2))
- bullying and harassment (3(a)(3))
- "sexual exploitation and abuse" (3(a)(4))
- advertising of narcotics, gambling, tobacco, and alcohol (3(a)(5))
- "financial harms" (3(a)(6))
It doesn't have to stop minors from "deliberately and independently searching for...content" (3(b)(1)), or ban "resources for the prevention or mitigation" of those harms (3(b)(2)).
Restrictions[edit]
Safeguards for Minors[edit]
A covered platform must have "readily accessible and easy-to-use safeguards" for minors to:
- stop other people from finding them (4(a)(1)(A))
- "prevent other users, whether registered or not from viewing" and "restricting public access to" their personal data (4(a)(1)(B))
- limit features that result in compulsive usage like rewards, notifications, or automatic playing of media (4(a)(1)(C))
- opt out of, or limit recommendations from, algorithms (4(a)(1)(D))
- restrict sharing of their geolocation, and tell them when it's tracked (4(a)(1)(E))
A minor must be able to delete their account and all its personal data, and limit the amount of time they spend there (4(a)(2)).
If a platform knows a user is a minor, these settings must be set by default to the strongest (4(a)(3)).
Parental Tools[edit]
A covered platform must provide "readily accessible and easy-to-use" tools for parents of minors to:
- view their "privacy and account settings" including the above "#Safeguards for Minors", and change the settings for any child (4(b)(2)(A))
- restrict their purchases (4(b)(2)(B))
- track and restrict the time they spend (4(b)(2)(C))
A minor will be told when those tools are in effect (4(b)(3)).
If a platform knows a user is a child, these tools must be in effect by default (4(b)(4)).
Reporting Mechanisms[edit]
A covered platform needs "a readily accessible and easy-to-use" way to report "harms to minors" (4(c)(1)(A)), an "electronic point of contact" specific to them (4(c)(1)(B)), and a way to confirm and track reports (4(c)(1)(C)). It must respond "in a reasonable and timely manner", within 10 days of getting them if it has at least 10 million users in the US (4(c)(2)(A)), and 21 days if not(4(c)(2)(B)). Imminent threats must be dealt with "as promptly as needed" (4(c)(2)(C)).
Other Restrictions[edit]
A covered platform can't facilitate the advertising of narcotics, gambling, alcohol, and tobacco to minors (4(d)), or modify the UI "with the purpose or substantial effect of subverting or impairing user autonomy, decision-making, or choice with respect to" what's required by #Safeguards for Minors and #Parental Tools (4(e)(2)).
A covered platform must let minors and parents know about the #Safeguards for Minors and #Parental Tools it imposes, but can't encourage them to weaken or disable them. Controls to enable and disable them must be "readily-accessible and easy-to-use" and "in the same language, form, and manner" as the covered platform (4(e)(1)).
Exceptions[edit]
This Act doesn't:
- stop a covered platform from limiting their algorithms from recommending "harmful, obscene, or unlawful content to minors" (4(e)(3)(A)(i)), or blocking or filtering spam or protecting their security (4(e)(3)(A)(ii))
- force a covered platform to disclose the content of a minor's communications (4(e)(3)(B))
- stop a covered platform from cooperating with law enforcement agencies regarding activity that it "reasonably and in good faith" believes is illegal (15(d)(1))
- force a covered platform to implement age gating or verification (15(c)(2))
Personalized recommendation systems are still allowed but only if they're based on the language spoken by the minor, the city where they live, and where they live (4(e)(3)(C)). The requirements imposed can be bundled with OSes and consoles as long as the minor is informed (4(e)(3)(D)).
Disclosure[edit]
Before a person "a covered platform reasonably believes is a minor" signs up for it, it must get them and their parents to acknowledge
- how it looks after their personal data (5(a)(1)(A))
- how to access the #Restrictions (5(a)(1)(B))
- whether it poses any "heightened risk of harm" to them (5(a)(1)(C))
For a child, parental consent is required also (5(a)(2)).
The terms and conditions of any covered platform that uses an algorithm for recommendations must describe
- how the system uses personal data belonging to minors (5(b)(1))
- how minors and their parents can opt out of or control the system (5(b)(2))
Any ad aimed at minors on a covered platform must
- tell them why they've been targeted for the ad including what personal data was used (5(c)(1)(B))
- explicitly say that it is an ad (5(c)(1)(C))
All these disclosures must be "clear, accessible, and easy-to-understand", and must be "in the same language, form, and manner as the covered platform provides any product or service used by minors and their parents" (5(e)).
Transparency[edit]
Every year a covered platform that has at least 10 million "active users on a monthly basis in the United States" has to get an "independent, third-party...reasonable inspection" (6(a)(1)).