A Comment on Malwarebytes v. Enigma

From Bibliotheca Anonoma
Revision as of 06:13, 29 October 2020 by Quintuplicate (talk | contribs)

Disclaimer: I am not a lawyer, and this article is not and should not be relied on as legal advice.

On first glance, Malwarebytes v. Enigma may not seem to be very important. It doesn't seem to be. It's one of those cases where one party has appealed a case all the way up to the Supreme Court, but it doesn't think the case is important enough for them to take.

But, even if its legal effect may not extend beyond the medium- or even short-term, it has historical value. For one, it is the first time the Supreme Court has interpreted or even mentioned a very important statute for the Internet, section 230 of the Communications Decency Act[1]. For two, it shows how the Supreme Court, or at least one justice of it (nobody else joined this "statement"), is inclined to interpret section 230 should a case revolving around it ever come before the Court.

When a judge interprets a law, he will of course first look to the plain meaning of the text. If the plain meaning supplies no clear meaning or a clearly absurd one, he will then look to the purpose of the legislature in enacting the law.

So first, a brief discussion of the Communications Decency Act, with a view to its purpose. In the early 1990s, the Internet had been opened to commercial use, and massive growth began as entrepreneurs decided to make their pot of gold by colonizing it. The Religious Right being very much in power at the time, they were afraid that there would be content on the Internet that would corrupt "good morals". That's why the Communications Decency Act is called this: to make it illegal to transmit "indecent" content on the Internet, even if the Supreme Court had found it otherwise protected by the First Amendment.

Most state codes have provisions stating that headings of sections are not part of the law and shouldn't be used to interpret laws. The US Code is different in that section "catchlines" and "analyses" (what it calls headings and tables of contents) are part of the law. So in some cases you'll have tables of contents that don't match up with the actual headings, and the people who update the US Code are powerless to fix it, because only Congress can change them to fit. What I'm getting with this is that section headings can be used to find out what Congress meant in the sections they head.

So, the title of the actual enactment, "Telecommunications Act of 1996", is a welcome break from the blatantly ideological titles and/or contrived attempts to come up with "clever" acronyms that have become so common in regular years. This doesn't tell us much, and that's a good thing.

Title V of this Act has the short title of "Communications Decency Act" and the long title of "Obscenity and Violence". Section 509 is located within Subtitle A, headed "Obscene, Harassing, and Wrongful Utilization of Telecommunications Facilities". So we get an idea of what the Congress wanted to target by making this law.

Section 509 itself is entitled "Online Family Empowerment". The section it inserts into the Communications Act of 1934 is entitled "Protection for private blocking and screening of offensive material." This accords with the general purpose of the CDA: the purpose of this section is to allow families to block "offensive material" from their children through private means.

Now we come to subsections (a) and (b) of this section. Respectively, they declare the "findings" of Congress and the "policy" of the United States. They are an almost uniquely American feature; their purpose is to help the courts discover the purpose of the laws they are in, and so make it more likely for them to interpret the laws according to the will of Congress.

The fourth "finding" basically sums up the gist of all five findings: "The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation." The "policy" states that Congress wants to keep the Internet free, but is concerned about obscene and harassing content online; it will encourage the private development of technology that will block such content.

And now, the meat of the whole section, subsection (c):

(c) Protection for `Good Samaritan' Blocking and Screening of Offensive Material.—
   ``(1) Treatment of publisher or speaker.—No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
   ``(2) Civil liability.—No provider or user of an interactive computer service shall be held liable on account of—
       ``(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
       ``(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

This was intended as a replacement for the punitive provisions of the CDA, which made all indecency on the Internet illegal. Instead, this was enacted as another section of the CDA, and the punitive provisions remained in the final law[2].

No doubt, paragraph (2) was intended to be the more important of the two short paragraphs of the section. It seems to fit more with the heading of the subsection. Yet in the long run, paragraph (1) became more momentous. Here's why: if you're not treated as the publisher or speaker of content someone else created, you can't be liable to be sued for defamation on it.

The ACLU soon took the Attorney-General, Janet Reno, to court over this Act. In Reno v. ACLU (1997), the Supreme Court struck the punitive provisions of the CDA down, but kept this section, which was not challenged. Subsections (d) and (e) simply set out ancillary provisions. "Interactive computer service", as defined by this section, includes users of the Internet as well as websites.

Since then, section 230 has become one of the most important enactments regulating the Internet. It allows sites to exist which cannot reasonably moderate all the content posted on it, because they are not liable for content that is posted which they don't know about. Even if they remove content, they're not treated as the publisher or speaker of the content that they choose to stay up as long as they do so in good faith, because of this section.


Notes

  1. To be sure, this is section 230 of the Communications Act of 1934, which was added by section 509 of the Telecommunications Act of 1996. The CDA is the name for the part of Telecommunications Act section 509 is in. This was codified to 47 USC 230. Confusing, I know.
  2. 521 U.S. 844, 858