Good Samaritan 47 USC § 230(c)
- Good Samaritan Defense
- 3rd Party Content
- Internet Access Liability
- Foreign Judgments
- Consumer Review Fairness Act
- 1st Amend
- Child Porn
- Child Porn, Reporting
- Protect Act
- - Notification
- V Chip
- Good Samaritan Defense
© Cybertelecom ::
A funny thing happened on the way to censoring the Internet – ISPs were given immunity from liability related to third party content. Senator Exon, in 1996, set out on his crusade to stop the barbarians at the gate of the Internet. He successfully introduced the Communications Decency Act which attempted to criminalise offensive content. What he ended up doing was protecting ISPs and users from certain third party liability.
To understand what Congress was up to, you have to step back a few years to a lawsuit called Stratton Oakmont v. Prodigy. Stratton Oakmont was a financial institution that, at the time, was reportedly experiencing a degree of difficulty. One of the difficulties was that someone in a Prodigy chat room had posted some material allegedly defamatory to Stratton Oakmont’s good name.
There were two problems. First, the poster of the material could not be found. Second, Prodigy was engaged in the practice of monitoring and filtering the chat rooms for material that was offensive to children. It was part of their sales pitch that their services were kid friendly.
Stratton Oakmont, feeling defamed and seeking damages, filed a lawsuit. Of course, they could not sue the anonymous poster, so they sued the next best thing; they sued Prodigy. Prodigy’s defense was elegant – Prodigy said “get outta here – were just an ISP – we do not have any control over what is posted to our system nor is it technically feasible for us to exert control.” But Stratton Oakmont argued back that Prodigy was not merely an online service; Prodigy had become a publisher. By filtering the chat rooms for content offensive to children, Prodigy had exercised editorial control over what would and would not be published, and therefore had the liability of a publisher. And if a publisher publishes defamatory remarks, that publisher can be liable. [Donato]
The Court bought it. Congress did not. Congress was appalled by the thought that - here Sen. Exon was telling us about this threat to the youth of American, and here Prodigy was acting the good citizen and fighting the filth – and Prodigy ends up getting slammed for something completely unrelated.
CDA Score Card: If you are keeping a score card, this is where all these situations have ended up. The Stratton Oakmont decision was appealed by Prodigy. Stratton Oakmont then thought it might be a good idea to settle, and did, relieving Prodigy of all liability. [NYTimes] The Communications Decency Act, at least those portions of the act criminalizing filth on the Internet, was declared unconstitutional by unanimous Supreme Court. And the Good Samaritan Defenses are still standing and remain good law.
Rule: Congress included the Good Samaritan defenses in the Communications Decency Act. These provisions state that ISPs are not publishers of third party content and cannot be held liable as publishers for this content (aka intermediary liability). Furthermore, ISPs can not be found to be liable under any law for actions taken to protect children from filth. According to the Act:
47 U.S.C. § 230
(c) Protection for “Good Samaritan” blocking and screening of offensive material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of –
(A) Any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be
- excessively violent,
- harassing, or
- otherwise objectionable,
whether or not such material is constitutionally protected; or
(B) Any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph ([A]).
(d) Obligations of interactive computer service
(e) Effect on other laws
(1) No effect on criminal law
Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
(2) No effect on intellectual property law
Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.
(3) State law
Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.
(4) No effect on communications privacy law
Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.
Congress was directly reacting against the situation where an ISP was making decisions in a chat room about what could and could not be posted. Even where the service provider takes an active role, much like a publisher, it shall not be held liable as a publisher. Liability shall rest solely on the actual content producers and enforcement shall be emphasized.
Hare v. Richie, Dist. Court, D. Maryland 2012: "In passing section 230, Congress sought to spare interactive computer services this grim choice by allowing them to perform some editing on user-generated content without thereby becoming liable for all defamatory or otherwise unlawful messages that they didn`t edit or delete." Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1165 (9th Cir. 2008) (en banc). Thus, "`Congress decided not to treat providers of interactive computer services like other information providers such as newspapers, magazines or television and radio stations, all of which may be held liable for publishing or distributing obscene or defamatory material written or prepared by others.'" Batzel v. Smith, 333 F.3d 1018, 1026 (9th Cir. 2003) (quoting Blumenthal v. Drudge, 992 F. Supp. 44, 49 (D.D.C. 1998)). As the Fourth Circuit explained in its touchstone decision in Zeran v. America Online, Inc., supra, 129 F.3d at 330: "§ 230 precludes courts from entertaining claims that would place a computer service provider in a publisher's role. Thus, lawsuits seeking to hold a service provider liable for its exercise of a publisher's traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—are barred." It is "immaterial whether this decision comes in the form of deciding what to publish in the first place or what to remove among the published material." Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1102 n.8 (9th Cir. 2009). Moreover, "an editor's minor changes to the spelling, grammar, and length of third-party content do not strip him of section 230 immunity." Fair Housing Council, 521 F.3d at 1170.
"[T]he section is titled “Protection for ‘good samaritan’ blocking and screening of offensive material” and, as the Seventh Circuit recently held, the substance of section 230(c) can and should be interpreted consistent with its caption." [Roommates.com, p. 3455, 9th. Cir. 2009] [Chicago Lawyers’ Committee for Civil Rights Under Law, Inc. v. craigslist, Inc., No. 07- 1101, slip op. at 6 (7th Cir. Mar. 14, 2008) (quoting Doe v. GTE Corp., 347 F.3d 655, 659-60 (7th Cir. 2003))]
Barnes, 570 F.3d at 1099-1100 Congress's goals in Section 230(c) "to promote the free exchange of information and ideas over the Internet and to encourage voluntary monitoring for offensive or obscene material," and to preserve the "vibrant and competitive free market" for interactive computer services.
Zeran v. American Online Inc., 129 F.3d 327, 331 (4th Cir. 1997):
Congress' purpose in providing the § 230 immunity was thus evident. Interactive computer services have millions of users. See Reno v. ACLU, ___ U.S. at ___, 117 S.Ct. at 2334 (noting that at time of district court trial, "commercial online services had almost 12 million individual subscribers"). The amount of information communicated via interactive computer services is therefore staggering. The specter of tort liability in an area of such prolific speech would have an obvious chilling effect. It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted. Congress considered the weight of the speech interests implicated and chose to immunize service providers to avoid any such restrictive effect.
Another important purpose of § 230 was to encourage service providers to self-regulate the dissemination of offensive material over their services. In this respect, § 230 responded to a New York state court decision, Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710 (N.Y.Sup.Ct. May 24, 1995). There, the plaintiffs sued Prodigy — an interactive computer service like AOL — for defamatory comments made by an unidentified party on one of Prodigy's bulletin boards. The court held Prodigy to the strict liability standard normally applied to original publishers of defamatory statements, rejecting Prodigy's claims that it should be held only to the lower "knowledge" standard usually reserved for distributors. The court reasoned that Prodigy acted more like an original publisher than a distributor both because it advertised its practice of controlling content on its service and because it actively screened and edited messages posted on its bulletin boards.
Congress enacted § 230 to remove the disincentives to self regulation created by the Stratton Oakmont decision. Under that court's holding, computer service providers who regulated the dissemination of offensive material on their services risked subjecting themselves to liability, because such regulation cast the service provider in the role of a publisher. Fearing that the specter of liability would therefore deter service providers from blocking and screening offensive material, Congress enacted § 230's broad immunity "to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children's access to objectionable or inappropriate online material." 47 U.S.C. § 230(b)(4). In line with this purpose, § 230 forbids the imposition of publisher liability on a service provider for the exercise of its editorial and self-regulatory functions.
In Blumenthal vs. Matt Drudge and AOL., the Judge quoted Chief Judge Wilkinson who stated:
The purpose of this statutory immunity is not difficult to discern. Congress recognized the threat that tort-based lawsuits pose to freedom of speech in the new and burgeoning Internet medium. The imposition of tort liability on service providers for communications of others represented, for Congress, simply another form of intrusive government regulation of speech. Section 230 was enacted, in part, to maintain the robust nature of Internet communication, and, accordingly, to keep government interference in the medium to a minimum.
See also Zeran, 129 F.3d at 331
The specter of tort liability in an area of such prolific speech would have an obvious chilling effect. It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted. Congress considered the weight of the speech interests implicated and chose to immunize service providers to avoid any such restrictive effect.
Compare Roommates, 521 F.3d at 1164 ("The Communications Decency Act was not meant to create a lawless no-man's-land on the Internet.").
Sec. 230(c)(2) No Liability for Making Us Safe
As stated above, the impetus for the Good Samaritan Provision was Prodigy Networks attempting to make chatrooms safe for children, and then being held liable as a publisher for those editorial decisions. Sec. 230(c)(1) says interactive services shall not be liable for third party content; Sec. 230 (c)(2) says interactive services shall not be liable for taking actions to make their services safe. Note that while Sec. 230(c)(1) provides immunity for an interactive service for the actions of others, Sec. 230(c)(2) provides immunity for an Interactive service for its own actions.
While this latter provision has not seen much litigation, it has come up. In Zango v. Kaspersky, the 9th Circuit confronted a case of first impression concerning whether a distributor of Internet security software is entitled to immunity under Sec. 230(c)(2).
Zango, Inc. (Zango) is an Internet company that provides access to a catalog of online videos, games, music, tools, and utilities to consumers who agree to view advertisements while they browse the Internet. It brought this action against Kaspersky Lab, Inc., (Kaspersky) which distributes software that helps filter and block potentially malicious software, for improperly blocking Zango's software. Kaspersky invoked the protection of § 230(c)(2)(B) for "good samaritan" blocking and screening of offensive material.
The 9th Circuit affirmed that Kaspersky is immune under Sec. 230(c)(2). Defendant is an interactive computer service as defined by the statute because it is an access software provider that provides software or enabling tools that filter, screen, allow, or disallow content that the provider or user considers obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable. Sec. 230(f)(4). The Sec. 230(c)(2) immunity covers Defendant's own actions (unlike Section 230(c)(1) which is aimed at third party content). This provision of Sec. 230 advances two of the policy goals articulated by the statute:
"to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;" and "to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children's access to objectionable or inappropriate online material[.]" § 230(b)(3), (4).
A New Jersey District Court overturned a New Jersey law that would have made interactive computer services liable for knowingly publicizing or displaying advertisements of sexual abuse of children. The problem is, noted the court, "in doing so, it creates an incentive for online service providers not to monitor the content that passes through its channels. This was precisely the situation that the CDA was enacted to remedy." - Backpage.com, DNJ 2013 (quoting McKenna, 881 F. Supp. 2d at 1273); Backpage.com, LLC v. McKenna, 881 F. Supp. 1262 (W.D. Wash. 2012)
- 230(c)(2)(A) has a good faith requirement;
- 230(c)(2)(B) has no good faith requirement. [Zango slip 7]
- Levitt v. Yelp!, Inc., No. C-10-1321, 2011 WL 5079526, at *7-8 (C.D. Cal. Oct. 26, 2011) (holding that CDA immunity applied "regardless of whether the publisher acts in good faith")
See also Internet Access Liability