|Communications Decency Act
- 1st Amendment
- Internet Freedom
- Children, Protection
- - COPA
- - CIPA
- - CPPA
- - Child Porn
- - Child Porn, Reporting
- - Protect Act
- - V Chip
- - Deceptive Content
- - Sex Offenders
- - Privacy
- - Notification
- SPAM Labels
- Good Samaritan Defense
The conference agreement adopts the Senate provisions with modifications.
New subsection 223(d)(1) applies to content providers who send prohibited material to a specific person or persons under 18 years of age. Its "display" prohibition applies to content providers who post indecenct material for online display without taking precautions that shield that material from minors.
New section 223(d)(1) codifies the definition of indecency from FCC v. Pacifica Foundation, 438 U.S. 726 (1978). Defenses to violations of the new sections assure that attention is focused on bad actors and not those who lack knowledge of a violation or whose actions are equivalent to those of common carriers.
The conferees intend that the term indecency (and the rendition of the definition of that term in new section 502) has the same meaning as established in FCC v. Pacifica Foundation, 438 U.S. 726 (1978), and Sable Communications of California, Inc. v. FCC, 492 U.S. 115 (1989). These cases clearly establish the principle that the federal government has a compelling interest in shielding minors from indecency. Moreover, these cases firmly establish the principle that the indecency standard is fully consistent with the Constitution and specifically limited in its reach so that the term is not unconstitutionally vague. See also Action for Children's Television v. FCC, 58 F. 3d 654, 662-63 (en banc) (D.C. Cir. 1995), cert. denied, 64 U.S.L.W. 3465 (1996); Alliance For Community Media v. FCC, 56 F. 3d 105, 1124-25 (D.C. Cir. 1995) cert. granted sub. nom., Denver Area Education Telecommunications Consortium v. FCC, 116 S.CT. 471 (1995), Dial Information Services Corp. of New York v. Thornburgh, 938 F.2d 1535, 1540-41 (2d Cir. 1991) cert. denied sub. nom., Dial Information Services Corp. of New York v. Barr, 502 U.S. 1072 (1992); Action for Children's Television v. FCC, 932 F. 2d 1504, 1508 (D.C. Cir. 1991).
The precise contours of the definition of indecency have varied slightly depending on the communications medium to which it has been applied. The essence of the phrase-patently offensive descriptions of sexual and excretory activities-has remained constant, however. At the time of this writing, the Supreme Court will consider at least one constitutional challenge to federal indecency statutes. Importantly, the question whether indecency is overly broad or unconstitutionally vague is not seriously at issue in that challenge. See Alliance for Community Media, supra, (whether State action exists as to private decisions by cable operators). There is little doubt that indecency can be applied to computer-mediated communications consistent with constitutional strictures, insofar as it has already been applied without rejection in other media contexts, including telephone, cable, and broadcast radio.
The conferees considered, but rejected, the so-called "harmful to minors" standard. See Ginsberg v. New York, 390 U.S. 629, 641-43 (1968). The proponents of the "harmful to minors" standard contended that that standard contains an exemption for material with "serious literary, artistic, political, and scientific value," and therefore was the better of the two alternative standards. ("Harmful to minors" laws use the "variable OBSCENITY" test and prohibit the sale, and sometimes the display, of certain sexually explicit material to minors.) This assertion misapprehends the indecency standard itself, and disregards the Supreme Court's various rulings on this issue. See Pacifica, 438 U.S. at 743, n. 18, and its progeny.
The gravamen of the indecency concept is "patent offensiveness." Such a determination cannot be made without a consideration of the context of the description or depiction at issue. It is the understanding of the conferees that, as applied, the patent offensiveness inquiry involves two distinct elements: the intention to be patently offensive, and a patently offensive result. In the Matter of Sagittarius Broadcasting Corp. et al, 7 FCC Rcd. 6873, 6875, (1992); In the Matter of Audio Enterprises, Inc., 3 FCC Rcd. 930, 932 (1987). Material with serious redeeming value is quite obviously intended to edify and educate, not to offend. Therefore, it will be imperative to consider the context and the nature of the material in question when determining its " patent offensiveness."
In view of the solid constitutional pedigree of the indecency standard (see Pacifica), 438 U.S. at 743 (describing indecency as low value and marginally protected by the First Amendment)), use of the indecency standard poses no significant risk to the free-wheeling and vibrant nature of discourse or to serious, literary, and artistic works that currently can be found on the internet, and which is expected to continue and grow. As the Supreme Court itself noted when upholding the constitutionality of indecency prohibitions, prohibiting indecency merely focuses speakers to re-cast their message into less offensive terms, but does not prohibit or disfavor the essential meaning of the communication. Pacifica, 438 U.S. at 743, n. 18. Likewise, requiring that access restrictions be imposed to protect minors from exposure to indecenct material does not prohibit or disfavor the essential meaning of the indecenct communication, it merely puts it in its appropriate place: away from children.
Violators of this section shall be fined under title 18, U.S. Code, or imprisoned not more than two years, or both.
Each intentional act of posting indecenct content for display shall be considered a separate violation of this subsection, rather than each occasion upon which indecenct material is accessed or downloaded from an interactive computer service or posted without the content provider's knowledge on such a service. New subsection 223(d)(2) sets forth the standard of liability for facilities providers who intentionally permit their facilities to be used for an activity prohibited by new subsection 223(d)(1).
New subsection 223(e) includes statutory defenses for violations of new sections 223 (a) and (d) that supplement other defenses available at law, such as common law defenses. New subsections 223(e)(1), (e)(2) and (e)(3) set forth the "access provider" defense. The defense protects entities from liability for providing access or connection to or from a facility, network or system not under their control. The defense covers provision of related capabilities incidental to providing access, such as server and software functions, that do not involve the creation of content.
The defense does not apply to entities that conspire with entities actively involved in the creation of content prohibited by this section, or who advertise that they offer access to prohibited content. Nor does it apply to provision of access or connection to a facility, system or network that engages in violations of this section and that is owned or controlled by the access provider. In the absence of these conditions, commercial and non-profit internet operators who provide access to the internet and other interactive computer services shall not be liable for indecenct material accessed by means of their services. This provision is designed to target the criminal penalties of new sections 223(a) and (d) at content providers who violate this section and persons who conspire with such content providers, rather than entities that simply offer general access to the internet and other online content. The conferees intend that this defense be construed broadly to avoid impairing the growth of online communications through a regime of vicarious liability.
New subsection 223(e)(4) provides a defense to employers whose employees or agents make unauthorized use of office communications systems. This defense is intended to limit vicarious or imputed liability of employers for actions of their employees or agents. To be outside the defense, the prohibited action must be within the scope of the employee's or agent's employment. In addition, the employer must either have knowledge of the prohibited action and affirmately act to authorize or ratify it, or recklessly disregard the action. Both conditions must be met in order for employers to be held liable for the actions of an employee or agent.
The good faith defenses set forth in new subsection 223(e)(5) are provided for "reasonable, effective, and appropriate" measures to restrict access to prohibited communications. The word "effective" is given its common meaning and does not require an absolute 100% restriction of access to be judged effective. The managers acknowledge that content selection standards, and other technologies that enable restriction of minors from prohibited communications, which are currently under development, might qualify as reasonable, effective, and appropriate access restriction devices if they are effective at protecting minors from exposure to indecenct material via the internet.
New subsection 223(e)(6) permits the Commission to describe its view of what constitute "reasonable, effective and appropriate" measures and provides that use of such measures shall be admissible as evidence that the defendant qualifies for the good faith defense. This new subsection grants no further authority to the Commission over interactive computer services and should be narrowly construed.
New subsection 223(f)(1) supplements, without in any way limiting, the "Good Samaritan" liability protections of new section 230.
New subsection 223(f)(2) preempts inconsistent State and local regulations of activities and actions described in new subsections 223(a)(2) and (d). This provision is intended to establish a uniform national standard of content regulation for a national, and indeed a global, medium, in which content is transmitted instantaneously in point-to-point, and point-to-multipoint communications. As originally passed by the Senate, this subsection excluded non-commercial content providers. The conferees have expanded this section to provide for consistent national and State and local content regulation of both commercial and non-commercial providers. The conferees recognize and wish to protect the important work of nonprofit libraries and higher educational institutions in providing the public with both access to electronic communications networks like the internet, and valuable content which they are uniquely well-positioned to provide. Accordingly, nonprofit libraries and educational institutions, like commercial entities, are assured by this provision that they will not be subjected to liability at the State or local level in a manner inconsistent with the treatment of their activities or actions under this legislation.
The conferees also recognize the critical importance of access software in making the internet and other interactive computer services accessible to Americans who are not computer experts. Accordingly, provisions of "access software" is included within the access provider defense. As defined in new subsection 223(h)(3), in term includes software that enables a user to do any of an enumerated list of functions that are set forth in technical language. It includes client and server software, such as proxy server software that downloads and caches popular web pages to reduce the load of traffic on the internet and to permit faster retrieval. The definition distinguishes between software that actually creates or includes prohibited content and software that allows the user to access content provided by others.
The conference agreement adopts the House provision with minor modifications as a new section 230 of the Communications Act. This section provides "Good Samaritan" protections from civil liability for providers or users of an interactive computer service for actions to restrict or to enable restriction of access to objectionable online material. One of the specific purposes of this section is to overrule Stratton-Oakmont v. Prodigy and any other similar decisions which have treated such providers and users as publishers or speakers of content that is not their own because they have restricted access to objectionable material. The conferees believe that such decisions create serious obstacles to the important federal policy of empowering parents to determine the content of communications their children receive through interactive computer services.
These protections apply to all interactive computer services, as defined in new subsection 230(e)(2), including non-subscriber systems such as those operated by many businesses for employee use. They also apply to all access software providers, as defined in new section 230(e)(5), including providers of proxy server software.
The conferees do not intend, however, that these protections from civil liability apply to so-called "cancelbotting," in which recipients of a message respond by deleting the message from the computer systems of others without the consent of the originator or without having the right to do so.