prev | next
Amdt1.7.5.12 Child Pornography

First Amendment:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

In New York v. Ferber,1 the Court recognized another category of expression that is outside the coverage of the First Amendment: the visual depiction of children in films or still photographs in a variety of sexual activities or exposures of the genitals. The reason that such depictions may be prohibited was the governmental interest in protecting the physical and psychological well-being of children, whose participation in the production of these materials would subject them to exploitation and harm. The state may go beyond a mere prohibition of the use of children, because it is not possible to protect children adequately without prohibiting the exhibition and dissemination of the materials and advertising about them. Thus, the Court held that “the evil to be restricted so overwhelmingly outweighs the expressive interests, if any, at stake, that no process of case-by-case adjudication is required.” 2 But, because expression is involved, the government must carefully define what conduct is to be prohibited and may reach only “works that visually depict sexual conduct by children below a specified age.” 3

The Court has considered cases addressing the private possession of child pornography in the home. In Osborne v. Ohio4 the Court upheld a state law criminalizing possessing or viewing of child pornography as applied to someone who possessed such materials in his home. Distinguishing a prior case protecting the personal possession of obscene material, the Court ruled that Ohio’s interest in preventing exploitation of children far exceeded what it characterized as Georgia’s “paternalistic interest” in protecting the minds of adult viewers of obscene material.5 Because the state’s interest in regulating child pornography was of greater importance, the Court saw less need to require states to demonstrate a strong necessity for regulating private possession in addition to the commercial distribution and sale.

In Ashcroft v. Free Speech Coalition, the Court held unconstitutional the federal Child Pornography Prevention Act (CPPA) to the extent that it prohibited pictures that were not produced with actual minors.6 The law prohibited computer-generated ( “virtual” ) child pornography, and photographs of adult actors who appeared to be minors, and could have extended to “a Renaissance painting depicting a scene from classical mythology.” 7 The Court observed that statutes prohibiting child pornography that uses real children are constitutional because they target “[t]he production of the work, not the content.” 8 The CPPA, by contrast, targeted the content, not the means of production. The government’s rationales for the CPPA included that “[p]edophiles might use the materials to encourage children to participate in sexual activity” and might “whet their own sexual appetites” with it, “thereby increasing . . . the sexual abuse and exploitation of actual children.” 9 The Court found these rationales inadequate because the government “cannot constitutionally premise legislation on the desirability of controlling a person’s private thoughts” and “may not prohibit speech because it increases the chance an unlawful act will be committed ‘at some indefinite future time.’” 10 The government had also argued that the existence of “virtual” child pornography “can make it harder to prosecute pornographers who do use real minors,” because, “[a]s imaging technology improves . . . , it becomes more difficult to prove that a particular picture was produced using actual children.” 11 This rationale, the Court found, “turns the First Amendment upside down. The Government may not suppress lawful speech as a means to suppress unlawful speech.” 12

In United States v. Williams,13 the Supreme Court upheld a federal statute that prohibits knowingly advertising, promoting, presenting, distributing, or soliciting material “in a manner that reflects the belief, or that is intended to cause another to believe, that the material” is child pornography that is obscene or that depicts an actual minor (that is, is child pornography that is not constitutionally protected).14 Under the provision, in other words, “an Internet user who solicits child pornography from an undercover agent violates the statute, even if the officer possesses no child pornography. Likewise, a person who advertises virtual child pornography as depicting actual children also falls within the reach of the statute.” 15 The Court found that these activities are not constitutionally protected because “[o]ffers to engage in illegal transactions [as opposed to abstract advocacy of illegality] are categorically excluded from First Amendment protection,” even “when the offeror is mistaken about the factual predicate of his offer,” such as when the child pornography that one offers to buy or sell does not exist or is constitutionally protected.16

However, the principles applying to child pornography do not extend to protecting children from encountering sexually explicit material. Although the government has a “compelling” interest in protecting children from seeing or hearing indecent material, total bans applicable to adults and children alike are constitutionally suspect.17 In Reno v. American Civil Liberties Union,18 the Court struck down two provisions of the Communications Decency Act of 1996 (CDA), one of which would have prohibited use of an “interactive computer service” to display indecent material “in a manner available to a person under 18 years of age.” 19 This prohibition would, in effect, have banned indecent material from all internet sites except those accessible only by adults. Although intended “to deny minors access to potentially harmful speech . . . , [the CDA’s] burden on adult speech,” the Court wrote, “is unacceptable if less restrictive alternatives would be at least as effective. . . . [T]he Government may not ‘reduc[e] the adult population . . . to . . . only what is fit for children.’” 20

In Reno, the Court distinguished FCC v. Pacifica Foundation,21 in which it had upheld the Federal Communications Commission’s (FCC) restrictions on indecent radio and television broadcasts, because (1) “[t]he CDA’s broad categorical prohibitions are not limited to particular times and are not dependent on any evaluation by an agency familiar with the unique characteristics of the Internet,” (2) the CDA imposes criminal penalties, and the Court has never decided whether indecent broadcasts “would justify a criminal prosecution,” and (3) broadcast radio and television, unlike the internet, have, “as a matter of history . . . ‘received the most limited First Amendment protection,’ . . . in large part because warnings could not adequately protect the listener from unexpected program content.” 22 By contrast, on the internet, at least as it existed in 1997, the Court believed “the risk of encountering indecent material by accident is remote because a series of affirmative steps is required to access specific material.” 23

After the Supreme Court struck down the CDA, Congress enacted the Child Online Protection Act (COPA), which banned “material that is harmful to minors” on websites that have the objective of earning a profit.24 In ACLU v. Reno, the Third Circuit upheld a preliminary injunction against enforcement of the statute on the ground that, “because the standard by which COPA gauges whether material is ‘harmful to minors’ is based on identifying ‘contemporary community standards[,]’ the inability of Web publishers to restrict access to their Web sites based on the geographic locale of the site visitor, in and of itself, imposes an impermissible burden on constitutionally protected First Amendment speech.” 25 The Third Circuit reasoned that COPA would have resulted in communications available to a nationwide audience being judged by the standards of the community most likely to be offended. In Ashcroft v. ACLU, the Supreme Court vacated and remanded the Third Circuit decision, holding “that COPA’s reliance on community standards to identify ‘material that is harmful to minors’ does not by itself render the statute substantially overbroad for purposes of the First Amendment.” 26

Upon remand, the Third Circuit again upheld the preliminary injunction, and the Supreme Court affirmed and remanded the case for trial. The Supreme Court found that the district court had not abused its discretion in granting the preliminary injunction, because the government had failed to show that proposed alternatives to COPA would not be as effective in accomplishing its goal. The primary alternative to COPA, the Court noted, is blocking and filtering software. Filters are less restrictive than COPA because “[t]hey impose selective restrictions on speech at the receiving end, not universal restriction at the source.” 27 Subsequently, the district court found COPA to violate the First Amendment and issued a permanent injunction against its enforcement; the Third Circuit affirmed, and the Supreme Court denied certiorari.28

555 U.S. 1137 (2009).

In United States v. American Library Association, Inc., a four-Justice plurality of the Supreme Court upheld the Children’s Internet Protection Act (CIPA), which, as the plurality summarized it, provides that a public school or “library may not receive federal assistance to provide Internet access unless it installs software to block images that constitute obscenity or child pornography, and to prevent minors from obtaining access to material that is harmful to them.” 29 The plurality asked “whether libraries would violate the First Amendment by employing the filtering software that CIPA requires” 30 —in other words, whether CIPA would effectively violate library patrons’ rights. The plurality concluded that it did not, after finding that “Internet access in public libraries is neither a ‘traditional’ nor a ‘designated’ public forum,” and that it therefore would not be appropriate to apply strict scrutiny to determine whether the filtering requirements are constitutional.31 The plurality acknowledged “the tendency of filtering software to ‘overblock'—that is, to erroneously block access to constitutionally protected speech that falls outside the categories that software users intend to block.” 32 It found, however, that, “[a]ssuming that such erroneous blocking presents constitutional difficulties, any such concerns are dispelled by the ease with which patrons may have the filtering software disabled.” 33

The plurality also considered whether CIPA imposes an unconstitutional condition on the receipt of federal assistance—in other words, whether the government can require public libraries to limit their speech if they accept federal funds. The plurality found that, assuming that government entities have First Amendment rights (it did not decide the question), “CIPA does not ‘penalize’ libraries that choose not to install such software, or deny them the right to provide their patrons with unfiltered Internet access. Rather, CIPA simply reflects Congress’s decision not to subsidize their doing so.” 34

Footnotes
1
458 U.S. 747 (1982). The Court’s decision was unanimous, although there were several limiting concurrences. Compare, e.g., id. at 775 (Justice William Brennan, arguing for exemption of “material with serious literary, scientific, or educational value” ), with id. at 774 (Justice O’Connor, arguing that such material need not be excepted). The Court did not pass on the question, inasmuch as the materials before it were well within the prohibitable category. Id. at 766–74. back
2
Id. at 763–64. back
3
Id. at 764 (emphasis original). Child pornography need not meet Miller obscenity standards to be unprotected by the First Amendment. Id. at 764–65. back
4
495 U.S. 103 (1990). back
5
Id at 108. back
6
535 U.S. 234 (2002). back
7
Id. at 241. back
8
Id. at 249; see also id. at 241. back
9
Id.. back
10
Id. at 253. back
11
Id. at 242. back
12
Id. at 255. Following Ashcroft v. Free Speech Coalition, Congress enacted the PROTECT Act, Pub. L. No. 108-21, 117 Stat. 650 (2003), which, despite the decision in that case, defined “child pornography” so as to continue to prohibit computer-generated child pornography (but not other types of child pornography produced without an actual minor). 18 U.S.C. § 2256 (8)(B). In United States v. Williams, 128 S. Ct. 1830, 1836 (2008), the Court, without addressing the PROTECT Act’s new definition, cited Ashcroft v. Free Speech Coalition with approval. back
13
553 U.S. 285 (2008). back
14
18 U.S.C. § 2252A(a)(3)(B). back
15
128 S. Ct. at 1839. back
16
128 S. Ct. at 1841, 1842, 1843. In a dissenting opinion joined by Justice Ruth Bader Ginsburg, Justice David Souter agreed that “Congress may criminalize proposals unrelated to any extant image,” but disagreed with respect to “proposals made with regard to specific, existing [constitutionally protected] representations.” Id. at 1849. Justice David Souter believed that, “if the Act stands when applied to identifiable, extant [constitutionally protected] pornographic photographs, then in practical terms Ferber and Free Speech Coalition fall. They are left as empty as if the Court overruled them formally” Id. at 1854. Justice Antonin Scalia’s opinion for the majority replied that this “is simply not true . . . Simulated child pornography will be as available as ever, so long as it is offered and sought as such, and not as real child pornography . . . There is no First Amendment exception from the general principle of criminal law that a person attempting to commit a crime need not be exonerated because he has a mistaken view of the facts.” Id. at 1844–45. back
17
See Sable Commc’ns v. FCC, 492 U.S. 115 (1989) (FCC’s “dial-a-porn” rules imposing a total ban on “indecent” speech are unconstitutional, given less restrictive alternatives—e.g., credit cards or user IDs—of preventing access by children). Pacifica Foundation is distinguishable, the Court reasoned, because that case did not involve a “total ban” on broadcast, and also because there is no “captive audience” for the “dial-it” medium, as there is for the broadcast medium. 492 U.S. at 127–28. Similar rules apply to regulation of cable TV. In Denver Area Educ. Telecommc’ns Consortium 518 U.S. 727, 755 (1996), the Court, acknowledging that protection of children from sexually explicit programming is a “compelling” governmental interest (but refusing to determine whether strict scrutiny applies), nonetheless struck down a requirement that cable operators segregate and block indecent programming on leased access channels. The segregate-and-block restrictions, which included a requirement that a request for access be in writing, and which allowed for up to thirty days’ delay in blocking or unblocking a channel, were not sufficiently protective of adults’ speech and viewing interests to be considered either narrowly or reasonably tailored to serve the government’s compelling interest in protecting children. In United States v. Playboy Ent. Group, Inc., 529 U.S. 803 (2000), the Supreme Court, explicitly applying strict scrutiny to a content-based speech restriction on cable TV, struck down a federal statute designed to “shield children from hearing or seeing images resulting from signal bleed.” Id. at 806. In striking down the Communications Decency Act of 1996, the Court would “neither accept nor reject the Government’s submission that the First Amendment does not forbid a blanket prohibition on all ‘indecent’ and ‘patently offensive’ messages communicated to a 17-year-old—no matter how much value the message may have and regardless of parental approval. It is at least clear that the strength of the Government’s interest in protecting minors is not equally strong throughout the coverage of this broad statute.” Reno v. ACLU, 521 U.S. 844 (1997). In Playboy Ent. Grp., 529 U.S. at 825, the Court wrote: “Even upon the assumption that the government has an interest in substituting itself for informed and empowered parents, its interest is not sufficiently compelling to justify this widespread restriction on speech.” The Court also would “not discount the possibility that a graphic image could have a negative impact on a young child” (id. at 826), thereby suggesting again that it may take age into account when applying strict scrutiny. back
18
521 U.S. 844 (1997). back
19
The other provision the Court struck down would have prohibited indecent communications, by telephone, fax, or e-mail, to minors. Id. back
20
Id. at 874–75. The Court did not address whether, if less restrictive alternatives would not be as effective, the government would then be permitted to reduce the adult population to only what is fit for children. Id. back
21
438 U.S. 726 (1978). back
22
521 U.S. at 867. back
23
Id. back
24
“Harmful to minors” statutes ban the distribution of material to minors that is not necessarily obscene under the Miller test. In Ginsberg v. New York, 390 U.S. 629, 641 (1968), the Supreme Court, applying a rational basis standard, upheld New York’s harmful-to-minors statute. back
25
ACLU v. Reno, 217 F.3d 162, 166 (3d Cir. 2000). back
26
Ashcroft v. ACLU, 535 U.S. 564, 585 (2002). back
27
Id. at 667. Justice Stephen Breyer, dissenting, wrote that blocking and filtering software is not a less restrictive alternative because “it is part of the status quo” and “[i]t is always less restrictive to do nothing than to do something.” Id. at 684. The majority opinion countered that Congress “may act to encourage the use of filters,” and “[t]he need for parental cooperation does not automatically disqualify a proposed less restrictive alternative.” Id. at 669. back
28
ACLU v. Gonzales, 478 F. Supp. 2d 775 (E.D. Pa. 2007), aff’d sub nom. ACLU v. Mukasey, 534 F.3d 181 (3d Cir. 2008), cert. denied, back
29
539 U.S. 194, 199 (2003). back
30
Id. at 203. back
31
Id. at 205. back
32
Id. at 208. back
33
Id. at 209. Justice Anthony Kennedy, concurring, noted that, “[i]f some libraries do not have the capacity to unblock specific Web sites or to disable the filter . . . that would be the subject for an as-applied challenge, not the facial challenge made in this case.” Id. at 215. Justice David Souter, dissenting, noted that “the statute says only that a library ‘may’ unblock, not that it must.” Id. at 233. back
34
Id. at 212. back