The "Sufficient Backdoor" Test: A New Model for Indecency Regulation of Converged Media
Content-based regulation is subject to the “strict scrutiny” standard in the Supreme Court. The “strict scrutiny” standard takes into account three issues: (1) whether the regulation furthers a compelling government interest; (2) whether the regulation is narrowly tailored to that government interest; and (3) whether the regulation operates in the least restrictive means so as not to be overbroad. Indecency restrictions are classic examples of content-based restrictions, aimed at protecting American children from harm. Today, that “strict scrutiny” standard can be modified to take into account technological advances and convergence trends.
Broadcast, cable and Internet media are currently treated differently under the First Amendment, Internet being the least regulated, and broadcast being the most. Current trends suggest that the three media are converging, and technology affords greater control for the receiver over content. Future content-based regulation may be applied to all three media equally. Under the modified “strict scrutiny” standard, content-based regulation will pass if: (1) the regulation furthers a compelling government interest; and (2) if qualified receivers are able to counteract the regulation by using a “sufficient backdoor.” In indecency regulations requiring the use of receiver-based content control mechanisms (filters), qualified receivers (adults) must be able to turn the mechanisms off by verifying certain information. The burden of future indecency regulations falls neither on the government nor the content producer, preserving First Amendment rights of all involved.
Broadcast, cable and Internet technologies are converging, becoming more and more alike in how content is distributed and consumed. As the technologies advance and become more accessible in daily life, children are exposed to more indecent content. Currently, broadcast, cable and Internet regulations pertaining to indecent content are treated differently by federal courts—broadcast being subject to the most sweeping regulation, and Internet subject to the least. Indecent content becoming more prevalent, disparate judicial treatment of the converged broadcast, cable and Internet media is no longer warranted.
If the Federal Communications Commission (FCC) were to gain jurisdiction over the three convergent media, it could create an independent agency made up of advocates, industry and ordinary citizens who could produce uniform classification standards for content. Advanced filtering technologies would then work in conjunction with the classification standards and concerned parents could set boundaries for their children’s viewing. As long as the filtering technology has a “sufficient backdoor” for adults (a mechanism by which adults can counteract the filter), any indecency regulation will pass the court’s “strict scrutiny” test. Most importantly, though, a mandate creating a classification agency and requiring filtering technology—shifting the restriction from the source of the content to the receiver—takes the government out of the content regulation process.
Many concerns are associated with classification standards and filtering technology. The ability to block content is, in its simplest form, censorship. Other problems arise when definitional standards are based upon the interests of the very industry they are supposed to regulate, not to mention privacy concerns of identity verification systems. There is also the possibility that the technology simply will not work. However, requiring filtering technology preserves important First Amendment rights—such as autonomy and the editorial privilege—that outweigh any concerns future technology produces.
This paper acknowledges the fact that any suggestions for future regulation are simplistic and unrealistic at this time. Obtaining jurisdiction for the FCC over cable and Internet requires the disregard of legislative and judicial precedent. Additionally, no two people have the same definition of indecency, and filtering technology is not yet capable of operating in the way this paper envisions. Eventually, though, technology and common opinion will evolve to the point where the provisions laid out in this paper are feasible.
Indecency is a vague notion that is highly dependent on the context in which it is used. Black’s Law Dictionary defines it as such: “Indecency, n. The state or condition of being outrageously offensive, esp. in a vulgar or sexual way. Unlike obscene material, indecent speech is protected under the First Amendment.”
The problem with this definition is that it does not provide a clear example of what is “outrageously offensive” or what is “vulgar”—every person and every community may have a different standard. Because indecent speech is protected under the First Amendment, regulations attempting to stifle indecent speech have been subject to the “strict scrutiny” standard. This test requires that all content-restrictive regulations fulfill a compelling government interest, be narrowly tailored to its purpose, and operate in the least restrictive means—that is, regulations cannot catch any extra speech (“Brief of Petitioner and Intervenor” 49-50).
The FCC has jurisdiction over indecent speech on broadcast media. The FCC defines indecency in broadcast as: “Material that “describe[s] or depict[s] sexual or excretory organs or activities” that are “patently offensive as measured by contemporary community standards” (“Notices of Apparent Liability” 5). Again, this definition of indecency is vague and narrow. The FCC acknowledges that “the determination as to whether certain programming is patently offensive is not a local one and does not encompass any particular geographic area…[r]ather, the standard is that of an average broadcast viewer or listener and not the sensibilities of any individual complainant” (“Notices of Apparent Liability” 5). The FCC considers the “full context in which the material appeared,” and the “manner and purpose of broadcast material” (i.e. whether it is news or live broadcast, etc.) on a case-by-case basis (“Notices of Apparent Liability” 5-6).
The last ruling the Supreme Court made regarding the FCC’s oversight of indecency on broadcast was a full thirty years ago, in the 1978 case, FCC v. Pacifica Foundation. Since 1978, indecency standards have shifted, both in the FCC and American culture. The fact that the FCC can now fine broadcast media for a single utterance of an expletive is considered by broadcasters and some district courts to be “arbitrary and capricious by departing from prior precedents without explanation,” creating an atmosphere where “[b]roadcasters are…left without any guidelines that would enable them to understand what is forbidden and what is not,” and where potential prior restraint reigns (“Brief of Petitioner and Intervenor” 2, 47). It creates an instance where, as Jeffrey H. Smulyan put it, “[f]rom a broadcaster’s standpoint, what was fine yesterday is now…indecent today” (quoted in Ahrens 1).
This paper will address the treatment of indecent speech and attempted regulation across three media: broadcast, cable and Internet. Though the FCC has expressed an interest in the regulation of cable content, and Congress has attempted to regulate content on the Internet, to date the FCC only has jurisdiction over indecent content in broadcast media. Ongoing concern with the effects of indecency on children will likely drive more attempts at FCC oversight in other media absent any official ruling otherwise.
Indecency has long been addressed by the FCC, Congress and the Supreme Court in regard to children; the “compelling government interest” in safeguarding American children is not questioned. The main concern with indecency regulation, or content regulation, is that it can be overly broad, leading to censorship, and inhibits adult autonomy. This paper will suggest a new standard under which to consider proposed indecency regulation: the “sufficient backdoor” test.
Current Treatment of Broadcast 1
Broadcast media—radio, then television—have been tightly regulated by the FCC. There are a number of reasons for this, most notably the relative scarcity of spectrum required to operate broadcast stations, and the pervasiveness and accessibility of broadcast media. The FCC has historically issued licenses to potential broadcasters who, because of the limited amount of operable spectrum, are required to act in the “public interest.” If the licensee betrays the public interest, they could loseWeight Exercise the license, though occurrences are few2.
The landmark broadcast indecency case was FCC v. Pacifica Foundation, decided in 1978. Pacifica explained why broadcast media was subject to such strict regulation:
“First, the broadcast media have established a uniquely pervasive presence in the lives of all Americans. Patently offensive, indecent material presented over the airwaves confronts the citizen not only in public, but also in the privacy of the home, where the individual’s right to be left alone plainly outweighs the First Amendment rights of an intruder…Secondly, broadcasting is uniquely accessible to children, even those too young to read” (FCC v. Pacifica 748-750). [emphasis added]
The Pacifica case remains the single most influential broadcast indecency case, though it was recently announced that the Supreme Court will review the FCC’s policy of fining broadcasters for just one fleeting use of an expletive (Ahrens and Barnes 1).
In setting a precedent for the First Amendment rights of the listener over the speaker, broadcast media is very clearly governed by a collectivist framework3. The underlying architecture of broadcast also contributes to the collectivist treatment. Operable broadcast spectrum was originally thought to be scarce, requiring a licensing system overseen by the FCC—this meant that relatively few could speak, and the costs of entry to the system were very high. The cost of accessibility was very cheap, meaning that a large number of people would likely tune in4. Therefore, due to the limited number of voices, broadcast could be regulated to benefit collective knowledge and the democratic process over individual rights.
Current Treatment of Cable
The FCC has recognized an interest in cable but remains unable to regulate content. Though it is often argued that the reason the FCC cannot regulate cable is that cable is not as pervasive as broadcast, a major reason is that cable policy is ambiguous and contradictory. The FCC has no real jurisdiction over cable, as the regulation of cable infrastructure was left to local franchise authorities, with federal oversight allowed only when explicitly stated in an act. However, the FCC does retain a small amount of power over the cable industry in promoting localism and public educational programming, governing basic tier pricing structure, and regulating obscenity (requiring pornography to be scrambled).
In Turner v. FCC, the Supreme Court identified a governmental interest in preserving local broadcast stations in an era of cable prevalence. While not a regulation of cable per se, preserving localism does require a fair amount of oversight. By mandating a “must-carry” policy, the FCC
serve[s] three interrelated and important governmental interests: (1) preserving the benefits of free, over-the-air local broadcast television, (2) promoting the widespread dissemination of information from a multiplicity of sources, and (3) promoting fair competition in the television programming market (Turner Broadcasting System v. FCC 180-182).
In preserving a “multiplicity of sources” to better inform the public and generally enhance the democratic process, cable leans toward collectivist treatment.
Cable is commonly regarded as a “naturally monopolistic” industry. Cable delivery is cheaper and more efficient if one company provides the service. Competition among cable companies is virtually nonexistent. Cable service providers are able to choose which channels to include in its packages (with the exception of the local broadcast and public education requirements), creating an opportunity to censor indirectly by exclusion. Because of this threat to content diversity, treatment of cable in federal courts leans toward collectivism.
Current Treatment of the Internet
The Internet is a medium that possesses characteristics of all preceding media. The Supreme Court tends to treat the Internet more like print—print being the medium that receives the highest protection—though the Internet also operates somewhat like broadcast. Because the Internet is a relatively new medium, no legal precedent has been set. The FCC has jurisdiction over Internet “lines,” at least in the case of telephone companies’ DSL service offerings, while cable Internet service lines remain untouched. The FCC has no control over Internet content.
Congress has attempted to regulate indecent content on the Internet in two ways: through the use of an overbroad statute criminalizing the purposeful sending of indecent content to minors, and by allocating federal funds only to libraries employing computer “filters.” The overbroad Communications Decency Act of 1996 (CDA) demonstrates the general belief that the Internet is not as pervasive or invasive as broadcast, since it takes deliberate click-throughs and disregard of multiple warnings to consume indecent material online (Reno v. ACLU 869). The Children’s Internet Protection Act (CIPA) provided funds for libraries employing software to block indecent material. The Supreme Court upheld the statute because, among other things, it provided a “backdoor” for adult library patrons wishing to view content that was caught by the filter by simply asking the librarian to turn the filter off (Shobaki 349). It is on this basis that this paper proposes a new First Amendment test, explored in more detail later.
Despite attempts at indecency regulation and infrastructure oversight, the Internet is treated in an overwhelmingly individualist fashion, that is, the rights of the individual trump all other concerns (Shobaki 349). The Internet is treated as such because of its virtually unlimited capacity (the opposite of scarcity), its cheap and simple mode of access, and its audience reach (many-to-many). Reaching content requires direct action and a certain know-how, and “early Internet cases demonstrate the Court’s reluctance to accept the pervasiveness rationale outside of broadcast” (Shobaki 349).
Technological Trend: Convergence
Broadcast spectrum is no longer scarce, and broadcast is no longer “uniquely pervasive” as it was thirty years ago. Common estimates cite 85% of the American populace as receiving television signals from sources other than over-the-air broadcast (cable or satellite), and subscription rates are likely to increase once American television signals switch from analog to digital on February 17, 2009. Additionally, recent reports show that 99% of Americans live in Zip Codes covered by high-speed Internet providers (“High Speed Services” 5). With such a large segment of the American populace capable of obtaining such services, it would seem that cable and Internet are now as pervasive as broadcast. Even the Supreme Court acknowledges that, at least with cable, “there is little difference between cable and broadcast television when it comes to the effects,” and that it is now just as pervasive and easily accessed by children (“Brief of Petitioner and Intervenor” 55).
The convergence of technologies allows devices to perform functions they were not originally or solely intended to perform. For example, users may now access the Internet, watch television content, or listen to music on their cellular telephones. Media has been trending toward convergence for quite some time; Ithiel de Sola Pool recognized that the “‘convergence of modes’ is blurring the lines between media, even between point-to-point communications” far before the trend was apparent (de Sola Pool 23). Rapid advances in technology have made this convergence trend possible, but, because of the relative sluggishness of the FCC and Congress (burdened by their own bureaucratic policies), outdated case law, and general misunderstanding of rapidly developing convergent trends, realistic and effective technological solutions to indecent speech on convergent media have largely been ignored (Shobaki 346, 350)5.
Whereas in the past broadcast, cable and the Internet have been treated differently due to their physical architectures and business models, the melding of the three media necessitates similar First Amendment treatment. This treatment, explored in more detail later, would take the government out of the regulation process, requiring only a mandate on receiver-based content control mechanisms. The actual regulation of content would be on the receiving end; the source of the content and the content itself would not be affected.
Technological Trend: Receiver-Based Content Control
Technology has advanced to a point where it permeates every aspect of American life. The rapid proliferation of technology has brought indecent content more readily into American homes, but it is also with technology that indecent content can be combated for the well-being of children. Filtering technologies have been employed since the 1990s to fulfill a number of functions: “they can organize information (for example, by classifying it), they can select information, or they can block information” (Balkin 64). Television and computer filters rely on classifications defined by the industry (major broadcast networks, computer operating system manufacturers, etc.) but are implemented on a voluntary basis. The classifications are not based on any consistent standard.
A subsection of the 1996 Telecommunications Act did require all new televisions with screens over thirteen inches and manufactured after January 1, 2000 to be equipped with the V-chip, which works in conjunction with ratings information to block programs exceeding the rating limits parents set. However, the Act did not lay out any definitional standard for the ratings system and instead left the “TV Parental Guidelines” up to the broadcast industry itself (“Brief of Petitioner and Intervenor” 51-52). The industry recognized an economic and popularity incentive in participating in the ratings system and reaps benefits directly by self-regulating (Samoriski 162).
There is currently no requirement for computers to contain filtering technology, though filters are commercially available to parents concerned about indecent content reaching their children. These filters rely on a complex set of classification standards and tend to block content that may have artistic merit. Another type of filter is employed by search engines like Google or Yahoo that simply sift through excess information to provide the user with more relevant content. This same type of filter blocks spam from e-mail. Again, no coherent definitional standards for classification are provided.
Though current technology is fairly rudimentary and tends to be vague, it will evolve to the point where it can differentiate between indecent content and art, “junk” and relevant material, etc. In the future, the FCC would require all entertainment content providers to participate in a ratings program, and create a new agency in which advocates, industry, and consumers alike could participate in creating classification standards. In this case, the burden of content restriction would be on the receiver, not the source of the content—that would smack of censorship and is not the “least restrictive means” of protecting a compelling government interest (Samoriski 163). Enhanced filtering technology of the future would “open an avenue that could remove government from the constitutionally disfavored position of having to decide what its citizens are allowed to see and hear” (Samoriski 146).
A New Model: The “Sufficient Backdoor” Test
The current standard by which indecency regulation—or, more generally, content-based regulation—is treated is called “strict scrutiny.” In the future, indecency regulation would be subject to a modified “strict scrutiny” standard. This modified “strict scrutiny” standard preserves the “compelling government interest” requirement and replaces the “narrowly tailored” and “least restrictive means” requirements by evaluating a “sufficient backdoor” by which qualified receivers can counteract the restrictive measures. If indecency regulation of the future requires content to be classified uniformly across all three media, and provides a technological means by which to block access if the receiver so wishes, it must also provide a qualified receiver (in this case, adults) with a sufficient means—a “sufficient backdoor”—to get around the requirement. This can generally be done by entering passwords, PINs, or some other age/identity verification system.
Receiver-based content control mechanisms are “narrowly tailored” to the “compelling government interest” of protecting children from harmful material. These filters keep indecent material out by the “least restrictive means” (“Brief of Petitioner and Intervenor” 51). Any excess or misclassified material caught by the elusiveness of the filter can be counteracted by simply turning the filter off. In this way, works of art or other relevant material that may have been caught by the filter can still be viewed, at least under the discretion of a qualified receiver (an adult).
Providing a “sufficient backdoor” preserves the autonomous function of the First Amendment, which posits that “individual autonomy, or liberty…allows individuals to define, develop, and express themselves” (Napoli 34). The concept of autonomy is based on philosopher John Stuart Mill’s definition, in that “men should be free to act upon their opinions—to carry these out in their lives, without hindrance, either physical or moral, from their fellow men, so long as it is at their own risk and peril” (Mill 53). In this case, qualified receivers must be autonomous to block certain content (for benefit to themselves or their children), but also must be autonomous to un-block the same content. Receiver-based content control mechanisms also preserve another First Amendment right: they entail “important positive expression in the form of editorial judgment about other content” (Weitzner 215). Most importantly, receiver-based content control does not censor speech; the content can remain intact, just as the producer intended it.
The Effect on Content
Television and Internet “filtering at the receiving end…allow[s] constitutionally protected speech to thrive in the marketplace,” as it does not prohibit speech, but rather, enables the receiver to decide which speech to receive (Samoriski 158). Filtering technology does not create a “chilling effect”—the choice not to speak for fear of penalty—on speech; rather, it could “facilitate more explicit, or more reserved programming” that is “more accurately [targeted to] audiences, knowing that programs rated a certain way will only be seen by certain consumer groups” (Samoriski 160).
Industry is likely to, at first, oppose any filter mandates6. Blocking content could mean lost viewership, which would lead to lost revenue, but, if content can be targeted more accurately, content originators stand to profit from target advertising revenue. Target advertising is a trend in the current media environment in response to the common belief that generic television advertising is no long effective. More advertising revenue will likely, but not necessarily, translate into higher quality programming.
Concerns with Requiring Receiver-Based Content Control
The primary First Amendment concern with requiring filters is that blocking content is censorship. It does not matter that there is a “backdoor”; speech is still affected. Of major concern is the source of the classification standards. In this case, a new regulatory agency is mandated by the government but does not include the government. However, industry-created classification standards are suspect as well, because “the use of [filtering] techniques…implies that the standard or rating scheme is not defined by the user, but basically by private organization, which entails the risk of censorship” (Bodard 267). It is also troubling if the industry also manufactures the filter. Microsoft, for example, designed its Windows 95 operating system so that its software could “scan a user’s computer hard disk drive and then transmit a list of the programs back to Microsoft” (Samoriski 163). Thus, industry can potentially control what the receiver sees and invade the receiver’s privacy. It is simply a matter of consumer advocacy to ensure that all citizens, if they so wish, are able to participate in the classification process. The FCC currently works on this principle—despite the relative slowness of any rule-making, to the agency’s benefit, it openly solicits comments on any proposed rules from institutions and citizens alike.
Filtering technologies do raise glaring privacy and surveillance issues, especially considering the steps necessary in actually opening the “backdoor.” Requiring filtering technologies would also require a system of age or identity verification. On the Internet, this might mean entering state-issued driver’s license numbers, or even Social Security numbers. It would create a fear that somehow content viewed by qualified receivers might be monitored, or that personal information inserted into the system could be hacked. However, similar systems are now in place all over the world. Consider, for example, metropolitan transit cards that scan upon boarding a bus or train; these cards ostensibly track a commuter’s every move. There is also a disturbing trend of citizens “giving away” their privacy rights in times of national crises or on social networking sites—the younger generation makes very personal information available on their “profiles.” Privacy concerns, if the pace of technology and social trends continues, will lessen and warrant reexamination at a later time.
Requiring filters also necessarily requires a major overhaul of media infrastructure. Considering uniform classification standards would mean considering uniform filtering technologies for broadcast, cable and Internet, though the three media are becoming one in the same, interactivity will be costly and time-consuming to implement. The Internet is especially problematic because “the same interconnecting technology that makes the Internet virtually indestructible also makes it difficult to monitor…[t]here is no centralized storage location, control point or communications channel” (Samoriski 155). To require uniform filtering systems would also require meddling with the architecture that makes the Internet uniquely open and anonymous. Cable Internet service providers now routinely build excess capacity into their lines, knowing that demand and data will increase; this practice can be adopted across all media. Though predicting the future of such an elusive technology as the Internet is difficult, technology will eventually store ever larger amounts of verification data in an anonymous fashion.
The simplest concern with filtering technology is simply that it will not work. It is possible that “parents will be unable to use the blocking device,” or that “children will be able to break through and watch the programming anyway,” as children tend to be more tech-savvy than their parents (Balkin 70). A major counteraction is parental education, as “educational efforts, rather than legislative or technological solutions, may be the best defense against protecting certain consumer segments (e.g., children) from objectionable content” (Hoffman 41). An open mode of communication between parents and children about what is appropriate is also key.
Likely Judicial Reaction
Were the mandate ordered today, it would most likely be struck down. The FCC has no jurisdiction over cable or Internet content, and any legislation giving the FCC that power would be complex and overreaching. The judiciary would have to change its definitions of the various media, and invalidate a massive amount of case law. It would also lead to widespread confusion and disagreement over what is acceptable by law. However, the Supreme Court must adjust, just like the FCC and Congress, to technological trends it did not initially foresee or understand. If American citizens can accept convergent technologies so readily, so too can the Supreme Court.
Broadcast, cable and the Internet have historically been subject to disparate First Amendment treatment in federal courts, based on physical and economic characteristics that no longer apply. As the three media converge and technology evolves to the point where users have ultimate control, the Supreme Court should consider all three under the same standard—a modified “strict scrutiny” test that evaluates whether there is a “sufficient backdoor” to content-based regulation.
Indecency, a vague and highly contextual concept, permeates all three media, and threatens to harm America’s children. If the FCC, which currently has jurisdiction over broadcast indecency only, can mandate a new and independent regulatory agency made up of various groups to classify entertainment content in a uniform fashion, then highly evolved receiver-based content control mechanisms (filters) can block—or unblock—objectionable content. The burden of content-based regulation falls not on the government or the content producers, but the receivers of the content, who are free to choose what they want to watch. As long as there is a “sufficient backdoor” for qualified receivers (adults, in this case), content is not censored nor is speech stifled—and the government is spared from overstepping its bounds.
1 It is the intention of this paper to address television broadcasting only, as radio broadcasting is no longer as pervasive as it once was.
2 Station WLBT in Jackson, MS had its license revoked in 1969 and transferred in 1971 due to racist and anti-Civil Rights practices. Licenses can also be revoked for sexual deviancy.
3 For the purposes of this paper, a collectivist view of the First Amendment puts the rights of the group as a whole over the rights of the individual; very generally, collectivists believe that “[l]egislation which abridges that freedom [of speech] is forbidden, but not legislation to enlarge and enrich it” (Napoli 48).
4 Technically, television broadcast signals are free and over-the-air, at least until the DTV transition on February 17, 2009 forces viewers to either purchase new televisions, obtain a digital converter box, or subscribe to a cable or satellite service.
5 With the exception of the V-Chip and Internet filters used by search engines or parents, though their effectiveness is debated and the ratings system on which the technology is based is neither thorough nor consistent.
6 Industry opposed the V-Chip in 1996 until it realized that defining its own regulation scheme, and keeping politicians and parents alike happy, meant more profits.
Ahrens, Frank. “Delays, Low Fines Weaken FCC Attack on Indecency.” The Washington Post 10 Nov. 2005: A01.
Balkin, J.M. “Media Filters and the V-Chip.” The V-Chip Debate: Content Filtering From Television to the Internet. Ed. Price, Monroe E. Mahwah, NJ: Lawrence Erlbaum Associates, 1998.
Barnes, Robert and Ahrens, Frank. “Supreme Court to Review FCC Ban on Profanity.” The Washington Post 18 Mar. 2008: A01.
Black’s Law Dictionary. 2nd ed. St. Paul, MN: West Group, 2001.
Bodard, Katia. “Free Access to Information Challenged by Filtering Techniques.” Information & Communication Technology Law. Vol. 12, No. 3. Oct. 2003.
FCC. “High Speed Services for Internet Access: Status as of June 30, 2007.” Industry Analysis and Technology Division, Wireline Competition Bureau. Mar. 2008.
FCC. “Notices of Apparent Liability and Memorandum Opinion and Order.” In the Matter of Complaints Regarding Various Television Broadcasts Between February 2, 2002 and March 8, 2005. 15 Mar. 2006.
FCC v. Pacifica Foundation. 438 U.S. 726. US Supreme Ct. 1978.
Fox Television Stations, Inc. “Brief of Petitioner and Intervenor.” Fox Television Stations, Inc., et al. v. FCC. Docket No. 06-1760-ag. Ct. of Appeals, 2nd Cir. 2006.
Hoffman, Donna L., Novak, Thomas P. and Schlosser, Ann E. “Locus of Control, Web Use, and Consumer Attitudes Toward Internet Regulation.” Journal of Public Policy & Marketing. Vol. 22, No. 1, spr. 2003.
Hudson Jr., David L. “Indecency Regulation: Beyond Broadcast?” First Amendment Center. 27 Dec. 2007..
Mill, John Stuart. On Liberty. Ed. Shields, Currin V. Indianapolis: Bobbs-Merrill, 1956.
Napoli, Philip M. Foundations of Communication Policy: Principles and Process in the Regulation of Electronic Media. Cresskill, NJ: Hampton Press, 2001.
Janet Reno. v. ACLU. 521 U.S. 844. US Supreme Ct. 1996.
Samoriski, Jan H., Huffman, John L. and Trauth, Denise M. “The V-Chip and Cyber Cops: Technology vs. Regulation.” Communication Law & Policy. Vol. 2, Issue 1, wtr. 1997.
Shobaki, Khaldoun. “Speech Restraints for Converged Media.” UCLA Law Review. Los Angeles: Regents of the University of California, 2004.
de Sola Pool, Ithiel. Technologies of Freedom: On Free Speech in an Electronic Age. Cambridge, MA: The Belknap Press of Harvard, 1983.
Sunstein, Cass M. Democracy and the Problem of Free Speech. New York: The Free Press, 1995.
Turner Broadcasting System, Inc. v. FCC. 520 U.S. 180. US Supreme Ct. 1992.
United States v. American Library Association. 539 U.S. 194. US Supreme Ct. 2003.
Weitzner, Daniel J. “Yelling “Filter” on the Crowded Net: The Implications of User Control Technologies.” The V-Chip Debate: Content Filtering From Television to the Internet. Ed. Price, Monroe E. Mahwah, NJ: Lawrence Erlbaum Associates, 1998.