Projects and Papers Main Page * Colleen Canavan Portfolio - Home

Obscenity, Community Standards and Internet Filtering

in the Information Age

Colleen Canavan

Info 520

8/25/03

The perceived need to regulate information dissemination in order to spare certain individuals from ideas of questionable acceptance can be found as far back as ancient Greek civilization, when Plato urged the suppression of “indecency” in the creative arts and called for the censor of writers (Heins, 2001, p. 3). Today, our modern society grapples with issues of defining constitutionally protected speech. The definition of and laws regarding obscenity are issues that the United States has continually revisited in recent decades. This paper begins with an exploration of the definition of obscenity in the United States, providing an historic overview of laws that have molded our current definition of what is legally considered obscene material, and exposes problems relating to the “community standards” aspect of the current legal definition. Additionally, this paper explores how libraries are affected by obscenity law in the current information age, with specific focus on the controversy surrounding Internet filtering in public libraries.

What is Obscenity?

When confronted with the concept of obscenity, individuals may be inclined to consider obscenity as synonymous with pornography. The words “pornography” and “obscenity” are frequently interchanged in lay communication; however, they are in fact two discrete definitions of materials which some individuals may consider indecent. The word “pornography” derives from the Greek “porne” (harlot) and “graphos” (writing), and is identified in Webster’s Third International Dictionary as “a description of prostitutes or prostitution, a depiction of licentiousness or lewdness, or a portrayal of erotic behavior designed to cause sexual excitement.” Pornography describes sexually explicit material, but it is not a legally defined term. Obscenity, on the other hand, is legally defined; however the legal definition is purposefully vague. Its current definition is based on what is commonly referred to as the Miller Test, which stems from the 1973 Supreme Court case Miller v. California (discussed later in this paper). In order for material to be considered obscene, it must fail all three aspects of this test. When material is defined as obscene, it is not constitutionally protected under the First Amendment.

There has been a resurgence of interest in the definition of and laws regarding obscenity since the advent of the Internet. Due to our new information technology, words and images are spread faster and more widely than ever before, and although much of the information found on the information superhighway is of legitimate social value, there is a large amount of content that even the most liberal-minded would consider to be of questionable value. As is clear to most Internet users, pornographic websites are abundant – and available for anyone to view. J. Robert Flores of the National Law Center for Children, states, “the pornography industry has… become among the most aggressive marketers on the Internet…. Today only a lucky few are able to avoid [pornographic material]” (Clancy, 2002, p. 50). In addition, Forrester, an American research company, had estimated that in 1998, the commercial Internet market for pornography was nearly $1 billion, about ten percent the amount spent on e-commerce (Clancy, 2002, p. 50). Clearly, the online pornography industry is flourishing. Although some find its growth alarming, it is important to remember that pornographic material is not illegal unless it fails the Miller Test for obscenity. With the online proliferation of material of questionable legality, it is in our interest to be familiar with the recent history of obscenity law.

Obscenity Law Cases

At the early part of our nation’s history, some states found it necessary to put into place obscenity laws prior to the issuance of any such law on the national level. In 1712, Massachusetts made it illegal to publish “any filthy, obscene, or profane song, pamphlet, libel, or mock sermon in imitation of religious services”, and the first obscenity prosecution occurred in Pennsylvania in 1815. In 1824, Vermont passed the first law making illegal the sale or publication of “lewd or obscene material” (Coglianese, 1995, Part III, para. 2). Later, many states adopting such laws used the definition of obscenity found in the 1868 English case of Regina v. Hicklin, which proposed obscene material to be that which corrupts the minds of individuals who are open to immoral influence (Morgan, J., 2000, p. 52).

The first obscenity case decided in the U.S. Supreme Court, however, did not occur until 1957, when the Court ruled that obscenity is not constitutionally protected speech (Coglianese, 1995, Part III, para. 3). In United States v. Roth, the defendant was charged with mailing sexually explicit materials to others. In this case, the Supreme Court adopted a test with which material was to be found obscene if “to the average person, applying contemporary community standards, the dominant theme of the material taken as a whole appeals to prurient interest” (Morgan, J., 2000, p. 52). Nine years later, in Memoirs of a Woman of Pleasure v. The Attorney General, the Supreme Court refined its definition of obscenity by adding to it the statement that the material in question must also be “utterly without redeeming social value”, making obscenity cases more difficult to prosecute (Morgan, J., 2000, p. 52). It was the 1973 Case of Miller v. California, however, that put forth the definition of obscenity as we now know it. In the case, defendant Marvin Miller had been convicted in California of violating that state’s obscenity statute when he mailed unsolicited brochures, containing sexually-explicit words and pictures, from his mail-order pornography company to a restaurant in Newport Beach (Morgan, J., 2000, p. 53). In its ruling, the Supreme Court repudiated the “utterly without redeeming social value” clause it had instated in Memoirs (Coglianese, 1995, Part IV, para. 2), and formulated the three part test for obscenity that is used today. According to the Miller Test, a jury must decide, in terms of the material in question,

  1. whether “the average person, applying contemporary community standards” would find that the work appeals to prurient interest,
  2. whether the work depicts or describes sexual conduct in a patently offensive way, and
  3. whether the work lacks serious literary, artistic, political, or social value (Morgan, J., 2000, p. 53).

Critics of the new definition feel that the Miller decision broadened the states’ ability to attempt to ban all pornography, as it is now considered easier to prove the Miller standard being met than the standard put forth by Memoirs (Coglianese, 1995, part IV para. 3).

In the late 1990s, due to the Internet’s explosive growth and popularity as a means for communication and commerce, the government of the United States sought to enact new laws regarding obscene materials on the Internet. The Communications Decency Act (CDA), part of the Telecommunications Reform Act of 1996, criminalized the transmission of “obscene, lewd, lascivious, filthy or indecent” materials (Friedman, 1996, Part V, 1, para. 1) with penalties of up to two years in prison and $250,000 in fines (Heins, 2001, p. 3). Nebraska Senator Jim Exon drafted the CDA with the goal of “making the new Internet and information superhighway as safe as possible for kids to travel” (Friedman, 1996, Part V, 1, para. 1). Although the aim of Congress was to promote the expansion of telecommunication technology, this law became to be viewed as an instrument of censorship of the Internet (Rappaport, 1998, The Communications Decency Act of 1996, para. 1). The American Civil Liberties Union immediately challenged the Act, and in the 1997 case Reno v. ACLU the CDA was found to be unconstitutional. In order for the Act to have passed constitutionality, the government’s interest needed to have been compelling and the means for accomplishing its objectives needed to have been the least restrictive as possible. The ruling, however, stated that although the protection of children was of compelling interest, less restrictive means of blocking obscene material from children, such as filtering software, were available (Morgan, J., 2000, p. 58). A second attempt by congress to introduce a similar law occurred in 1998 with the Child Online Protection Act (COPA), part of which mandated that owners of commercial sexually-oriented websites verify that visitors to their sites were of legal age. One year later, a federal judge enjoined COPA due to its abridgement of access to constitutionally protected materials for adults (Morgan, J., 2000, p. 57).

Community Standards in the Internet Age

The “community standards” clause of the Miller test for obscenity has been a focus of heated debate since the Miller ruling in 1973. Proponents of the test feel that local community standards are just and workable, while some opposition to the Miller test is based on the wrongfulness of local majority rule over individual access to information. In fact, Coglianese goes as far as to say that “having local community standards allows the majority to decide what is acceptable or not acceptable. The First Amendment does not have to be part of the constitution of it only protects what the majority finds acceptable” (1995, part IV, para. 16). The emergence of the Internet has further complicated the issue of what constitutes a community, and who sets the community standards. For example, some states define community as the whole state, while others define community by county. Some states, such as Oregon, have no laws against obscenity (Coglianese, 1995, Part IV-B, para. 2). Additionally, many argue that the Internet is a community unto itself, and cannot fit into any existing definition of community as it has no geographical boundaries.

The 1996 case of United States v. Thomas shows how the “community standards” clause of the Miller test for obscenity can cause concern in the Internet age. The defendants in the case, Robert and Carlene Thomas, owners of a California-based Amateur Pornography Bulletin Board Service, were convicted of transporting obscene materials via interstate phone lines when a Postal Inspector in Tennessee received sexually explicit images from their BBS (Morgan, J. 2000, p. 53). The importance of the case was that it was tried in Tennessee, the locality of receipt of the images, and not California, where the BBS was based. The defense claimed that there would be “an impermissible chill on free speech” since a BBS operator does not control who can receive materials on the BBS or from where (Friedman, 1996, Part III-C, para. 4); however, the Court upheld the conviction on appeal since the Thomases had required applications from their subscribers and therefore knew where their subscribers were located, and could have denied access to their BBS based on the applicants’ locations. According to the ruling in Thomas, online offenses could presumably be prosecuted anywhere in the U.S., since Internet access is available nationwide, and many fear that “venue shopping” may become a common occurrence, whereby offenses would be prosecuted in areas with the strictest community standards (Friedman, 1996, Part III-C, para. 4.). Additional to the threat of venue shopping, the Due Process clause of the Fourteenth Amendment, which entitles fair notice as to where defendants may be prosecuted, could be jeopardized. J. Morgan stated in 2000, “if a defendant can be prosecuted in any jurisdiction in which Internet access is available, Due Process may be denied” (p. 55).

s a result of problems associated with differing community standards throughout the US, the concept of a national community standard has been discussed. According to Morgan:

A national community standard eliminates the lowest-common denominator effect and forum-shopping by federal prosecutors. Therefore, vendors would no longer need to determine each local community standard nor restrict the content of their material to what the least tolerant community would deem “fit”. (p. 53)

Although such a proposal would alleviate some legal concerns of vendors of pornographic materials and better protect the Due Process clause, a national community standard would be extremely difficult to define due to the diverse levels of tolerance in the US; a diversity that would be in our best interest to maintain. In fact, the Supreme Court noted in Miller that setting a national standard would be unrealistic, since residents of Maine and Mississippi may have very different tolerance levels than residents of New York or California (Friedman, 1996, Part IV, para. 7). Furthermore, it is in the best interest of local governments to allow for the majority of the local community to have some rights against the national majority (Bass, 1996, Part II-B, para. 4).

In addition to the problems noted above, the concept of “community standards” in the Information Age poses risks to the First Amendment. As noted by Coglianese:

It is futile to have every hamlet in this country decide what it finds acceptable on the Internet and resolve the question in favor of the most repressive answer. This turns the purpose of the First Amendment on its head. Instead of protecting minority rights from an encroaching majority, the majority is given constitutional protection to remove the freedom of speech from the minority. (1995, Part VI, para. 12)

The problem with the community standard clause in the Miller test in terms of the Internet is clear: the Internet has no definable geographical boundaries, which are intrinsic to the definition of community as put forth by the Supreme Court. As Friedman states:

when the supreme court decided in 1973 that local communities should set their own standards for what constitutes obscenity, the justices probably never dreamed of the day when computer networks would make it possible for someone in California to connect instantaneously, both verbally and pictorially, with someone in Tennessee. (1996, Part IV, para. 8)

The Supreme Court, in future cases, must reexamine the community standards clause in obscenity cases in light of current and future technology. In fact, the argument of the dissenting opinion in Miller was that the local community standard was “unconstitutionally vague and would lead to the dispensing of unevenhanded justice” (Morgan, J., 2000 p. 54), an argument worth taking a second look at as more and more people gain access to and seek information on the Internet. Users of the Internet contribute to a more diverse community than any that can be found within geographically prescribed limits; a community, according to Morgan, “in which people communicate instantaneously as if they lived next door to each other [and where] the proper community standard should be one where the members … can themselves define obscenity” (p. 54).

Library Internet Filtering

With the current proliferation of sexually explicit websites available for Internet users to view (intentionally or not), public libraries that offer Internet access must contend with the question of how to best serve their patrons while limiting the amount of illegal material that may pass through its networks. Again, it is important to emphasize here that not all pornographic material is considered obscene, but variations of tolerance to such materials puts public libraries in a position where individual libraries can and will be held accountable for obscene (and therefore illegal) words and images that patrons access at their computer terminals. The American Library Association is staunchly opposed to censoring constitutionally protected speech; its Library Bill of Rights, which states that “libraries should not exclude materials because of their origin, background, or views of those contributing to their creation” and that “libraries should provide materials and information presenting all points of view” is open to broad interpretation. but it can be assumed that the Rights apply only to constitutionally legal speech. However, as we have seen, definition of obscenity varies from locale to locale, and therefore an attempt to define obscenity on the Internet as a whole leaves one with a close to impossible task. Due to the obscenity quandary and considerations regarding childrens’ access to objectionable material, debate has surfaced in recent years over the issue of Internet filtering in public libraries. Gardner, in 2002, laid out three general philosophies regarding filtering:

  1. It is better to filter and delete useful information in order to protect children and/or prohibit the distribution of certain types of information and/or viewpoints.
  2. It is better not to filter in order that people have the information they need, that they can decide what they want to access and that children can learn appropriate behaviors when dealing with the online world.
  3. It is better to allow each person to decide if they want filtered access. (p. 46)

Proponents of filtering feel that it is necessary to filter questionable material from library networks in order to protect minors from accessing pornographic and/or obscene material, and that library Internet use policies are not a strong enough enforcement. For example, a study of the Greenville County, South Carolina Library Board found that patrons were inadvertently exposing other patrons, including children, to pornography and obscenity, and that the library staff were unable to enforce time limits set in their Internet use policy (Clancy, 2002, p. 52-54). Clancy, who is Senior Counsel with the National Law Center for Children and Families and an advocate of Internet filtering, quotes J. M. LaRue of the Family Research Council to bolster her pro-filtering argument. According to LaRue:

In addition to the many other serious problems caused by the proliferation of hard-core pornography in our country, its accessibility via the Internet is turning America’s public libraries into virtual peep sows open to the children and funded by taxpayers. This is primarily due to failure of the Department of Justice to enforce federal obscenity laws. (p. 50)

Advocates of filtering technology have a well-intentioned goal of shielding children from pornographic material. However, the arguments against filtering are far more persuasive to any individual who holds his or her First Amendment rights in high regard. Filtering imposes one viewpoint on the patrons of a library, and in doing so “violates and negates the spirit of [our] country” (Gardner, 2002, p. 46). The ALA opposes the use of filtering technology in libraries, and in 1997 adopted a resolution stating “the use of filtering software in libraries to block access to constitutionally protected speech violates the Library Bill of Rights” (Bastian, 1997, Introduction, para. 4). In the 1998 case of Mainstream Loudon v. Loudon County Libraries, a federal district court ruled that the filtering policy used by the Loudon County Library System was unconstitutional. The court’s finding was that the filtering policy in place was over-inclusive in that it filtered non-offensive material, and that less restrictive measures could have been put in place by the libraries, such as the filtering of Internet terminals used by only by minors. Additionally, the court expressed concern that the library would abrogate its role in selection of material to a software filtering company (Morgan, J., 2000, p. 58). In fact, a large part of the anti-filtering argument addresses the problems inherent in using third party software to filter Internet content, when librarians have historically been the selectors of all materials found within libraries. As Crawford stated in 2002:

Filtering software does not mirror collection development policies, since libraries have no say in what is blocked. Filtering is like buying a magazine or encyclopedia and tearing out pages, except that someone else tears out pages and won’t tell you which ones or why. (Some Problems with CIPA, para. 2)

Commercial filtering software blocks a large amount of constitutionally protected speech, for children as well as adults. When keyword blocking software is used, for instance, information is blocked on items such as breast cancer or Essex county, while when host filtering software is used, whole sites devoted to specific organizations, such as those dealing with women’s issues, are blocked (Bastian, 1997, Filters, para. 3). In fact, government witnesses have admitted that filters block up to 15% or more sites wrongly (Crawford, 2002, Some Problems with CIPA, para. 3). Clearly, Internet filtering is a poor remedy for keeping only constitutionally unprotected speech off of library networks. The “community standards” clause regarding obscenity cannot be met constitutionally by the filtering software approach. As Gardner notes:

the most ironic part of the filtering debate is that due to technological advances, we now have access to more of the total human information product than ever before. The barriers of oceans, mountains, time and money have all been torn down and in their place, people who feel that they know what others should and should not access have become barriers themselves. (p. 46)

The use of filtering software, as it stands today, is not an adequate means of filtering only unconstitutionally protected speech. The best means of filtering only obscene material from library networks would be to have libraries themselves be the gatekeepers of their Internet access. If libraries could deem what is considered obscene according to their community, and block only the websites that fall under the category of obscenity, they may be able to comply with the “community standards” aspect of the Miller test as well as protect intellectual freedom. In this vein, Bastian states that:

If … ALA could assist in defining those constitutionally unprotected sites, perhaps setting standards for identification, libraries could keep within the bounds of First Amendment rights while at the same time proclaiming their own rights, as librarians, to make choices in the material that is accessed in the public library.

The concept of librarians controlling information that library patrons access on the Internet is the theoretically the most promising answer to the filtering debate. Librarians could select information appropriate to their libraries in the same manner in which their collection development policies are enacted. However, it is obviously far more challenging to select materials for patrons to view via the Internet due to the vast amount of material available and the rate at which this material changes. In addition, the manpower necessary for such a feat would far exceed the limited resources that libraries have today, especially in the current economic climate where public library funding is being slashed at alarming rates. Unfortunately, it is not economically feasible today for libraries today to be the gatekeepers of information available via the Internet.

Conclusion

The definition of obscenity has changed throughout history, and will undoubtedly continue to change as societies evolve. Just as we may scoff at the idea of the censoring “erroneous representations” of gods that Plato found necessary (Heins, 2001, p. 3), future societies may find our current law regarding obscenity crude and unworkable. Certainly, more obscenity trials will make their way to the Supreme Court in upcoming years, and the Court will have to contend with their current test of obscenity as applied to the Internet community. Courts today continue to struggle with the shortcomings of the Miller test for obscenity in the information age, and it is arguable, according to Friedman, that “the very nature of computers requires new laws which pay greater attention to the technology’s effect on the issues, and not the issues themselves”. Indeed, many would argue that our problem today is that of trying to fit a new communication medium into existing law that is not able to adequately cope with new technological advances. (Friedman, 1996, Part I, para. 2 ).

Until obscenity law is revisited in terms of the Internet, libraries will continue to face the daunting challenge of balancing access to constitutionally protected speech with the filtering of obscene and therefore illegal material from their networks. Complicating this balance is the recently enacted Children’s Internet Protection Act (CIPA), which requires that public libraries adopt Internet filtering software if they receive federal funding. In accordance with CIPA, public libraries and schools must block access to visual depictions of obscenity to adults, and to minors, visual depictions of obscenity as well as other depictions that are “harmful to minors” (CIPA Questions and Answers, 2003). In regard to CIPA, the ALA is quick to note that there is no filtering technology available today that will block all illegal content while allowing access to constitutionally protected materials. Unfortunately, our society today is faced with one more situation in which our ability to ethically and legally cope with ever-advancing technology is severely lacking; addressing today’s problems with yesterday’s information is a fault that our society has been guilty of for far too long. In order for our society to advance ethically and to be in accordance with the constitutional protection that our founders carved out for us, the Supreme Court of the United States must revisit the “community standards” clause it put in place thirty years ago, with the understanding that what defines a community has changed enormously in the past years. Hopefully, libraries will benefit from such a revisiting of obscenity law, and be able in the future to further their role as repositories for all constitutionally protected speech.

References

American Library Association’s Library Bill Of Rights. Retrieved June 30, 2003, from http://ala.org

Bass, T.S. (1996). Obscenity in cyberspace: Some reasons for retaining the local community standard. 1996 The University of Chicago Legal Forum 471. Retrieved August 20, 2003 from Lexis-Nexis database.

Bastian, J.A. (1997). Filtering the Internet in American public libraries: Sliding down the slippery slope. First Monday. Retrieved July 2, 2003, from http://www/firstmonday.dk/issues/issue2_10/bastian

CIPA Questions and Answers. Retrieved August 14, 2003, from http://ala.org

Clancy, C. A. (2002). Libraries in the cyberage – filtering, censorship and the First Amendment: Libraries at the crossroads. Against the Grain, 14(4), 46-59.

Coglianese, R. N. (1995). Sex, bytes, and community entrapment: The need for a new obscenity standard for the twenty-first century. 24 Capital University Law Review 385. Retrieved August 20, 2003 from Lexis-Nexis database.

Crawford, W. (2002). The real CIPA issues – and a friendly challenge. American Libraries, 33(3), 83.

Friedman, M. L. (1996). Keeping sex safe on the information superhighway: Computer pornography and the First Amendment. 40 New York Law School Law Review 1025. Retrieved August 20, 2003 from Lexis-Nexis database.

Gardner, C. (2002). Libraries in the cyberage – filtering, censorship and the First Amendment: Libraries at the crossroads. Against the Grain, 14(4), 43-46.

Heins, M. (2001). Not in front of the children: “Indecency”, censorship, and the innocence of youth. New York : Hill & Wang.

Morgan, C. D. (2001). Liability for obscenity and pornography in providing Internet access in the library. Library Administration and Management, 15(1), 17-19.

Morgan, J. (2000). Obscenity on the Internet: A challenge to intellectual freedom in the United States. Libri, 50, 51-59.

Rappaport, K. L. (1998). In the wake of Reno v. ACLU: The continued struggle in western constitutional democracies with Internet censorship and freedom of speech online. 13 American University International Law Review 765. Retrieved August 20, 2003 from Lexis-Nexis database.