Back to Lesson Index



Lesson 77 - Dispute Resolution in Cyberspace

Introduction

Freedom of Speech in Cyberspace from the Listener's Perspective: Private Speech Re-strictions, Libel, State Action, Harassment, and Sex

Introduction 378

I. Edited Conferences 381
A. Electronic Conferences and Their Hazards 381
B. The Right to Edit 385
1. The Right to Exclude Content 385
2. The Right to Exclude Speakers 390
3. Possibly Permissible Requirements 397
C. Editing from the Listener's Perspective 398
D. Defamation Liability 401
E. Edited Conference Groups on Public Computers or Run by Public Employees 407
1. No Constitutional Barriers to Editing 407
2. No Constitutional Right to Edit 409
II. Government Protection of Listeners Against Offensive Messages 410
A. Protecting Some Listeners Without Burdening Other- Listeners 410
B. Telephone Harassment Laws 412
C. ³Electronic Harassment² in Electronic Conferences 413
D. Hostile-Environment Harassment 414
E. One-to-One Online Harassment 421
F. The Continued Unwanted Contact Model 423
III. Sexually Explicit Material and Minors 425
A. The Potential Restrictions 426
B. The Least Restrictive Alternative Requirement 428
C. Ratings 429
1. The Clean List/Dirty List Models 429
2. The Ratings Model 432
3. Enforcing the Ratings System 432
4. Why This is at Least an Equally Effective Alternative 433
5. Community Standards 435
Conclusion 435

INTRODUCTION

Speakers' desires are fairly simple: gener-ally, they want more listeners. But listen-ers don't just want more speakers talk-ing to them. Listeners want more control over their speech di-et‹a larger range of available speech cou-pled with greater ease of selecting the speech that's most useful or interesting to them.

The success of the new electronic media in the ³marketplace of marketplaces² of ideas‹where information providers compete for that scarcest of resources, the attention span of modern man‹will turn on how well they can satisfy listeners' desires. The new media have one significant advantage: they can give listeners many more choices. But for listen-ers, that's not enough. For listeners, what the new media omit‹time-wasting junk, insults, material that might be harmful to their children‹is just as important as what they in-clude. Listeners care about this outside the online world, and they care about it just as much online.

In the following pages, I will discuss three categories of online speech issues and look at them partly, though only partly, through the lens of the listeners' interests:

1. Edited Electronic Conferences: One of the most significant features of the new media is the interactive electronic confer-ence‹bulletin board, newsgroup, discussion list, or the like. People who listen in on these conferences (and most participants spend much more of their time listening than speak-ing) want speech that's relevant to their interests, readable, reliable, and not rude. Sometimes an open, unedited electronic confer-ence can provide this, but often it can't. Often‹as conference operators have been learn-ing‹editing is critical to making online speech worth listening to.

At the same time, editing is content con-trol, the sort of thing that, if the govern-ment did it, would be called ³censorship.² It includes limitations on who may speak, removal of people who speak badly (in the editor's opinion), the deletion of inappropriate mes-sages, and automatic screening of messages for profanities. Many have expressed concern about this sort of private speech restriction.

In Part I, I will defend the propriety of private, nongovern-mental, content control on electronic conferences. I'll argue that:

2. Avoidance of Offense: Listeners don't want to hear materi-al that offends them. This doesn't mean they only want to hear what they agree with; controversy is usually more fun than agreement. But some speech is offensive enough that its emotion-al cost to the listen-ers can exceed the informational benefit they derive from the conversation.

No one likes to be personally insulted. No one likes to hear one's race, sex, religion, or deeply held moral beliefs rudely at-tacked. Often, we're discomfited even by watching oth-ers argue rudely with one another. Some speech like this can be annoying; some can ruin one's mood for hours. People don't go to parties where they think it likely that the other guests will be rude to them‹neither do they want to participate in electronic confer-ences where this happens.

In Part I, I'll argue that private editing is an important tool for giving people the opportunity to interact in the polite environ-ment they may prefer. In Part II, I'll discuss what the govern-ment may do to protect people from speech that offends them when private editors can't or won't edit it out. I'll sug-gest that:

In general, the government ought not be able to restrict offensive speech in electron-ic conferences (unless it's a threat or falls into some other Free Speech Clause exception). Some tele-phone harassment laws, and possibly some aspects of hostile environment harass-ment law, seem to already impose restrictions on online speech, but to the extent they do, they're unconstitu-tional. On the other hand, the government should be more able to restrict one-to-one speech‹such as personal e-mail‹that's aimed at unwilling listeners; such restrictions protect unwilling listeners while still leaving speak-ers able to communicate with willing ones. The best way to implement such a restriction would be to let listeners demand that a speak-er stop sending them direct e-mail, a power people al-ready enjoy with regard to normal mail. Such a rule would be better than di-rect extensions of telephone harassment laws, which often embody dangerously vague prohi-bi-tions on speech that's ³annoying² or ³harassing.²

3. Giving Parents Control Over Their Children's Access: Fi-nal-ly, online as well as offline, parents are concerned about their children gaining access to sexually explicit materials; and, online as well as offline, the question becomes how the law can restrict children's access without also restricting the access of willing adult listeners. In Part III, I'll suggest that: ‹ Some laws may already prohibit the online posting of nonobscene sexually explicit mate-rial that might be ³harmful to mi-nors² (a term of art described below). ‹ These laws have been upheld in the offline world, partly be-cause they've been seen as not imposing much of a burden on adults who want to get the material. Online, though, where it's hard to tell who's a child and who's not, these laws are much more bur-densome to adult viewers. ‹ These laws ought to be unconstitu-tional online because there's a less speech-restric-tive alternative which could protect children while maximizing the choices available to adult listen-ers: a self-rating system that would identify which images or discussions are sexually explicit. This approach will still impose something of a burden on speakers and adult listeners, but this burden should be constitutionally permissible. In focusing on listeners, I don't mean to suggest that listeners' rights are generally more important than the rights of speakers. After all, the Free Speech Clause guarantees ³the free-dom of speech,² and much of the Court's doctrine has‹in my view correct-ly‹protected speakers, even where most listen-ers might object to what the speakers are say-ing. But this very emphasis on the rights of speakers can lead people to ignore the rights of listeners, rights the Court has also recog-nized. And worse still, focusing exclusively on the rights of speakers can make us ignore how critical listener satisfaction can be to the survival of the new media. If we think the new media can be valuable tools for public discourse, it's worth trying to make sure that the law doesn't make them unattractive to the listening public. I. EDITED CONFERENCES A. Electronic Conferences and Their Haz-ards Listeners want to hear more of what inter-ests them and less of what doesn't. This be-comes especially important for electronic con-ferences, some of the biggest speech activi-ties on the infobahn. These conferences are electronic ³places² where people from all over the world can com-municate with one another on a partic-ular topic, from the law of government and religion to Jewish issues in Star Trek. Electronic conferences can be organized as Internet dis-cussion lists; as Internet news groups; or as special dial-in services, ranging from the big public ones like Prodigy, America Online, and Compuserve, to the smaller and more spe-cialized ones like Counsel Connect, to single-PC bulletin boards that may have only a few hundred subscribers. Regardless of imple-men-tation, though, these conferences are means by which each of hundreds or thousands of par-tic-ipants can talk to all the others, and will have to listen to what the others have to say. The conferences are like faculty or law firm symposia, where everyone present can speak (though not all at once) and comment on what everyone else has said. They are, howev-er, symposia that go on continuously, and that can include hundreds of people who've never physically met one another. And, as in a sym-po-sium, though everyone has an opportunity to speak, the over-whelming majority of partic-ipants are ³lurkers,² people who only listen. The advantage of these conferences over the traditional me-dia is their openness and interactivity; but this is also their great risk. An electronic conference is a compila-tion of the messages (or ³posts²) written by all its participants, and as with any compila-tion, its value lies both in the substance of the materials it con-tains and in their selec-tion. As with other compilation media, such as radio programs, magazines, or live conferenc-es, people look for high ratios of wheat to chaff (or, as computer people say, ³signal to noise²): electronic confer-ences in which they find a large fraction of the speech to be in-terest-ing. I've seen many users turn away from elec-tronic conferences, even conferences on topics that interest them, be-cause the signal-to-noise ratio was too low. Conference users have limited time, and messages that are irrelevant to the con-ference topic, messag-es from people who don't know what they're talk-ing about, and messages that are repeti-tive all make the conference less valu-able. People want conversations of higher quality than talk radio. Conference users also want an emotionally congenial envi-ronment. It may be a pleasure to listen to people discuss an issue civilly, but a strain to listen to them yell at each other. Even if the intellectual content is the same, the tone of the speech can be a serious bur-den. Few physical conferences, for instance, invite speakers who insult one another. Many newspapers refuse to print certain profani-ties. We don't go to clubs or parties where we know boors are likely to be declaiming; simi-lar online conduct can make an electronic con-ference much less valuable for us. One bad apple on a discussion list can spoil many people's enjoyment. And as more people get online and begin to use online resources, the risk of information overload and of the occasional rude participant esca-lates. Most electronic conferences try to keep both the signal-to-noise ratio and civility high by moral suasion. Each conference has an official topic; people generally know not to stray too far from it, and if they do, oth-ers might ask them to ³take it off-list²‹continue the discussion in personal e-mail rather than on the electronic conference. When people start getting rude, others might chime in to quiet every-one down. Most confer-ences have conference operators who are in charge of the techni-cal details of conference administration; they often also take responsi-bility for informally keeping everyone in line. Increasingly, though, conference operators have begun to edit more coercively, in several ways: 1. They may limit who can access their con-ference. Counsel Connect, for instance, is limited to lawyers; the LAWPROF Internet discussion group is limited to law profes-sors. The the-o-ry is that those are the people who are most likely to contribute something valu-able to the discussion. List operators have dis-cretion, of course, to waive these rules‹either in favor of admis-sion or of ex-clusion‹in particular cases. 2. They may kick out troublemakers, people who have prov-en to be consistently off-topic, or rude, or kooky‹as always, in the opera-tor's judgment. 3. They may automatically filter all mes-sages to exclude particular words, such as profanities. 4. They may manually screen each message that's sent to the conference before actually passing it along to all conference par-ticipants. This can, of course, be a time-consum-ing process, though for many conferences it might not be prohibitively so. And, as with newspapers and physical con-ferences, editorial decisions make a big dif-ference in the conference content on three different levels. First, editing decisions directly dictate what can be said. A confer-ence that's open to everyone will be different from one that's open only to law professors. A conference that allows militant and even rude debate will be different from one that re-quires more gentility. A conference on, say, religious free-dom that doesn't allow posts which rely on explicitly religious foundations (on the theory that such posts are likely to distract discussion into unresolvable theolog-ical arguments) will be differ-ent from one that takes another approach. Second, editing one post can cut short a whole discussion. Be-cause conferences are in-teractive, one post leads to others. Excluding one rude message may avoid dozens of messages re-sponding to it, responding to the responses, commenting on the merits of polite de-bate over rude debate, and so on. Third, including or excluding certain mes-sages will change who participates. Some peo-ple will get involved in polite discus-sions‹both speaking and listening‹but will turn away if the discussion turns nasty. Oth-ers will participate only so long as there are comparatively few posts that they see as irrelevant. A law professor might be willing to read a law professors-only con-ference but not be interested in a general public confer-ence, or vice versa. B. The Right to Edit 1. The right to exclude content. Where there are editors, there are speakers who resent being edited out. Prodigy, for in-stance, has been criticized for its edit-ing practices, which have at times included auto-matically screen-ing out profanities, deleting messages that denied the existence of the Ho-locaust, deleting messages criticizing Prodigy's pricing policies (and urging a boy-cott of Prodigy), and kicking off users who persisted in posting these messages. Similarly, oper-ators of Internet discussion lists who try to restrict what's said on the list, and by whom it's said, can expect a good deal of resis-tance. Private conference operators clearly don't violate the Consti-tution by editing, just like newspaper editors don't abridge free speech by refusing to publish a letter to the editor. Some have argued that private providers' use of government-funded or gov-ernment-regulated Net backbones makes them state actors and thus bound by the First Amendment; this, though, is cer-tainly not so under existing state-action doctrine. But editing is not only constitutional, it is constitutionally protected. A law that would, for instance, prohibit conference oper-ators from screening messages, or even allow screening for relevance but prohibit screening for viewpoint, would violate the First Amend-ment. As the Court held in 1974 in Miami Her-ald Publishing Co. v Tornillo and recently reaffirmed in Hurley v Irish-American Gay, Lesbian and Bisexual Group, the freedom to speak includes the freedom to create one's own mix of speech. A parade, a magazine, the playlist of a music radio station, and an edited electronic conference are all speech products creat-ed by an editor. ³Rather like a composer, the [edi-tor] selects the expressive units of the [pub-lication] from potential participants² to cre-ate a particular work. Restricting the editor's right to edit would be restricting his right to create a particular speech prod-uct: it would make ille-gal the production of certain speech mixes and require instead the production of others. ³The choice of mate-rial to go into a [publica-tion], and the decisions made as to limi-tations on the [publication's] size and content . . . constitute the exercise of editorial control and judgment,² and this edito-ri-al judgment is constitutionally protected from govern-ment in-ter-fer-ence. This is the rule for newspapers and parades, and it should be the same for electronic conferences. Of course, many conference operators don't have a specific ideological perspective they want to communicate. They may just want to spread information about a certain topic and may ex-clude material only because they think it's irrelevant, impolite, or inaccurate, not because it clashes with their viewpoint. But ³a narrow, succinctly articulable mes-sage is not a condi-tion of constitutional pro-tection.² The decision by the St. Patrick's Day parade orga-nizers to exclude one group was pro-tect-ed even though the remaining parade wasn't communicating much by way of a specif-ic ideology. The editorial choices of a nonparti-san news-paper that assiduously tries to avoid all politi-cal bias are as pro-tected as other papers' choices. The editing deci-sions of a conference operator are equally a conscious attempt to create a speech prod-uct with a par-ticular content; their claim to be-ing protect-ed exercis-es of editorial judgment is as strong as that of the deci-sions of the news-papers or parade orga-nizers. Nor should it be relevant that electronic conferences are less selective than newspapers and magazines. It's true that newspa-pers and maga-zines publish only a small fraction of what might be submit-ted to them, while even edited elec-tronic conferences tend to let through almost all the messages that people try to post. But this lower selec-tiv-ity shouldn't keep the editors' edi-to-rial judgments from being protected. Technology eliminates the need to edit for paper-saving rea-sons, but the editors' desires to edit for germane-ness, civility, and even viewpoint remain as legitimate as they are in the news-pa-per con-text. Compelling opera-tors to give access to the con-ferenc-es to ev-eryone proba-bly won't cost the opera-tors any-thing out of pocket, but it will cost them the ability to create the speech prod-uct they pre-fer. As the Court has held, ³[e]ven if a news-paper would face no addi-tional cost to comply with a com-pul-so-ry access law and would not be forced to forgo the publica-tion of [its own chosen mate-ri-als] by the [compelled access],² the First Amend-ment would still prohibit the compelled access law's ³intrusion into the function of edit-ors.² And, of course, Hurley made clear that the editorial func-tion remains protected even when, as in a parade, the edi-tors rare-ly exercise it. The Court has, in two contexts, upheld laws that require pri-vate property owners to let others speak on their property, but neither of these narrow exceptions should be applicable to elec-tronic conferences: Broadcasting: The Court has tolerated ³more intrusive regu-la-tion of broadcast speakers than of speakers in other media²; in par-tic-ular, Red Lion Broadcasting Co. v FCC up-held a rule requiring broadcasters to give time to opposing views. But the Court has re-fused to extend its rela-tively deferential scrutiny of broadcasting controls to other media, such as newspapers or cable TV. Red Lion has been read as turning entire-ly on ³the unique phy-si-cal limita-tions of the broadcast medi-um²‹the physi-cal scarcity of avail-able broadcast chan-nels. No such limita-tions ex-ist for elec-tronic confer-ences; there are thousands of con-ferenc-es avail-able to every-one who has Internet access (which includes users of Prod-igy, Compu--Serve, America On-line, and similar ser-vices). Even if one counts only the three big ser-vices, three is still more than the num-ber of cable operators or large local news-pa-pers that serve the typical city. Given that the Court has refused to apply Red Lion to cable and news-papers, I don't see how it could justify applying Red Lion to elec-tronic con-ferenc-es. Content-Neutral Access Mandates: In two cases, the Court at least partly approved con-tent-neutral speaker access mandates. Turner Broadcasting System v FCC indi-cated that the govern-ment may in some circumstances re-quire cable-system operators to leave open some channels for local broadcast-ers. PruneYard Shop-ping Center v Robins held that state law may require shop-ping center owners to let members of the public speak in the center's public areas. The more recent Hurley deci-sion, though, makes clear that these cases wouldn't justify even content-neu-tral access mandates to electron-ic confer-enc-es. An electronic conference, like the parade involved in Hurley or the newpaper in Miami Herald, is a more or less coher-ent speech product, one whose content is a func-tion of all its com-po-nents. A parade organizer, newspaper editor, or conference opera-tor may solic-it speech from the public, and may decide to let much of it through unedited; but this is just one possible choice on his part, and he might equally well choose to fashion his speech product out of only a cer-tain set of messages. When the govern-ment re-quires a confer-ence operator to include speech that he would prefer to exclude, it's order-ing the operator to change the char-acter of the information the confer-ence conveys. Just as a St. Patrick's Day parade which includes an ³Irish American Gay, Lesbian and Bisexual Group of Boston² banner communicates some-thing dif-ferent from a St. Patrick's Day parade which ex-- cludes this ban-ner, so an unedit-ed confer-ence communi-cates something other than an edited one. PruneYard, as Hurley pointed out, ³did not involve `any con-cern that [mandated access by other speakers] might affect the shopping center owner's exercise of his own right to speak.'² Shopping cen-ters aren't usually in the speech business; ³[t]he selection of mate-rial for publication is not generally a con-cern of shopping centers.² The speech by mem-bers of the public didn't interfere with any messages the shopping center was trying to communicate. ³The principle of speaker's au-tonomy was simply not threatened in that case.² It's true that in some cases compelled ac-cess to a shopping center can indeed inter-fere with the center owner's speech. If the owner, for instance, decid-ed to have a patriotic Fourth of July festival, letting flagburning pro-testers on the property might affect the center's speech as much as would letting un-wanted signs into a parade. But such coordinated expressive activi-ty‹the use of the shop-ping center to communicate a message or a set of messag-es‹is the exception rather than the rule. Prune-Yard didn't fore-close the possibil-ity that, in such a context, gov-ernment-man-dated inclusion of other speakers in the fes-tival would be unconstitu-tional; as Justice Marshall said in his concur-rence, the shop-ping center owners were ³permit-ted to impose rea-sonable restric-tions on ex-press-ive activi-ty.² And the Court has never suggested that the government could compel access to, for instance, bookstores or holi-day displays or other places where access might seriously in-terfere with the own-er's own message. Turner Broadcasting did involve a limita-tion on the property owner's speech‹the cable operator could no longer use the chan-nels that were set aside by the law to carry the materi-als it pre-ferred. But, as Hurley pointed out, cable operators have a mo-no-po-ly; the jus-tification supporting the law in Turner Broad-cast-ing was the survival of broadcast stations that might be threat-ened when a monopolist excludes them. This was the interest that made the law valid, and this interest is ab-sent in the elec-tronic-conference context. The interest served by restrictions on con-ference editing‹the interest in ³requir[ing] speakers [here, conference operators] to modify the content of their expression to whatever extent beneficiaries of the law choose to alter it with messages of their own²‹is, according to Hurley, ³exact-ly what the gener-al rule of speaker's autonomy forbids.² Of course, speakers would prefer to have access to an exist-ing conference, with its estab-lished pool of listeners, even when set-ting up new con-ferenc-es isn't hard. But, as in Hurley, though ³the size and success of [an existing confer-ence] makes it an enviable ve-hicle for the dissemi-nation of [disparate] views, . . . that fact, without more, would fall far short of sup-porting a claim that [the confer-ence] enjoy[s] an abiding monop-oly of access to spectators.² 2. The right to exclude speakers. Some conference operators may want to limit access based on who a person is‹for instance, on the person's occupation, profes-sion-al standing, political opinions, religion, sex, or race‹and not just on what he posts. Many traditional conferences certainly select their speakers this way. Of course, it's often hard to tell these things about a person online, so someone might lie his way into a closed group without much difficulty. But few people gen-er-ally want to do this, and in any event, even online the truth might come out. Sometimes, the person's status may be used as a proxy for his knowledge: Counsel Connect, for instance, is generally re-stricted to lawyers, largely because lawyers are more likely to talk and think in particular ways, ways useful to other lawyers. If a lawyer asks a ques-tion about First Amendment law, other law-yers are more likely to respond by citing cas-es; laypeople may instead respond with textual arguments (³Congress shall make no law²) or moral arguments that lawyers may know are gen-erally not accepted by courts. A person's identity can also be directly relevant to the confer-ence operator's purpos-es. Some conferences are aimed at dealing with issues facing a particular group. Democrats might want to argue about what the Democratic Party platform should be. Ho-mosexuals might want to debate what the homosexual community's stance ought to be on a particular issue. Southern Baptists might want to discuss what stance Southern Baptist churches should take on homosexuality. Blacks might want to argue how they as blacks should react to Louis Farrakhan; whites might want to debate how whites should deal with prob-lems of police racism; men or women might want to share thoughts on why their own sex is superior. In each situation, people might specifically want to hear the voices of their fellow group mem-bers (whatever they have to say) and not of others (no matter how sympathetic to the group they might be). We see these sorts of group-limited symposia often in the offline world, where they are some-times praised and sometimes condemned. Finally, a person's identity might more subtly influence the conference operator's actions. An operator might be more willing to bend the rules for, say, academics or women or whites or ag-nos-tics than he would be for oth-ers. An operator might more quickly kick off a misbehaving person with a male-sounding name than with a female-sound-ing name, or vice ver-sa. And regardless of the operator's actual rea-sons, a person who's denied access to a conference‹or who's kicked off after having had ac-cess‹might believe that his group mem-bership was the reason. Can the government bar conference operators from dis-crimi-nating based on, say, race, sex, religion, age, political opinion, marital sta-tus, sexual orientation, profession, or educa-tion? The law already bars various forms of discrimination in public accom-modations. Pri-vate clubs, parades, and the Boy Scouts have all been viewed by at least some govern-ment agencies as places of public accom-moda-tion; the same logic might be applied to elec-tronic conferences. And some juris-dic-tions prohibit much more than just discrim-ina-tion based on race, sex, religion, or na-tional ori-gin: The District of Colum-bia, for instance, also bars discrimi-nation based on ³age, mari-tal status, personal appear-ance, sexual orien-tation, family responsibilities, disabil-ity, matriculation, political affilia-tion, source of income, or place of resi-dence or busi-ness.² The Court has never squarely dealt with this. Hurley held only that the parade orga-nizers had the right to exclude speech that en-dorsed homosexuality‹it didn't decide wheth-er parade organizers had the right to ex-clude homosexuals. The Court has several times con-fronted the question whether barring sex dis-crimination by private clubs would abridge the club members' rights to freedom of expres-sive association, but this too is a some-what different matter. When a club is forced to admit unwanted members, the danger is the pos-sibili-ty that ³admission of [the unwanted peo-ple] as voting members will change the message communicated by the group's speech.² When a conference is forced to accept unwanted speak-ers, the danger is the certainty that admis-sion of the speakers will change the mes-sage com-mu-nicated within the conference. Still, Hurley did suggest that the expres-sive association cases may be relevant to de-termining whether parades‹or, presum-ably, con-ferences, whether electronic or not‹can dis-criminate in selecting their participants. The rule the Court announced in those cases has two components. First, the First Amendment is implicated only when the law will indeed ³change the con-tent or impact of the organization's speech.² To determine whether this is so, a court can't just con-sider generalizations, even statistically accurate ones, about the beliefs shared by most women or most men (or, presumably, by most mem-bers of other classes defined by quasi-suspect or sus-pect attrib-utes such as race). When a law ³requires no change in the [associa-tion's] creed . . . and . . . impos-es no re-stric-tions on the organization's ability to exclude individ-uals with ideolo-gies or philosophies different from those of its existing mem-bers,² the orga-ni-zation must ³show that it is organized for spe-cific expressive purposes and that it will not be able to advo-cate its desired viewpoints nearly as effectively if it cannot confine its membership to those who share [a particular attrib-ute].² Second, even if the First Amendment is im-plicated, the regu-lation may still be upheld if it's narrowly tailored to a compelling state inter-est which is unrelated to the sup-pression of ideas. The prevention of sex discrimination, even in organizations which do not actually engage in commerce but only provide leadership skills and business contacts, is such a compelling inter-est. Pre-sumably the same would be true of the prevention of race and national origin dis-crimina-tion. And a prohibition on dis-crimination in membership is by definition narrowly tailored to the interest. Justice O'Connor, recently joined by Jus-tice Kennedy, has taken a different view. Un-der her approach, the only inquiry must be whether the organization is primarily commer-cial, which is true ³when, and only when, the association's activities are not predominantly of the type protected by the First Amend-ment²‹are not predominantly expressive. If the organization's activities are primarily commercial, then it has only minimal expres-sive association rights, and govern-ment interference with its membership criteria would be permis-sible. If the organization's activ-ities are primarily ex-pressive, then ³both the content of its mes-sage and the choice of its members² are pro-tected. Thus, if Justice O'Connor's approach is transplanted from the expressive association context to electronic conferences, most conference operators‹all those whose conferences are devoted to something other than commercial transactions‹would have the unlimited right to discriminate in membership. If the majority view is applied, however, the result is less clear. To begin with, the conference operator will have to show that ³the content or impact of the [conference's] speech² will be changed if the participant restrictions are lifted. Under one defi-nition of ³content² this shouldn't be hard; certainly the content of a conference will be changed whenever new speakers are allowed. On the oth-er hand, if courts insist that the change not just be to the exact words the conference con-tains, but to some substantive aspects of the discussion, the operator will have to show that, say, nonlawyers or women or blacks as a group will probably have different views on various topics than lawyers or men or whites as a group. And when the group is defined by a suspect or quasi-suspect attribute, such as race or sex, the opera-tor will have to show this using more than just statistical general-iza-tions (even empirically valid ones). Next, the court would ask whether the gov-ernment has a compelling interest in barring the particular form of discrimina-tion. Such a compelling interest quite likely exists for race, na-tional origin, sex, and probably reli-gion. For other attributes, it's less clear. State courts are, for instance, split on whether pre-venting discrimination based on marital status, sexual orienta-tion, and age are compelling interests, and I know of no deci-sions dealing with whether there's a com-pelling interest in bar-ring private discrimi-nation based on, say, political affilia-tion. In my view, Justice O'Connor's framework is the better one, especially when one is dealing with choice of speakers and not just with choice of members in an organization. An all-lawyer, all-Repub-lican, all-female, all-white, or all-Catholic electronic conference pres-ents a unique speech mix for its partici-pants. We might not be entirely happy that some people prefer to talk only to members of their own group, but‹especially where no sala-ry or other tangible economic benefits are directly in-volved ‹people's choice of cor-re----spondents seems as much a part of the free-dom of speech as their choice of what to say or listen to. Nor is it proper to allow only those exclu-sions that are in some way germane to the con-ference topic‹to say that, for in-stance, wom-en might be excluded from an electronic confer-ence for discussion of men's issues, but not from an electronic confer-ence on, say, bank-ruptcy law. It shouldn't be the government's job to determine what's germane to an expres-sive association's purposes and what's not. Ex-cluding women (or men) from a bank-ruptcy law discussion will defi-nitely change the discus-sion con-tent; we might not think it will change the content in any re-motely interesting way, but presumably the discussion organizers disagree. Hurley tells us it's up to the pa-rade organizer, not the government, to decide whether including a certain message would unacceptably change the parade's message. If Justice O'Connor is right to equate an expres-sive association's interest in ³the content of its message² and ³the choice of its members,² then the decisions about membership in an ex-pressive association‹or an elec-tronic conference‹should likewise be in the organizer's hands. In at least one area, in fact, antidiscrimination law has been appropriately trumped by the First Amendment; despite Title VII, churches continue to have the right to discriminate based on race and sex in their choice of cler-gy. ³The right to choose minis-ters without government restriction un-derlies the well-being of religious com-munity, for perpetuation of a church's existence may depend upon those whom it selects to preach its values, teach its mes-sage, and inter-pret its doc-trines both to its own membership and to the world at large.² Though the inter-est in stopping race and sex discrimination is normal-ly compelling, in the con-text of clergy selection it must yield to the church's rights under the Free Exercise Clause. Conference participants are to a con-ference what clergy are to a church: The perpetua-tion of a confer-ence's distinctive content depends on those whom the operator selects to contribute to it. 3. Possibly permissible requirements. All the above has turned on the conference operator's right to create a coherent speech product. Regulations that don't jeopar-dize this right are a different story. For instance, a law that required online service providers to of-fer person-to-person e-mail services to every-one and barred providers from restricting the content of such message would probably be constitutional. The same should even be true of a law that required service providers (Prodigy and the like) to give their users the ability to cre-ate new electronic conferences. Such laws wouldn't prohibit the creation of any speech products; operators could still edit their own conferenc-es any way they please. (One could, of course, still oppose these laws on the policy grounds that the government gen-erally should-n't inter-fere with private businesses, or even argue that such laws may some-times be takings of private prop-erty without just compensa-tion. ) It may also be permissible to restrict op-erators from chang-ing posts (as opposed to deleting them) without the authors' per-mis-sion. Changing people's posts essentially rep-resents them as having said something they didn't say‹it implies a false state-ment of fact, ³X said Y² where in reality X said Z. This sort of knowingly false statement should be constitutionally unprotect-ed. In fact, such modifications of others' posts may already be prohibited in many situations. Under the Copyright Act, both copying someone's work and transforming or abridging it are presumptively infringe-ments. By posting to an electronic confer-ence the author obviously gives the conference operator an im-plied license to copy the work in order to forward it to the other conference members, but he probably doesn't give an implied license to change it. Putting words into someone's (elec-tronic) mouth may also risk a false-light invasion of privacy lawsuit, and, in a commercial context, a misattribution claim under the Lanham Act. If an operator wants to change a con-ference participant's words, the operator should get the person's agree-ment, either at the time of the post or be-forehand (for in-stance, when the person signs a Terms of Service agree-ment that makes clear that certain words will be deleted from all posts). C. Editing from the Listener's Perspective The Value of Editing: I've focused on the interests of the conference editor as speaker, largely because this is what the doctrine has generally done. But, as I mentioned earlier, editing is also critical to the interests of listeners. As the Court has recognized, listeners have substantial claims to autonomy in their selec-tion of the speech they hear: ³no-one has a right to press even `good' ideas on an unwill-ing recipient.² In many contexts, this au-tono-my can't justify silenc-ing a speaker, be-cause other, willing listeners might be pres-ent. But listener choice remains an important value; practically, if a medium can't give listeners what they want, listeners aren't likely to use it. Private intermediaries are a vital tool for listener choice. Listeners who want more op-tions are generally better served not by a media outlet which carries everything submit-ted to it, or even by many such outlets, but by many edited media outlets, each with its own editorial judgment. I'd rather have access to twenty radio stations, each with its own playlist, than to twenty (or even fifty) sta-tions that are all open to all comers. And just as speakers' rights to speak can often be fully realized only by their right to associate to form a more powerful speaker (albeit one that might not always perfectly track the ideas of each of its individual members), so listeners' control over what they listen to is often made possible by the editors' right to edit (even though the result might not per-fectly track the interests of each of the editor's customers). Nor would it be wise to prohibit viewpoint-based editing, while allowing editing based on subject matter. Some of the most useful forms of editing are at least partly based on viewpoint. This is com-mon in the print world: The New Republic and The Nation-al Review, for in-stance, are useful to their readers precise-ly be-cause they have particular outlooks on the world. Likewise, con-ventional conferences of-ten invite speakers precisely because of the viewpoints they express. The same goes for electronic conferences. A biology discus-sion group might, for instance, reject messages that take a creationist perspective. A gay-rights discussion group might re-ject messages that argue that gay rights are a bad idea because homosexuality is evil. A Christian theology discussion group might re-ject messages that try to prove there is no God. Even the most open-minded of us can't devote our time to debating every-thing. Once we're confident enough about a certain propo-si-tion‹that evolution is correct, that homo-sexuality isn't immoral, that God exists‹we may want to spend time discussing its impli-ca-tions rather than rehashing the arguments about whether it's correct. Messages that ex-press contrary viewpoints, messages that respond to them, further responses to the re-sponses, and so on, will be useless to us, and by decreasing the signal-to-noise ratio will make the whole conference less useful. Of course, this doesn't mean that most electronic conferences will be completely doc-trinaire. Debate is the lifeblood of electron-ic conferences; few conferences of which I know have much of an ideological litmus test. But while the typical conference may tolerate a wide range of opinions, the editor may de-cide that certain perspectives are be-yond the pale. He may do participants a service‹making the confer-ence more valuable to them‹by ex-cluding those perspectives. The Drawbacks of Editing: Of course, edito-rial judgment itself limits listener choice, by depriving listeners of access to voices which they might like. Perhaps I might be dis-appointed by the law professor-only conference, and wish that my friend the layman could partici-pate. Perhaps a biologist might think his colleagues could profit from learning more about the creation scientists' arguments, even if many of his colleagues might think they've heard enough of them. But because these limitations are a matter of private deci-sion, not of government rule, they are generally easier to avoid. If enough lis-teners want to hear a particular view and one confer-ence doesn't carry it, others proba-bly will. And if no conference is interested in carrying the view, then chances are that this is because too few listeners want to hear it. In that case, the only way that those who are interested can be satisfied is by imposing on the greater number who aren't interest-ed. We see this happening already. Among the big services, Prodigy advertises its editing, while CompuServe generally im-poses no content controls. Counsel Connect provides lawyer-only discussions, while many conferences on the big services and on the Internet are open to everyone. There's an AMEND1-L Internet free speech discussion list that's open to all and a CLSPEECH list just for law profes-sors. New Internet discus-sion lists are cheap to start; people who already have accounts with certain Internet providers such as Netcom‹accounts that cost about $10 per month‹can set up such lists for free. And these discussion lists will be open to all Prodigy, America Online, and CompuServe users, as well as those who have direct Internet access. Of course, editing won't always be benefi-cial to listeners. To take one example, Prodigy's notorious removal of messages criti-cal of Prodigy's pricing policies was in no one's interests but Prodigy's own. One could defend this on Miami Herald grounds as part of Prodigy's edito-rial control rights, but it's hard to justify it on listener auton-omy grounds. This, though, was an unusual in-ci-dent, and one that had little overall nega-tive impact on speak-ers or listeners. Banning this sort of conduct might not hurt listeners, but it wouldn't have helped them much, either. On the other hand, Prodigy's removal of anti-Semitic messag-es from some bulletin boards and its automatic editing of offen-sive words may be of significant value to many lis-teners. Just as I'm entitled to avoid magazines that print anti-Se-mitic propagan-da, I have a legitimate interest in having maga-zine edi-tors, act-ing as my agents, exclude the anti-Semitic mate-rial for me. Hav-ing Prodigy impose this editing policy gives me as listener a choice: be exposed to a restricted set of views on Prodi-gy, or to an unedited set on CompuServe or on the Internet. Bar-ring Prodigy from editing would deprive me of that choice. On government property, we may have no choice but to suffer offensive speech, but there's no rea-son this has to apply to private-ly owned fora. Naturally, by increasing listener choice, editing also increas-es listeners' ability to choose unwisely. Listeners who choose confer-ences that tolerate only their own viewpoint, or those that shut down passionate debate, or even those that ex-clude racist speech, might be sealing themselves off from impor-tant argu-ments, arguments they might find persuasive (or at least worth knowing) if they saw them. Some might see this danger as a justifica-tion for laws that would open up the conferences and make sure that listeners don't shut themselves off from balanced debate. But though listeners may make the wrong decision, I believe it's better to leave these decisions to the listeners rather than to the government. It seems morally troubling for the govern-ment to force unwanted speech onto listeners; and I'm skeptical that even a well-motivated government can be good at determin-ing what listeners really ought to hear and what they can legitimate-ly seek to avoid. Equally importantly, I doubt that any at-tempts to save lis-teners from their narrow-mindedness will really work. If listeners want to cocoon themselves from opposing ideas, it's hard to see what can be done about that. Maybe compelled access will give some listeners some information for which they'll ultimately be grateful, even though they didn't at first want it. But I doubt this will often happen; you can make people receive mes-sages, but you can't make them read them. And in some situa-tions‹for instance, if the government bars editors from screening out in-sults or racial attacks or even ignoramuses‹many listeners may just stop reading the conference altogether. In trying to make people more informed, the government might cause them to become less informed. D. Defamation Liability There have so far been no direct legal threats to conference operators' right to ed-it. There has been, however, one recent indi-rect threat: the assertion by one court‹and some commenta-tors‹that editing should increase the conference operator's expo-sure to defama-tion liability. People who run electronic conferences, ed-ited or not, may be liable for defamatory statements posted on those conferences. As a general rule, those who participate in dis-tributing a libel are liable together with the original author; the same may apply to the confer-ence operator. A republisher of a libel‹for instance, a newspaper printing an op-ed or even a letter to the editor‹is liable under the famil-iar constitutional framework: Public Figure/Public Concern: If the false statement is about a public figure and is on a matter of public concern, the repub-lisher is liable if he knows the statement is false or recklessly disregards the possibility that it's false. Private Figure/Public Concern: If the false statement is about a private figure and is on a matter of public concern, the republisher is liable if he acts negligently in publishing the state-ment (for instance, doesn't do the factual investigation that a reasonable person would have done). Private Concern: If the false statement is on a matter of private concern, the republish-er might theoretically be strictly liable. In practice, though, few states impose strict liability even on the person who originally makes the statement, and I've seen no re-cent case that imposed strict liability on the re-publisher. Whether such strict liability would be constitutionally permissible is an open question. A distributor who isn't a republisher‹for instance, a book-store or a newsstand‹can be held liable, too, but such a distribu-tor is given an extra immunity: It isn't liable if it ³neither knows nor has reason to know of the defamatory article,² and it is gen-erally ³un-der no duty to examine the various publica-tions that [it] offers . . . to ascertain whether they contain any defamatory items,² unless a particular publication ³notoriously persists in printing scandalous items.² Much of the recent controversy about online libel turns on wheth-er conference operators should be seen as republishers or distribu-tors; but in practice this distinction isn't that important here. Whether he's a republish-er or a distributor, a conference operator will be liable for public figure/public con-cern statements only if he's acting with actu-al malice. And regardless of whether he's a republisher or a distributor, an operator will be liable for private figure/public concern statements under some sort of negli-gence stan-dard‹if an operator ³has reason to know² that a post is defamatory, he is vulnerable to a lawsuit even if he is seen only as a distributor. The one area of possible difference would be statements on matters of private con-cern, but even there it seems unlikely that a conference operator would be held strictly liable even as a republisher. The conference operator's main worry, then, has to be about what the negligence stan-dard‹the lowest standard with which he'll have to deal‹means in practice. Is it reasonable to let all posts through, or does the duty of care include a duty to prescreen? If the oper-ator does have a chance to screen the mes-sag-es, must he read them carefully, or is it reasonable to adopt a ³let-it-through-unless-it's-clearly-off-topic² policy? It's here that a court's attitude towards editing becomes important, because the choice of a duty of care is in large part a policy decision for the court to make. What level of expense and effort is ³reasonable² to expect can turn, rightly or wrongly, on the court's view of the so-cial utility of the underlying conduct. If a court believes that editing is a bad thing, it might impose more liability on conference operators who edit than it would on those who don't edit. Stratton Oakmont, Inc. v Prodigy Services Co., a libel case which held Prodigy to a higher standard than CompuServe be-cause Prodi-gy edited and CompuServe didn't, is a case in point. Formally, the court claimed it was sim-ply determining whether the better analo-gy for Prodigy was the bookstore (a distributor) or the newspaper (a publisher). ³Prodigy's con-scious choice, to gain the benefits of edito-rial control,² the court concluded, ³has opened it up to a greater liabil-ity than . . . other com-puter net-works that make no such choice,² and than ³bookstores, libraries, and network affiliates.² The ³decision to regu---------late the content of its bulletin boards . . . simply require[s] that . . . [Prodigy] also accept the concomitant legal consequenc-es² ‹with edito-rial control comes increased liability. Prodigy was more like a pub-lisher than a distributor because it ³[had] uniquely arrogated to itself the role of de-termining what is proper for its mem-bers to post and read on its bulletin boards.² Now it's true that, under conventional neg-ligence principles, one's ability to avoid harmful conduct (here, defamation) is rele-vant to whether one has a duty to try to avoid it, so situations in which the operators have the opportunity to exercise editorial control may in-deed properly lead to greater liability. If operators actually read all incoming messages before distributing them to the conference, then it becomes more likely that they'll know about the defamatory state-ment; in that case, the operator may be liable regardless of whether it's viewed as a distributor. And even if the operator doesn't read the messages, the fact that it has an ³editorial staff . . . who have the abili-ty to continually monitor incom-ing transmissions² might make it fair to impose on it the duty to read the messages. But the Stratton Oakmont court didn't limit its discussion of Prodigy's editing to manual editing: It also referred to the auto-matic software screening program and to Prodigy's practice of deleting some messages after they've been posted, presumably once they have triggered subscriber complaints. These prac-tices don't change the cost to the opera-tor (and, indirectly, to its cus-tomers) of greater monitoring for libel; they don't change the benefits to defamation victims that such greater monitoring would bring. There's no inherent reason that these sorts of edit-ing decisions should affect the negligence calcu-lus. Thus, the court's decision wasn't an appli-cation of settled negligence principles. But neither was it an application of a set-tled libel law distinction between publishers and distributors. The court's mod-els of distribu-tors‹bookstores and network affili-ates‹do ex-ercise editorial control; they, no less than Prodigy, determine what their customers and viewers will see. They select which books they'll carry or which shows they'll broad-cast. Some-times they refuse to carry an item if it seems to them to contain offensive, ideologically unpalatable, or just unpopular material. They may lack the ability to edit books or TV shows line-by-line, but for prac-tical purposes Prodigy lacks this ability, too. Actually, Prodigy is much less selective about its posts than a bookstore is about the books it carries: Bookstores choose to stock only a frac-tion of all the books that are available to them, while Prodigy lets through virtually all the posts submitted to it. It therefore seems to me that the court's decision reflects not the com-mands of established libel or negligence principles, but rather a policy judgment about the propriety of editing. Con-sider the court's assertion that Prodigy ³has uniquely arrogated to itself the role of determining what is proper for its members to post and read on its bulle-tin boards²; its stress that editing ³may have a chilling ef-fect on freedom of communication in Cyberspace, . . . [a] chilling effect [which appears to be] exactly what PRODIGY wants²; and its two references to Prodigy's conduct as ³censor-ship.² The court seemed to be exact-ing great-er liability as the price for bad, or at least suspect, behavior. For the reasons I mentioned above, this is the wrong policy choice to make. Editing is a valuable service, and conference opera-tors shouldn't be discouraged from performing it. Depending on how you weigh the interests in private reputation and in un-inhibited speech, some sort of operator liability may be appropri-ate. The eco-nomic feasibility of editing might play a role in the balance, just as the economic fea-sibility of preventive measures is general-ly relevant in negligence analyses. But whether the opera-tor actually edits shouldn't affect the place the line is ultimately drawn. The Communications Act of 1996 can be read as prohibit-ing courts from penal-izing conference operators for edit-ing. The Act says: No provider or user of an interactive com-puter service shall be treated as the publish-er or speaker of any in-formation provided by another information con-tent provider. . . . No provider or user shall be held liable on account of . . . any action volun-tarily taken in good faith to restrict access to or avail-ability of material that the provider or user considers to be obscene, lewd, lascivi-ous, filthy, excessively violent, harassing, or oth-erwise objectionable . . . . The term ³in-formation content pro-vider² means any person or entity that is responsible, in whole or in part, for the creation of development of in-formation provided through the Internet or any other inter-active computer ser-vice. I think that the best reading of the statute is that no conference opera-tor shall be held liable as a publisher or republisher for defamatory speech by con-ference participants; that the operator's ed-iting can't be considered in determining whether he should be exposed to defama-tion liability; and that the ³or otherwise objec-tionable² clause gives protection to any of the operator's edit-ing choices. On the other hand, one can at least argue that: (1) a confer-ence operator can be held liable as transmitter of a defamatory state-ment on the same terms as the original speak-er, so long as state law does not label him a ³publisher² or a ³speaker²; (2) considering an operator's editing decisions as a factor in deter-mining the operator's standard of care does not constitute impos-ing liability ³on account of² his editing; (3) the ³or otherwise objectionable² proviso only protects editing choices that turn on the offensive form of speech and not, say, on viewpoint-based or subject matter-based editing choices; or (4) ³information content provider,² despite its breadth, doesn't include individual contribu-tors to a conference. These latter arguments, I believe, are something of a stretch, which is why I think the Communications Act does bar Stratton Oakmont-type reasoning. But there's enough ambiguity in the statute that the matter is not free from doubt. E. Edited Conference Groups on Public Com-put-ers or Run by Public Employees Many of the computers that make up the Internet are run by public institutions, gen-erally public universities. Many Internet electronic conferences are operated from those computers, and many are operated by public employees, especially academics. Does the editing of such conferences stand on a different footing from the editing of private conferences? Does the public-forum doctrine, for instance, make certain forms of editing uncon-stitutional? Conversely, does the government's ownership of the computers or control over its employees give it the pow-er‹even if not the duty‹to restrict their ed-iting activities? 1. No constitutional barriers to editing. A private person operating an electronic conference on a public computer is not bound by the requirements of the Free Speech Clause; he may restrict speech even based on its view-point. Private speakers don't become state actors just because they're speaking on or using public property. This is most obvious with regard to the traditional public fo-r-um‹the organizers of a rally or a parade may control the speech that goes on there, even though they're using pub-lic prop-erty for this speech ‹but it should be equally true for other public prop-erty. A student group meeting in a public school, for instance, should still be able to control its own speaker selection. Use of a public class-room, or of time and space on a public com-put-er, is a valuable government sub-sidy, but tak-ing subsidies (even large ones) doesn't turn a private organization into a state actor for Free Speech Clause purpos-es. So long as the group's speech-based decisions aren't dic-tated by the government, there's no state action. The same should be true when the subsidy comes in the form of the government letting an employee edit a conference on gov-ernment time. Books or journals edited by public university pro-fessors, for instance, have never been thought of as involving state action. Certain-ly the editors routinely make viewpoint-based decisions about what gets published and what doesn't, something state actors generally can't do even in a nonpublic forum. A government employee isn't always a state actor, even when he's acting on government time. So long as he isn't ³exercising power `possessed by virtue of state law and made possible only because the wrongdoer is clothed with the authority of state law,'² he's acting as a private person. Like a public de-fender, a professor editing an electronic con-ference is engaging in ³essen-tially a private function, traditionally filled by [private persons], for which state office and authority are not needed,² though in this case state dollars are paying his salary. 2. No constitutional right to edit. On the other hand, the government may let someone set up an electronic conference on its computers, or allow him to operate it on gov-ernment time, only on condition that he not edit, or that he edit only in certain ways. This is akin to the government's power to cre-ate a designated public forum limited to the discus-sion of particular subjects. The govern-ment may conclude that, if it's going to let public property be used, it should be used only in the way that best serves the pub-lic‹for instance, for a confer-ence that's open to everyone, or a conference that, even if edited, is edited only in viewpoint-neutral ways. Nonetheless, I'd recommend that the govern-ment be hesitant to restrict editors' power. As I mentioned above, even viewpoint-based editing may often create a more valuable speech product. For in-stance, a scholarly bi-ology list that accepts messages based on evo-lutionary premises but refuses to accept mes-sages based on creation science premises would probably be discriminating based on view-point: The theory that man was created directly by God is certainly an alternate viewpoint to the theory that man naturally evolved from other animals. But such a restriction may well be quite appropri-ate. Even an open-minded group of scientists may reasonably con-clude that they'll use one theory as their operating as-sumption. At that point, further arguments based on other theories may just become dis-tractions from the business at hand. The sci-entific community, it seems to me, is better served by one conference devoted to discussion of biological problems from an evolutionary perspective, another from a creation science perspec-tive, and a third for arguing about which is the better perspec-tive, than by three conferences on which creationists and evolu-tionists fight it out. If the concerns about limited access can be allayed not by restraining the editors but by providing more discussion lists on the same topic (but edited in different ways), then providing more lists should be the preferred alternative. Providing the extra list doesn't cost anything by itself; the list is just another entry in the computer's tables. Sometimes add-ing the new edited list may lead to more mes-sages coming through the computer, but often it won't: The separate edited lists may end up having fewer band-width-consuming flame wars and fewer digressions. Editing is a good thing; as a general rule, govern-ment computer owners (and especially academic institutions) should encourage it, not discourage it. II. GOVERNMENT PROTECTION OF LISTENERS AGAINST OFFENSIVE MESSAGES Another desire of listeners is a pleasant, polite speech envi-ron-ment, both one in which they aren't personally insulted, and one in which they don't have to hear more general statements that they might find offensive (for instance, because the state-ments are profane, racist, or sexually explicit). As I argued above, editing decisions by confer-ence owners can be valuable to listen-ers for precisely this reason. Just as many people prefer a ³family newspaper,² or a newspaper run by po-lite editors rather than by bigots or fanat-ics, so many people would prefer a list where flame wars and other forms of abuse are screened out, or at least quick-ly suppressed. But what if the conference opera-tors choose not to intervene‹if they decide not to edit generally, or if they agree with the abusive messages or at least find them valuable enough not to edit out? A. Protecting Some Listeners Without Bur-den-ing Other Listeners In my view, the government may generally restrict speech to protect unwilling listeners only if the restriction doesn't interfere with the flow of speech to willing listeners. Thus, speech on elec-tronic conferences should be protected even if it's offen-sive, in-sulting, pro-fane, or bigoted, because restricting such speech would ³permit[] majoritarian tastes . . . to preclude a protected message from [reach-ing] a recep-tive, unoffended minor-ity.² In this, electronic conferenc-es are like billboards, demon-strations, and news-pa-pers. The Court has made clear that re-stricting offen-sive speech in these media would impermissibly impoverish pub-lic dis-course, and there's no reason the rule should be differ-ent online. The interests of the speaker and of the willing listen-ers must, I believe, prevail over those of the offended listener. On the other hand, some restrictions on unwanted one-to-one communications, such as physical mail, phone calls, and e-mail, should be constitutionally sound. For one-to-one com-munications, it's possible to create laws that are ³narrowly tailored to protect only unwill-ing recipients of the communications.² A law that, for in-stance, stops mailers from sending material to people who've already ex-pressed a desire not to get it is constitu-tional; the same should be true for e-mail. Of course, speakers might still want to communicate even to unwilling listeners, and imposition on a speaker's self-expression ought not be taken lightly. Nonetheless, especially when a listen-er has already told the speaker that he's not interested in hear-ing more, I don't believe the speaker's desire to keep talking should be treated with much solicitude. In such a context, the speech is likely only to annoy or offend, and not en-lighten or persuade anyone. I agree with the Court that ³no one has a right to press even `good' ideas on an unwilling recipi-ent,² so long as the unwilling recipi-ent is the only listener involved. Space constraints keep me from defending this theoretical posi-tion in detail here, though I've talked about it at some length else-where. Moreover, a broad theoretical defense is probably prema-ture, since there've been few explicit pro-posals for regulat-ing of-fensive online speech (other than sexually-themed speech, which I discuss in Part III). Instead, I'll focus more specifically on the two sorts of fairly broad restrictions on online speech that may already exist, though they generally aren't enforced this way today: telephone harassment stat-utes and hostile-en-vironment harassment law. B. Telephone Harassment Laws In recent years, some telephone harassment statutes‹which are today generally used to stop indecent, threatening, and other-wise an-noying phone calls‹have been specifically ex-tended to online communications. Others had always been broad enough to include online messages. These statutes vary by jurisdiction, but they tend to prohibit some mix of the fol-lowing: ‹ Threats, a prohibition that general-ly rais-es no First Amend-ment prob-lems. ‹ ³[R]epeated telephone calls with intent to annoy another person,² some-times limited to phone calls di-rected to the other person's home or work. ‹ Use of ³indecent or obscene lan-guage,² some- times with ³intent to annoy, abuse, or harass.² ‹ Any communication made ³with intent to ha-rass, annoy or alarm another per-son,² some-times lim-it-ed to those made ³in a manner like-ly to cause an-noy-ance or alarm.² ‹ Anonymous communications made ³with intent to annoy, abuse, . . . or ha-rass.² These statutes have generally been upheld against First Amend-ment challenges, a result which is defensible (though, as I discuss be-low in Part II.E., still problematic). But extending them literally to online commu-nica-tions causes significant prob-lems. C. ³Electronic Harassment² in Electronic Con-ferences Consider, for instance, the Connecticut telephone harass-ment statute, which has re-cently been amended to say: (a) A person is guilty of harassment in the second de- gree when: . . . (2) with intent to harass, annoy or alarm an-other per- son, he communicates with a person by telegraph or mail, . . . by computer net-work . . . or by any other form of written com-muni-cation, in a manner likely to cause annoyance or alarm . . . . Read literally, the statute would prohibit me from posting any message to an electronic con-ference ³with intent to . . . annoy² one of the participants (say, someone with whom I'm arguing). After all, by posting a message to a conference, I'm communicat-ing with each of the conference participants; for instance, if the conference is a distribution list, I'm causing a message to be e- mailed to everyone on the list‹³communicat[ing] with² each of them ³by computer network.² This includes the person I'm trying to annoy. How broadly ³annoy² would be read is anybody's guess, but a lot of things said in online conversation are intended at least in part (and sometimes entirely) to annoy one's opponents. Perhaps electronic conferences would be better if everyone intended only to enlighten, and never to annoy, but annoying and offensive speech is nonetheless constitu-tionally protected. Leaflets, news-paper arti-cles, books, and movies can all be annoying (sometimes intentionally) to parts of their audiences; despite this, it seems clear that a ban on ³harassing, annoying, or alarming² speech in them would be unconstitutional. I see no reason why elec-tronic messages should be less protected. The problem is that a statute which origi-nally applied to one-to-one communications is being applied to one-to-many com-munica-tions. Keeping me from sending annoying messages to one particular person doesn't severly restrain public discourse; if the message is meant to irritate the recipient, it's unlikely to per-suade or enlighten him. Its only likely consequence is annoy-ance. But in a one-to-many context, a message that's annoying, even in-tentionally so, to one per-son may indeed be valuable to others. I'm not sure that the extension of tele-phone harassment laws to online communications was meant to cover electronic confer-ences. Quite possibly the drafters of the laws were only contem-plating direct e-mail, a one-to-one medium not much different from conventional phone calls. In this context, as I mention be-low, restrictions similar to those imposed on telephone harass-ment might indeed be per-missible. But whether intentionally or not, some of the laws on their face sweep consider-ably further than they should. D. Hostile-Environment Harassment Hostile-environment harassment law is a very different crea-ture from telephone harass-ment law, but it too might have unex-pected consequences in cyberspace. The most familiar form of hostile environ-ment harassment is workplace harassment: speech or conduct that is ‹ ³severe or pervasive² enough to ‹ create a ³hostile or abusive work environ-ment² ‹ based on race, sex, religion, national ori-gin, age, disability, veteran status, or, in some jurisdictions, sexu-al orien-tation, citi-zen/alien status, political affilia-tion, mari-tal status, or personal appearance ‹ for the plaintiff and for a reasonable per-son. An employer is liable for hostile envi-ronment harass-ment perpetrated by its employees‹and even its cus-tom-ers ‹so long as it knows or has rea-son to know about the conduct. This is a broad definition, and it has in fact been applied to a broad range of speech. A state court, for instance, has held that it was reli-gious harassment for an employer to put religious articles in its employee newsletter and Chris-tian-themed verses on its pay-checks. The EEOC has likewise concluded that a claim that an em-ployer permitted the daily broadcast of prayers over the public address system was ³sufficient to allege the existence of a hostile working environment predicated on religious discrim-ina-tion.² Similarly, a court has characterized an employee's hanging ³pictures of the Ayatollah Khome[i]ni and a burning Ameri-can flag in Iran in her own cubicle² as ³national-origin ha-rassment² of an Iranian employee who saw the pictures. Courts have used ha-rassment law to enjoin people from making ³remarks or slurs contrary to their fel-low employees' re-ligious beliefs,² dis-play-ing materials that are ³sexually suggestive [or] sexually demean-ing,² or uttering ³any racial, eth-nic, or religious slurs wheth-er in the form of `jokes,' `jests,' or otherwise.² A fed-er-----al agen-cy has likewise characterized anti-veteran postings at Ohio State University as ha-rass-ment based on Vietnam Era veteran sta-tus. Hostile-environment law may even cover coworkers' use of job titles such as ³foreman² and ³draftsman,² sexually themed (but not misogynistic) jokes, and ³legiti-mate² art. The con-sti-tutional-ity of workplace harassment law is being hotly debat-ed, but as of this writing the risk of ha-rassment lia-bili-ty is certainly a fact of life. What does this have to do with the Internet? Well, the foun-dation of workplace harassment law is the theory that harass-ment is itself discrimination: the denial to cer-tain people of a particular kind of employment benefit‹a tolerable work environ-ment‹based on their race, sex, and so on. This theory is equally applicable to other discrimination statutes, including statutes that bar discrimination in places of public accommodation. Some statutes make this explic-it, prohibit-ing, for instance, ³communication of a sexual nature² that creates ³an intimi-dating, hostile, or offensive . . . public accommoda-tions . . . environ-ment.² Other statutes that speak only of dis-crimina-tion have also been interpreted as barring harass-ment: For instance, a recent Wiscon-sin admin-istrative agency decision has concluded that an overheard (but loud) discussion that used the word ³nigger² created an illegal hos-tile public accommoda-tions environment for black patrons, even though the statements weren't said to or about the patrons. Like-wise, the Minnesota Supreme Court has held a health club liable for creating a hostile pub-lic ac-commo-dations environment, based on the club's owners ³belittl[ing]² a patron's reli-gious views (expressed in a book the patron had written) and ³lectur[ing] her on fun-damen-tal-ist Chris-tian doctrine.² And it's fair-ly well-estab-lished that other antidiscrimination statutes, which ban dis-crimination in educa-tion and housing, also apply to hostile envi-ronment harass-ment; it stands to reason that the same would be true for pub-lic accom-modations statutes. As Part I.B.2 discusses, it's eminently plausible that com-mercial online services would be considered places of public ac-commo-dation. Given that some judges have seen even noncom-mercial establishments‹such as parades, the Boy Scouts, and private clubs‹as places of public accommodation, Prodigy, Coun-sel Con-nect, and others would quite likely quali-fy. At least one commentator has in fact suggested this very point. Say, then, that someone frequently posts slurs or sexual jokes or sexu-ally explicit messages or sexist or racist or anti-veteran or reli-giously bigoted statements or even religious proselytizing to an electronic conference. In the eyes of some factfind-ers, such mes-sag-es may well create a ³hostile or abusive² environment for some of the conference participants. If the conference operator has the power to do something about this‹for instance, if the conference is moder-ated but the moderator lets these messages through, or if the operator can kick off the offender but refuses to do so‹the speech could give rise to liability. The best real-life online example of this came in the context of hostile educational environment law. In late 1994, in the wake of a controversy about an allegedly sexist ad in the Santa Rosa Junior College newspaper, some students posted sexist remarks about two fe-male student newspaper staffers on a college-run electronic confer-ence. Though the fe-male students didn't see the message, they eventually learned about it, and when they did, they filed a complaint with the U.S. De-partment of Education's Office for Civil Rights. The Office concluded that the messages were probably ³so severe and pervasive as to create a hostile [educational] environ-ment on the basis of sex² for one of the students. A college tolerating speech that creates a sexu-ally hostile educational envi-ronment would, in the Office's view, violate Title IX of the Civil Rights Act. If this is so, then a service provider tolerat-ing sim-i-lar speech on its computers would probably be violating pub-lic accommodations statutes. I believe these sorts of speech restric-tions are generally un-constitutional, entirely so in the educational and public accom-moda-tions contexts, and partly so in the workplace context. I don't want to go into the details of this here; the arguments have been amply discussed elsewhere. Put briefly, I can't deny that ³hostile or abusive² speech can greatly diminish the value of an online con-ference‹public or university‹for those who are offend-ed; but such speech, even racially, re-ligiously, or sexually bigoted speech, is pro-tected by the Free Speech Clause from gov-ern-ment abridgment. It's protected on side-walks, in private homes, in the pages of news-papers. Despite the recent spate of campus speech codes, courts have held that it's pro-tected in universities. There's no rea-son it shouldn't be equally protected in Prod-igy, Counsel Connect, and the like. Some courts have been willing to uphold hostile-environment harassment law in the workplace, though it bears emphasizing that others have suggested that even there it may face substan-tial constitu-tional prob-lems. But this has been in large part be-cause of their view that ³the workplace is for work-ing,² not for debate. Electron-ic conferenc-es are created precisely for de-bate. Whatever the constitu-tional status of workplace harass-ment law, such speech restric-tions in places devoted to communi-cation can't be valid. But though I'm confident that most restric-tions on harassing speech will ultimately be struck down, the fact remains that they are something of a growth field in free speech law today, and that they enjoy a good deal of support. Hostile environ-ment-based restric-tions on online speech are likely to arise with some frequency in coming years. E. One-to-One Online Harassment Restrictions on some one-to-one messages‹such as personal email‹are at least theo-retically more defensible because they help insulate unwilling listeners while still protecting the right to communi-cate to will-ing ones. Some telephone harass-ment laws have been upheld precisely on these grounds. Still, even in the one-to-one context, these laws pose signifi-cant problems. ³Annoy-ing,² ³harassing,² and ³indecent²‹words the laws use to define the speech they bar‹are vague terms, and fairly read, the laws can sweep quite broadly. If an acquaintance of mine has botched a task I asked him to do, and I phone him or e-mail him and say, ³You idiot, you really fucked this up,² I may well have committed harassment; I've said something that's argu-ably both ³annoying² and ³inde-cent,² I've said it with the in-tent to an-noy (and perhaps ³harass,² whatever that means), and my statement is in fact likely to an-noy. Under many telephone harassment stat-utes, I've committed a crime. Likewise, I edit an electronic poetry jour-nal, and our sub-scribers sometimes send us messages about our poems. One such message said nothing more than ³Your poems suck!² Under the Connecticut law, send-ing that mes-sage may well have been a misde-meanor; the message was likely to annoy me and could have been intended to do so (as well as to communi-cate the sender's views). I can't claim that the messages in these exam-ples are of remark-ably great First Amendment value, but it isn't clear that this sort of speech should be crim-inal. Some of the broader telephone harassment statutes have been held unconstitutional for this very reason. Other stat-utes have been upheld, especially if they've required that the speech be intend-ed to annoy (which not all statutes do); and some commenta-tors argue that this intent requirement should help save the law from inval-idity. But I'm not sure that even those statements made with the in-tent to annoy should be con-sidered crimi-nal‹as my example above shows, many of us might say such things in an exasperated mo-ment, with little likely harm to anyone. In practice, of course, telephone harass-ment laws are consid-erably less menacing than their language suggests. It takes con-scious effort to make an annoying call to a stranger; few people complain to the police about the occasional annoying call from an acquaintance; in many such situations, prosecutors may de-cide not to prosecute; and it's often hard to prove what the caller actually said. In some respects, these checks might limit the law's reach to only the most serious situations. But the e-mail environment changes some of these condi-tions. While annoying phone calls are usually deliberate pranks, thought-through insults, or conscious attempts to menace, an-noying e-mail can easily happen on the spur of the moment. A person sees a message he dis-likes on an electronic conference, and in a few seconds he can send an angry retort‹one intended to and likely to annoy‹directly to the author's mail-box. And because e-mail, unlike phone calls, leaves a written record, its content is easy to prove. The person who sent me the ³Your poems suck!² e-mail prob-ably wouldn't have called the publisher of a print magazine to say the same thing. The ease with which one can reply to an e-mail makes such replies spontaneous and sometimes rude. This sort of conduct seems less like a deliberately harassing phone call, and more like the annoying words said in pub-lic‹one can imagine someone saying the same thing at a poetry reading in a coffee-house‹which are generally not punishable un-less they're likely to cause a fight. In the great ma-jority of cases, recipi-ents will still not complain and prosecutors won't pros-ecute, but in my view these shouldn't be the only barriers be-tween a basi-cally decent computer user and a misdemeanor convic-tion. F. The Continued Unwanted Contact Model Instead of outlawing a particular catego-ry of speech‹the annoy-ing or even the intentionally annoying‹a better solution might be to leave the decision in the listener's hands. If you don't like what I'm e-mailing to you (and to you alone), you should be able to de-mand that I stop. The demand would put me on notice that my mes-sages have gone beyond the tolera-ble; the law might even require that the no-tice specifically alert me that further con-tact is illegal. Such a law would thus avoid criminalizing the occasional intemperate out-burst, while still giving annoyed recip-ients the power to demand some peace. The Court has already, in Rowan v United States Post Office Department, upheld such a law for physical mail. The law pro-vided that, if any householder concludes that a sender has been send-ing him sexually sugges-tive advertisements, he can notify the Post-master General that all further mailings from the sender must stop. The Postmaster General would then order the sender to remove the re-cipient from its mailing list; if the sender kept sending material, it would be committing a crime. Though the law focused on sexually sugges-tive material, the Court didn't ground its (unanimous) conclusion on any supposed ³lower value² of such material. In fact, it was clear that the householder could label any material, including political ads and dry goods cata-logs, as being sexually suggestive: The householder's judg-ment was unreviewable, and the very unreviewability of this judgment was, in the Court's view, central to the law's va-lidity because it kept the decision about what's offensive out of the government's hands. Though the law kept the sender from ³speaking² to the recipient, the Court con-cluded that ³no one has a right to press even `good' ideas on an unwill-ing recipi-ent.² This logic seems eminently applicable to e-mail; I see no reason for treating electronic communications differently from paper ones. True, you can easily delete e-mail, but you can easily throw out paper mail. You can usu-ally even set up your e-mail program to auto-matical-ly delete messages that come from a par-ticular address, but you can likewise throw out unread letters that come from a particular place. The auto-delete capability might make a Rowan-type law less neces-sary because the mes-sages would get deleted without the recipient even knowing they arrived; still, the sender could get around this by using remailers or other means that hide his address. The reasoning in Rowan rested in large part on the privacy of the home; the unwanted con-tinued mail was seen as particu-larly offensive because it intruded on this privacy. One might therefore ar-gue that any Rowan-inspired law for e-mail should only apply to e-mail that's received at home. But I don't think the concern about unwilling recipients of mail or of e-mail should be any different when they pick up the message at school or at work. One of the great features of the new tech-nologies is that one's physical location no longer matters. A person can pick up messages from the office one day and from home the next. The emotional and dignitary burden to listeners of having to see mes-sages to which they've already lodged their objections is the same whether they're in the home or in their office. The burden to speakers of having to stop talking to the listeners is likewise the same regardless of where the offended listener is located. So long as the speakers retain the ability to communicate to unoffended listen-ers, the offended listener should be entitled to demand that communication to him stop, whether he's at home or not. To make sure that speakers do retain the ability to commu-nicate to others, the law should be clearly limited only to those situa-tions where the sender can personally choose whether the message is to go to the recipient. For instance, if the message is e-mailed through a distribution list whose membership is outside the sender's control, then the mes-sages to the recipient can't be suppressed without restrict-ing the messages to other list mem-bers. The law should also make sure the sender gets adequate notice; a simple ³Oh you idiot, stop bothering me² should proba-bly not be seen as a legally enforceable bar to all future commu-nications. But if someone clearly says they don't want any more e-mail from the sender‹and e-mail programs can easily have the proper language programmed into them so that a sender can send the right form (which would also be saved somewhere for evidentiary pur-poses) at the touch of a button‹there should be no constitutional difficulty with the law enforcing such a demand. All this merely shows that the law is con-stitutional, not necessari-ly that it's wise. It's not clear that we need to bring the might of the criminal law down on people just for sending repeat-ed unwanted messages, whether they are ads or personal insults or just pranks. And of course, certain kinds of un-wanted messag-es, such as threats or extortion attempts, would be barred any-way by other laws. But there are contexts where repeated un-wanted messages can indeed cause more than just mild annoyance. A Rowan-in-spired law would apply only when the recipient has spe-cifically asked the sender to stop. When some-one keeps contacting you despite your specific requests to the contrary‹when he knows you're so uninterested in hearing that you've taken the trouble to tell him to stop talking‹this might reasonably cause you some alarm. This can be especially so for repeated romantic overtures; the recent spate of stalking legis-lation reflects the fact that per-sistent un-wanted romantic interest, expressed in a con-text where it's clearly not reciprocated, can be quite discon-certing even if no specific threat is present. III. SEXUALLY EXPLICIT MATERIAL AND MINORS Besides wanting to control their own speech diet, people may also want to control that of their children, especially with respect to sexually explicit materials. Some are skepti-cal of the need to do this, and compared to the threats to which children are ex-posed in the physical world, the danger of exposure to explicit online material indeed seems slight. Still, many people are con-cerned about it; the legal system has generally decided that they have a right to translate their concerns into law; and outside the online world, laws that restrict minors from accessing sexually ex-plicit material have generally been upheld. For purposes of this discussion, I'll assume that preventing children from seeing such ma-terial is indeed an important goal. A. The Potential Restrictions The most prominent restriction on sexually explicit material on-line is, of course, in the Communications Decency Act of 1996. The Act makes it a crime for people to ³use[] any interactive computer service to display in a manner avail-able to a person under 18 years of age, any [material] that, in context, depicts or de-scribes, in terms pa-tently offensive as mea-sured by contempo-rary com-munity standards, sexual or excretory activities or or-gans.² Many electrons have been spilled regarding the Act, and the Court is scheduled to pass on its constitutionality this Term in Reno v. ACLU. I don't intend to specif-ically focus on it, though some of the things I say below may ap-ply to it. Rather, I want to focus on a different kind of statute. Many states have laws that bar ³public display² of certain kinds of sexually explicit matter, and many such statutes have been up-held against First Amendment chal-lenge. For instance, Geor-gia Code § 16-12-103 pro-vides, among other things, that: (e) It shall be unlawful for any person know-ingly to . . . display in pub-lic . . . at any . . . public place fre-quent-ed by mi-nors or where minors are or may be invited as part of the general public: (1) any [visual representation] . . . which depicts sexu- ally explicit nudity, sexual con-duct, or sadomasoch- istic abuse and which is harmful to minors; or (2) any [printed or audio material] which con-tains . . . explicit and de-tailed verbal de-scriptions or narra-tive accounts of sexual excitement, sexual con-duct,or sadomasochistic abuse and which, taken as whole, is harmful to minors. ³Harmful to minors² is defined as: that quality of description or represen-ta-tion, in whatev-er form, of nudity, sexual con-duct, sexual excitement, or sadomasochistic abuse, when it: (A) Taken as a whole, predominantly appeals to the prurient, shameful, or morbid interest of minors; (B) Is patently offensive to prevailing stan-dards in the adult community as a whole with respect to what is suitable mate-rial for mi-nors; and (C) Is, when taken as a whole, lacking in se-rious liter- ary, artistic, politi-cal, or scien-tific value for mi-nors. Under this definition, even material that's not obscene as to adults is barred from public places. If‹as seems quite possible‹electronic plac-es that are acces-sible to the public (such as conferences or Web pages) are found to be ³public places,² then many items that are posted on the Net are al-ready in violation of Georgia law, and of similar laws in other states. And the law isn't just limited to obscenity or even ³pornography² (whatever that may be). It could equally apply to ³le-giti-mate² art in an online gallery, or to a person's favorite Mapplethorpe photo on his Web page, or to online sex-talk groups, or conceivably even to groups that talk clinical-ly about sex. The problem is that, online, most plac-es‹electronic confer-ences, Web sites, and so on‹are ³frequent-ed² by at least some minors. And while offline one could put the explicit material in a separate room and check people's IDs be-fore one let them in, there's no similar mechanism online. Internet newsgroups are accessible to everyone. So, generally speaking, are discussion lists. While Web pages can theoretically ask for a credit-card number before they let a person into partic-ular places, that won't work if someone wants to display material for free. The Georgia statute and others like it might thus do some-thing quite similar to what the Communications Decency Act is criti-cized for do-ing‹they might bar a great deal of sexually themed mate-rial from most freely available places on the Net. B. The Least Restrictive Alternative Re-quire-ment In my view, the state public display laws‹and the similar Communications Act provi-sions‹are unconstitutional when ap-plied online, because there are means of protecting children in cyberspace without unduly limiting adults. In Sable Communications v FCC, the Su-preme Court made clear that the government may not, in trying to protect children, bar sexually-explicit speech generally when it could im-ple-ment a less restrictive alternative that restricts only children. If such an alterna-tive exists, and if it serves the compelling inter-est as well as would a total ban, then the ban is unconstitution-al. Butler v Mich-igan, which struck down a law that tried to ³shield juve-nile innocence² by ³reduc[ing] the adult popula-tion . . . to read-ing only what is fit for children,² sug-gests the same. Formally, the rule that the government may prohibit distri-bution to minors of materials that are harmful to them doesn't include such a proviso; when distributed to minors, materi-al that's harmful to minors is considered es-sentially obscene and lacking in First Amend-ment value. But if a law that bars dis-tri-bution to minors also interferes with dis-tri-bution to adults, the government should have to show‹as in Sable‹that the law is the means that's least restrictive of adult ac-cess. As to adults, of course, nonobscene material is constitution-ally pro-tected. C. Ratings

Fortunately, technology may provide part of the solution here. Unlike in the offline world, a child's online eyes‹the soft-ware that retrieves and displays the material‹can be set up to screen out unwanted material. The diffi-culty comes in recogniz-ing it: To a computer, a Mapplethorpe and Mickey Mouse are both just 1s and 0s. There has to be (as in the offline world) some human agent that identifies what's suitable for minors and what's not. 1. The clean list/dirty list models.

One possible solution is for someone to provide a list of online locations‹World Wide Web sites or Internet electronic conferenc-es‹that have been checked and certified ³clean,² to-gether with a Net access program that allows access only to those locations. The pro-gram can, when it's first set up, ask the buyer (presumably the parent) to configure a password; then, whenever it's run, it will ask for the password, and if the right pass-word isn't entered, it can run in clean-only mode. Alternatively, if someone comes up with a list of places that are ³dirty,² the soft-ware can allow access to all places except the dirty ones.

There are already several commercially available products (often called filters) that, among other things, maintain lists of dirty sites and prohibit access to them. The best-known one is called SurfWatch; it costs about $50, and for $5.95 per month a cus-tomer will get the list of dirty sites con-stantly updated. SurfWatch in fact employs people to look through the Net, always on the lookout for new bad locations. If the govern-ment wanted to, it could buy out SurfWatch or one of its competi-tors‹for a fraction of the cost of legally enforcing any sort of speech restriction‹and then distribute the software for free to all users.

This need for updates, of course, shows a weakness of the dirty list approach. The Net is a constantly changing environ-ment; new Web pages and discussion groups are constantly being added, and existing resources are con-stantly being changed. It may take the filter distributors a while to detect the changes.

The clean list model does promise to shield children almost perfectly from harmful-to-mi-nors material, because children would be able to access only those pages that the filter distribu-tors have screened. Leakage of dirty material could happen only when a clean page is modified after the filter people check it.

But precisely because a child can see a page only if it's been certified clean, any clean list program will give children access to only a fraction of the clean material on the Web. Screeners al-most certainly couldn't check more than a small fraction of exist-ing Web resources, and any new resources‹including new pages at existing sites‹might go unchecked for quite a long time. Many parents, even those who want to shield their children from harmful matter, might be un-derstandably reluc-tant to restrict their children's access that greatly.

Courts might, despite this, still conclude that a clean list is a less restrictive but equally effective alternative to a ban: After all, the clean list will effectively serve the primary interest, shielding children from ma-terial that's harmful to minors. The fact that it does this only by insulating children from a lot of other material, the argument would go, should be ignored for purposes of this inquiry. The cases neither clearly endorse nor clearly foreclose such an argument, largely be-cause there've been so few cases that have seri-ously elaborated on the meaning of ³less re-strictive alternative.²

It seems to me, though, that such an argument is ultimately unpersuasive. A proposed alter-native may aptly be called ³not equally effec-tive² if it serves the compelling interest only through an unacceptable sacrifice of other impor-tant interests. For exam-ple, say that ³clean list² filters were unavailable, and oppo-nents of online speech restrictions sug-gested that the less restric-tive al-ternative would simply be parents keeping their kids off the Internet. This alterative would indeed shield children quite well, but at an unac-ceptable cost. The clean list model is of course quite different in degree from keeping kids off the Net altogether, but it seems sim-ilar in kind.

Finally, the clean list model appears quite inadequate to the task of filtering newsgroup posts and discussion list messages. The dis-tribu-tors of clean list software can identify a newsgroup that's mostly clean‹that generally doesn't carry any harmful-to-minors materi-al‹but there can be no assurance of this; any-one can post something dirty at any time.

Ultimately, even all these considerations might not be much of a problem. The various filters may catch the great majority of of-fending sites, and, after all, restrictions on the distribution of pornography to minors out-side the electronic world are also noto-riously imperfect.

Nonetheless, the fact that the filters will miss some sites makes it possible to argue that these programs are not the sort of less restrictive alternative that would invalidate a total ban‹that they are not as effective as a combination of the ban and whatever techno-logical assists (such as filters) that parents can use. Under the strict scrutiny test the Court applied in Sable, a law is unconstitu-tional if there are other means that are less restrictive but as effective as the ones being chal-lenged. If the proposed means ³fall short of serving [the] compel-ling interests,² then the challenged law may be constitutional.

2. The ratings model.

There is, however, a possible supplement to the clean list and dirty list approaches that would probably be at least as effec-tive as a total ban: A rating system by which people can self-identify explicit material that they make available.

On the computer, pictures are stored in files that are orga-nized in a conventional way. If one establishes a convention that a small part of every file contains a marker indicating whether the file contains sexually explicit material, then the programs that read these files can also look at this marker. If the program is running in child-only mode (something the parents can set), then the pro-gram would refuse to display the file. Even text posted to electronic conferences might be specially labeled so that programs which read these conferences can filter it out.

The rating system might even rate the pic-tures in finer-grained ways, for instance for degree of sexual explicitness, or violence, or what have you; then the parent could set what-ever threshold rating he wants. Of course, if the system gets too de-tailed, the risk of mis-label-ing might increase.

3. Enforcing the ratings system.

The ratings system can work only if the ratings are accurate. The law might make the system ³voluntary² in the sense that no one has to attach a rating to his pictures; the child-only programs would then just view any unrated picture as presumptively dirty. But there'd still be a problem if a picture that should be X-rated is self-rated as G.

One can question the propriety of using the criminal law in this context‹I agree with Jus-tice Stevens that distribution of obscenity should be at most civilly punishable, and I'd say the same about mate-rial that's harmful to minors. But I think that, under the existing doc-trine, it would be constitutional to criminalize the display of any harm-ful-to-minors material which does not carry a correct rating.

The courts have already agreed that it's permissible to bur-den such speech with re-strictions on its distribution to minors, and with restrictions on its display in places where minors may be present. Requiring that it be labeled seems to me no different than re-quiring that it be put on a separate shelf which minors can't read. Of course, if a defendant makes an error in labeling, he may be found criminally liable. But the same is true under the laws upheld in Ginsberg: If a defen-dant errone-ously decides that material isn't harmful to minors, he may end up in jail.

4. Why this is at least an equally effec-tive alternative.

The rating system would, of course, not be perfect at screen-ing out material that's harm-ful to minors. But it would generally be no worse at this than would an outright criminalization of Internet dis-plays of such material. Under a rating system, minors would still be able to get access to illegally misrated materials, but under a general prohi-bition, minors would be able to get access to illegal-ly posted ma-terials. If posters aren't de-terred by the risk of criminal punish-ment for misrating, it seems unlikely that they'd be deterred by the risk of criminal punishment in a general prohibition.

Likewise, the ratings laws won't protect children from mate-rial posted by foreign or anonymous sources; but, of course, nei-ther would a total ban. Foreign and anonymous posts are the weak point of any system for control-ling cyberspace materials. Any domestic con-trol regime‹of pornography, gambling, or what have you‹will at most cut down on the number of places that contain the prohibited materi-al; it can't even come close to elimi-nating them.

Rating systems won't shield minors who get access to a com-puter that isn't running the screening software (for instance, a computer at a friend's house). In that respect, a ban might be more effective; by deterring the posting of certain materials, the ban would decrease the amount of harmful-to-minors matter available even to those children using an unshielded computer.

But balanced against that is the likelihood that more people will comply with a rating system than a general prohibition. People who post explicit material, even for free, presum-ably get some value out of doing it. Realisti-cally, criminal liability won't be much of a deter-rent‹criminal prosecutions will probably be rare, just as prosecutions for distributing material that's harmful to minors are rare now. For foreign and anony-mous posts, prose-cution may be next to impossible.

Given the low cost of violating the law, the important issue for many posters will be the cost of following the law. Many peo-ple might not want to stop posting explicit mate-rial and might therefore be willing to run the low risk of prosecution instead of complying with a total prohibition. But some of these people may gladly attach a ³dirty² rating to the material, because it will basically let them do what they want‹make the material available. Even people who face no risk of criminal liability may voluntarily comply with a ratings system; many of them might be quite happy to prevent children from accessing their posts.

If I'm right about all this, then a total ban on display of harmful-to-minors matter in places which might be accessible to minors would be unconstitutional. The alternative‹a ratings system‹would be both less restrictive and at least as effec-tive.

5. Community standards.

In all this, I've omitted discussion of varying community stand-ards. Under existing law, whether material is obscene (or is harm-ful to minors) is determined with reference to the standards of the geographical community to which it's being distributed. If a prod-uct is being sold by mail order or telephone, the burden is on the sell-er to deter-mine where it's going and what the relevant stan-dards are. In theo-ry, then, any mechanism for dealing with materi-al that's harmful to minors will need to rate each item separately for each possible juris-diction. This will be much more complicated than the normal clean/dirty determination (which itself might be complicated enough along the edges).

I don't talk much about this because the whole concept of community standards will in any event have to be rethought for cyber-space, both for conventional obscenity and for mate-rial that's harmful to minors. Online, of course, one can't tell a person's place of resi-dence any more than one can tell his age. Making distributors of pos-sibly obscene mate-rial‹either sellers or people who post it for free‹liable based on local community standards will reduce online sexually explicit speech to the level of the most restrictive community.

One way or another, then, a national online community stan-dard will be implemented. The only question is whether such a national stan-dard will be consciously chosen, or whether the most restrictive local community standard ends up becoming the national one by default. In any event, others have discussed this point at length, and I would rather refer the reader to their treatment of it.

CONCLUSION

The new information technologies can be a boon for the marketplace of ideas; speakers, listeners, and society at large can all profit from them. Free speech in cyberspace should be protected as zealous-ly as it is in the offline world.

But the more the new media are viewed as primarily a soap-box, the more likely they are to become as irrelevant as a soap-box. Cyber-space can only thrive if it provides lis-teners material that's valuable to them: mate-rial that's relevant, that doesn't make them feel annoyed or harassed, and that doesn't make them afraid to give access to their children, who may have a great deal to gain from the online world.

The difficulty, as always, is accommodating the interests of speak-ers, of listeners who want to hear what a particular speak-er says, and of listeners who don't want to hear it. And a general preference for the interests of speakers, at least where govern-ment regulation of the speaker's message goes, is justifiable. Despite my endorsement of some speech restric-tions, I remain generally skeptical about govern-ment speech regulations.

Still, while the Constitution might require that speak-ers prevail in many cases, there are ways in which listeners can also be proper-ly protected, even at some speakers' expense. Mod-est restrictions on persistent unwanted e-mails and modest re-stric-tions on sexu-ally explicit material that might be accessible to children are two such mechanisms; but of all the possible protec-tions for listeners, a thriving market of private editors is by far the most im-portant. The new age will indeed be, in Virginia Postrel's words, ³[t]he Age of the Edi-tor.² The more we inter-fere with this, the less valuable the new technolo-gies will be-come.

authors:
Larry LessigDavid PostEugene Volokh



Back to Lesson Index

Copyright © 1999 Social Science Electronic Publishing, Inc. All Rights Reserved