Usenet Death Penalty Coalition PICS a Fight with Spam

Tom W. Bell[*]

v. 9/27/97
Note: An earlier, unfootnoted version of this paper appeared in, Telecom. & Elect. Media News, Fall 1997, at 1, 4.

Spam, known among policy wonks and Hormel executives as "unsolicited commercial email," has raised hackles for a good many Internet years. On Friday, August 1, 1997, an informal coalition of system administrators attacked the problem of Usenet spam by taking matters into their own hands. They issued a "Usenet Death Penalty" (UDP)[1] against UUNet, a major Internet service provider that had ignored repeated requests that it curb the spam pouring through its dialup accounts.[2]

Without resorting to regulatory or legislative action, this UDP Coalition managed to convince UUNet to staunch its flow of spam. Satisfied, the Coalition lifted its UDP on August 6.[3] Despite this success--or perhaps because of it--the Coalition's strategy drew widespread criticism.

Most news accounts of this private attempt to curb newsgroup spam accused the UDP Coalition of unilaterally canceling third parties' messages.[4] Granted, the UDP Coalition stated that "all traffic coming from these sources is to be canceled until further notice."[5] In fact, however, the UDP had no power to force anyone to cancel UUNet postings.

The UDP Coalition merely generated cancel messages--labels, if you will--that corresponded to particular newsgroup postings--namely , those from UUNet. Each Usenet site administrator bore responsibility for honoring or ignoring the Coalition's cancellation request.[6] In this respect, the UDP more closely resembled a third-party rating system than it did vigilante censorship.

The UDP thus gives us an early example on Usenet of what the Platform for Internet Content Selection (PICS) will allow, for better or worse, on the World Wide Web and other parts of the Internet. PICS establishes a protocol for labeling Web sites or other electronic documents. Among uses for PICS labels, software such as SurfWatch or CyberPatrol could use them to block violent or indecent content.[7]

With regard to both the UDP and PICS, third parties can generate metadata (i.e., labels) that Internet access providers (under the UDP, Usenet site administrators; under PICS, various parties controlling access to the Web) choose to use or ignore. Intended recipients of filtered messages must, if they want more open access to the Internet, either find another channel, persuade the responsible access provider to implement a less restrictive filter, or implement some sort of hack around the access provider's barriers.

Of course UDP and PICS differ at a technical level. As social mechanisms, however, they effect similar results. The UDP Coalition represented a third party--potentially one of many--which offered its own, admittedly idiosyncratic metadata about UUNet transmissions; namely, "Cancel them!" The PICS labeling system likewise allows third parties to rate others' information. As Paul Resnick has noted, "PICS labels can be attached or detached. An information provider that wishes to offer descriptions of its own materials can directly embed labels in Web documents or send them along with items retrieved from the Web. Independent third parties can describe materials as well."[8]

Nobody forced or tricked site administrators into obeying to the UDP "rating" system. Granted, some Usenet cancellation strategies rely on inherently suspect forgery techniques (essentially, X saying, "I'm Y and I want to retract my earlier Usenet post.")[9] Forgery does not appear to have played any part in the call to apply the Usenet Death Penalty to UUNet posts, however. Rather, the UDP Coalition quite forthrightly proclaimed its intention to cut off UUNet in an emergency measure to save Usenet from spam overload.

Site administrators should recognize that a request for a UDP represents a severe measure, and would arguably act with negligence if they honored a UDP without due consideration. According to the Net Abuse FAQ, "[T]he general consensus among participants is that [a] UDP . . . should only be employed after every other method has been tried and failed."[10] Usenet site administrators should thus bear the responsibility for carefully evaluating UDPs. Willing ignorance of the reasons behind a UDP provides no excuse for automatically implementing it. The same would hold true in the PICS context of access providers who incautiously use labels to effect extraordinarily censorious results--e.g., a public library banning all but "Hillary-approved" web sites.

To fill out the parallel between the UDP and PICS, note that those who rely on a third parties' PICS labels will not always have full knowledge of the criteria used to generate those labels. Therefore, even if list administrators did blindly and non-negligently obey the UDP Coalition, they merely did what consumers of PICS labels will no doubt sometimes do as well. The problem of secret blacklists has already popped up in the context of filtering software, after all.[11] If PICS sees wide use in the consumer context, we can expect it to stir up similar controversies.

The problem of non-transparent criteria pervades metadata projects like software filters and PICS. Consumers of metadata inevitably cede part of the work of interpreting information to third parties. Indeed, consumers demand some opacity in this enterprise. It is by exercising their granted authority to pre-interpret information that metadata generators add value to information. If consumers of metadata services had perfect information about the criteria used in that pre-filtering, they would have no need for metadata. Every metadata project will thus entail a select amount of opacity (though how much opacity consumers want remains subject to trial-by-market).

If they demand opacity of interpretation criteria, how can consumers evaluate the performance of metadata services? They can do so by relying on reputational mechanisms--specifically, service marks that identify generally trusted third parties. You might buy metadata from the ACLU, or the Christian Coalition, or Disney, for example, on the strength of the reputations that their service marks convey. We will thus cure the opacity problem with data about metadata-or, if you prefer, meta-metadata.

In sum, the UDP Coalition's successful attack on spam relied on consensual acts among private parties, and it did so to at least as great a measure as PICS probably will. For better or worse, then, the UDP thus serves as something of a test run of the principles behind PICS. That the UDP Coalition represents one of many institutions in a civil society that can help to generate metadata about others' information does not, of course, make its actions wise or commendable. But it does excuse the Coalition from claims that it played the same role as a State-sanctioned censor. To the extent that censors--including those who have suggested mandatory PICS ratings of web pages--rely on State power to effect their goals, they do exactly what the UDP did not: Coercively restrict the free flow of information.


[*] Director, Telecommunications and Technology Studies, Cato Institute.

[1] For a description of the UDP and other cancellation methods used on Usenet, see Usenet Cancel Message FAQ, at through . . . /part3/. With regard to the UDP specifically, the FAQ says, "Originally, the UDP referred to auto-cancellation of all messages from a certain site as a final solution to too much abuse. . . . [T]he meaning mutated . . . to the aliasing out of a certain site by many major sites, thus 'shunning' them off of Usenet." Id. at para. VIII.D., at The former meaning remains favored, ibid.

[2] See, Mark Frauenfelder, UUNet Given the 'Death Penalty' Wired News, 8/1/97, at; Denise Pappalardo & Todd Wallack, Antispammers Take Matters Into their Own Hands, Network World, August 11, 1997, p. 8; Rajiv Chandrasekaran, Group Blocks Postings of UUNet Customers, Washington Post, August 5, 1997, p. D1, D2.

[3] Usenet posting by Ken Lucke, UDP coalition activist, reprinted at,5,13187,00.html. In practice, lifting the UDP entailed some complications due to the automated nature of anti-spamming technology. See Janet Kornblum, Word of UUNet Truce Stifled, C|Net, 8/7/97, at,4,13188,00.html.

[4] See, for example, Frauenfelder, supra ("The penalty entails the use of a program called a cancelbot . . . . When [it finds] a target message, the message is erased from the newsgroup."); Pappalardo & Wallack, supra, (claiming that UDP coalition "unleashed an army of cancelbots against UUnet" which "were set to attack any message posted from a address."); Chandrasekaran, supra (claiming UDP coalition "blocked or destroyed thousands of postings made by customers of UUNet. . . .")

[5] See, for example, Frauenfelder, supra.

[6] Mark Frauenfelder, Usenet's Etiquette-Enforcement Agency, Wired News, undated article at ("It is up to a site administrator to honor 'cancel' messages, though most do.") See also, more generally, (cancelbot FAQ).

[7] For general information on PICS, see W3C, Platform for Internet Content Selection, at Or, for an experiment in irony, try getting information on PICS by typing just that--"PICS"--into an Internet search engine.

[8] Paul Resnick, Filtering Information on the Internet, Scientific American, March 1997, at (emphasis added).

[9] Apparently, it does not take much beyond cancellation effect a forge. "Usenet was not built with security in mind; the fact that it's relatively simple to forge a cancel proves this." The Net Abuse FAQ, para. VII.A., at

[10] The Net Abuse FAQ, para. 3.19, at

[11] See, for example Rebecca Vesely, Cybersitter Goes After Teen, Wired News, 12/9/96,at; Jonathan Wallace, The X-Stop Files, The EthicalSpectacle, at

Access other writings.
Return to Tom W. Bell's Homepage.

(C) 1996-9 Tom W. Bell. All rights reserved. Fully attributed noncommercial use of this document permitted if accompanied by this paragraph.

Usenet Death Penalty - - v. 07/99