Discussions Relevant to PICSRules and Free Speech.

Joseph M. Reagle Jr.

Much of this debate can be found at:
    ftp://vorlon.mit.edu/pub/f-c/

Many of these issues are also addressed at:
    PICS, Censorship, and Intellectual Freedom FAQ
    The ILPN Interview on PICS

My General Statement

A fair amount of conern has been expressed regarding the PICSRules Proposed Recommendation. The concern is about its its intent, and likely affect on speech on the Web. I want to clearly and briefly present what PICSRules is, its intent, and my personal expectations and hopes about its impact on the Web.

PICSRules is a mechanism for exchanging user settings, resulting in an easy one-click configuration. It allows such preferences to be easily saved, moved, and exchanged. This capability and the ability to sign PICS labels have been a part of our work on PICS since its inception. The exchange of preferences can be between users, agents, search engines, proxies, and servers -- anyone. PICSRules specifies a format in which users can communicate their preferences to agents who will act on their behalf or to search engines who can return links likely to be to the user's tastes. This could include the quality, privacy, or security of the content at the link. Also, a user can have different PICSRules files for different Web personalities (or ways they wish to interact on the Web) that are easily portable between all of their computers. These features should be an easy to use part of desktop applications and browsers.

We did not undertake PICS and PICSRules for the purpose of enabling government control of content, nor at the behest of any government, just the opposite in fact. My belief is that PICSRules will encourage participation of diverse communities to use the Web, empower users, increase the visibility of filtering decisions, and mitigate the suppression of speech on the Web. Discussion about the best use of such technologies is critical to realizing this.

General Comments

With respect to the contention that PICSRules will make it easier for governments to censor their populace. I disagree (whereas with PICS, I think there is a danger that it could be used in such a way), PICSRules is merely a format for specifying rules and making decisions over the processing of meta-data. It presupposes different and mobile rule-sets. Governments do not need to be frequently interchanging rule-sets with other agents in a dynamic and ad-hoc manner, nor in having multiple sets. There would be one, and only one, which they could control quite well in some proprietary format, which is not hard to do regardless. (This issue is distinct from whether there is a single, global rating system and its affect on speech.)

With respect to PICS' ease of use by governments, and it somehow being designed such that it can not be scaled for such uses, I'd say regardless if engineers could agree amongst themselves to do such a thing, it would be impossible to do. And PICSRules doesn't make it any easier to scale for governments, the only people it gives more power to are smaller organizations and individuals in exchanging preferences. And no, we don't expect grandmothers to write PICSRules code. It is an interchange format for the preferences as expressed on the sliders and checkboxes of a GUI.

I see meta-data, decision making, data exchange, and negotiations as critical (and non-stoppable) part of technological evolution. The technical world will mirror the social. Capabilities to manage information and trust relationships are needed. The W3C, and others, are advancing the technology from merely moving html documents back and forth, to creating collaborative tools, tools by which individuals can fashion a community and environment in which they operate efficiently and with ease. For instance, I use filtering to block out content and spam that I don't want, use encryption to secure the boundaries of a community that I create in terms of privacy and authenticity, and I use proxies to hide my IP number, filter out banners, email addresses, and referrer fields. Internet JunkBusters has an extremely powerful mechanism for filtering content, and it is built right into a proxy! But it'd be absurd to say JunkBusters shouldn't have been developed because an organization could use it. One of the questions in the FAQ is, "How can I get my ISP to run the Internet Junkbuster? "

Tools which enable social functions to be exercised are being developed. The question is, will governments be more likely to use them, or people? I'd like to see them used by people to govern their interactions as they best see fit. Governments would obviously like to use them to extend their own mandate. I've often said that on the Net, people vote with their clicks. If people's behavior on the Net, is contrary to how the government thinks they should act, their is obviously a source of contention, and resolving it isn't something we can do. The issue of governing legitimacy is an interesting and problematic problem.

Email Discussions:

At 10:42 PM 12/13/97 +1000, Irene Graham wrote:

>The original PICS specifications were intended to enable a multiplicity of
>rating systems. Two years later the number of rating systems can be counted
>on the fingers of one hand. Few organisations or community groups have
>evidenced any interest in developing rating systems.

In addition, those rating systems and filtering products that do exist are often, rightfully, criticized for being opaque. The the biases inherent in the rating system and filtering decision are not readably comprehensible to the user.

>On the other hand, great interest has been shown by governments.

Yes, of course. They are trying to extend their percieved mandates to a new forum.

>However, the existing PICS specifications do not lend themselves overly
>well to facilitating mass censorship by governments. PICSRules 1.1 is about
>to change that. (http://www.w3.org/TR/PR-PICSRules.htm)
>PICSRules is about a multiplicity of preferences (including, of course,
>those of governments), not a multiplicity of rating systems.

There are a couple components to making decisions. First there is some bit of information, then there is some preferences about that information, and a resulting action (order, sort, select, block, etc.) PICS-Rules captures the preference information. PICSRules signals the end of our work on the PICS specifications and something like this was considered necessary to the platform from the outset. Lorrie Cranor has talked about the consideration of other scripting languages for trust management. I continue to believe that meta-data and trust management will be key to the development of the Web. PICSRules, and other processing languages will be very important to preference exchange between users and agents, as well as increasing the ability of engines to return targeted results.

>That is the other reason to "bother with" PICSRules - to heed the calls of
>governments desirous of one global/universal rating system, which they can
>each easily adapt to enable a far more effective means of censorship of
>Internet content than presently exists.

PICSRules implies that there will be multiple systems, preferences, and ways of selecting/blocking content. I think one of the most outstanding difficulty with filtering systems today is there lack of transparency (the user can't see what is going on.) The product, rating system, or the method for including sites in a white list or black list are biased. Consequently, efforts to remove cultural biases from rating systems and to make them clear, unbiased, and simple can be a good thing IMHO. I think the PICSRules is supportive of point 5 of the EFF guidelines (though you would obviously criticize it with respect to points 11 and 12.):

http://www.eff.org/pub/Net_info/Tools/Ratings_filters/eff_filter.principles

5) Customer Choice and Control
* The customer should as much as possible be able to configure what is being filtered, such as by a user-friendly means of adjusting defaults for filtration/ratings categories, by selectively adding or deleting
specific new sites or keywords, by turning on or off topics to filter for, or by swapping entire sets of filtration criteria, as examples.

* Customers should not be placed in the position of purchasing someone else's morality or preferences for lack of ability to customize or make meaningful choices. Instead, they need tools that help them filter
out material they do not find appropriate. Notes: Systems based on the Platform for Internet Content Selection (PICS) could become "passively compliant" with this principle, in theory, as PICS allows for multiple ratings systems from which the user may select. However, this facility neither guarantees a multitude of ratings systems, nor that such competition would assure meaningful choices. Labels or filters built into search engines may not be particularly configurable by users - they may simply cause the search engine to refuse to index and list certain sites and content. Users should know about this, even if they have no control over it, so they can make informed choices about whether to use the service or not.

>However, a global rating system on its own, or even together with a
>critical mass of labelled content, won't work because of cultural and
>policy differences.

See our policy statement at:
http://www.w3.org/Policy/statement.html

W3C works on behalf of and with its membership to build and maintain an inter-operable network architecture. This architecture must allow local policies to co-exist without cultural fragmentation or domination.

>PICRules 1.1 provides an essential element of this proposed mass censorship
>system. It facilitates the development by any government (or other entity)
>of a country-specific profile, based on the global rating system, which
>matches their existing, for example, movie/video ratings such a PG, M, R,
>X, Refused, etc. http://www.theaustralian.com.au/national/4196812.htm

This could be done without PICSRules.Thought has been given by various entities to PICS rating systems translators have already been given some attention. Regardless, a question to ask ones-self when presented with the options of
1. a government stating they will pass a law which makes various entities liable or that content must be suppressed at the source
2. or a government stating that you should/must use a specific preference file when browsing the Web


which is better?

I believe that your position is that Australia would do the first, and the policy would be bad, but unactionable. The net gain is that the law on the books would be moot, its implementation would be impracticable, and "free speech" is not practically hindered. With PICSRules, the policy itself may be less onerous, but the practical impact would be troublesome. My response to this is that this is a pragmatic issue of assessing the likelihood of various outcomes, where others may actually disagree with you. Second, it is not the position of the W3C at this point to try to weigh these various outcomes in the many different cultures and jurisdictions that want to use the Web and to make value judgements about which is better. Though I will say that such an analysis did seem to be the reason PICS was started in the US/CDA context (and now that the SC struck down the CDA, people's assessment of the balance and slipperiness (as in slope) of certain outcomes is changing.) Also, the people working on the technology probably make these types of judgements for themselves, personally. I can't speak for them, but for myself, and I think for most of the people, are still trying to push client side, self empowering type solutions.


>However, once there is a global rating system, people in Sweden, Brazil,
>USA, or wherever, will be able to rate material with that rating system.

I think the global rating system is of concern in your context given the pragmatic assessments you make, but is not something the W3C is responsible for and is orthogonal to PICSRules. Also, we are not in the position to publicly question and judge the motivations of our members. Work is produced according to typical W3C process, there has to be a fair amount of interest and consensus.

>Given that in Australia it is a criminal offence to exhibit, sell, etc.
>certain types of off-line material if it is not pre-rated, (and the
>government is planning to make ISPs criminally liable for content provided
>by others http://www.afr.com.au/content/971213/inform/inform6.html),

If Australia's people and/or government want to censor, it is not the W3C's place to make them not. Not that we will build them such technology, our intent is to motivate self empowerment.

>Paul Resnick states in his PICS FAQ "Some people argue that
>unrestricted access to information is a fundamental human rights question
>that transcends national sovereignty. W3C has not adopted that position."
>http://www.si.umich.edu/~presnick/pics/intfree/FAQ.htm

True. We would not pronounce such a thing one way or the other. In fact, the suppression of speech is considered to be fundamental to human rights in most of the world (unfortunately IMHO), including some aspects of US legislation. From my own FAQ:

________

Does my country have a right to filter what I see?

The norms for what is considered to be appropriate speech across nations and cultures obviously varies. The W3C is not the proper organization for resolving cultural differences. However, some of these differences have been addressed by a number of international organizations. The following quotation demonstrates that international treaties on civil rights actually have provisions for restricting certain types of speech:

Americans don't rely solely on the First Amendment for guidelines on freedom of expression, according to Ann Ginger of the Meiklejohn Civil Liberties Institute. Also on the books are:

The Genocide Convention Implementation Act, 18 USC Sec.1091 CHAPTER 50A: Passed in 1987 by the U.S. Senate, the act made it a crime to try to destroy people on the basis of race, creed or religious orientation. More significantly, the act also criminalized direct public incitement to genocide, such as hate speech.

Article 20 of the International Covenant on Civil and Political Rights: Passed by the U.S. Senate, it became effective Sept. 8, 1992. The covenant prohibits any propaganda for war, as well as "any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence.


http://www.fac.org/publicat/fafuture.htm


____

>P.S. The majority of the contents of this message is being posted to the
>fight-censorship list and the (Aust) Link list and newsgroups. It will also
>be forwarded to some other W3C people involved in the development of PICS.
>I invite any corrections, or other comments, regarding what PICRules 1.1
>facilitates - in either public fora or by whisper if that is preferred.

You may forward my response on to the appropriate fora. BTW: My ability to continue this discussion in email is unfortunately limited by both time (where does it go) and typing stamina (RSI). However, I know this will be a topic at CFP, as well as at a forum created by the CDT, and some meetings being hosted by Internet Free Expression Alliance, where we are happy to participate.

On Mon, 15 Dec 1997 08:01:00 +1000, Michael Baker (mbaker@pobox.com) wrote:

[Edited to exclude much of the redundancy from the earlier post. -JR]

>* Article 19 of the Universal Declaration of Human Rights explicitly
>protects freedom of expression for all and specifically the "freedom to
>hold opinions without interference and to seek, receive and impart
>information and ideas through any media".

Mr. Baker,

Thankyou for the very well presented concerns regarding PICS and PICSRules.

I understand the issue of human rights is a problematic one.

>* This principle has been reaffirmed in multiple international
>agreements, including the International Covenant on Civil and Political
>Rights.

See Article 20. My point being it is not our place to determine what is the proper use or exercise of free speech.

>* PICSRules 1.1 have been developed for, or can be used for, the
>purposes of:
>
> - preventing individuals from using the Internet to exchange
>information on topics that may be controversial or unpopular,
>
> - enabling the development of country profiles to facilitate a
>global/universal rating system desired by governments,

This is not our intent, though we have been very frank in stating that PICS technology could be used in such a way. Not that we advise or endorse this. Our intent has been the exchange of preference files, (particularly when it comes to privacy) and to continue to enable user empowerment on the Web.

>* PICSRules 1.1 go far beyond the original objective of PICS to
>empower Internet users to control what they and those under their care
>access. They further facilitate the implementation of server/
>proxy-based filtering thus providing a more simplified means of
>enabling upstream censorship, beyond the control of the end user.

Some form of trust management (how to make decisions given meta-data, certs, vouchers, digitally signed statements, etc.) has long been a part of our work on meta-data. Lorrie Cranor did a good job of explaining this on the Fight Censhorship list.

>We draw to W3C's attention that:
>
>* similar techniques that block Internet sites have prevented access
>to innocuous speech, either by deliberate intent, through oversight,
>or as a result of ignorance of the infrastructure of the Web,

This is lamentable/unfortunate, and I'm always glad to see people such as yourselves call those organizations on it.

>climate of confidence for the furtherance of electronic commerce. In
>fact, filtering and rating systems intended for the protection of
>minors have proven inefficient and counter-productive,

I am not sure I would agree with this.

>even a cursory analysis of PICSRules 1.1 indicates that the likelihood
>of community organisations developing complex profiles is slim. The
>necessary expertise is more likely to be acquired by governments
>seeking to restrict access to content and inhibit freedom of
>expression.

Hopefully, you would just set your preferences in a nice GUI, them dump them to a file, we have working test code that does this at the W3C.

>It seems apparent that PICSRules have been developed in response to
>calls from governments who seek a more efficient and effective
>technological means of restricting human-to-human communications.

I do not believe this is the case.

>However, once there is a global rating system, people in Sweden, Brazil,
>USA, or wherever, will be able to rate material with that rating system.

I think the global rating system is of concern in your context given the pragmatic assessments you make, but is not something the W3C is responsible for and is orthogonal to PICSRules. Also, we are not in the position to publicly question and judge the motivations of our members. Work is produced according to typical W3C process, there has to be a fair amount of interest and consensus.

>From: Seth Finkelstein <sethf@MIT.EDU>

>Subject: Re: FC: W3C's Joseph Reagle responds to GILC, defends PICS
>
>[I'm really not spending much time on this these days, but the combination
>of logical fallacies, fluff, and "Braunism" in this message motivated
>me to write a comment. Feel free to forward around to lists]
>
>[Much snipped, I'm just going to hit the high points]
>>>At 10:42 PM 12/13/97 +1000, Irene Graham wrote
>> From: "Joseph M. Reagle Jr." <reagle@w3.org>
>> PICSRules implies that there will be multiple systems, preferences,
>> and ways of selecting/blocking content.
>
>    This is a false statement. PICSRules doesn't "imply" that at
>all - it may PERMIT it, but that doesn't mean one system won't become
>all that matters, by government fiat. Especially if that is the one
>connected to criminal liability if not used.

    Whether there is one vocabulary is independent of whether you have different rule sets.

>> Consequently, efforts to remove cultural biases from rating systems
>> and to make them clear, unbiased, and simple can be a good thing IMHO.
>
>    Except that destroys their utility - nobody wants a system
>that treats a medical text, a crime-victim autopsy photo, a painting,
>and an XXX movie identically (i.e, showing a nude body).

    Agreed. There are benefits to subjective as well as less-subjective systems.

>> Regardless, a question to ask ones-self when presented with the options of
>
>> 1. a government stating they will pass a law which makes various entities
>> liable or that content must be suppressed at the source
>> 2. or a government stating that you should/must use a specific preference
>> file when browsing the Web
>>
>> which is better?
>
>    This is a VERY common rhetorical trick of "PICS propaganda", it's
>a classical excluded-middle logical fallacy. The problem is
>the above two options are not exclusive-or (one prevents the other), but
>in fact possible AND's (number 2 helps implement number 1). In fact,
>one of the chilling aspects of the US Communications Decency Act

    It wouldn't make much sense for a government to say, "X is illegal, if you are a content provider or ISP you must censor it, oh and users' clients must ban it too." And installing client side filtering does not lead to the suppression of speech (where I have to refrain from speaking.


>> With PICSRules, the policy itself may be less onerous, but the
>> practical impact would be troublesome. My response to this is that
>> this is a pragmatic issue of assessing the likelihood of various
>> outcomes, where others may actually disagree with you.
>
>    Note how the passive statement of "others may actually
>disagree with you" is snidely substituted for an actual support of the
>assessment. It's not like that "disagreement" is a revelation at this
>point. That's another stock bit I've seen many times lately, the retreat
>from actually supporting the PICS position by just stating there's a
>"different" position. Sure, but how well does it stand up to critical
>examination? We never find out, because that statement of existence
>is used as justification.

    I was merely pointing out that much of this debate is predicated on assumptions about what things are desirable, and assessments of the world, and how to best effect the world towards your assumptions. We may share the same assumptions (hard to say), but I am quite sure my assessment of where the world stands, and how to affect it is different.


>From: "Michael Sims" <jellicle@inch.com>

>To: fight-censorship@vorlon.mit.edu
>Date: Tue, 16 Dec 1997 08:30:00 -0400
>Subject: Re: W3C's Joseph Reagle responds to GILC, defends PICS
>CC: reagle@w3.org, sethf@mit.edu, rene@pobox.com
>
>
>Joseph Reagle wrote:
>
>> At 10:42 PM 12/13/97 +1000, Irene Graham wrote:
>>
>> >On the other hand, great interest has been shown by governments.
>>
>> Yes, of course. They are trying to extend their percieved mandates
>> to a new forum.
>
>Since you grasp this point, why are you developing a tool that helps
>them do that? Two choices I see: W3C thinks it would be nifty to
>segregate the net according to national boundaries, or they are
>fools. I believe the first is the case, and I'm wondering, WHY?

I think creating boundaries is useful. I don't want to see the Web "fragment" (a US Web, Chinese Web, Microsoft Web, or Netscape Web, both in terms of technology and culture) . I want to see people create the boundaries. In the first instance, all the content is still there, and people will find ways to it if they want, in the second, we have China behind their Intranet.

>You're going to claim with a straight face that this is useful for
>something other than denying access, I suppose.

Hard to say. My interest in this type of technology is from our privacy work. (That Lorrie discussed earlier.) But this will be based on RDF, and some other, more generalizable rules. (The precursor to PICSRules 1.1 was  more generalizeable, but considered to be too complex I believe, I'm not the authority on that though.) We've played with privacy disclosure schema in PICS1.1, and even demonstrated the idea of "recommended settings" in that context before the FTC during the summer.

>Current meta-data already performs all functions that you propose for
>PICSRules, except one only: providing a seamless and assured manner
>to deny access to forbidden material.

    Yes, PICSRules is in the context of that which motivated PICS, children accessing inappropriate content.


>> PICSRules, and other processing languages will be very important to
>> preference exchange between users and agents, as well as increasing
>> the ability of engines to return targeted results.
>
>Untrue. Search engines already index the document and meta-data, and
>return very focused results. Just how does proposing a spec to deny
>propogation help search engines?

    I pass on my prefs with respect to the quality, code safety, content, or privacy of the site to the search engine.

>Seth pointed out that PICSRules only permits, not implies. I would
>add that there are strong market forces, as always, that would push
>for one universal rating system. There COULD be a variety of movie
>ratings in the US. There COULD be a variety of ratings in Australia
>for all publications. There COULD be, but there aren't. No one
>wants to rate their crap using 12 different systems, or even one.
>Only the one single system which achieves critical mass through
>government or Microsoft pushing it, such as, say, RSACi, will matter.

But I don't believe PICSRules promotes a single system either.

>> I think one of the most outstanding difficulty with filtering
>> systems today is there lack of transparency (the user can't see
>> what is going on.)
>
>As opposed to having the content denied at some nameless upstream
>provider. I'm sure the user gets a lot of feedback that way.

    I think that should be transparent as well.


>> 5) Customer Choice and Control
>> * The customer should as much as possible be able to configure
> ^^^^^^^^
>
>Not the government. "Customers" can't configure upstream proxies,
>sorry.

    Yes of course. And yes, we haven't taken a position on whether governments can do this or not. Difficulty is though, even if you create this stuff for the client and customers, it is hard to keep governments from using it to.

>Seth addressed this. My two cents: IF that either/or existed,
>government liability is the way to go. There the bans are a matter
>of public record.

    I believe from a policy point of view, explicit "banns" are preferential to behind the scenes maneuvering and selective enforcement.

>> Also, the people working on the technology probably make these types
>> of judgements for themselves, personally. I can't speak for them,
>> but for myself, and I think for most of the people, are still trying
>> to push client side, self empowering type solutions.
>
>Hmmm, let's see. Please provide any one URL where anyone from W3C
>pushes a client side solution/speaks against upstream filtering in
>an official setting. I've not seen that.

No, we haven't taken "official" positions with respect to policy. The staff can and does take personal positions.


> "threat which on its face and in the circumstances in which it
>is made is so unequivocal, unconditional, immediate and specific as to
>the person threatened, as to convey a gravity of purpose and imminent
>prospect of execution."

My point being that the protection of absolute free speech is not without its caveats in any legal system, and we aren't competent to draw the line.