W3C Privacy Workshop Day 2

13 Jul 2010


Dominique Hazaël-Massieux (W3C; <dom>)
Thomas Roessler (W3C; <tlr>)
Caspar Bowden (MS), John Morris (CDT; <jmorris>)
Patrick Walshe (GSMA)
Charles Brookson (BIS)
Mischa Tuffield (Garlik; <mischat>)
Marcos Caceres (Opera; <marcos>)
David Singer (Apple; <dsinger>)
John Carr (European NGO Alliance for Child Safety Online)
Kristin Tigart (Tessitura Network)
Alissa Cooper (CDT; <alissa>)
Frederick Hirsch (Nokia; <fjh>)
Robin Berjon (Vodafone; <darobin>)
Graham Steel (INRIA)
Thomas Duebendorfer (Google)
Simon Moritz (Ericsson Research)
Hannes Tschofenig (Nokia Siemens Networks)
Bruce Lawson (Opera)
Jens de Smit (SURFnet)
David Rogers (WAC; <drogersuk>)
WonSuk Lee (ETRI)
Kangchan Lee (ETRI)
Sören Preibusch (University of Cambridge)
Ioannis Krontiris (Goethe University Frankfurt)
Cullen Jennings (CISCO; <cullenfluffyjenni>)
Zoli Piroska (Secret Sauce partners)
Ian Fette (Google; <ifette>)
Jochen Eisinger (Google; <jochen>)
Karl Dubost (<karl>)
Bryan Sullivan (AT&T; <bryan_sullivan>)
Dong-Young Lee (LGE)
Youn-Sung Chu (LGE)
Kasey Chappelle (Vodafone)
Richard Barnes (BBN; <rbarnes>)
Richard Carlsson (Ericsson)
Aza Raskin (Mozilla)
Patrick Gage Kelley (CMU; <pkelley>)
Franco Papeschi (Vodafone)
Mark Lizar (Smart Species)
Soonho Lee (SK Telecom; <soonho>)
Daniel Appelquist (Vodafone; <dka>)
Aram Perez (Qualcomm)
Kai Hendry (Aplix; <hendry>)
Deirdre Mulligan (UC Berkeley)
Henry Story (<bblfish>)
Dan Appelquist, Thomas Roessler
Patrick Kelley, Kai Hendry, Bryan Sullivan, David Singer


Thomas Duebendorfer: Making Privacy a Fundamental Component of Web Resources

See also: position paper / slides

Duebendorfer: Understanding the Web Browser Threat - 45% of people don't use the newest browser
... Web Browser Security Update Effectiveness - three weeks after an update how many have upgraded
... how to improve privacy - better transparency, more control
... put the right security building blocks in place to build features
... build privacy features, on top of security
... did a survey on sharing in social networks
... levels of settings are inconsistent across networks
... because the settings are inconsistent it is hard to understand what is going on
... facebook, myspace have proprietary APIs, OpenSocial has no API for privacy settings, applies network defaults after upload
... decided we need a very fundamental, clear way to have privacy settings
... proposal: generic access controls, available in APIs, and controls in the interfaces of the uploading-apps
... you need the access control not just on one item, but on a group of items
... can share with open groups -- a group that may have 10 known members today, 1000 members tomorrow
... and the networks may not be sharing the lists
... the future privacy implications of sharing with these groups will not be known
... implemented a proof of concept on Orkut
... created a photo sharing application on Android
... called Photocial, uploads to Orkut, Myspace, Facebook

Preibusch: why must we upload multiple times with different settings

Duebendorfer: currently this practically doesn't exist.

Preibusch: this will be the way forward, though - do people actually upload to multiple networks?

Duebendorfer: no solid data on this, many people use at least two anecdotally, personal/professional

<mischat> henry is talking about foaf+ssl as an universal identity solution

Story: -- this does exist, there are federated identity solutions

<cullenfluffyjenni> ah

Story: tying this in, through different resources around the web - can map this to an ontology

Duebendorfer: need unified user management, but it isn't practical yet

<ifette> Can we stick to clarifying questions?

<ifette> that was hardly a clarifying question

??: uploading three images is just insane

??: hosted in the cloud, is there a public bug database for facebook to see what they have screwed up on

<alissa> http://www.tosback.org/

Chappelle: there is a site that will look for policy comparisons - for privacy policies on the web
... ACL contains lots of information about users relationships. need to be careful about the relations it exposes

thomas: can be careful with who has access to the unified ACL information

??: talked about managing change in privacy, changes happen, but we think statically - the open groups change

??: is there any solution about the open group management

thomas: a UI issue, be aware of sharing with current + future members
... there is a presumption that you trust who the admins allow into the group

John Morris: Binding Privacy Rules to Data: Empowering Users on the Web

See also: position paper / slides

Morris: one slide summary ---- © 2010 CDT

Morris: there is value to putting these words on the document
... creates a strong legal recourse

<dom> (I'm not sure that the copyright sign still carries any specific legal value — copyright exists by default, doesn't it?)

Morris: these creative commons logos - create a more granular set of restrictions
... the current model of privacy on the web is a failed model
... take it or leave it doesn't really work
... policy makers both in europe and the US are beginning to realize that
i think most people at this meeting, what we have now doesn't work, and we are groping to find a way to do it better
... the idea is to empower users to create privacy rules, and then bind those rules to the information
... so they travel with the information at all times
... this was tried in the iETF starting in 2001, with geopriv
... and that was focused on location privacy
... with two very simple rules - you can have my location, you can do what you want with my location, but do not do anything else

<rbarnes> location object format with rules: http://tools.ietf.org/html/rfc4119

Morris: the second rule was retain it for x long
... proposed in 2008 for W3C Geolocation API

<rbarnes> 1) Retransmission / secondary use

<rbarnes> 2) Retention limit

<ifette> this makes little sense to me. If you're talking about copyright, it's dubious how much it's actually helped for things like the music industry. The one thing it does help is to prevent other large entities for distributing data -- but for privacy sensitive data, you're not worried about large sites selling your information, you're worried about the info being out there and accessible at all

<ifette> e.g. photos of you

Morris: this talk is about - the idea of binding rules to data
... what is the value of binding rules?
... how are the rules enforced, what relevant does this have

<dom> John Morris' slides

<alissa> ifette, not sure who the "you" is that is not worried about large sites selling your info. some people certainly have that worry.

<rbarnes> ifette, these rules could act as guidance for how the photos could be handled

Morris: this is not a technical way to make sure that no one violates my privacy
... but it provides a legal means

<rbarnes> e.g., if retransmit==no, then it might signal to facebook not to open up beyond friends, even if tagged

Morris: data privacy commissioners, the FTC, private legal actions, and even market response could take over
... this does not guarantee perfect compliance, just as copyright does not

<ifette> rbarnes, even the speaker is admitting that the rules won't be adhered to

Morris: (i.e. file sharing)
... but it provides much compliance, and this is of a lot of value
... in the U.S. the question of whether the government can get information

<alissa> ifette, the rules don't have to work 100% perfectly in every instance in order to be of any value at all

Morris: has much to do with the user's privacy expectations
... if user's were to bind these rules - it is evidence of the user's expectations of privacy

<rbarnes> ifette, the utility is not that the rules are absolutely enforced, but they can still have some value

Preibusch: can you give details on market response, i think that is quite interesting

Morris: if it becomes known that i tell pizza hut where I am so that I can get a pizza
... but tell them, don't redistribute my information
... then there may be a market backlash if it is found out they are redistributing this information
... we did hear yesterday -- users may take the dollar off even if they know there is less privacy
... so the market response may not clearly be a compelling response all the time

<ifette> alissa, rbarnes, i'm much more interested in proactive than reactive for things like privacy

<ifette> once something is out there and it's privacy sensitive, i give up

<rbarnes> ifette, i think there are both proactive and reactive use cases, let's talk in more detail offline

Morris: there still needs to be notice, users need to understand how their information will be used, but this is not currently working adequately

<alissa> that's fine. data brokers and government litigants do not give up.

<tlr> rbarnes, ifette, don't take that discussion offline -- it's precisely the conversation that we need to have

Morris: you could say, I am fine with coupons, that is fine, but don't give that information to someone else
... and then if I learned the grocery store gave it to the insurance company that would be upsetting
... there are lots of objections to binding rules
... and I try to reject most of them, but they are all wellfounded concerns
... oh well we talked about it in Geolocation - we don't need to talk about it again
... the user interface will be hard, pretty much any privacy focused interface is hard
... but there is value in moving towards a clear, consistent UI
there are ways to simply the process of what is going on
... users will blame the browser, when websites break the rules
... and there is some risk here, but some of that can be mitigated by the UI
... don't say "what rules do you want the website to follow"
... say "what rules do you want me to send to the website"
... don't claim the browser is the perfect protector of everything out there
... it is a big ugly world out there

<dom> "Geolocation is not a privacy success story", "status quo is not working" — is this referring to specific events/reports?

Morris: it may well be true that inadequate security protection could lead to a catastrophic loss
... but i would argue that privacy is not like that
... getting a little bit of protection, or getting half protection, is better than getting no protection
... if more companies honor my privacy than there is gain
... while we are not sure this will work, what we currently have isn't working

<dom> hmm... in some cases, pretense of privacy is as damaging as pretense of security; I'm not sure "some privacy is better than no privacy" is true in all situations

Appelquist: having been involved in the Geolocation group - and I was there at the meeting in 2008 - when we had these discussions, personally I think this is exactly the opportunity to revisit the question
... in light of the implementation experience, wanted to take of my neutrality hat, and speak in support of John

Fette: having no desire to revisit geolocation, at the beginning you used the example of copyright, which has a legal framework -
... in the US, I took a photo of you passed out, I own the photo, you have no right to that data
... do you think this would help in a material way

Morris: google execs convicted in Italy, breaking their laws, serious problems with that decision, but this only deals with rules about the information you already have - nothing to do with people depicted in the photograph
... and yes copyright has a clear legal frame and structure, but privacy has a pretty good legal frame as well.

??: are we building a protocol framework in order for this to be useful?

Morris: the original P3P from 15 years ago, did envision an expression of rules, and then a negotiation process, and then an implicit acceptance. that didn't get there. but there is some value in sending these rules, even if recipients don't understand what they are getting initially
... ultimately, I think Pizza hut can make a choice - where is the closest pizza hut, but John is saying I can't use this information for secondary market purposes, and how much value am I getting from selling pizzas, so I will give John his service, and honor his preference.

<Singer> but the 'secondary marketing' may be requesting a sign 'pizza hut 0.5m' from the city, citing all the searches in that area as evidence that they need the sign...

Appelquist: or they could give you a discount on a pizza if they can market too
... copyright has a long, long history, and I like your idea, but it is going to take a long time, before you can put a few letters on the screen that explain exactly what I want. and even creative commons, I don't know what all those symbols are, I use them with my flickr photos

Morris: yeah we have had a couple hundred years of copyright practice...
... the issue, that there is no technical way to enforce a rule set so it should be thrown out is just silly
laws don't enforce themselves

Singer(?): it seems that you are saying, as me as a user, I have to think about every rule I want to send out into the cloud, which sounds onerous, and as a business, I need to accept all those


Roessler: there is a difference between todays unknown, and the idea of sending some information along - and move this to general discussion

Morris: let me respond a little bit.
... for users, yes, we need to find ways for users to set defaults
... on the business side, Alissa is going to suggest a limited set of privacy rulesets that could actually be conveyed

Hirsch: legal expectations, so rule is probably a bad term, we are talking about setting a small number of expectations about what will happen, maybe we are making it sound more complicated than it is. but copyright isn't all that confusing.

Morris: the word rule in a technical body may well have implications here, but it might be the right word to use with users

Story: the relation between copyright and this, is interesting
... perhaps we are dealing with "data copyright"

Morris: you can't really copyright pure facts, and a lot of stuff that is personal about me may be a pure fact. my birthdate for example.

Bowden: i am broadly sympathetic to this proposal, I think the analogy with copyright is a bit of a red herring. but it designs in a channel for the user to affirm they have not given, or have given, consent for secondary purposes. I think it will be tough to create something that just slots in. But this will allow more of what the users want to specify. But if we, as designers, go about designing UIs for this challenge, we should be rather devious about.

<Bowden:> closed system where the user gets less and less choice of whatever pops up as legal balderdash. But this is something I think we should incrementally move towards and support.

de Smit: showed the copyright logo, it is embedded in the presentation, the copyright comes with it naturally. The rules you are proposing are attached, not embedded.

Morris: the IETF attempted to address that, with geopriv, defining an envelope, and declaring that whenever you send the location, the entire envelope must be transferred, but you are absolutely right there is no technical binding.

de Smit: i am wondering if you try to implement this, if every privacy notion you send out will just fall off at the boundary.

Morris: they might, but later they cannot deny that they stripped it off
... they may choose to ignore it, but they would not be able to ignore it

<dom> (re TOSBack, I had contemplated suggesting browsers plugins that kept track of changes in privacy policies in services that the user is registered with, and which would warn you when you log in a service when the policy has changed)

Aza Raskin: Privacy: A Pictographic Approach

See also: position paper / slides

Raskin: take your negative no energy, and help me come up with ways to say why it won't work - and how it can.
... starting with P3P, this idea is not new.
... trying to take an ontological tack
... think P3P was flawed from the beginning
... even the logo sort of shows you this was not designed for end users
... should point out I dropped out of grad school.

Raskin: This graph - in particular is how data gets moved around when we send a URL to check if it is a phishing site.
... this showed us our privacy policy was wrong
... People do not know what they want
... We are the experts, if we can't figure it out
... how can they

<dom> (people do not understand the ramifications of what they do — indeed, otherwise chess would be a very boring game)

Raskin: People do not understand the ramifications of what they do,
... you can easily get their SSN from gender, birthdate, hometown
... The point here is that people did not think these things were very bad to give up
... Similarly, everytime we ask the user a question they don't understand or don't care about, we have failed.
... Before you can use Microsoft Help, answer this question
... about indexing your help or not
... and the user says - you suck
... We should always respect the user.
... Asking the user, is not respecting them - we need to respect their desires to choose, but also not to distract them.
... Focus on 1 question - what attributes of privacy should people care about?
... Creative Commons? (for privacy)
... Washing machine symbols for clothes actually seem closer to me
... Irreducibly complex?
... Each company wants to talk about their particular case.
... We came up with what we think is a pragmatic solution
... We have some icons that are understandable, you write your privacy policy as normal, and then you add some icons and some text
... and this text says - no matter what else the privacy policy states, at the very least we don't share with third party people
... that is the basic idea
... and this has to be read+write
... Creative commons - everyone is making decisions about what to put on their work
... With Privacy policies - they have to be readable by laypeople but not choosable by laypeople
... Also, we are trying to back this up with actual product.
.. We are taking the notion of OpenID, connecting to sites, etc.
... When you go to the site, you can say connect to the site, with the browser's knowledge of who i am
... And I can go through progressive disclosure with the browser being my intermediary

Raskin: Help them make actionable choices at the point where they connect to the site.
... we can bubble up the most important things of the privacy policy and show them to the users.
... the 7 things that matter most
1. is your data used for secondary use, and is it shared with third parties
2. is your data bartered or sold
3. your data is given up without a subpoena

Fette: aggregate marketing - does that count as secondary use?

Raskin: gonna let the lawyers figure that out
4. This site has a security rating of 2.5....

<pkelley> (scribe note, oh my god)

Raskin: the answer to that, or the answer to the passive aggressive question, won't bad actors not use this.
... if the major sites adopt this, other sites would fall in line.
... and you can augment this with crowdsourcing, but this is a start
... we haven't figured out how to inform the users of the complexities here, for explaining security
... maybe five different levels that feel reasonable, we are ok
... does the service give you control of your data? delete? export?
... Does the service use your data to build and save a profile
... that was number 6, and service control of your data is 5

Chappelle: sorry if I have problems with these, but ecommerce sites are an issue
... i can't think of a company i have worked with that doesn't do all of these

Raskin: the language needs to codify business as not usual

Singer: with things like washing machine labels, 40deg means 40deg, with things like legal aspects of privacy, it is not consistent, it is domain specific. do we not run the risk of the icons becoming meaningless?

Raskin: 1. i am a designer and product person
2. I sat down with Joy Ito from creative commons, once we get the strawman. Then he can take the global community of creative commons lawyers and figure out if they work
last icon ADS, ADS with an eye for tracking.

Berjon: i really like the icons idea. I have concerns that that is just too many icons.
... they have variations, perhaps in terms of deployment might you start with three?

Raskin: icons + description always

Singer: it rests on the idea that these are an exception to the baseline normalcy

Raskin: we would fail if we go to every site and put all these icons on every site

??: this one with the stars, that one is bad. everyone thought there would be an ever expanding number of licenses, but what has happened in practice, is we have had less and less of them, that got applied to a very large number of cases.

Hirsch: there are nuances they are trying to get at, you can't just override everything, and how much would be stripped out

Raskin: we haven't sat down with this yet - zittrain is going to get involved

Hirsch: whole business model for security ratings & stars
... if i don't see one of these icons, is the opposite is true.

Raskin: you only care about the bad things, which is why we show the bad things.

Appelquist: if i want to use your data for secondary use.

much discussion

de Smit: based on terminology - do you expect popular sites to have three or four icons

Morris: data given up with a subpoena - this is a real issue. every company in the world does this

Raskin: your data is sold, here is why - your data isn't sold at all

Dubost: you are already at the website, so it is too late

Raskin: before you go to the website, i can see them before the site gets my information

Fette: with regards to ads, most sites probably don't know what ads they run

<hendry> be good if there was "highway code" published at PRIVACY-ICONS.org so you can quickly look them up. I reckon we'll need more than 3.

<karl> karl: reduce the set of icons to 2 or 3 with binary information. and learn from that experience. then move on

<hendry> i wonder how many icon permutations are there with Creative Commons. Must be quite a few

<karl> hendry, Creative Commons permutation - http://creativecommons.org/licenses/

<hendry> karl, thanks. so there is 6 there.

<fjh> note that lawyers have reasons for writing text, expressing nuances etc, hence need to examine how much might need to be removed

Mulligan: this will fail for all the same reasons as P3P, this forces the decisions to stay in the same hands, and not switch to the users hands which is the radical departure for creative commons

Morris: this is just another way to have companies say take it or leave it for users

singer: people are thinking new and inventive things to do with my data that people haven't thought about yet

??: what happens when the site changes their privacy, how am i going to know that I have to go look for these bad icons.

<fjh> use of such icons as a notification mechanism seemss to be possible


Story: an icon that says your rules are going to be looked at.

<hendry> i like ifette's point about using real world examples like Google with these icons. it won't look pretty, will it?

<cullenfluffyjenni> do we have a scribe

<hendry> tlr, ok, but my connection is flaky

<hendry> tlr, are we scribing the talk itself?


<karl> PERSONAL PRIVACY AND THE INTERNET June 1997 http://epic.org/reports/surfer-beware.html

Bryan Sullivan: Some Perspective on User Data Privacy

See also: position paper / slides

<tlr> http://www.w3.org/2010/api-privacy-ws/slides/sullivan.pdf

<karl> Privacy Policies without Privacy Protection - December 1999 - http://epic.org/reports/surfer-beware3.html

<inserted> ScribeNick: hendry

Sullivan: identifies number of roles and responsibilities in the eco-system

<karl> Top 10 Mobile Device Privacy Policies

<karl> [I wonder what is happening when a webcam is controlled by someone to take an image of another person for uploading on a site.]

<karl> [there is also the issue of the same device used my multiple people]

<dom> Bryan's slides

Hirsch: question about role of guardian/parent

<fjh> note, child protection is really a case delegated authority, applicable to elder law as well

<fjh> how does BONDI deal with child protection

Sullivan: the policy does not define the roles

<fjh> bryan answer - - nothing specific is explicit in the policy, out of scope of the XACML definition

Sullivan: how policies are managed is not figured out. the client side (end user) is obviously a good palce
... group policies can be modelled in the BONDI policy proposal (answering soren's question)

Alissa Cooper: Privacy Rulesets: A User-Empowering Approach to Privacy on the Web

See also: position paper / slides

Cooper: talking about privacy rulesets that was proposed earlier by John
... efforts to reduce privacy policies to icons / nutrution labels
... argues people have their own privacy policy that they want to identify
... talks about creative commons as raised earlier as an example as a way of expressing these rules
... acknowledges existing privacy rules proposal is too complex, leading to mis-understandings
... some "primary" purposes like the service resizing the image on upload
... secondary purposes is when the service provider uses your data, in other ways, that's not part of the core service to the user
... explains retention policy, long could be indefinite

<fjh> least permissive = most private, marketing choice of name?

Cooper: "least permissive" ~ only primay, little or none secondary and no retention

<dom> (yes, least permissive as a bad connotation that "most private" doesn't)

<dom> Privacy Rulesets draft submitted to DAP WG

Cooper: something about combinations not handled ? (couldn't hear)
... gives some examples of sites and corresponding policy
... open question how to communicate policies to users
... shows 7 options for licensing a flickr image
... upfront widget install could show policy
... shows browser preference panel
... as a place where privacy rulesets can be shown

Morris: clarification, least permissive allows contextual advertisiing

Fette: asks that if a user is given as option, surely they will just choose least permissive and be done with it

Roessler: is it a global preference?

Cooper: does require people to care, otherwise it will fall to defaults
... argues people will begin to care

Sullivan: the noscript extension could be an implementation of a particular policy

Cooper: the policy can be setup as you go (actionable)

Cooper: there can be conflicting rulesets (answering fjh question with regards to its simplicity)

Hirsch: "the change over time" problem

<fjh> if you change your choice of ruleset in a later similar transaction, then I would argue that should apply to that current interaction, but not to the previous one, since that had a potentially different context (even if at same provider)

Berjon: asks if the ontology can be used elsewhere

<fjh> one repeated issue during the workshop is that things can change over time, thus we need to deal with notification of changes, as well as clearly defining whether choices apply over time or only to current interactions.

Coooper: conceptually yes, pragmatically no

Fette: is there any proof to show that Creative Commons reference point is actually understood?

Cooper: rephrased it that only people that USE the data, do they understand (a smaller subset)
... agreement that its a valid point, that proof is missing from CC as a good reference point for privacy issues

Robin Berjon: Privacy Workshop Position Paper - The DAP Perspective

See also: position paper / slides

Berjon: talking about the death of privacy thanks to DAP (joke)

<jmorris> on "does creative commons work," -- a good question, and I have not read this the database linked from this page, but this might be start to answer this question: http://www.readwriteweb.com/archives/does_creative_commons_work_database.php

Berjon: designing security into APIs, "naturally secure". it's a hard problem.

Hirsch: asking for help from the privacy community here

<lkagal> followup on question about CC usage http://dig.csail.mit.edu/2009/Papers/ISWC/policy-aware-reuse/paper.pdf

Berjon: educating people about how bad the situation is today

<karl> [should there be participation of desktop data vault software implementers: iPhoto, Addressbook, etc on the mac to include the same APIs.]

<mischat> hendry: http://dspace.mit.edu/bitstream/handle/1721.1/53327/550582069.pdf?sequence=1

<mischat> oshani's word on CC ^^

<hendry> mischat, argh, PDF & 3G don't mix

<lkagal> Another followup on CC usage (sorry can't resist) http://www.switched.com/2007/09/21/virgin-mobile-steals-teens-flickr-photo-for-ad/

Berjon: mentions crowd sourcing WRT policies

<mischat> that is a classic example of not handling/abiding-by CC policies, nice one lkagal

Hirsch: different markets with needs and requirements
... aiming for simplicity, the minimum bar to meet the needs

Dubost: are there desktop implementors involved?

Berjon: yes, mozilla guy working on addressbook

Moritz: aren't we moving from security to privacy?

Hirsch: yes

Berjon: no
... privacy problems maybe easier to address at this point

Eisinger: wanted to know of companies that use input tracking as a product

Berjon: [scribe: gave two examples i didn't hear (help)]

<cullenfluffyjenni> company that tracks cut and paste : http://www.tynt.com/

Preibusch: talks about privacy negotiations and how can businesses facilitate this?

<jochen__> and "addthis"

Fette: too complex to implement in the backend for companies to facilitate this

Sullivan: we'll see domain specific APIs. emphasises the onus is on the user.

Cooper: argues privacy policies will negotiate over a long period of time

Raskin: 1) does a global default undermines privacy?
... 2) Social networks are quite different to companies who just sell stuff, how can policies reflect this?

Cooper: identifies opportunities here ?

fjh: ??

Chappelle: (identified a an example which is quite difficult to express for a user)

Eisinger: argues that the better ad networks will just do the right thing to attract customers

<karl> [device apis can be used in many contexts which are not devices. for example proxies to collect the data you are sharing and understand your own shared data]

Moritz: asks how these rulesets (geopriv?) it can integrate to DAP APIs

<bblfish> My question/request was that one should publish the "rules" as an ontology for privacy, in such a way that it can be used not just in browsers via javascript API, but also in a RESTful manner. This is not such a complicated thing to do: imagine Facebook is your Personal Data Store - but you would also be good to have that datastore be in your control, either on your computer as Opera does with the HTTP server in the browser, or remotely on your server. So

Cooper & Berjon: we are experimenting, but we are not satisfied so far

Fette: asks what the implementation will do WRT to (failed) negotiation

Roessler: are there incentives to come up with this sort of user action (negotitation?)

Fette: what happens if you refuse ads? do you not get content?

Preibusch: argues with some % that if companies do identify policy, they will drum up business

<karl> [I fear that if blocking ads goes stellar, there will be new strategies from Marketing people such as ads in text content or Product placement in images, etc.]

Singer: ad networks arent neccessarily doing "bad" things.
... users can't be trusted to make the right decision
... worries about some permutation explosion that businesses have to handle somehow

<karl> [it will be interested also for ads included in maps (Google Maps in Japan), tags (StackOverflow)]

Jennings: this is already a problem today (flash might not be enabled)

<karl> [in Google Japan, example of ads on maps http://maps.google.co.jp/maps?f=q&source=s_q&hl=fr&geocode=&q=shimokitazawa&sll=35.675147,139.921875&sspn=11.162499,17.973633&brcurrent=3,0x6018f36ba569c4bf:0xf22b87faf9377270,0&ie=UTF8&hq=shimokitazawa&hnear=&ll=35.661254,139.667537&spn=0.002728,0.004388&z=17 ]

<karl> [and for stackoverflow branded tags http://stackoverflow.com/tags ]

<karl> [I repeat, do not underestimate the "creativity" of marketing people ]

de Smit: study users to find out some sort of understanding of what people want

Story: points to the social networking space as a good ground for more study

Morris: identifies the huge challenge here

<karl> scribenick, tlr

<karl> adjourned

<bblfish> I suppose one issue on privacy and the social web that I forgot to mention is that it will be also a good educational forum for people to get used to these concepts

<bblfish> just as Facebook makes social networking obvious to people, and made privacy issues become headline news

<bblfish> so distributed social networks will be a good place for people to learn of the implications of privacy policies

<bblfish> (just a thought)

<bryan_sullivan> scribenick: bryan_sullivan

Wrap-up discussion

Mulligan: P3P failed because lawyers want to say maybe, companies do not want absolutes, good guys and bad guys were not distinguishable
... technologies for spyware etc are the same as used for ad serving etc, differentiation was not easy
... P3P was useful because though pre-norm (many service models now common were not established) it enabled discussion of the needs
... an alternate view: we need to be focusing on the service providers that are most likely to adopt the approaches we take
... using icons provides a way to nudge service providers in the right direction

Appelquist: those who are marketing themselves to consumers as privacy protectors should be our focuse - they are serving the early adopters

Hirsch: what do we agree upon - e.g. transparency is good?

Roessler: these are good things, and a path well-trodden - the hard part is what to do about it

Preibusch: who is going to adopt privacy seals? users should be cautious - usually the opposite is true: the worse the practices, the more seals will there be

Appelquist: that is a problem if true that we can do something about

<fjh> there is no such thing as 100% secure, there is only a confidence in degree in reduction of risk

<fjh> david s notes that transparency can be beneficial to avoid "spooking users", control might not be as helpful and maybe more emphasis should be on transparency

Singer: it is not obvious that the users will be able to answer adequatelly unless they have clear information - i.e. transparency

Fette: its not reasonable to expect web service providers to automatically respond and comply

<fjh> deirdre notes agreements on transparency and simplification seem to be apparent

Fette: users hear a lot of things are bad and this affects their trust in service providers

Roessler: what are the options for service providers to increase transparency?

Singer: you can apply social pressure on the service providers

Kelley: users losing trust reflects a failure to help them understand

Cooper: there are features that will be introduced in different browsers due to their different incentive models - it's not fair to rely upon web services as a reason not to comply

Singer: user's inabililty to express what they want will result in inaccurate response

Fette: overall all services used by users, needing to explain all the technical nuances is unrealistic - a global option can succeed when a detailed explanation fails

Roessler: simplifying the probolem to conditions user care about and being able to express them - will than help

<fjh> david notes that ruleset type information might be useful for transparency when conveyed to user but not clear that user can choose correctly among choices, concern

<fjh> discussion of cookies as an analogy

<fjh> allissa ntes assumption - user interaction in DAP when sharing information, hence integration with rulesets

<fjh> issue for dap - privacy issue about sharing other users contact information from own address book

<fjh> entered as ISSUE-86

Appelquist: web based banking - we had to create a visual language (locks etc) to get users to use services

Appelquist: but re APIs, the disclosure of information through social networks has no visual language

<fjh> dan notes new use case is using social network to share health care information

Roessler: this is a core challenge, talking about data as data within various regulatory frameworks

Preibusch: we need to increase privacy salience - primarily on web servers
... salience is being aware of issues e.g. privacy - awareness of claims increases salience
... we don't have to overload the user with this information, we can use tools and download protective profiles

Fette: as a browser vendors, the majority of users do not use tools such as noscript

Preibusch: why have some browsers taken action to incorporate such tools?

Hirsch: we should think about a simple thing, e.g. secondary use as a key thing for users - would it be objectionable to control?

Fette: secondary use is still to broad

Roessler: if we had a term that was useful for this would it help?

Fette: things that users expect to be private e.g. email inbox, are easy to conceptualize, it gets harder in the more middle cases with additional complexity

Singer: http and cookies and html invented the facebook tagging problem - when link styling was created, we created the link visiting problem. we need to continue to think about wat we are doing to expose user data

Appelquist: the meta-issue is for W3C is to have a more robust privacy review

<fjh> note - secondary use is defined in http://dev.w3.org/2009/dap/privacy-rulesets/#secondary-use

Roessler: the general discussion has worked out some points - and we need to find out how to get that into to the discussion in DAP

Mulligan: re robin's question about how rulesets could be used - rights expression in oasis was criticized as a one-way communication - concern was that rights would be held by big people and claimed

Mulligan: DAP can consider bdirectional communication to address that issue

Mulligan: users need to also be able to give it back

Fette: its good to address both aspects - although data export is a larger scope
... DAP should focus on APIs that enable the user to know what type of data is being accessed

Berjon: DAP is taking that approach - and since its tough on UI, we need implementer involvement for that

<jmorris> question back to Ian -- how would that change the current broken privacy model? If users cannot set rules, then companies will continue to use users' data without concern for what the users want...

Raskin: W3C could start to work on secondary use, define what they are etc

Singer: we did agree that we can remove the common stuff from policies and simplify them

<jmorris> I'm not sure there was consensus on that

Dubost: get marketing people involved will help

Singer: its not reasonable to expect users will be able to make detailed policy decisions, but it is feasible for users to define who they trust and rely upon policies they set

<dsinger> there was consensus that 6,000 word privacy policies are somewhat of a problem; we didn't discuss how to ameliorate, but one idea was to centralize and refer to a common set of basis definitions etc.

fjh: definition of secondary use is given in the DAP rulesets doc
... sense as co-chair of DAP is that re will move rulesets forward, what would be the result of that effort in perspective of the workshop attendees

<tlr> a definition, that is

<jmorris> to be clear, as one of the co-authors of the ruleset document, we certainly do not think the precise language or ruleset composition is set in stone

Hazaël-Massieux: refining the terms is clear, defining something that will be used specifically in code will be more difficult

Walshe: its important to stay close to this, as there is a desire to protect privacy while services are wanting to use more data

Caceres: this sounds pretty experimental - its unclear if the policy stuff will work\

Roessler: there is room to do specifications in a way that enable experimentation

Appelquist: there was very real evidence when geoloc was defined that it would be used

Morris: if we go forward with the APIs without a privacy story, a year from now it may be too late and we may have govt regulation

Cooper: if developer demand is a metric, there should no privacy discussion in DAP since there is no demand yet

Appelquist: we need to focus on simplifying expression of intent, and transmitting it

<jmorris> or, the developers will say "give me access to contacts" -- they will not say "give me access to contacts in a privacy sensitive way"

Appelquist: transmission of intent should be in DAP scope

Fette: what reasonable controls should we provide in the case that the user does nothing to express them? it's better to express this through the page and not local to the device

<karl> [the same way you can use different sharing license for the same content, you should be able to define different rulesets for the same data depending on who you communicate with]

Berjon: we should compartmentalize our work in DAP - e.g. terminology and granularity

Hazaël-Massieux: re granularity, re rulesets an API call may not be the right granularity
... support continuing work on defining rulesets though

<karl> s/carl/karl/,g

Hirsch: the assumptions that a service provider has all this figured out are questionable

Morris: supporting that, in geoloc a preexisting relationship with a service provider can override what is bundled with the data

<jmorris> to be clear, that is what we argued in geoloc, but of course geoloc did not pursue this approach

Preibusch: re potential conclusions, potential implementations / mockups may help

<fjh> can navigate to web site that does not have privacy mechanisms in place, yet still wish to convey intent, hence rulesets valuable. That site could access javascript APIs

<karl> [there is also the issue on how you shared each piece of data. Logs of ruleset you sent]

<fjh> issue for DAP - degree of ruleset transmission with API calls, how often, which

<karl> [important when you export your data from the service and you want to keep track of what you have done OR when you want to move data from one place to another]

<fjh> issue for DAP - user interaction for ruleset confirmation when multiple APIs are used to provide functionality, usability etc

Fette: focusing on content vs APIs may be a better approach

<jmorris> but ian, how are rules associated with content if not through the API

<fjh> added DAP ISSUE-87, ISSUE-88

Fette: if you want location to be shared with a, files with b, - that is difficult to assess as compared to types of content in general

<karl> [sometimes the rulesets could be defined by person. Example: each time I share with this person or this service I want this ruleset. hierarchy of rules from content to persons/services to groups.]

<karl> [it might even be more complex than just hierarchy, groups being different depending on the context.]

<fjh> issue for DAP: clarify how rulesets interact with pre-existing relationships

<fjh> added DAP ISSUE-89

<karl> [Bridging the gap between our online and offline social network http://www.slideshare.net/padday/bridging-the-gap-between-our-online-and-offline-social-network]

??: there is no one size fits all, a pre-established relationship may work in some cases, and not in others

<fjh> is Dom making analogy to certification validation dialogs?

Hazaël-Massieux: users may make decisions based upon who is offering the website, and not discrete APIs or data types

<fjh> s/certiication/certificate

Berjon: granularity discussion is one we need to have - we can take it offline
... what other discussions do we need to have

<fjh> karl notes that intent is to attach ruleset as early as possible to data

Dong-Young Lee: its difficult to determine what is good vs bad, control at the API level is pragmatic

<fjh> question asked is attach policy during API or during interaction with site that aggregates,

<fjh> but isn't the privacy concern directly about the data returned by the API call, regardless of subsequent aggregation?

<karl> Are there use cases on sharing data *already* available in the WG?

Roessler: we don't have an answer to the granularity question - it is a critical one, also interaction with negotiaton

Roessler: privacy is most concerned in access and less so in transmission

Roessler: we have an enforcement issue in transmission

<dsinger> scribenick: dsinger

Roessler: not the api discussion any more, but what other points?

Hazaël-Massieux: multiple discussions in best practices in privacy; should we pursue?
... best practices for website and application developers
... (many) this is a current work item in DAP, just not yet started

Dubost: how do you export the data a web site has? (safely)
... the log of what you have communicated to a site or it has given on?

<fjh> added DAP ISSUE-90 - Create privacy best practices document for web site developer

Perez: security of the attachment of rulesets to the data

Hazaël-Massieux: transparency and control are related...

Roessler: two different categories: common formats (karl), desirable features
... also implementation best practices (UAs)

Sullivan: the first-order sharing is primary for the user; what happens deeper than that is different, do I need to know?
... a common format needs to be usable on servers as well as to clients

Roessler: process, review, privacy discussions in specs (e.g. at transitions)

Chappelle: the same way we highlight security issues, maybe we need to highlight privacy issues

<karl> to give an example of Best Practices done in the past http://www.w3.org/TR/qaframe-spec/

Appelquist: the TAG is looking at this whole area -- will report back there on stuff the TAG might want to think of getting involved in

Dubost: could be a note/extension to the QA framework

Roessler: volunteers for spec. best practices?
... volunteers for UA best practices (leveraging the experience in GeoLoc)? (names again collected)
... volunteers for best practices for site developers and web apps?

Chappelle: gsma and MPI (mobile privacy initiative) should have material to contribute

Roessler: volunteers for helping shorten privacy policies?

Chappelle: what could we help them take out? if we have an industry standard that says what a cookie is, etc., then they can get it by referral

Sullivan: the MPI is also looking at this

Appelquist: would like to see the line of communication with Aza's work continue

Roessler: much may be hidden in the terms in the P3P spec.
(many) and there are studies, of course

Roessler: thinks this is not the policy languages interest group

<karl> dom, "Our logging is passive; we do not use technologies such as cookies to maintain any information on users."

Preibusch: while not able to provide *best* practice, he can help identify bad...

Appelquist: moves on to wrap up...
... when we did geo, we thought we were doing the right thing at the time (not including policy in the API); maybe we missed an opportunity, and am not comfortable that we have internalized that as a community...
... don't want to think that we have closed the door on adding privacy-related info to API calls

Berjon: ... a can of worms for DAP....

Roessler: will it succeed? we don't know. should we continue to look? yes
... geo will be re-chartered, what input do we want to provide to that?
... we should also look for implementation reports.

Cooper: the geo spec has normative language constraining what sites can do; in the best practices we need to explore whether we can do better
... if no-one read the spec., then will they read a separate best practices?

Appelquist: the Berkeley paper showed that setting normative requirements on site developers, in a document read by others, may have poor results;

Hazaël-Massieux: whereas a best practices document targetted at them might work

Appelquist: we can add warnings to validation tools

Roessler: we can document what was tried and what the effects were,
... you can actually be more forceful than when in normative text

Caceres: to follow on, needing more teeth, should we show that to regulators

Chappelle: if we sef-regulate better, we will probably be regulated de jure less
... the UK information commissioner has asked the industry to take a stand and report on best practices that are working

Sullivan: we are indeed working with those commissions
... we should pave the way

Tschofenig: ... (scribe got behind hand)

tlr: wrap-up! thank you all!

(general) applause

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.134 (CVS log)
$Date: 2010/08/12 12:13:14 $