<scribe> scribe: jyasskin
<gendler_> present
jrosewell_: Thanks for being
here. This came out of a question on a PING call: how much
privacy is enough?
... if everything must look exactly the same, what does that
mean for innovation?
... not clear what the policy is.
... Please discuss.
... Does anyone have thoughts you want to share?
JohnWilander: A question for you:
when you asked how this affects competition/innovation, we have
a set of standards (what the W3C produces), and that becomes
the platform. If a browser deviates from basic things, that's
considered a bug, and the browser goes and fixes it.
... Is that what you're thinking of? "This is what you should
expect from all browsers?" Or are you asking to ensure that
browsers can have different privacy promises so you can
innovate, e.g. in privacy
jrosewell_: Conversation in the PING meeting, we were reviewing a feature that involved a hyphenation dictionary. In the review, it was raised that the way a word is hyphenated indicates the algorithm used, which could differentiate one browser from another.
JohnWilander: Fingerprinting
jrosewell_: If we want to make
all dictionaries work in exactly the same way, and we can come
up with thousands of ways to implement algorithms differently,
is there a point at which it's "enough"?
... Also something about national laws
JohnWilander: To start a conversation, we shoudl not have accidental fingerprinting. If there's no user benefit to have differences, then we should not have them. But if there are user-beneficial differences, e.g. American vs British English, then it's ok.
<Zakim> nigel, you wanted to mention consistency relating to common requirements like fetching resources, and HR
<MikeSmith> in https://cryptpad.w3ctag.org/code/#/2/code/edit/4ht9YHtVS9AB4UBlh-oPvHej/ I see “James: privacy baseline + definition session. Good paper from Tess.
<MikeSmith> ... what paper is that?
nigel: The topic of consistency came up. A member noted a difference experience doing a horizontal review for accessibility vs privacy. Accessibility has a written set of principles, where privacy doesn't. It'd be good to have better documentation about what it means to have privacy.
<Joshue108> +1 to Nigel
nigel: "Private vs not-private" isn't really a good distinction. "Fetching a resource given a URI" is repeated everywhere, and doing a privacy review of that fetching ... should be able to refactor those concerns into a shared document.
jrosewell_: I hear it'd be good to have greater clarity.
<MikeSmith> http://darobin.github.io/api-design-privacy/api-design-privacy.html
hober: On the point that
accessibility reviews have more written resources that spec
authors can use to learn best practices, that's because the W3C
over many years has prioritized accessibility in its staffing
and resource allocation. The WAI has done a tremendous amount
of work and been very successful. If we prioritize privacy in
the same way, we'd probably get similar outcomes.
... Before we did the Fetch spec, lots of specs fetched
resources, but they made different assumptions. The big benefit
of the Fetch spec isn't the new fetch() API, but that's a side
effect. The big benefit is to have a common model that all
specs use, so we can understand the security and privacy
implications.
... If your spec defines a fetch that's not in terms of the
Fetch spec, that's when you get security/privacy concerns worth
looking at. Fetch is a success case of recent spec work, that
we've been able to define resource loads on top of Fetch, so it
doesn't come up over and over again.
... nigel, is this a case where fetching isn't defined in terms
of the fetch spec.
nigel: The bit I'm talking about refactoring is about privacy concerns around fetching resources, regardless of mechanism of doing the fetch.
hober: Refactoring into the common model, from the Fetch spec, gets you a shared understanding.
<boazsender> thinking about https://www.w3.org/TR/security-privacy-questionnaire/
hober: Would be good if we had a common model like accessibility has, but the case in *this* case is to point to Fetch.
<boazsender> (and about https://w3ctag.github.io/design-principles)
jrosewell_: To summarize at this point, there was a particular case around fetching, and we had to point to Fetch. And if there's ambiguity in Fetch we should address that. But nigel also pointed at the experience of horizontal review, and the need for more detail in what's expected.
nigel: Tess is right that there's
background here, that accessibility was prioritized in the
past.
... We're now playing catch-up with privacy.
<wseltzer> jyasskin: wanted to point to Privacy Threat model
<Zakim> jyasskin, you wanted to mention privacy threat model
<wseltzer> ... as an attempt to document this sort of privacy baseline
<MikeSmith> models to consider include the spec-development process of WHATWG spec editors, and the way that incorporate privacy considerations into their spec development
<wseltzer> ... explicitly discusses that an attacker can or cannot learn specific information
<wseltzer> ... that document has been lagging, would welcome more people contributing
<dom> https://w3cping.github.io/privacy-threat-model/
<Zakim> MikeSmith, you wanted to ask for concrete examples of problem specs that this would have helped to solve
MikeSmith: It would help me to
have some concrete examples of failures to consider things that
have happened in the past with specs that had some sort of
privacy miss, where a baseline would have helped to make the
spec editors avoid some mistake.
... I've spent a lot of time on core specs for Javascript APIs
and HTML, and we spend a lot of time talking about privacy.
It's not already high on everyone's awareness. WHATWG specs,
Web Performance, Web Apps, it's fundamental to everything we do
together, and I'd say the same about accessiblity.
... What areas is it not happening? Need more peer-to-peer
discussion among editors and chairs to share best
practices.
jrosewell_: When I do business in the UK, I can go to a lawyer who tells me what I need to do as a businessperson. That's the sort of thing that'd be useful in the W3C. What am I aiming for?
<boazsender> q
Christine Runnegar: Acknowledge what nigel said. Experience with your review wasn't how you wanted it to be. Want to work together to make reviews more positive for everyone.
<jrosewell_> ?
scribe: This is a big topic. Privacy is really hard. Hard in a legal sense. A lawyer can say "do X", which is a risk-based interpretation of the law. Security is hard and was really really hard 20 years ago. People doing security reviews had this experience.
<nigel> thanks Christine, I didn't want to focus on a not-completely-positive experience, rather to think about how we can work better in the future.
scribe: We need people to help
develop more documentation for spec authors.
... I hear the request for examples. That's the intention of
the self-review privacy questionnaire. It sets out threats and
mitigations as we learn as we do reviews.
<dka_> That self-review questionnaire: https://w3ctag.github.io/security-questionnaire/
scribe: I've noticed that some of
the really obvious privacy problems are less common now. People
are much more alert within the W3C about the risk of increasing
the fingerprinting surface. We're aware that creating a
globally-unique persistent identifier is bad. We have a long
way to go, but we're moving along.
... The Mitigating Fingerprinting document is good.
JohnWilander: Replying to Mike Smith: Looking back several years, a big thing that was missed was requirement for partitioning. Local storage was specified as a global identifier in third parties. IndexedDB, SErvice Workers. Now all of those need to go back and revisit them.
<jrosewell_> From Zoom Christine: https://www.w3.org/TR/fingerprinting-guidance/
JohnWilander: Other specs that didn't consider that a browser might partition things. e.g. Clear-Site-Data
<MikeSmith> is the partioning issue more of an implementation issue than a spec issue?
JohnWilander: Another thing that's often overlooked is credentialed vs uncredentialed requests. So we have to specify whether or not particular fetches need a cookie.
<hober> MikeSmith: it's many spec issues, Anne's coordinating it all from here: https://github.com/privacycg/storage-partitioning
<slightlyoff> (clear-site-data did consider partitioned UAs, but not all partitioning strategies are identical or spec'd; JohnWilander is incorrect)
JohnWilander: Another mistake is that as soon as something is stateful in any capacity, it's a potential supercookie. Is a server in control of what the browser stores.
hadleybeeman: I'm on the TAG. This is a fantastic session. Re jrosewell_ about going to a lawyer to understand the law. We should also look at how that law is created. I work with the UK government, and a lot of what came into the conversation about the GDPR came from what we created here.
<Zakim> MikeSmith, you wanted to comment about privacy related to legal concerns vs privacy related to protecting users from privacy exploits, for the sake of protecting users (regardless
<ivan> From Christine Runnegar to Everyone: Here is the document - https://www.w3.org/TR/fingerprinting-guidance/
MikeSmith: First thing
JohnWilander was describing about partitioning. Some things are
spec requirements, which we can address in spec language. Other
things are implementation details. Browser can do all sorts of
things that aren't addressed by the specs, but are still
bad.
... How many concerns about partitioning are guidance about
implementations, vs concerns for spec language.
... Realized that I personally don't think of privacy in terms
of legal issues or legal requirements. I'm just not aware of
them. We have lots of fingerprinting vectors, and we're trying
to get rid of them. A lot of them we can't get rid of. Some
we're stuck with, which doesn't mean we should add more. How
much of this is about addressing legal concerns vs things like
fingerprinting, where there aren't laws against it.
... Legal issues vs privacy hygene?
<MikeSmith> robin well I guess I meant laws against providing fingerprinting vectors in web APIs and features
pes: If the mental model is "I'd
like to show something to my lawyer so they can tell me what to
do", it's probably not something that'll ever exist. W3C isn't
a governance body to tell people what to do. It's about
bringing browsers and others together to meet unmet
needs.
... Getting principles about what we're trying to achieve, and
mechanisms to achieve them. Interest in moving them into the
security and privacy questionnaire.
... Privacy threat model could be a concise summary of harms
identified in other specs. That's also available in the privacy
considerations of other specs, although it's not easily
searchable.
jrosewell_: Hear that lots of information is available.
pes: Information that exists isn't necessarily what you want. It's what's happened in the past, not restrictions on what you can do.
jrosewell_: When I pay lawyers, I
rarely get a black or white answer. "It could be this or
that."
... Re fingerprinting: if you're using it to get data that
would be considered personal, then you do have laws that cover
what you can do with it.
... Aaron presented research on going to 10k's of websites
about which ones write cookies prior to the consent dialog.
European websites were better than other areas, which validates
that the GDPR has had an effect.
<boazsender> GDPR has definitely had an affect
Bleparmentier: When talking about fingerprinting, you said a lot couldn't be gotten rid of because it'll break things. If we want a new thing that would be very useful but would add fingerprinting?
<AramZS> Resources and participants from that talk are available at https://docs.google.com/document/d/1BZrAXjydtzzufDWOaPAS1oDKYsAHmic5juNUxKVnW4s/edit?usp=sharing
Bleparmentier: How do you think about it? If something adds fingerprinting but adds no utility, we won't add it, but if it adds fingerprinting but is useful, how do you decide?
<MikeSmith> it seems worth taking into consideration the Do Not Track case, in terms of how successful it was at solving the problem it was intended to solve, and if it was not successful, why not?
<Zakim> Christine, you wanted to answer Mike
Christine: Not answering the
tradeoff question. As PING cochair, I'm trying to reduce the
fingerprinting surface. In a situation where fingerprinting
can't be removed, we ask whether it can be made more
detectable. Then privacy researchers can call out websites that
are using the API purely for tracking.
... To Mike's question about privacy law. They vary across the
world in detail, and some countries don't have them. But
there's some global consensus about privacy principles. One
instance where the W3C could be helpful. We could build
technical specifications that allow websites to comply with the
law. We might see that in the Privacy CG.
dka_: Samsung, cochair of TAG. Want to promote TAG Ethical Web Principles by me and Hadley.
<dka_> https://w3ctag.github.io/ethical-web-principles/
dka_: "Security and privacy are
essential" "web must enable freedom of expression" "web must
not cause harm to society"
... Trying to promote some principles as bedrock that define
how we design web standards and technology.
... Want to root the architectural understanding in fundamental
human rights. Looking at Article 19 of the UN Declaration of
Human rights. <missed text of the rights>
... Privacy is fundamental to that right
... The web should, as a platform, should provide more privacy
than other platforms.
... We don't want to limit ourselves to even the lowest common
denominator of privacy laws, but should do a better job. The
web should be *the* privacy-protecting platform.
<JohnWilander> Thank you, Dan!
<Zakim> robin, you wanted to answer Basile's question
<robin> Zoom crashed when I unmiuted....
<wseltzer> qq+ robin
<AramZS> Robin, not sure if you unmuted but we could not hear you
<Zakim> Joshue, you wanted to talk about degrees of accessibility vs degress of privacy
Joshue108: ex-cochair of WCAG.
Like talk around taking best bits of what we do well.
Accessibility: if someone says "this is inaccessibile", this
isn't binary. Inaccessibile to whom? Different groups have
different issues. That opens a whole range of user needs and
requirements that are complex and difficult to balance.
... With privacy, I won't say it's easier, but it's a different
thing. Needs are more generic.
<Zakim> robin, you wanted to react to robin
robin: Back to Bleparmentier's
question: in general we'd use first principles to build these
tradeoffs.
... Core value of the web is that it enables trust. In the
IETF's Internet is for End Users, there's a section about
putting users first and developing trust.
<MikeSmith> I’m not sure that getting privacy protections adopted in the market on scale is easier than doing similar for accessibilty
robin: Difference between web and
native app, you can load something and know you're in a
trustworthy environment.
... Do anything in a 1:1 context, and be confident you wont' be
tracked. Browser is user's fiduciary agent. User delegates to
browser, and browser's job to ensure users aren't reidentified.
Browser's job not to use data to track people.
<wseltzer> The Internet is for End Users (RFC 8890)
robin: Start from the idea of
trust, and trying to use it to answer fingerprinting tradeoff,
it'll vary from feature to feature. When looking at a feature
that might add fingerprinting surface, we ask how we might make
it safe. Might involve users, probably not via a consent
prompt, but make user aware that doing the thing might identify
them.
... If it's not possible to do a feature without adding
fingerprinting, would always answer not to do the feature.
<Zakim> wseltzer, you wanted to discuss resolving tradeoffs
wseltzer: Note that in the W3C
process, the tradeoffs are resolved first in the WG when
working with horizontal reviewers, to reach consensus. If they
can't, and there's still an objection, it comes to the
Director, who again tries to broker a consensus, and then makes
a ruling.
... Process community group is asking how we might get to a
Director-free version of the process, where the community could
find the right resolution.
<Zakim> MikeSmith, you wanted to comment and to talk about legal/privacy concerns for sites/content vs privacy concerns for development of primitives and APIs for the web platform and to
MikeSmith: Back to "no laws against fingerprinting", I was thinking of how we design primitives and APIs, rather than how content providers design websites. If the website uses fingerprinting in a malicious ways, that causes legal issues. But not legal issues around new web primitives or browser features.
<wseltzer> [and of course I meant to include the TAG among the groups giving guidance around design decisions]
MikeSmith: Look at Do Not Track.
How successful was it in the market? Very simple technically,
and if it wasn't successful, it wasn't due to technical
complexity of the specification.
... A lot of things the W3C can do. We don't create websites.
We create technologies, and we can make great technical
solutions, but if market forces prevent it from being adopted,
how much more can we do? If people wanted DNT to be successful,
what more could we have done?
jrosewell_: Re Dan, it's dangerous when a standards body takes the role of lawmakers and goes beyond what the law requires. Takes choice away from <?>.
Dan: Disagree
<boazsender> lol, cars are terrible
<cpn> +1 Dan
<boazsender> +1 dan
robin: No law requires cars to have locks, but very difficult to sell a lock-less car.
hadleybeeman: Law responds to what's going on. It doesn't have magic source of truth. Responds to what we build and harms society/lawmakers see.
<cwilso> +1 Dan, +1 Hadley
<boazsender> cars are a huge blunder from a technology for society perspective
<boazsender> +1 dan and hadley
nigel: The thing that's useful
from accessibility is a testable set of outcomes, relating to
accessibility. And WCAG has gotten adopted by lawmakers, who
say "do that".
... Would be good to build a privacy test suite.
<hadleybeeman> I like the WCAG parallel
<boazsender> +1 to a privacy test suite... we could add that to WPT and wpt.fyi
<Zakim> nigel, you wanted to say that we need Rec(s) with testable statements of outcome in implementations
nigel: Could test about leaking cookies, and then a new feature might be impossible to implement without breaking the test suite.
<inserted> ... Then HR for privacy can check for implementability without breaking the privacy outcome requirements.
Bleparmentier: Back to tradeoffs.
Minority Report targetted ads based on height (?)
... Quite an agreement that it's too much targetting.
... In UK, We can't stop you on the street and ask for
ID.
... Is the ID card in France still too much privacy
imposition.
... Should we stop at France? Go to UK? Even more?
... There's a gray area.
bmay_: What do we do about user
consent? How does it factor into everything? If user says they
want a browser to behave a certain way without caring about
privacy implications?
... Have browser makers put in different modes, and let users
choose?
<AramZS> In brief: The DNT question, while a valid one, is not the best way to look at the question of 'what we can do about privacy'. The DNT standard is a voluntary request to downstream systems, our new approach can and *should* be considered as enforced mechanical standards on what is made available to downstream (from the browser) systems.
hadleybeeman: When we started the Ethical Web Principles, we were reviewing a design that made ads more effective at the cost of the user's privacy. Thought that was bad, but didn't have a document to point to.
<robin> MikeSmith: ^^ AramZS is addressing your point
<robin> uhm, hadleybeeman, cross-context tracking is _never_ in the interest of the publisher either
hadleybeeman: Lack of privacy, like fingerprinting, etc. are almost always in the interest of the content author or publisher, but not the user. It's our job as a community to protect privacy and make sure user has the tools they need to interact with the web as they want to.
<MikeSmith> AramZS, thanks for the clarification
jrosewell_: Thanks for the discussion.
<Joshue108> thanks all
<AramZS> So the answer I have to 'what more can we do than DNT' is - discard making requests when it is clear that the other systems have no intent to follow those requests and instead switch to making requirements.
<AramZS> Thanks all!
<wseltzer> [adjourned]
This is scribe.perl Revision of Date Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/ Guessing input format: Irssi_ISO8601_Log_Text_Format (score 1.00) Succeeded: s/craeted/created/ Succeeded: s/Aram/Aaron/ Succeeded: i/ack bleparmentier/... Then HR for privacy can check for implementability without breaking the privacy outcome requirements. Default Present: cpn, Nigel_Megitt, kleber, boazsender, gendler_, cwilso, Tatsuya_Igarashi, Benjamin_Young, Karima, danyao, dom, JohnWilander, Laszlo_Gombos, Bleparmentier, AramZS, wseltzer, ivan, Jemma, Dan_Appelquist, Bert, Joshue, Robin_Berjon, xiaoqian, hadleybeeman, dsinger Present: cpn Nigel_Megitt kleber boazsender gendler_ cwilso Tatsuya_Igarashi Benjamin_Young Karima danyao dom JohnWilander Laszlo_Gombos Bleparmentier AramZS wseltzer ivan Jemma Dan_Appelquist Bert Joshue Robin_Berjon xiaoqian hadleybeeman dsinger Joshue108 Found Scribe: jyasskin Inferring ScribeNick: jyasskin WARNING: No "Topic:" lines found. WARNING: No meeting chair found! You should specify the meeting chair like this: <dbooth> Chair: dbooth WARNING: No date found! Assuming today. (Hint: Specify the W3C IRC log URL, and the date will be determined from that.) Or specify the date like this: <dbooth> Date: 12 Sep 2002 People with action items: WARNING: No "Topic: ..." lines found! Resulting HTML may have an empty (invalid) <ol>...</ol>. Explanation: "Topic: ..." lines are used to indicate the start of new discussion topics or agenda items, such as: <dbooth> Topic: Review of Amy's report WARNING: IRC log location not specified! (You can ignore this warning if you do not want the generated minutes to contain a link to the original IRC log.)[End of scribe.perl diagnostic output]