W3C

– DRAFT –
Online Harms – a European and UK perspective

26 October 2020

Attendees

Present
Alan, Bert, Brent_Zundel, Dan Appelquist, dom, dsinger, ivan, Janina, jeff, Joshue108, jrosewell, Karima, Tatsuya_Igarashi, weiler, wseltzer, [+ ~40]
Regrets
-
Chair
James_Rosewell
Scribe
dom

Meeting minutes

Alistair Kelman's Slides

<wseltzer> [note that W3C has a strong royalty-free patent policy]

<wseltzer> https://‌www.w3.org/‌Consortium/‌Patent-Policy/

<Zakim> wseltzer, you wanted to discuss RF commitment, submission? and to discuss assertions about companies

wseltzer:Pointing out W3C's royalty-free Patent Policy ^, one way to invite consideration of work at W3C is to make a Member Submission including the RF patent commitment. Second, noting some assertions about companies' compliance, as a standards body, we're not in position to address those.

jeff: very much against online harms
… please to see actions to cure online harms
… the lack of success around PICS and POWDER is full of lessons for us
… W3C doesn't legislate, we're a voluntary standards organization
… we also do much better on tech standardization than we do on policy
… my impression from your presentation is that there is quite a bit of intermixing of the two in your proposal
… we would need to tease out where W3C can contribute
… I like the idea of a Member submissions, which would help clarify what is technical in nature for which we have a process to help go through standardization as they mature
… whereas for policy, we have no consensus to work on policy-related matters
… we know how to standardize metadata, but we're not the place for enforcement
… we have extensive means in W3C to bring communities together and bring proposals to standards

James: fwiw, this is first and foremost an awareness session

Jeff: I was responding on the optimistic projections from Alistair that W3C could solve this in 6 months!

James: Alistair is dedicated to fixing this problem

Alistair: the opportunity here is, as member of the digital production partnership, and we look at managed metadata for videos
… by getting ofcom to put some recommednations on what the standards metadata should have
… and then have middleware deployed throughout the world to react to this metadata
… that's what the opportunity is
… this would enable effective child protection in a matter of months, not years
… by using practices and technical standards, there are ways to circumvent some of the adoption difficulties

<Zakim> dsinger, you wanted to ask about definitions

DavidSinger: a few years ago I moderated a panel at the AC meeting around fake news & moderation around what Internet harms
… "fake" is only relative to "truth", which is notoriously hard to agree upon
… "might harm" seems likewise hard to interpret
… this intersects with the right to free expression
… you talked about unintended consequences - I'm fearful of the consequences of these vaguely worded statements
… to what extent is it a societal problem vs a parental problem vs a free speech problem?
… very delicate balances to find
… not sure the very heavy handed regulation is the right way to go

Alistair: I'm not proposing a heavy handed regulatory regime
… I say there is a spectrum of content
… it would be quite a debate whether something would cause a child a fright or not
… but you knew exactly whta content was at the extremes
… you could classify people in categories, and use big data to help people make decisions
… we know that in the UK, you've had a television watershed - the most salacious program have tended to be shown later and later at night
… so looking at the original broadcast time of a program, then it may contain salacious content; not so if shown early in the morning
… this could be used to classify our present content
… on top of that, you could put forward dispute procedures to re-classify content from level 4 to level 3
… using a 3rd party support
… In addition to SafeCast, we have SafeCheck which is designed for professional people to certify whether content complies with specific standards
… using the SafeCheck trademark to ensure compliance
… we're in discussion with regulators to make this a voluntary standard to make this area properly managed
… We're not talking about censorship; adults would always be able to see the content

James: does that address the question about the role of parents?

Alistair: Parents need assistance - we're trying to give parents the tools that they need to protect their children
… the internet is a kind of baby sitter in a lot of ways
… we have ensured that you can use th school age of a child to balance this properly
… so that kids can share among their peers without causing them unnecessary harms

James: it's helping the parents, not just a substitute

DavidS: at the moment, people on the internet are unidentified

<Zakim> dsinger, you wanted to mention "on the internet…"

DavidS: allowing tools to see if there is a child at the end of the conversation would allow targetting children specifically
… "on the internet nobody knows you're a dog" is not as true as it used to be due to tracking
… but there remains an advantage is the relative anonymity of the end user
… once you know that you can't show pornography because the user is a child, then the child could become a target for manipulation by the site

Alistair: by embedding the limitation in the metadata, we're allowing to filter the media away

DavidS: but the site can detect that it wasn't fetched

Alistair: but there is not feedback here

DavidS: this can be worked around

James: this highlights the challenge of solving that issue

DKA: +1 to David Singer on the potential abuses
… we are in a different age when it comes to online harms, esp to children

<kleber_> David Singer is definitely correct: there is no technical solution to filtering without anyone being able to tell that filtering has happened

DKA: pornography & adult content is one type, but both as a parent and as a technologist, I'm a lot more concerned with grooming, sexting, children being groomed into cults such as QAnon
… being groomed into alt-right conspiracy followers
… there are a lot of kind of online harms happening outside of content delivered to browsers , via social media
… Every OS has parental controls built-in
… given that online harms is bigger than the browsers
… also locking in a given browser or a given browser gives a false sense of security
… e.g. TOR can be used to work around network blocking
… any device where software can be installed can be used to work around these limitations
… device-based systems feel more effective - you can lock down different type of contents, but you can also limit what additional browser/software can be installed
… it feels the focus of the proposal is from a decade in the past - we need to look to the harms of today's internet

Alistair: Ofcom is regulating this space and determines how they go about it
… what we've been looking at how to let the technology help with that
… usage of TOR by children, esp young children (<13 years) is very limited
… it needs to be possible that children are safe, not being groomed via a 2nd hand mobile phone that would be given to them
… we have a discussion with the Home office
… what we're proposing is being looked upon seriously

James: on one hand, we have an argument that it is up to lawmakers to make up what's needed
… and then technology organizations need to support that
… vs how effective these approaches can be, and the risk around censorship

James: you mentioned April as a timeline for the UK
… what can we expect from the EU?

Alistair: at the moment, Spain is having a consultation on online harms similar to what the UK did at the end of September
… the report that I've seen on the submissions is very similar to the UK's
… I would hope that the UK's proposals are going to be such that they will be accepted as effective right away across all the EU
… that we won't have a divergence of standards
… that the UK will show the way which will be in compliance with what the European Commission will do in this space
… there is no difference between the UK & EU goals
… French legilislation is going through, Germany is going through Länders as well
… but COVID has slowed down these discussions

<Zakim> jyasskin, you wanted to mention "incentives to label accurately, and danger to trans kids whose parents don't want them to be trans"

Jeffrey: I wanted to worry about labeling content as harmful when it shouldn't be
… e.g. youtube labeling LGBT content as mature, even though it is similar to non-LGBT non-mature content

<dka_> +1

Jeffrey: likewise, had a friend who really needed access to that content as support

Alistair: wholeheartedly agree on this
… we had a video on this last year
… I'm more than happy to continue to conversation - feel free to get in touch with me

James: thank you for preparing this presentation - not an easy topic

<jrosewell> And thank you Dom for scribing.

Minutes manually created (not a transcript), formatted by scribe.perl version 123 (Tue Sep 1 21:19:13 2020 UTC).

Diagnostics

Succeeded: s/dta/data/

Succeeded: s/needs/need/

Succeeded: s/has/as/

Succeeded: s/no scribe I'm afraid/scribenick: dom

Succeeded: s/What is the zoom password?//

Succeeded: s/wseltzer yes//

Succeeded: i|very much against|wseltzer: Pointing out W3C's royalty-free Patent Policy ^, one way to invite consideration of work at W3C is to make a Member Submission including the RF patent commitment. Second, noting some assertions about companies' compliance, as a standards body, we're not in position to address those./

Maybe present: Alistair, DavidS, DavidSinger, DKA, James, Jeffrey