Skip

Online Harms - a European and UK perspective

Facilitator: James Rosewel

Speakers: Alistair Kelman

Presentation to cover
  1. the current state of online harm legislation in the UK and Europe including the Audio-Visual Media Services Directive (AVMS-D) on 1st November 2020;
  2. child safety and the impact DNS over HTTPs is having; and
  3. the role of standards, laws and industry adoption in solutions.

Slides

Minutes (including discussions that were not audio-recorded)

Previous: Innovative Adaptation, Personalization and Assistive Technologies All breakouts Next: Parts and Template Instantiation

Skip

Skip

Transcript

So I'd just like to thank Alistair Kelman for joining us today, to discuss online harms, in particular, the work that he has done in this space.

So I will now hand over to Alistair and at the end we'll then take questions via IRC chat.

So key plus to line up on the queue, but you know the score.

Thank you, Alistair.

Thank you, James.

Good afternoon, everyone.

To put everything in context today, I better say a bit about myself.

I used to practice as a barrister, but I left the bar to concentrate on technology in patents.

I started out working in technology in law, back in the days before TCP/IP eclipsed the open systems interconnection standards.

I have a personal website and on screen is the best email for you to contact me today after the session.

Today I work via the SafeCast company that I co-founded with my wife Diana.

We're working on protecting children from being harmed by content on television and the internet.

We have two websites, safecast.co.uk and SafeCast.Global which we invite you to visit.

SafeCast.Global is the home of our non-commercial initiative in child protection.

And we would like you to look at this and if you so wish, sign the petition we have drawn up on change.org.

I want to distinguish today between two types of online harms, since we're only going to be addressing the first of these today, when looking at the UK and the European situation.

This is harms by content on the internet.

Viewing the content causes harm to the viewer.

But there's a second group of harms, which are things like child abduction, coordination of criminal activities, criminal conspiracies, organized crime and terrorism.

The first group of harms are addressed by particular government departments and regulators.

The UK Department of Digital, Culture, Media and Sport DCMS. The Information Commissioner's Office and Ofcom.

Also the UK Department of Health and Social Care deals with the mental harms that arise from viewing content.

The second harms fall within the scope of the National Crime Agency, the Home Office and our security services, GCHQ.

I'm not gonna go into these harms today.

Safe for saying that my wife and I have put our proposals before these agencies.

And it would appear to be the case that they consider our proposals to be either beneficial or neutral.

So, long ago, online harms, arising from content, were considered by the W3C.

They're linked in my presentation afterwards so you can check on anything afterwards.

In the early 1990s, the W3C drew up a metadata standard to protect children.

It's called PICS.

And in 1999, PICS was superseded by POWDER, the Protocol for Web Description Resources.

This was far too complicated, they tried to integrate search engine capabilities in child protection.

It had a commercial cost around $40 per device, per child, which was in those days thought to be acceptable.

It had serious censorship issues because there was no consensus amongst the members from the W3C on how it should be used.

It had database design issues.

They separated the content from the labels themselves through pointers.

A very sensible database design issue, but it meant that some of these things got lost.

But the thing that finally wiped it out was when they took the basic standard and enhanced it with quality marks.

Is this a good website or not?

This led to the standard being taken over by astroturfers and search engine optimizers, to push their websites up the ratings agencies.

By 2005, the standard was effectively dead and PICS and POWDER were abandoned around 2008.

Looking now at the legislation in this area, the key piece of UK legislation is the European Union Audiovisual Media Services Directive, and specifically the AVMS-D Regulations, which strictly speaking come into force on 1st November of this year.

Following on from this new legislation is the Age Appropriate Design Code, which comes into force on the 1st of April of next year.

Then we have the Online Harms Bill which is to be published early next year and should be in full force by the end of next year.

And finally, we have the new European Commission Digital Services Act.

The consultation on this close in September.

It's at an very early stage and my estimate it to be coming into force about 2024.

Now the AVMS-D is the responsibility of the UK Department of Digital, Culture, Media and Sport and is enforced by Ofcom, the telecommunications regulator, and Online Harms regulator.

The Age Appropriate Design Code is the responsibility of the UK Information Commissioner's Office.

The Online Harms Bill is being brought in by the UK Government.

And the Digital Services Act is being brought in by the European Commission.

All these pieces of legislation are based around a simple point.

Children and vulnerable people need protection from being harmed by content on the internet without censorship.

And my case today, is that the W3C can make this happen.

With your help, we can bring in effective protection within months, not years.

Through the use of open standards.

At the end of these discussions, I hope you can tell me what groups and forums does the W3C have to support the European legislation and preserve what are termed family friendly filters for parents.

Something we've all experienced is that legislation runs behind technological change.

Regulation of audiovisual works in the European Union was based upon the Audiovisual Media Services Directive, which became a directive in 2010 across the entire European Union.

This directive ensures that broadcast television was highly regulated.

Now broadcast television is things like the BBC, ITV, Sky, Channel 4, RTE, Canal+, RTL, ORF. But on all those systems, content which might harm a child is banned.

But on-demand content, which in those days was little more than a few premium television channels, cable television channels on satellite with a very small number of viewers, was made subject to a less rigorous regulatory regime.

Specifically, pornographic satellite and cable channels lobbied with Brussels for a less onerous regime.

Then along came YouTube, Facebook, Twitch, and TikTok.

Five years ago, the European Union drew up a revised Audiovisual Media Services Directive, which was passed in 2018.

This brought everything within the same standard and content.

So the content which might harm a child was banned on both broadcast and on-demand video.

It's a Pan-European directive and every country in the EU, and the UK was required to bring in new laws to bring it into effect by September 2020 at the latest.

Then, yes.

Which of course has impacted every single country in the EU and it has made it impossible to meet this deadline.

Turning now to the UK position.

We are actually the first of the EU to bring in the new law, to bring the new law into effect.

New law is set out in the Audiovisual Media Services Directive and content which might harm a child is banned on broadcast TV and on-demand video.

It's in effect from the 1st of November, 2020, but it's not gonna be actively enforced before the 1st of April, 2021.

Instead, Ofcom, the regulator, has recently had a consultation on the subject of the regulation of video sharing platforms and online harms.

And the Ofcom report is awaited by everyone with interest.

Under these new regulations, video sharing platforms must enforce their membership requirements, that's the click-wrap agreement, to protect children.

There must be no harmful or misleading advertising.

There must be no harmful content.

They must age verify their users.

They must also register with Ofcom, they pay the fee and accept UK jurisdiction over content to UK citizens.

The first issue is how VSPs are going to comply with the new Audiovisual Media Services Directive Regulation, to remove harmful content they're advertising.

They appear to be intent on doing this through the use of artificial intelligence and databases, to filter away anything that might breach the regulations.

How do these systems work?

The main ones are Google Safe Search and Symantec RuleSpace.

Both of these use proprietary AI database systems and because these are proprietary systems, their operations are not public.

Ofcom, in their VSP consultation which closed last month, asked some very detailed questions of VSPs.

And it will be looking at how VSPs, Artificial Intelligence database filters, comply with Ofcom's core principles, core requirements, of freedom of expression and transparency.

Now, within this area it's the major problem of family friendly filters.

Symantec RulesSpace and Google Safe Search are core parts of family-friendly filters in the UK Something the parents depend upon.

90% Of UK families get their broadband from four internet service providers.

All four of these ISPs have family-friendly filters on, by default.

All mobile networks in the UK have family-friendly filters.

The big retailers, where children are present, like McDonald's and Starbucks, also have family-friendly filters.

The Internet Watch Foundation provides a DNS list to block child sex abuse material to these family-friendly filters.

But new Internet Engineering Task Force protocols undermine the operation of these DNS filters.

There's no solution yet offered by the encrypted DNS deployment initiative members, as I'll explain on the next slide.

To predict security, new IETF force, Internet Engineering Task Force standards are coming in.

These are essential to protect the operation of the internet against threat.

Privacy is being embedded within the internet so that communications can be private, and to inhibit stalking and abuse.

The technical changes are being rolled out by Google, Apple, Mozilla, Cloudflare and Comcast, who are key members of the encrypted DNS deployment initiative.

These will protect against man in the middle attacks, and make it less easy for governments to censor the internet, and for attackers to stalk users' online behavior.

These changes are coming in during this year, but as an unintended consequence, our family-friendly filters will fail.

It was for this reason that the IWF asked me to sit in on the EDDI discussions to suggest how EDDI members could address this problem.

I want now to briefly outline how we can regulate the internet.

We can do this using laws or we can also do this using practices.

Laws go through a series of steps.

Laws are made by parliament, bill, second reading, committee stage, yadda yadda, as you know, all the processes they go through.

They're lengthy, expensive, difficult processes, there are many loose ends and they can be lost in going through the legislative processes.

They also can have unintended consequences.

Waiting lists for medical appointments, a well-known example of how the certain hospital boards gained the system to get themselves supposedly in compliance.

Also, laws are difficult to interpret correctly, and from time to time, we need judicial review of the law.

So it's not a straightforward process.

In contrast, practices are made by merchants and the business community.

They're experimental means to lead to a desired result.

An established business practice can become a law via judicial interpretation and caselaw.

That's what happened over the centuries effectively with double entry bookkeeping.

And a modern example of a practice would be the placing of adult magazines on the top shelf in a news agent, as a means of stopping young children from perusing them.

It's not a law, but it's a best practice.

And practices can become universal by the network effect as more and more people use them.

It's particularly good example to this, that the QWERTY keyboard, which has embedded itself in our lifestyle and GSM phones as well.

So, as a way that W3C could help.

This is Larry Lessig.

He's a professor at Harvard and Larry's a great thinker on human rights and copyright.

He's the behind Copyleft and the Creative Commons.

Just over 20 years ago, he published a book called Code of Law" which looks at how computer programs and logic could be made to embed laws in systems, which we humans use and rely upon. His work has inspired me and has led me to set out the SafeCast's proposals, which are to use global standards as a proxy for legislation to regulate the internet. I may want to tell you a bit about how we got to where we are. Back in 2012, the embryo of our SafeCast company started looking at content labeling on the internet to protect children. We started from a different place than the W3C. We looked at how the independent broadcasting authority, the predecessor to Ofcom, regulated television in the 1960s and seventies. Through our early work, my wife and I came up with a notification standard, which was equivalent to the television watersheds. And we put out proposals before the European Commission and Ofcom as the basis of the self-applied content labeling system initially, as a best practice. We found that all video could be divided into just seven classes. From video, which was suitable for anyone to watch, to video which could never be broadcast without restrictions, such as execution, beheading videos. We also found, that our classifications could be mapped onto the key stages of the national curriculum, which means that generic age filtering based upon the school age of a child can be used in digital devices as the basis of filtering for child protection purposes. In 2013, we were granted a UK patent for our invention. The claims only apply to automated filtering of advertisements, which allows our invention to be used completely free of charge for automatic filtering of content, which is not advertising. Now, if you think of content as a continuous spectrum, then there are colored areas which fall within were blurred. They'll always be a debate about whether something is blue-green or green-blue. Thus whether a piece of content falls within a level two or a level three. But, onto this, you can impose fixed bright lines for advertising, which ensures that inappropriate advertisements are not shown to children who are too young to view the them. In 2016, we got a US patent for our invention. In the USA, the TV watershed is called the safe harbor rules. And also in 2016, SafeCast gave the UK government a FRAND undertaking, that's fair, reasonable and non-discriminatory. So that our patents could effectively be standard essential patents. This a bit like your mobile phone, which has thousands of patents involved in its operations. The final barrier to our universal adoption was removed when the UK Intellectual Property Office notified us that it was prepared to act as a mediator on determining royalty rates. So in 2018, the Office of the Children's Commissioner asked SafeCast to approach Facebook with our proposals. It was already a means of coding filtering in YouTube in it's API under which BBFC videos could be classified in YouTube metadata. This is the code, for all those programmers amongst you. Very straightforward. We proposed a similar code for SafeCast. The SHC Headscodes, which we knew could fall within the ISO standard. This is what we put up to Facebook and Facebook has yet to respond. I spoke some time ago about laws and practices and how practices can turn into standards. Well, today's standards enable seamless mirroring between devices. This is all done through embedded metadata. With one flick of a finger, you can send a video from a mobile phone onto a 4K television screen. All the complex changes are done by interoperable standards. In video, this was done by two organizations working together, the Digital Production Partnership, which we're members of, and SMPTE, the Society of Motion Picture and Television Engineers, which have their standard on a thing called GitHub. As TSP 2121. This is the full name (mumbles) up there. Also in this area is EIDR, the Entertainment Identifier Registry, which is based in Hollywood and looks after ISO 26324, the universal digital object identifier system. This has the in-built capacity to cope with all possible and potential video content created by the human race forever allowing child protection through metadata labeling and filtering to become a core part of every video without censorship. Protecting children and vulnerable people on the internet is not just about pornography and censorship. There are other issues at stake here too, fraud and crime. On 1st, April 2020, the Age Appropriate Design Code will come to force under section 123 of the Data Protection Act. This code says that all advertising must comply with the Committee of Advertising Practice code, the CAP code. The CAP code is an industry standard, which is self-regulating and forms a core part of regulation of broadcast advertising in the UK, which must be legal, decent, honest, and truthful. These are the requirements built into the CAP code. Under the code, the ICO has ordered that all advertising must contain details on ownership, that is who owns the advert and is promoting it. Provenance, where the advert is coming from and product placement, identifying that the advert is an advert. These requirements do not cause any problems for UK regulated broadcasters like ITV or Sky or Channel 4, but there are serious problems for Facebook, YouTube, and Twitter, who do not have this data. In my slides there are two hotlinks which can take you to major news stories from the Wall Street Journal and the New York Times, which show how the situation is out of control. SafeCast has put forward some proposals. We're saying that to adhere to the CAP code, all advertising must contain metadata on ownership, provenance and product placement so that it can be filtered away for child protection purposes, using the provisions of section 104 of the Digital Economy Act, that's already law. And we're also saying that to enable lightweight content filtering without censorship, all content, including advertising, should contain SafeCast Headcodes to enable open standards based lightweight filtering for child protection purposes. We're waiting on Ofcom's response to our suggestions. But if we're successful in getting our proposals taken up, then as a bonus, it will mean that metadata labeling within an open standard can protect the family-friendly internet in the UK and the EU. Furthermore, because of the capability within ISO 26324, the universal digital object identifier system, the labeling can be implemented nationally in accordance with the digital sovereignty requirements of nation states. There will be no need for Google, Mozilla, Facebook, Apple, Cloudflare and Broadcom to become global censors. There's a relatively new legal concept in international law called digital sovereignty or tech sovereignty. It allows nation states to harmonize technical issues in a consensual manner. It allows us to give effect to good technical engineering and business practices, which improve trade by removing barriers to trade, increases the volume of the trade and it improve safety. Picture on the right is Dr. John Postgate, a Birmingham 19th century doctor who campaigned for over 20 years of his life for legislation to stamp out the practice of adulteration of food medicines in the pace of powerful vested interests and the indifference of most parliamentarians. Today, the unfiltered internet is the polluted sea of information mixed with disinformation, pornography, horror and violence. Filtering the internet to protect children from being harmed is a universal public health issue, akin to food safety and the provision of clean water. I ask for your support of SafeCast's proposals and your guidance on how we can get the W3C behind these. Thank you for your attention.

Skip

Sponsors

Platinum sponsor

Coil Technologies,

Media sponsor

Legible

For further details, contact sponsorship@w3.org