InterestingCases

From Policy Languages Interest Group
Revision as of 18:56, 5 January 2010 by Dwvisser (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

On this page, the PLING collects interesting real world cases that are examples for specific aspects. Those aspects can be found in the policy itself or in the case, thus giving more insight on the traps and holes waiting for the creator of a computer aided policy.


Dog Poo in the entry hall

As usual, banks have a video surveillance system, mainly for security reasons. It should record attempts to manipulate ATMs or film attacks.

A german bank has a new innovative use for the monitoring. A woman with her child entered the bank. The video tape recorded that the child left traces of excrements on the floor. Apparently, the kid had stepped into dog poo before entering the bank. The bank used the video record to identify her client and sent an invoice for cleaning the floor.

The case was taken up by the press and created a lot of reactions. The data commissioner started an investigation.

As an interesting follow-up, the city of Tilburg in the Netherlands plans to have a registry of the DNA of Tilburg dogs to determine the originator of the dog poo. Now if I step into dogpoo in Tilburg, can I ask to have the remains on my shoes being analyzed and find the holder of the dog?

Specifics

Apparently, there is some unexpected use of the video recording. It seems that people generally have the assumption that those video records are there for security reasons, that they serve to identify robbers or manipulations. The bank points to their general policy that allows for the use of data also for other purposes like leaving dirt. Nobody reads the policy and relies on the natural assumption. Assumption and policy diverge and the scandal is created. This is the interesting aspect of this case. In how far can natural assumptions be factored into a policy language or a user interface thus warning people that their natural assumption does not match the actual policy? This is an interesting field for research. P3P does already allow to identify the mismatch, but it has as a precondition that the users has entered all his assumption into his profile. Nobody would do that. We may come up with a common set though that could serve as a starting point.

Added on 13 Feb 2008 by Rigo Wenning

Virgin Mobile and Flickr Photos

Virgin Mobile used photos that it found on Flickr in a national advertising program. The ads were large bus-stop billboards in Adelaide, South Australia. The photos were tagged under a Creative Commons license that allowed commercial use. The final images followed the rules of the CC license and added the required attribution.

Both the owner of the photo and the subject of the photo were completely unaware of this use of their images. They were both surprised when they found out.

Issues

This real-world use case raises many issues.

  • If you upload photos of your friends onto Flickr (or any other social network) should they be informed?
  • If you tag the photos that allow Commercial Use, should all the subjects be made aware?
  • If you tagged the photos with CC Attribution License, should you be aware that this allows Commercial use?
  • If you were aware of this, were you aware you needed Model Release declarations from the subjects in your photos?
  • Should Virgin Mobile have informed the photo owners/subjects of their plans?

Links

Added on 13 May 2008 by Renato Iannella

21 Million bank accounts to buy

On 8 December 2008 Wirtschaftswoche reported that around 21 Million bank records of german citizens are circulating on the market. There have been some privacy scandals in the past where Deutsche Telekom lost 17 million records with names, phone numbers and addresses of customers. But this time it is even worth. The records contain not only identifying personal data, but also details about bank accounts, birth date and even financial information. This is sufficient to attack the bank account and fraud off some funds.

As was in the case of Deutsche Telekom, the origin of the lost records seems to be some smaller call centers. Wirtschaftswoche blames excessive outsourcing by big telcos and media companies in the past.

To optimize the business, hotlines and customer service are outsourced to smaller, cheaper call centers. Some of the heavily underpaid employees of those call centers do collect the data given, augment it with what has come out of the interaction with the customer and sell it to some data merchants. The data merchants sell the data to other data merchants. The latter will acquire data from different sources, aggregate and clean out. This way, a huge amount of data has accumulated and is now offered for sale in the market.

Issues

This case raises the issue of data governance. There are no accompanying measures to safeguard, audit or otherwise remain in control of the data. Now how does a data controller remain in control in outsourcing scenarios? This question hasn't been resolved so far, so the media is suggesting to abandon the outsourcing strategies by big companies.

Possible Remedies

In 2003, Pearson, Casassa-Mont & Bramhall proposed a paper on Accountable Management of Identity and Privacy: Sticky Policies and Enforceable Tracing Services. This uses Trusted Computing Technologies to enforce the data handling by the company outsourcing work that needs access to certain personal data.

Added by Rigo Wenning: 08 December 2008 ==

Privacy friendly data analysis

The MIT senseable city lab made an analysis of the crowd and their communications during the inauguration of US President Obama. It is remarkable that they used EU data protection standards:

The data analyzed consists of hourly counts of mobile phone calls served in Washington, D.C. and includes the origin of the phones involved in the calls. To ensure the complete privacy of the mobile customers, analyses are performed in compliance with the 2002 Directive of the European Parliament and Council on Privacy. The use of aggregate call data implies that at no time can individual users be identified. As a comparison, consider information about highway traffic: we know how many cars from each state are traveling at any given time, but we do not know the license plate number or the individuals driving.

Added by Rigo Wenning: 26 May 2009 ==