Warning:
This wiki has been archived and is now read-only.

DeviceDescriptionEcosystemTrust

From Device Description Working Group Wiki
Jump to: navigation, search

This is the Trust section of the Device Description Ecosystem document. See the DeviceDescriptionEcosystem overview.

Trust

Trust is a value we place on data. When offered two or more sources of data that offer the same information, we choose the source with the highest trust to increase the chance of getting good information from the data.

The issues to be resolved include:

  • Representing the trust values.
  • Determining trust values for information sources.
  • Communicating trust to the data consumer

Consider the following scenario:

Manufacturer M provides a highly portable device with a built-in keyboard. Service provider S wants to deliver an interactive form to M and will choose a menu-based solution or a text-entry solution based on the keyboard usability. User U has indicated no particular preference for stylus, buttons or keyboard. Accessibility group A has rated M's keyboard as poor, while M has rated the keyboard as excellent. How should S proceed?

In the above scenario, S may decide that M has a reputation for good usability testing and is honest in its descriptions, whereas A has a tendency to be over-critical. Thus S trusts M's description more that A's, and therefore chooses to send a form to the device requiring the user to enter some text into a field.

Alternatively, S may have no opinion about the trustworthiness of M's descriptions, but knows that the government has suggested A's usability assessments be used whenever possible. In this case, S will choose A's judgment and chose to deliver a sequence of simple menus to the device to avoid the user having to type text.

The scenario above highlights a problem with subjective descriptions (such as the usability of certain device features) since any such descriptions can only be the opinion of the people who created the description. However, a similar problem can occur with data that appears to be objective. The size of a screen would appear to be an objectively measurable value. Yet, to the manufacturer it means the number of pixels present in the screen, while to the content author it means the subset of those pixels that can actually be used to render content. The two values are often different. One could attempt to derive "usable size" from "physical size" but now you have introduced a processing step in the acquisition of information and you will need to consider the trustworthiness of this processing. Thus, any variability in the interpretation of the data can lead to discrepancies and loss of trust.

* Trust is important when critical decisions must be made. Selecting between two sources of data is one such decision. Choosing between a number of adaptation options in critical phases of interaction with the user, is also trust-related. For example, your service is delivering a map (or an X-ray medical image, or security picture from a remote camera or any other fidelity-sensitive image) to a mobile device, and you have information from a moderately trusted source that the device will scale rather than clip the image. Do you, or do you not, adapt the image on the server before transmission? The consequences of making the wrong choice could be significant.

Trust the source of information

* There is a relationship between the trust you assign to an instance of data and the trust you assign to the source of that data. Generally, the more you trust the source, the more you will trust the data. New data produced by a source will initially be trusted to the same degree as the source itself. When data is subsequently used, the quality of that data can be evaluated. If the results of using the data are good, this can improve the trust one puts in the source. Similarly, if the results of using the data are bad, this can negatively affect the trust in the source.

* The trust one associates with a source can be affected by the proximity of the source to the origin of the data. For example, a manufacturer of a display screen may be highly trusted with respect to objective technical details of the screen, such as its dimensions. A retailer of devices in which many screens are used might be trusted less because the retailer was not involved in the manufacture of the screens.

Missing information

* The information provided by sources is never complete. There are several reasons for missing information, including:

  • * Costs associated with gathering, verification and publication.
  • * Lack of access to the origin of information.
  • * Concern that revealing some information that may be perceived negatively. In particular, information about device limitations or known erroneous behaviour is something that some manufacturers may be cautious about making public.
  • * Belief that certain data would not be valuable, and therefore not worth publishing.
  • * Delegation of responsibility for publication of some of the data to other authorities.
  • * Mistakes made by the collators of the data.

Multiple sources

¤ Discuss: the reasons for there being multiple sources of information, and how do deal with the issue of merging this into a single set, selecting from conflicting information etc.

Errors in original information

¤ Discuss: causes of errors and the problem of getting such errors corrected, and of dealing with legacy erroneous data even after the corrections have been made.

Objective vs Subjective information

¤ Discuss: difference between objective and subjective information, with examples, and consequences for trust. Mention issue of verification (objective=easy, subjective=controversial).

Trust how information is used

¤ Discuss: trust issues for providers of device information who may have good reason to be concerned about how such information is used (or abused).

User preferences

User preference is not part of device description. May be part of extended repository. ¤ Needs rewriting.

Implied user properties

The implication of some device properties is that the end-user may be categorised. For example, the presence of a screen reader would be seen as implying that the user was visually impaired, and this information could influence the content/service provider (e.g. a health insurance provider offering a service via the Web).

Legal restrictions

Jurisdiction issues

Trust the fidelity of forwarded information

What happens to cached information? Will it go stale? Will forwarded information be delivered verbatim, or will an intermediary be permitted to summarize/aggregate?

Many of these issues are addressed by established information security technologies. However, their implementation raises some overheads (e.g. the calculation of digital signatures), which will impact both the cost and the efficiency of the DD repository.

In a working ecosystem, those demanding trust must place a value on this trust and therefore accept the associated overheads.