ISSUE-223 Criteria for implementation testing

Hello all,

As discussed previously, the group should have criteria by which to 
measure success during the implementation/testing phase. by establishing 
criteria the group will have benchmarks by which to measure test 
implementations and then know in which cases it has achieved the desired 
result and in which cases it might possibly revise the specification.

This email is to begin the conversation around establishing what these 
benchmarks shall be.

  * What is the anticipated cost of implementation for a server? Is that
    cost reasonable?
      o Overly expensive implementation costs will impede adoption.
      o Does any one have an idea of a reasonable range?
      o Should the cost be expressed as a fixed cost or as a percentage
        of revenue?
  * What is the economic impact of implementation?
      o How are particular businesses, market segments, or the industry
        affected?
  * What is the impact on the user experience?
      o On the micro level, what is the experience of users interacting
        with choices related to DNT?
      o On the macro level, does DNT alter the overall experience of
        users on the net? Does it alter the quantity or nature of
        content that is available to users?
      o How does DNT change the behavior of publishers, and how do those
        changes affect users?
  * Rate of adoption and range of implementers
      o What level of adoption are we seeing or anticipating? What level
        is considered success?
  * Does DNT satisfy policy requirements?
      o Does it reduce the amount of data being collected about users?
      o Does it mitigate real or perceived privacy concerns?

I look forward to hearing others' input.

Best,

David

Received on Monday, 18 November 2013 20:32:51 UTC