AB/2014-2015 Priorities/w3c work success

From W3C Wiki

Raison d'être

W3C Management has asked for the AB's help identifying projects that do not seem to be on the road to success. . This task force is intended to:

  • Identify measurable indicators that predict whether a spec development effort is on the road to success
  • Identify specific things the Team can do to identify red flags and mitigate risks

The challenge of course is to given that "prediction is very difficult, especially about the future". This task force is charged with gathering information and building consensus on how to do a better job than we have been doing at making such predictions and prescriptions.


  • success - Need to define success in this context.
  • failure - Need to define failure in this context.
  • metrics - What metrics will be used to determine success vs. failure?

Basics of how the project will work

The project will mostly work in public. The AB does reserve the right to have confidential discussions with people who do not want to be identified or quoted in public to make sure we get solid data on controversial topics. The project will involve any volunteer (AB, AC, any W3C participant willing to help) The project will suggest a way to identify work not destined for success during AC meeting in May 2015 as well as a plan for mitigating common reasons for a lack of success.

Methodology : Learning from W3C's successes and failures

The team will attempt to mine W3C's experience to learn some "lessons of history". The work will be done in public on the public-success-fail@w3.org list.

Mining the /TR page, participation lists, mailing lists, and issues lists for patterns
  1. Rates of bugs being raised and resolved
  2. Does group have no energy, critical mass, or too many cooks?
  3. Are specs advancing in maturity?

There is now a table mashing up data from the working groups database, and the mailing list archive. See WG Data Mining for more information.

From a crude analysis based on sorting the table by a) having an expired charter for multiple years, b) number of messages in their mailing lists in the last half of 2014, c) number of technical reports published 2013-2014, and number of CRs / PRs / Recommendations 2013-2015, the following WGs scored low on multiple criteria:

  • Web Notifications
  • Forms
  • Near Field Communications
  • Voice Browser

See WG Data Mining Analysis for more information.

Initial list of factors that may inhibit W3C work success
  1. Overly ambitious plans that require "boiling the ocean"
  2. Trying to prematurely standardize a rapidly evolving technology
  3. Trying to standardize technology that is competitive / key patent holders not willing to make RF commitments
  4. Trying to standardize technology where W3C doesn't have a critical mass of contributed expertise
  5. Continuing to invest after industry has moved on
  6. Members don't contribute adequate resources to stabilize / test draft specs, causing them to languish in WD or Candidate Recommendation stage
Initial list of failures to learn from
  1. XHTML 2
  2. XForms
  3. URL spec (there was a plausible plan, W3C couldn't execute on it for a combination of resource and political reasons)
  4. P3P
Strawman Proposals for avoiding failures
  1. Don't charter WG's without strong member support
    1. Issue: must define a metric for strong member support in this context
  2. Don't charter WG's without a contributed spec f.ex. from a CG or Member Submission
  3. Team should be resourced and ready to take over editing / testing if member contributions don't materialize
    1. Issue: in this scenario, the Members should re-evaluate if the group should be continued
  4. Only charter WGs that will focus on objectively measurable real world interoperability issues/problems.
Strawman Proposals for detecting WG failures

Following indicators should be positives

  1. The relevant industry representatives are mostly in the WG
  2. There is a high transformation rate of participants in a workshop/CG to those joining a WG
  3. Substantial number of active contributors
  4. group life intenisty (# de call, F2F meeting, #bug activity…)
  5. Effective Liaison with other market related standardization bodies
  6. Alignment of deliverables and initial calendar
  7. Lots of public communication on the topic (by members, by W3C, ...)

The following are probably "red flags"

  1. Irreconcilable differences on fundamental issues
  2. Turf wars with other organizations
  3. A lack of implementations


  • October 2014 : setup wiki and mailing list, call for participation
  • November/December 2014 : identifying patterns indicating successful / unsuccessful work testing their usefulness for prediction
  • January/February 2015 : summarizing what we learned and presenting a proposal for how to identify work not destined for success going forward
  • March/April 2015 : assessing feedbacks, evaluating impact on W3C reputation, culture, etc.
  • May 2015 : Recommendations and action plan

Team members

  • Michael Champion
  • Virginie Galindo
  • Chaals
  • Jeff
  • Marcos

Getting productive discussion from people might be easier if they aren't being exposed to public criticism. So, confidential input is allowed but the bulk of the discussion should be in public.