Testing

From Web and TV IG

About this Task Force

The Web & TV Testing Task Force is part of the Web & TV Interest Group. The Task Force is responsible for developing testing requirements specific to Web & TV applications. The focus is expected to be on HTML5 and associated specifications. The Task Force will work with the Web Testing Interest Group and other relevant W3C groups to ensure that Web & TV requirements are met by the W3C testing framework.

Background

Over the last couple of years, much work has been done to provide HTML5 with the features necessary to support commercial video. Adaptive bit-rate video formats, protected content and features required by commercial video regulation have all been developed to support common means of providing commercial video using standard interfaces. Being able to reliably test these new features is a necessary step towards ensuring high-quality implementations that run correctly and consistently on HTML5 browsers. The W3C has recognized the need for testing to encourage consistent and reliable implementation of HTML5-compliant browsers. The Web & TV Testing Task Force will work with other testing efforts in W3C to make sure the W3C tests are rich enough to support Web & TV use cases.

The Web & TV Testing task force will do the following:

  • Collect Web & TV testing use cases (e.g. "Testing a browser embedded in a commercial TV" or "use of W3C tests by third-party certification organizations").
  • List requirements for W3C test tool features to achieve the use cases (e.g. "ability to test embedded browsers").
  • List prioritized requirements for W3C specification test coverage to achieve the use cases (e.g. HTML5, Media Source Extensions).
  • Identify gaps in the current test tools.
  • Identify gaps in the current test coverage.
  • List features important to Web & TV members and classify them in terms of testing priority.
  • Work with the Web Testing IG, Browser Testing and Tools WG, HTML Testing TF and other relevant W3C groups to communicate the requirements and develop a strategy to fill the identified gaps.
  • Liaison with external organizations to inform them about the ongoing activities and gather input on the use cases and requirements.

Dashboard

This dashboard will be used to provide timely and relevant information on the work of the Web & TV Testing task force.

Use Cases

Please bear in mind that text should be used whenever possible (instead of email attachements) for indexation purposes.

Resources


Tentative Requirements

  • Test Coverage and Priorities
  1. Examine existing tools (e.g. Modernizr)
  2. Surveys of major web sites for issues
  3. Workshops
  4. Consider external organizations (DLNA, OIPF, etc. (see below))
  • Central Test Runner
  1. One URL as central location for all tests
  2. One click to run all tests
  3. Clear results summarizing top pass/fail results
  4. Detailed pass/fail results for individual tests
  5. Test configuration options
  6. Certifiable test logs
  • Device Tests
  1. Remote testing
  2. Differences for products vs. prototypes?
  • Requirements from Use Cases
  1. Standardized APIs for test hooks
  2. An ecosystem (web sites, workshops, etc.) to get feedback from the community on bugs, priorities, features, etc.
  3. Provide one home for all W3C tests
  4. Performance measurement
    1. Time to start a stream
    2. Average frame rate
    3. Maybe defined "acceptable" bound, but the measure is most important because different applications will require different performance.
  5. Management of testing with various codecs and encoding schemes (e.g. adaptive bit rate)
  6. Management of testing with various externally provided CDM implementations
  7. Ensure testing can be performed with EME implementation in JavaScript
  8. Interface and testing to override cross-origin restrictions for specific devices and services authorized by the user
  9. Testing security of discovery process (e.g. user must specifically authorize access to each device)
  10. Synchronization of streams in MSE (e.g. works with ad insertion or concatenation of programs in continuous stream)
  11. Network impairment/feedback mechanism to force MSE adaptive bit-rate implementations to change bit rate.
  12. Test cases in support of EME
    1. The same CDM across two different browsers yields the same result.
    2. Two different CDMs can decrypt the same stream

Survey on testing priorities

External groups (listed below) and IG members have been asked to complete a Feature Coverage Table

External Groups

In order to get feedback on testing priorities from various organizations in the TV industry, we send out a survey asking about testing priorities. Below is a list of external organizations we have reached out, inline the status of the reply to our liaison letter. Note that the list has been built based on members input on which organizations to reach out.

Survey Results

The detailed results from the survey are hosted on the member wiki and are MEMBER CONFIDENTIAL

An aggregated version on the results is available also on this public wiki

Other Relevant W3C activities

This is a list of groups/activities that may be related to the TF work. Note that this list doesn't imply any endorsement of the linked technologies or official liaison with the linked group.

Specification Proposals

  • TBD

Participants

  • Clarke Stevens, CableLabs (moderator) c.stevens@cablelabs.com
  • Mark Vickers, Comcast
  • Giuseppe Pascale, Opera
  • Yosuke Funahashi, Tomo-Digi
  • Kaz Ashimura, W3C
  • Bob Lund, CableLabs
  • Sheau Ng, Comcast/NBCU
  • Bin Hu, AT&T
  • Daniel Davis, W3C
  • [add your name here if you wish to participate in this TF]

Discussion & Calls