About this Task Force
The Web & TV Testing Task Force is part of the Web & TV Interest Group. The Task Force is responsible for developing testing requirements specific to Web & TV applications. The focus is expected to be on HTML5 and associated specifications. The Task Force will work with the Web Testing Interest Group and other relevant W3C groups to ensure that Web & TV requirements are met by the W3C testing framework.
Over the last couple of years, much work has been done to provide HTML5 with the features necessary to support commercial video. Adaptive bit-rate video formats, protected content and features required by commercial video regulation have all been developed to support common means of providing commercial video using standard interfaces. Being able to reliably test these new features is a necessary step towards ensuring high-quality implementations that run correctly and consistently on HTML5 browsers. The W3C has recognized the need for testing to encourage consistent and reliable implementation of HTML5-compliant browsers. The Web & TV Testing Task Force will work with other testing efforts in W3C to make sure the W3C tests are rich enough to support Web & TV use cases.
The Web & TV Testing task force will do the following:
- Collect Web & TV testing use cases (e.g. "Testing a browser embedded in a commercial TV" or "use of W3C tests by third-party certification organizations").
- List requirements for W3C test tool features to achieve the use cases (e.g. "ability to test embedded browsers").
- List prioritized requirements for W3C specification test coverage to achieve the use cases (e.g. HTML5, Media Source Extensions).
- Identify gaps in the current test tools.
- Identify gaps in the current test coverage.
- List features important to Web & TV members and classify them in terms of testing priority.
- Work with the Web Testing IG, Browser Testing and Tools WG, HTML Testing TF and other relevant W3C groups to communicate the requirements and develop a strategy to fill the identified gaps.
- Liaison with external organizations to inform them about the ongoing activities and gather input on the use cases and requirements.
This dashboard will be used to provide timely and relevant information on the work of the Web & TV Testing task force.
Please bear in mind that text should be used whenever possible (instead of email attachements) for indexation purposes.
- Use Cases & Open/Closed Issues : a page to upload or link your contribution to be discussed with other IG members.
- Test Coverage and Priorities
- Examine existing tools (e.g. Modernizr)
- Surveys of major web sites for issues
- Consider external organizations (DLNA, OIPF, etc. (see below))
- Central Test Runner
- One URL as central location for all tests
- One click to run all tests
- Clear results summarizing top pass/fail results
- Detailed pass/fail results for individual tests
- Test configuration options
- Certifiable test logs
- Device Tests
- Remote testing
- Differences for products vs. prototypes?
- Requirements from Use Cases
- Standardized APIs for test hooks
- An ecosystem (web sites, workshops, etc.) to get feedback from the community on bugs, priorities, features, etc.
- Provide one home for all W3C tests
- Performance measurement
- Time to start a stream
- Average frame rate
- Maybe defined "acceptable" bound, but the measure is most important because different applications will require different performance.
- Management of testing with various codecs and encoding schemes (e.g. adaptive bit rate)
- Management of testing with various externally provided CDM implementations
- Interface and testing to override cross-origin restrictions for specific devices and services authorized by the user
- Testing security of discovery process (e.g. user must specifically authorize access to each device)
- Synchronization of streams in MSE (e.g. works with ad insertion or concatenation of programs in continuous stream)
- Network impairment/feedback mechanism to force MSE adaptive bit-rate implementations to change bit rate.
- Test cases in support of EME
- The same CDM across two different browsers yields the same result.
- Two different CDMs can decrypt the same stream
This is a list of groups/activities that may be related to the TF work. Note that this list doesn't imply any endorsement of the linked technologies or official liaison with the linked group.
- Test Framework Task Force
- Test Management Task Force
- Resource Center Task Force : Contribute requirements here. Provide a prioritized table by topic.
- Common Mobile/TV specification list
- Mobile/TV differences
External Groups and Liaison Representatives:
- DLNA: Mark Vickers
- OIPF: Giuseppe Pascale, (Paul Higgs?)
- CEA: Clarke Stevens
- DTG: Giuseppe Pascale
- HbbTV: Giuseppe Pascale
- SmartTV Alliance: Giuseppe Pascale
- ATSC: Clarke Stevens
- MMT Group (in MPEG): Mark Vickers, Clarke Stevens
- IPTV Forum: Yusuke Funahashi
- OMA: Bryan Sullivan
- Clarke Stevens, CableLabs (moderator) firstname.lastname@example.org
- Mark Vickers, Comcast
- Giuseppe Pascale, Opera
- Yosuke Funahashi, Tomo-Digi
- Kaz Ashimura, W3C
- Bob Lund, CableLabs
- Sheau Ng, Comcast/NBCU
- Bin Hu, AT&T
- [add your name here if you wish to participate in this TF]
Discussion & Calls
- This TF primarily conducts its work on the public mailing list at email@example.com (archive) by prefixing the email subject with the TF identifier [testing].
- General Info (bridge, schedule, IRC etc)
- Agenda Telco 6th of March 2013 (minutes)
- Agenda Telco 27th of March 2013 (minutes)
- Agenda Telco 3rd of April 2013 (minutes)
- Agenda Telco 10th of April 2013 (minutes)
- Agenda Telco 17th of April 2013 (minutes)
- Agenda Telco 24th of April 2013 (minutes)
- Agenda Telco 8th of May 2013 (minutes)