Web and TV Interest Group Teleconference

10 Apr 2013

See also: IRC log


Clarke, Bin, Bryan, Yosuke, Kaz, Mark_Vickers, Sheau


Clarke starts meeting and goes over the agenda

Topic Final update of liaison letter

Clarke: 2. Follow-through: How do we maximize response from liaison letter and internal poll?

Clarke: individual responsibility to fulfill that
... does anyone know how to get the response from people asap?

Sheau - internal or external>

Clarke: either one

<sheau> Bin, that was me Sheau speaking.

Clarke: I think these people haven't signed formal liaison agreement


<bryan> I agree, for those that we're members of we can reach out to the liaison lead to ensure quick followup; for OMA we are involved and can help get a response by june.

Clarke: anything else for the liaison letter?

Topic use cases

Clarke: no new ones
... we want to decide the requirement for testing, and test tool
... we have candidate use cases, we need to decide whether to approve it or not
... criteria is if there are requirement generated from use cases

<Clarke> use case 1: http://www.w3.org/2011/webtv/wiki/Testing/Web_%26_TV_Testing_Discussions/Improve_Web_Platform_Consistency

Clarke: what requirement does this use case add?
... I think it does
... what do you think if we pass it on to testing TF, will it be accepted

Mark agree that this use case will generate requirement

<bryan> Mark, I agree we need to document the need while it's still not met

Clarke, any other comment?

scribe: anyone oppose to adopt thos use case?

move on to next use case

Mark, it is important to have it be recorded

scribe: if there is wording change, certainly welcome

Clarke, or we can say it is recommended to pass this test suite

Mark, outside group referencing W3C test, but what you are saying W3C also references outside tests

Clarke, we are providing our perspective of requirement to W3C test ecosystem

scribe: any other comments to this use case?
... recommned to accept it.
... anyone oppose?

<Clarke> use case 3: http://www.w3.org/2011/webtv/wiki/Testing/Web_%26_TV_Testing_Discussions/performance_testing

Next one is Giuseppe's

Browser performance testing

Clarke it's a performance sort of test

scribe: my question is whether we need performance test

<bryan> yes we do

Bryan, we really need performance test, such as Coremob we have requirment of performance test

scribe: usaually it is part of functional test
... at least to have ability to measure

<bryan> time to start a stream and average frame rate for example are big impacts to user experience and need to be assessable at the least

scribe: to understand what the user experience will be
... to make sure network environment is consistent

Mark, agree to what Bryan said

scribe: what we are really going for is the binding functionality for correct implementation

<bryan> if you ask OEMs they will argue that device variation makes performance tests less useful, and we agree that devices vary for valid reasons e.g. processor class, memory. but at least being able to consistently measure performance, with elimination of variables where possible, allows you to assess the result for your own purposes. but W3C does not need to set expectations, except as a minimum and then probably only as a recommendation

scribe: a good example is "bound" for class of devices

Sheau, acceptable if use case describes the need well

scribe: suggest to add additional clarification of minimum requirement of user acceptability in addition to metrics of benchmark

Clarke, if you think anything needs to be added, feel free to edit it

scribe: suggest it to be accepted
... anyone oppose?

Clarke, next use case is Bin's

<Clarke> use case 4: http://www.w3.org/2011/webtv/wiki/Testing/Web_%26_TV_Testing_Discussions/MSE_Testing

Clarke, we had some discussion in the last CC here.

scribe: the question is if this MSE use case adds requirement to pass on to testing TF

Mark, it might be good if we add specific test cases

scribe: might be good guidance to test team

Clarke, I don't see MSE test has covered, such as to deal with media stream with particular start time, multiple simultaneous track, adapted bit rate

scribe: there are several things to extend to testing requirement
... my recommendation is similar to last one
... to extend requirement
... assign action item to Clarke, Bin and Mark to add specific test cases
... add action to communicate with HTML WG
... recommend to accept this use case

anyone oppose?

<Clarke> use case 5: http://www.w3.org/2011/webtv/wiki/Testing/Web_%26_TV_Testing_Discussions/EME_Testing

Clarke, next is use case 5

scribe: very similar to last one
... nature of security of CDM, and probably add value to testing platform

Sheau, does the order of 3 scenario suggest anything?

scribe: feel like the 3rd one should be the 1st

<bryan> CDM it's similar though to codecs for video in that the test environment needs to support a variety of codecs as well as CDMs. Specific CDM supporting test environments may need to be provided (server side) by whoever defined/deploys the CDM

Mark, nothing implies in the order

scribe: 1 and 2 represent the fundamental functionality, 3 also adds graphics transportation
... 3 is orthogonal to 1 and 2

<bryan> and presumably the CDM-drawn bits are not (or may not be) accessible to the application, e.g. taking snapshots in a canvas

Clarke, my recommendation is to accept this

scribe: anyone oppose?

One more use case 6

<Clarke> use case 6: http://www.w3.org/2011/webtv/wiki/Testing/Web_%26_TV_Testing_Discussions/NSD_Testing

Clarke, last case of network service discovery

scribe: gives use agent a particular way to access local network resources
... gives additional security things, cross-origina features, and manage those features to let user control
... unique test platform features required by this use case
... need to add more specifics on that
... recommend to accept it
... anyone oppose?

Clarke, a more general question is that the above use cases may not cover the test coverage adequately

scribe: call for additional use cases to cover broader web and tv area more adequately

Clarke, basically we have gone throuhg use case and requirement

scribe: main testing group is evolving
... look for suggestion on how to accomplish our goals
... look for advice and recommendations
... is their landscape there clearly enough? contracting tools?
... or the tools are already there, and we cannot add anything there yet

Bryan, haven't heard any contracting tools

scribe: but they might hire someone to add vendor's test suites

Clarke, advice on how to meet our deliverables

scribe: any other business to discuss?

meeting adjourned

<Clarke> Thanks for scribing, Bin


let me generate the minutes

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.137 (CVS log)
$Date: 2013-04-11 01:28:04 $