W3C

- DRAFT -

HTML Media Task Force Teleconference

06 Jan 2015

Agenda

See also: IRC log

Attendees

Present
+1.650.458.aaaa, +1.408.536.aabb, markw, Vincent, davide, jdsmith, BobLund, Aaron_Colwell, paulc, joesteele, adrianba, +1.650.458.aacc, pal, ddorwin
Regrets
Chair
paulc
Scribe
joesteele

Contents


<trackbot> Date: 06 January 2015

<scribe> scribe: joesteele

<paulc> trackbot, start meeting

<trackbot> Meeting: HTML Media Task Force Teleconference

<trackbot> Date: 06 January 2015

<BobLund> I joined but am in a noisy place

<cyril> http://lists.w3.org/Archives/Public/public-html-media/2015Jan/0007.html

<scribe> agenda: http://lists.w3.org/Archives/Public/public-html-media/2015Jan/0007.html

role call

Aaron is here with a cold

paulc: meeting will be on MSE topics

update on editors draft

paulc: bugs are on agenda for later

acolwell: since last set of changes have not been able to make more edits yet
... still working on it
... clear path for everything except bug filed today
... fine with doing a heartbeat
... not sure whether folks want me to resolve all existing bugs -- at least a couple may be non-trivial

paulc: wait till we go through the bugs

acolwell: we can mark the ones we want to do

MSE test suite status

paulc: F2F had ACTION-71 re: Cyril figuring out where we stood

ACTION-71?

<trackbot> ACTION-71 -- Paul Cotton to Investigate testing status to move things forward -- due 2014-11-07 -- CLOSED

<trackbot> http://www.w3.org/html/wg/media/track/actions/71

paulc: cyril sent a report

<paulc> http://lists.w3.org/Archives/Public/public-html-media/2014Dec/0012.html

paulc: can you step through the reports and remark on any changes since then?
... please use the queue for questions
... want to figure out how we should make progress

cyril: the report is in three parts
... part 1 is about producing the reports of the runner -- took a while to setup correctly
... needed to setup on my machine to work correctly, but now working correctly again
... I have run the test suite we have in the runner on 3 diff browsers -- Firefox, Chrome, IE
... reports are in JSON and are in the git repository called "test results"
... that is the initial report but folks can update as needed
... for IE had lots of crashes at the time, so reports might not be good

paulc: link in the report was for which one?
... #3 is the raw HTML I think?

cyril: that is the report generated by the online runner

paulc: is there a way to look at that in human readable form?

acolwell: its basically the same way we do specs -- raw git URL

cyril: even that gives you plaintext

<paulc> Raw data of results: https://github.com/w3c/test-results/commit/265eaeb832a4deb19412dfa863fd422889263ffc

paulc: I can't see it, can you give us a short summary of the trends
... ?

cyril: let me find a way to view the results

<BobLund> http://rawgit.com/w3c/test-results/265eaeb832a4deb19412dfa863fd422889263ffc/media-source/all.html

boblund: this should give you a human readable version

paulc: lots of red
... lots of IE timeouts
... have you figured out how to get around that problem?

cyril: I was running on Win8 IE11 lots of problems at that time

paulc: are the green and yellow things that were not run in IE

?1: there were not results in those columns

paulc: seems like a lot of fails in UC10

cyril: not sure what those are -- they were in the tests at that time

paulc: anyone know what the status is in safari

acolwell: they have been filing bugs so I know they are working

paulc: let's keep going

cyril: that was part 1
... for part 2 started looking at the test files -- very long task
... I looked at one file in particular and filed some bugs on that
... Aaron commented but have not had time to respond yet

paulc: so you thought some tests needed changing?

cyril: this test was making calls to isTimeSupported -- which is static -- was assuming WebM format to be supported
... my main suggestion was to split this into a core test and separate tests for each byte stream format

acolwell: I put some comments, in general I approve of his approach
... not sure whether this is establishing a base or documenting all possible use cases
... pretty sure we will not have interop in some cases
... for example for some codecs

paulc: cyril can you explain the tests you added and why?

<acolwell> https://github.com/cconcolato/web-platform-tests/commit/0499e3fc0103f99fb64f386e5db070af47e5a62a

cyril: the MP4 file format is a container for many types of codecs
... I looked at all existing codecs and proposed a test for those codecs
... I agree with Aaron that most of these codecs will not be implemented interoperably
... but some codecs and variations of codecs that would be worth testing

paulc: where did you pull the codecs from?

cyril: from the MP4 registration authority that maintains a list of standards and codecs
... any vendors can add a codecs -- mp4ra.org?

adrianba: I think the goal that we have is to test interop of MSE itself, and of the normative requirements. MSE does not mandate any particular format support.
... the spec itself does not have specific codec requirements
... using isTypeSupported as an example -- we want to make sure implementation do this correctly and do not enumerate a large number of codec to see which are supported
... if someone has an implementation tey want to sumbit, which does not support WebM or other codecs, it would be reasonable for them to request a new codec to be added to indicate their support for that API

<paulc_> MSE registry: http://www.w3.org/2013/12/byte-stream-format-registry/

adrianba: but having a long list of codecs is probably not needed and we should be cautious about the ones we add

paulc: when I asked the question about where the list was from -- wondered if it was from this registry
... that would have been the place to start

cyril: the registry only lists the byte stream formats, not the codecs
... you have to go to the other registry for the codecs
... I agree with Adrian that we want to test the behavior not the byte stream formats
... what I thought was needed to test MSE specifically were things like --
... what happens when one files has multiple tracks and some tracks use codecs that are not supported

acolwell: I agree with Adrian that we should keep the tests focuses on testing MSE not the byte streams, but I also agree that we cover behavior where there are unsupported codecs

adrianba: was not suggesting that we have tests that do not use the codecs, this could be similar to how we did testing in HTML5 -- HTML4, ogg format or WebM
... did not matter what format the files where in as long as you could play video and the interaction worked correctly.
... but we had to have media content to test the elements
... in this case we will have to have media that does supported what is being testing
... thing that is different than going to the MP4 registry and enumerating all those formats and testing whether they are supported
... don't need to be exhaustive
... ultimately we are looking to see how many passes for each test, we will have to explain why failures are not important

joesteele: is there an explicitly unsupported codec?

cyril: yesy I did add that to the tests
... I think I agree, I did trim the tests to keep the codecs tested to a minimum
... would like those MP4 codecs to a separate test

paulc: does anyone disagree with that strategy?

acolwell: so you want to keep the tests in a separate file?

cyril: some would be kept in the tests, some might be kep in a separate file (e.g. on my website)

adrianba: not sure what the split out tests would be for
... definitely agree that breaking things down and not having all in one place is helpful
... in the end want it to be easy to anaylse the results
... e.g. isTypeSupported is implemented interoperably across multiple browsers

cyril: you can give me an action to do that and I will review
... I will define first a codec that is supported and one that is not supported for each implementation, I I will test both cases for each
... what matters is that it is a supported coded

paulc: let's pop back up a level -- describing the testing you were doing

cyril: in part 3
... this work was suggested by Paul at the TPAC
... looking for coverage of the test suite
... I looked at the spec and tried to determine how many tests were needed per section of the spec, and tried to see what was there, what was missing and what I had added during my anaylze
... this is preliminary work

<cyril> https://docs.google.com/spreadsheets/d/1XKjIuGWjEaSvMkf31HiTaYYkLbyuNg4saP2Sf-iW-cs/edit?usp=sharing

cyril: there are two spreadsheets
... this is test coverage -- listing tests that are needed per section, tests available, and link to existing tests for that section

paulc: given that this is organized by section, are you going to revisit each section?

cyril: there are sections I have not scanned yet -- e.g. the audio splice algorithm -- don't know how many tests are needed there yet.
... from that perspective it is not final yet
... also I have not linked all the tests to a given section
... the first tab in that google doc is the test coverage, the second is the list of existing files in that test suite and some notes
... probably need to work on the notes

paulc: do you need help from other people to generate tests, or finish the "tested in" column first?

cyril: if folks think this is the right thing to do, need to get agreement on each test

paulc: has anyone reviewed these tests and have comments

jdsmith: I think this approach is solid. intent is to identify testable assertions in the spec. we are going to find out whether we have full coverage.

<acolwell> +1

jdsmith: we have to prioritize where we have gaps. I think Cyril is doing a good job and I am impressed with what he had done. We can split things up and help him if he needs it

cyril: I would be happy to have help but I will continue on this

<acolwell> https://github.com/w3c/web-platform-tests/pull/1238

acolwell: I am also happy with what Cyril has done. and have a pull request also that I would appreciate review on

cyril: my intent was to get this to a stable state and then review that pull request as well

acolwell: this might reduce the amount of tests that need to be reviewed

paulc: want to have some time to review existing bugs and discuss heartbeat
... should we schedule another meeting in a month?
... want to identify places you could use help from others

cyril: that makes sense

MSE bugs

paulc: there are 5 bugs -- Aaron do you want to discuss those bug you mentioned

bug 27649

acolwell: think we have agreements on this

bug 27599

acolwell: think we also have agreement on this need to reach out and get clarification

bug 27239

acolwell: need to get clarification on how to approach that

s/bug 27399/bug 27239/

bug 27758

acolwell: Bob and I talked awhile ago on this -- can he refresh?

<paulc> https://www.w3.org/Bugs/Public/show_bug.cgi?id=27758

BobLund: think we had agreement that we should have an information reference to the track attributespecification -- question was in the byte stream spec or the specification?
... there was a statement tht the bytestream spec should include the track attribute reference

paulc: does your proposal fill that in for each existing item in the registry?

BobLund: yes it just says it should follow the existing media format

acolwell: I think that is fine

bug 27242

<adrianba> +1

acolwell: have not started this yet -- might be an edge. case folks have not hit yet
... buffer ranges can do wierd things with B frame content
... depending on how folks interpet what is buffered in that content -- need the right test cases

s/ab edge/an edge/

MSE heartbeat

<paulc> I propose we include 27599 and 27649 in the heartbeat with 27758 being optional depending on the amt of work to resolve 27758

paulc: I am proposing you do the bugs you said were ready, and leave to the editors whether the other are included

acolwell: that sounds fine

<adrianba> I can help with heartbeat if you like

paulc: put the folks in the CC list and CC me as well

acolwell: sure -- will coorindate with Adrian as well

paulc: next week will continue with EME -- revisit MSE in about a month

s/revisit this/revisit MSE/

paulc: think we are done for today
... thanks Cyril for attending and all your work!
... paulc thanks all!

Summary of Action Items

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.140 (CVS log)
$Date: 2015-01-06 17:00:31 $

Scribe.perl diagnostic output

[Delete this section before finalizing the minutes.]
This is scribe.perl Revision: 1.140  of Date: 2014-11-06 18:16:30  
Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/

Guessing input format: RRSAgent_Text_Format (score 1.00)

Succeeded: s/prducing/producing/
Succeeded: s/or MR4?/format/
Succeeded: s/mp4.org/mp4ra.org/
Succeeded: s/browwers/browsers/
Succeeded: s/27399/27239/
FAILED: s/bug 27399/bug 27239/
Succeeded: s/key frame/B frame/
Succeeded: s/ab edge/an edge./
FAILED: s/ab edge/an edge/
Succeeded: s/MSE heartbet/MSE heartbeat/
Succeeded: s/rwch/reach/
Succeeded: s/this/MSE/
FAILED: s/revisit this/revisit MSE/
Succeeded: s/regoistry/registry/
Succeeded: s/attrbiute/attribute/
Succeeded: s/src? /track attribute/
Succeeded: s/adrina/Adrian/
Succeeded: s/on are/are on/
Succeeded: s/an unsupported/an explicitly unsupported/
Succeeded: s/kee the tests in a spearate/keep the tests in a separate/
Found Scribe: joesteele
Inferring ScribeNick: joesteele
Default Present: +1.650.458.aaaa, +1.408.536.aabb, markw, Vincent, davide, jdsmith, BobLund, Aaron_Colwell, paulc, joesteele, adrianba, +1.650.458.aacc, pal, ddorwin
Present: +1.650.458.aaaa +1.408.536.aabb markw Vincent davide jdsmith BobLund Aaron_Colwell paulc joesteele adrianba +1.650.458.aacc pal ddorwin
Agenda: http://lists.w3.org/Archives/Public/public-html-media/2015Jan/0007.html
Found Date: 06 Jan 2015
Guessing minutes URL: http://www.w3.org/2015/01/06-html-media-minutes.html
People with action items: 

[End of scribe.perl diagnostic output]