See also: IRC log
<renato> Chair: Renato
<victor> hi
<renato> https://www.w3.org/2017/09/14-poe-minutes
<michaelS> scribe: CarolineB
<michaelS> scribenick: CarolineB
Minutes apporved
<renato> https://github.com/w3c/poe/blob/gh-pages/test/test-regime.md
renato: while we wait for CR approval - we
can look at improving this test regime
... lets have dry run internally first - using implementers in teh group
benws_: where do we want to move on to
renato: so that non members can understand how to test
benws_: is what we have now enough?
michaelS: a result template would be good.
... ivan: who will combine the results into an imlpementation report.
they could tell us waht format woudl be best
<simonstey> https://w3c.github.io/data-shapes/data-shapes-test-suite/#implementation-reports
ivan: we have n features that must be
implemented, Report must list these.
... a table with green and red cells is common. It shows which
implementation does which
... Implementors could provide this table themselves (including ideally
reasons why something was not implemented)
... Commercial implementors may not need to make everything public, but
we should see
renato: we have a list of features - exit criteria
ivan: we have more tests than features, so we could map features to tests - two level hierarchy
michaelS: q re ODRL implementation page - tehre are two roles, publisher, consumer - do both roles have to be covered?
renato: we can remove that - its from a while ago
michaelS: Is it sufficient to just evaluate but not create policies?
<ivan> example report (annotation wg): https://w3c.github.io/test-results/annotation-model/all.html
renato: that is fine
ivan: here are the examples we should list
... then there is a big table that lists examples. There are many
yellows where it didn't address features. Otehrwsie green
michaelS: Is working on an implementation. What should we do about constraints. Should all variants be tested and shown?
benws_: thats what we do with the evaluator
michaelS: It isn't in the IM. Put it on the evaluation page?
ivan: is asking for a volunteer...
renato: Yes. We'll need list of features, broken into test cases, advice to implementors. Get table ready
ivan: understands most implementors from this group, so we could all take part as we add bits to the table
benws_: will map features to exit criteria
renato: victor has 76 examples - map these to features?
benws_: there are 42 tests
ivan: so roughly 100 - which is manageable
michaelS: october/november is very busy, so I wouldn't be available at the right time
renato: suggests - in short term - if
Michael and Ben work to get tests and features into one page
... on GitHub
ivan: can we use the example I have put up? Then people coudl fill data into it themsleves.
renato: so a page with all features and test cases under each for implementors to fill in would work
ivan: Excel is good - can we use Google Docs?
<victor> I agree with Renato, I can help. I agree with Ivan, we can have a Google spreadsheet.
<simonstey> https://www.ctan.org/pkg/excel2latex?lang=en excel 2 latex fwiw
michaelS: we use Google sheets and I have a python sheet to turn into html or markdown
ivan: excellent!
<renato> FYI: ODRL V1.1 in Chinese: https://drive.google.com/file/d/0BzVxGpAd27SqMGY4NzcwMTItNTZkMy00Y2Q5LWEwOWEtNzYxNGFiOWIxYTI4/view
renato: Michael will start off with Victor and Ben (Chinese only optional)
ivan: What about the SHACL stuff?
renato: Is this only for SHACL tool users?
simonstey: idea with SHACL testcases - to
see if you really implement a feature.
... You can run your output against SHACL shapes
... ... useful when implementation is not open
ivan: so is shacl not part of the official / necessary testing?
simonstey: If we make it a requirement then people without shacl capability could not test. So not mandatory
<renato> https://w3c.github.io/poe/vocab/#cr-exit
renato: the other one is vocab. there are 4 in the exit criteria
ivan: So we pick 2 independent schema
validators, run the ontology through some of the tools. Same for Json LD
... has the Ontology gone through Protege? (yes) so we can check its
consistent for first two bullet points
victor: I removed the test from the validator because it took 15 secs (too long for everyday)
<victor> Also, my OWL profiler here http://owlprofiler.appspot.com/
<victor> uses OWLAPI to check the profiles --and as a side task validates consistency
renato: So now, we get the features and test cases documented so that implementors can add their notes
<michaelS> That's the Google sheet for the implementation results: https://docs.google.com/spreadsheets/d/1I2-qht3KRjkIvwdvsfkAMq4AJ0FXjn3Wl5mH-tPe5bs/edit?usp=sharing
renato: maybe sort to list features first?
... who can supply implementation reports?
<victor> +1. The difficult part will be starting the table, it will be easy to complete it after a few "examples" are available
simonstey: looking into it - needs to check - not an implementation in a classical sense
benws_: we're very distributed so may need to massage to get the test results we want
simonstey: there may be some implementations that can ignore e.g. missing assigner..
michaelS: will talk to Stuart and know more later
simonstey: re timings - we had issues with firewalls blocking
<victor> what about the API? well, I'll ask you by email...
renato: Michael will work with rest of team to get spreadsheet up so we can review next week
*thanks!
renato: Ben. Do you want to close the couple that are for you?
having a meeting with directors coming up - if all goes well we can publish for 26th
<victor> bye bye!
<ivan> trackbot, end telcon