W3C

- DRAFT -

WoT-PlugFest/Test

05 Dec 2018

Attendees

Present
Kaz_Ashimura, Michael_McCool, Ege_Korkan, ege, Kunihiko_Toumura, Michael_Lagally, Taki_Kamiya, Toru_Kawaguchi, Tomoaki_Mizushima
Regrets
Chair
McCool
Scribe
ege, kaz

Contents


<ege> link for the assertion tester: https://github.com/egekorkan/thingweb-playground/tree/assertionTest

<McCool> https://www.w3.org/WoT/IG/wiki/PlugFest_WebConf#Agenda_05.12.2018

<kaz> scribenick: ege

Test plan update

McCool: so let's get started
... sort out logistics for next week
... did changes but mostly cosmetic
... we can go to my repo to see the actual branch
... input data needed for the report
... implementation needed from fujitsu and hitachi
... it would be useful to have it before the testfest
... would be good to have it out of the way
... so email me or do a PR
... against my repo or main repo
... ege did some progress

<kaz> scribenick: kaz

Ege: TD validation tool
... how to test the assertions
... new tool here

https://github.com/egekorkan/thingweb-playground/tree/assertionTest

Ege: it runs through all the assertions
... and generates CSV
... same format as McCool's report
... and some extra information why failed

McCool: ok
... TD being tested

Ege: directly creates the results

McCool: forgot to mention this tool on the testfest logistics page
... visits mccool's updated-test-results

https://github.com/mmccool/wot-thing-description/tree/updated-test-results

McCool: once you have a result file
... put it here

https://github.com/mmccool/wot-thing-description/tree/updated-test-results/testing/inputs/results

Kaz: can you put that instruction to the testfest page?

McCool: has a README.md already

https://github.com/mmccool/wot-thing-description/tree/updated-test-results/testing

Kaz: you can add the above URL to the testfest page then

Ege: some of the assertions have problems
... some of them are combination of multiple assertions
... in this case, both the two assertions to be handled at once
... should I make both of then failed if either of them failed?

McCool: there is a mechanism to track the situation (parents/children)
... you can go ahead and create a new assertion which is specialized
... we need more specialized assertions

Kaz: if the parent assertion fails the children assertions also should fail

McCool: explains the example of the top assertion
... we need all the assertions done
... good to know which assertions could be checked by the automatic tool
... maybe some of assertions can't be checked by the automatic tool

Ege: e.g., idiopotent test

McCool: we need to go through the test specification description as well
... maybe we can put some note here (at the test specification descriptions)
... if only part of the assertions can be tested automatically, that's fine
... once you find which can be handled, that would be good

Toru: question
... panasonic has some thing like air conditioner
... but the TDs are hand-written
... can they also be included?

Kaz: think that can be included given that TD is also exposed to outside for applications
... as part of TD producer, e.g., an air conditioner this time

McCool: yeah, so we should strike the phrase of "programatically generated"
... this description is loose enough for proxy as well
... consume and produce
... currently Panasonic gave me implementation description here
... 4 devices
... all one implementation of one code-base

Toru: tx

McCool: probably Ege needs to flesh out the tool more
... some of the test are manual

Ege: would be ok to categorize the assertions?
... JSON Schema, network tool and manual, e.g.

McCool: yeah, we can flesh that out

Ege: also another sort of assertions?
... additional fields

McCool: there are 4 fields: pass, fail, not-impl, total
... context column includes contextual link
... not terrible shape actually now

TestFest logistics

McCool: can add information here

https://github.com/w3c/wot/tree/master/testfest/2018-12-online/

McCool: add information
... schedule
... webex
... any restriction for that?

Kaz: no
... anybody from the WG/IG can join the calls

McCool: Monday: Script webex
... Wednesday: Editors webex
... Friday: TD partly
... will work on the procedure
... and preparation TODO
... each organization with one of more implementation needs to submit an implementation description
... make sure all the implementations are online
... copy all TDs to the TDs subdirectory
... and data collection procedure
... validate TDs, generating results files per TD
... merge results files, giving result file per implementations
... check in result files
... record any interop tests
... run npm
... (shows the resource for the "interop test" part)
... the system merge all the CSV reports to generate the table at the interop test part

Toru: do we use google hangout video for that purpose?

McCool: last time Matthias provided that
... the question is what would happen this time
... Matthias is not here
... will ask him

Ege: this is a free service, isn't it?

McCool: not sure the number of participants for the free service
... let me look into it
... another point to mention
... penetration test
... not really complicated
... looks like Ege's network service testing
... Elena wanted try various things
... next section on assertion testing
... and then interop testing

Ege: can be automatically generated?

McCool: that's what I'm assuming
... if you can do that, let's do that
... Elena can look into Burp Suite
... we can automatically generate a configuration file
... procedure to be determined
... btw, as for the interop testing part
... record any interop tests in testing/input/interop
... next week during the scripting call, we'll continue the discussion

CR exit criteria

https://github.com/w3c/wot/blob/master/testing/requirements.md

https://github.com/w3c/wot/blob/master/testing/criteria.md

McCool: kaz mentioned that DCAT/SSN are better examples for data model spec

Kaz: feedback from the call with Ralph and PLH yesterday, what we need to do is clarifying the TD vocabulary and show 2 independent implementations use the vocabulary

McCool: yeah
... still confusion about assertions within the draft report to be updated

TD version for TestFest

<kaz> FYI, the diff between published TD (oct 21) and the current editor's draft (nov 29) available at: https://w3c.github.io/wot-thing-description/diff.html

McCool: which version of TD to be used for the TestFest
... essentially freeze the TD spec today
... if your implementation actually fails that's OK
... we're checking the testing procedure now
... this is a snapshot today

https://w3c.github.io/wot-thing-description/diff.html

Kaz: fyi, the above is the diff between the published version on Oct 21 and the current editor's draft on Nov 29

McCool: ok
... let me capture the URL on the testfest page

JSON-LD validation tool

https://json-ld.org/

Kaz: another suggestion from W3M was
... we might want to look into JSON-LD WG's validator above
... Ege, is your playground based on that?

Ege: I built my playground validator from scratch

Kaz: can you quickly look into the generic JSON-LD validator at: https://json-ld.org/?

Ege: can do that

Kaz: would be helpful
... tx

Additional meeting?

McCool: small group of us working on validation
... maybe some more work tomorrow?

Kaz: let's do that during the Monday meeting

McCool: first hour on Monday?

Kaz: 1pm on Monday in Europe

Ege: can make it

Possible interoperability report Note

Kaz: btw, the feedback from the w3m guys included that the interoperability part of the draft implementation report doesn't have to be part of the official implementation report
... on the other hand, it would be useful for implementers to publish it as part of a separate interoperability test report as a WG Note

[adjourned]

Summary of Action Items

Summary of Resolutions

[End of minutes]

Minutes formatted by David Booth's scribe.perl version 1.152 (CVS log)
$Date: 2018/12/06 02:58:16 $