Meeting minutes
(Some more photos from the WoT Meeting Day 1 are available online (Member-only))
Opening
<kaz> Slides
(self introduction around the table)
<McCool shows the agenda Day 1>
McCool: we will also talk about AI. Wonsuk Lee from ETRI will join for this topic.
Use Cases and Requirements
Use Cases
<kaz> Slides
<Tomoaki Mizushima shows slides>
Mizushima: there are many use cases for the IoT
… we ask the stakeholder to fill out use case template
<kaz> Use Case Template
<TM shows a use case template>
… use cases are not limited to W3C members
… template has a simple structure
… we asking about use case background and the specific problem statement
<kaz> i|use case are not|Use Case Template|
<TM shows the process how are Use Cases created at the use case repository>
Mizushima: if you have a question, you can create a issue
Ege: TD TF wants to use the Use Case process to turn worthy issues into use cases
… we had 3 trials so far
… tomorrow in TD session there will be background
McCool: Use cases is always the first process to implement new features
Sebastian: is there a clear border between small feature and bigger concept request
McCool: I have some slides about the kind of process
Requirements
<kaz> Slides
<McCool shows slide #3>
McCool: the default process is that we have a UC description, we will derive requirements which will be reflected by work items
McCool: there is a requirement format
… user story template is "As a PERSONA, I want CAPABILITY so that PURPOSE."
McCool: what is the different between Functional vs Technical Requirements?
… functional is about "Why"
… technical is about "What"
<McCool shows a user story example; slide 6>
<... and Requirement examples; slide 7>
McCool: We assign some categories such as Privacy, Cloud Integration, etc.
… special cases are Security and privacy requirements
… those are generally to mitigate "risks"
McCool: last slide shows a suggested plan
… we should expand the requirement section in use cases and requirements document to define requirments and connect them to use cases
… we should also keep everything simple
McCool: we have time for questions
Ege: for me is not enough to provide detail description of a requirement. We need a good understanding, what is the impact of, e.g. TD
McCool: we are not consultans.
<kaz> s/consultas/consultants/
McCool: there is a problem that provided use cases are not very detailful
<Zakim> EgeKorkan, you wanted to talk about the need for use case if one can write a requirement like in the example
DavidE: I'm confused about the slide 7. Not sure what case 3 has to do with "Profile"?
<McCool shows the WoT Security and Architecture Spec 1.1>
… there are places which list the different stakeholders
<kaz> WoT Security - 3..1 WoT Primary Stakeholders
Kaz: I'm agree with McCool approach
… for use cases, we need to clarify first the stakeholders and then about the need
McCool: I will make an alternative suggestion tomorrow
RobA: there is a difference between use case and use case scenario
… how to handle connection to ontologies?
… and language models
McCool: it would be great if you can provide links of good use cases samples
RobA: there is Spatial Data on the Web WG as a good example
Kaz: tx for your input, Rob! For this WoT Charter period, it is very important to handle input from other SDOs and W3C groups as use cases and requirements. So let's continue joint discussion :)
McCool: tomorrow we have a joint meeting
Refactoring
<kaz> Slides
McCool: The one that caused the most discussion is architecture document
… whether it should be informative or normative
… they go into in the beginning to understand WoT and also to understand the organization of other documents
… there are also high level assertions that cannot be tested for the implementation report
… we have to find a solution for the assertions. Should we remove them or move them somewhere
… about half of the assertions are about security or privacy and other half are about TD/TM and discovery
… we can migrate them to the relevant documents
… there are also use cases that belong to the use cases
… the minimum work is to move the assertions
… any discussion on this?
<kaz> WoT Architecture 1.1 Implementation Report
Ege: implementers do not follow the assertions in any way so there is no danger in removing them altogether
McCool: we have implementation experience
Ege: but those were not implemented by following the spec, they happened to satisfy those
McCool: we have proven their implementability already so they can be moved around
<Zakim> EgeKorkan, you wanted to say that implementers are not aware of the assertions (features) anyways
Sebastian: on one hand, I would say that we should not change a running system
… on the other, with a 2.0 charter, we have the opportunity to make bigger changes
… requirements are in architecture somehow, which does not make sense
Kaz: this proposal slide is ok. To agree on a consensus, we need to check the assertions in the document in detail
McCool: who can do it though?
Kaz: maybe we should have a basic resolution on this direction, and then think about who, how, when to do the changes as the next steps.
McCool: or we do nothing, simply
Kaz: going to TR track, we will we need to explain why Architecture need to be a Rec Track document during the wide review again.
McCool: we have to evaluate them one by one and they can move into the TD since that is the only document we have committed to publish as REC
Ege: I am fine with going through them one by one but I strongly think that implementers are not aware of them nor did they read those assertions when implementing
McCool: that is not true. They submitted implementation experience
Ege: no, I say with implementers and "forced" them to submit pass/fail in the csv. They did not read to implement their implementation
Mizushima: Architecture is an important document. It is important to explain the relationship between the documents
McCool: It will stay and will be like an explainer
<McCool> proposal: Adopt the plan to convert Architecture to an informative document, with existing assertions to be evaluated and removed or moved to other normative documents.
Sebastian: we have to follow the formalism.
<sebastian> +1
<McCool> proposal: Adopt the plan in https://
<McCool> proposal: Adopt the plan in https://
Ege: since some assertions are about scripting runtimes, we should remove the word "normative"
RESOLUTION: Adopt the plan in https://
ACTION: Kaz to talk with PLH about the proposed change of the Architecture
Discovery
<kaz> Slides
McCool: got sucked into another thing and do not have time to work on the discovery spec
McCool: we can postpone the discovery work until TD 2.0 is ready.
… in the meantime, we can publish an errata for the current document and allow submission of TD2.0 instances
Sebastian: I would not like to have two specs that contradict each. We should check that the discovery supports TD 2.0
McCool: we can do that with an errata and keep the API etc.
Zoltan: I wanted to mention that we (Scripting API) have dependency on the discovery spec. There are no algorithms in the discovery etc. If there will be no changes to the spec, the scripting api can go ahead and be flexible in specifying a JS api for discovery
McCool: at least for the short term, we can do this
Kaz: agree with McCool. We do not have to work on Discovery until we get use case and requirements
<McCool> proposal: Defer work on Discovery until the next charter, but publish errata if necessary to address ambiguity or refinements to work with TD 2.0 or the Scripting API.
<sebastian> +1
RESOLUTION: Defer work on Discovery until the next charter, but publish errata if necessary to address ambiguity or refinements to work with TD 2.0 or the Scripting API.
Liaisons
OPC Foundation
<kaz> Slides
Sebastian: OPC UA is used in manufacturing environments
… the key concept is that we have shopfloor devices. They typically speak different protocols
… opcua collects this data and presents in its own ecosystem
… if you have everything in opcua level, it is easy to develop applications.
… however, there are non opcua devices and the mapping of their datapoints is manual
… since some months, there is a new spec from OPCF that uses TDs to onboard non opcua devices into OPC UA environments
… with this spec, TD is part of it
… we have started the discussion about northbound
… we want to define an official opc ua binding within the OPC Foundation as a new working group
… we have biweekly meetings. Next is on October 8th
… W3C members can simply join
… we plan the OPC UA binding to be the first external binding for the WoT 2.0 registry
… there will be more information in the wot week in the end of november
ETSI ISG CIM
<kaz> Slides
McCool: They are doing NGSI-LD
… the group works on context information management
… NGSI-LD is the deliverable of interest.
… we have meeting for the past 6 months. Now there are regular meetings since we have a liaison
… some NGSI-LD entities can be seen as WoT Things. So we would need a binding
… linking to entities between two systems would also make sense
… if you have heard of FIWARE and want to see its relation to WoT, please join the meeting
https://
McCool: we run the NGSI-LD meetings so if you are a member, you can just join
AIoT
<kaz> Federated learning CG
Wonsuk: we want present our proposal for a federated learning API and new TF
Sungpil: here is the motivation of our work
… security and privacy is a critical concern in centralized machine learning
… there is strict regulation of collection and processing of private data, mandated by GDPR
… centralized ML requires significant resources as well
… these can be mitigated by doing federated learning
… it can learn from data in the client side, so not transmit the data
… and less burden on the server
<McCool> (aside: this may be interesting: https://
Sungpil: you can see the architecture and the sequence diagram for the interactions
… server can update the aggregated parameters and make a new global model
… this can happen cyclically, thus many times
… we identified that some WoT use cases are relevant for FL
… there are some issues we identified as well
McCool: we can define a common interface via thing models
… for example entering a low power mode
Sebastian: we should discuss this in a use case and possibly as an AI session
… we can also invite microsoft
McCool: this can be a CG topic
Ege: we can organize something in a next meeting
Wonsuk: this was one technology. We should consider other AI technologies as well
… we should create a TF to collect more information
McCool: indeed. FL is one Thing. There are other technologies to consider
Sebastian: this can be also a plugfest topic
McCool: the reason I said CG is because it seems like an incubation topic
Kaz: let's start further collaboration at the WoT CG first, then Use Cases and Requirements at the WoT IG, then concrete standardization at the WoT WG :)
McCool: we should close the meeting and make dinner plans
JSON-LD/WoT Joint Call
Sebastian: starting joint meeting between WoT and JSON-LD WG
Sebastian: let's start with introductions
DavidL: digital bazaar
Gregg: indep, json-ld
Hirata: hitachi, wot wg
Jan: ie wot, research univ, constrained devices, coap, SDF
Kaz: team contact w3c
McCool: intel, iot, ai, accelerated computing, wot co-chair
Koster: iot, wot co-chair
RobA: sdw co-chair, ogc, json-ld to link to openapi specs
Ted: OpenLink Software. W3C WG/CG/XG/IG involvement since 200x. Primary coding language is english
Mizushima: use case tf of wot
Sebastian: wot co-chair, siemens
Josh: ignite, retail
Brian: ignite retail
Eric: regendary requirements
Benjamin: digital bazaar
Toumura: hitachi
Ben: social wg
Rigo: w3c team ld, legal counsel
Current Status of JSON-LD
Sebastian: update on status of topics relevant to WoT
<Rob-OGC> bigbluehat lets take this offline for a deep dive... some initial work https://
Gregg: have some slides we have been using in other meetings...
Sebastian: please
<bigbluehat> JSON-LD Star https://
<bigbluehat> CBOR-LD, YAML-LD, and the JSON-LD Recharter: https://
<kaz> s/EricS/Eric_Schuh/
Gregg: json-ld cg has been working on RDF-Star
… notable change rather than quoted triples, now have added an indirection though a reifier
… slide 10
… now have rdf:reifies relationship
… json does not have special syntax for quoted values
… so use the @triple tag
Sebastian: so there is an additional annotation
Gregg: opaque term that has the form of a triple, can talk about it without claiming it is true
McCool: summary is syntax exists, but we need to understand the use cases
McCool: my understanding is that we would have to register all the contexts to be used with a registry maintained by the JSON-LD group
… most compressed would be hand-rolled
… but can't support change over time
Sebastian: but also means a generic CBOR decoder would not work
Benjamin: however it does operate in a constrained environment, for example it can work in barcode readers, and CBOR is small enough to be encoded in QR codes, etc.
McCool: having a set of contexts in the registry has other benefits, eg. we can specify prefixes
Gregg: they also need to be immutable
McCool: imo they should be anyway
<kaz> CBOR-LD
Gregg: can still use context that are not in the registry, just not as compressed
McCool: are there any limits in the number of things in the registry?
Gregg: terms are in the map, which is constructed dynamically
… only contexts need to be registered, numbers are assigned to terms dynamically
McCool: any gotchas in mapping to CBOR, e.g. internationalization strings?
Gregg: not that I'm aware of, although there are things in CBOR that don't map to JSON
Sebastian: what is the process, an email?
Benjamin: we plan to set up a registry with its own process to add things
… will include statement of immutability, and a hash
McCool: but we can still version, right?
Gregg: yes, using versioned urls is fine
Benjamin: there is also YAML-LD, which should also be equivalent
… and actually JSON is a subset of YAML syntax... technically
McCool: I also assume the new syntax for JSON-LD-Star with work with CBOR-LD, etc?
Benjamin: yes, can use some reserved keyword slots
McCool: for WoT, techically we define an information model with serialization as JSON-LD, so we could just add additional serializations
Sebastian: comments?
Benjamin: comments don't go into the graph by default, but are allowed in YAML, but ignored in conversion to other formats
Benjamin: some future work for language tags, etc. but are very YAML-specific, would not be able to paste in JSON
Sebastian: so you will recharter in 1-2 years, right?
Gregg: yes, but gated behind RDF-Star, and at least 3mo after that
McCool: we *could* publish an appendix that just adds a serialization, rather than re-opening the TD spec.
Canonicalization
Sebastian: need to trust that the TD is correct, would like signed TDs...
Gregg: sounds like you would want a VC
Gregg: and yes, that uses RDF canonicalization
McCool: issue is that we would like to work on constrained devices
Benjamin: JWT could be used...
McCool: what we were thinking, but would specify prefixes
Benjamin: uses JCS...
McCool: we did look but realized we needed to do a bit more work e.g. for default values
Gregg: might be able to use selective disclosure
McCool: was thinking that we could sign just the template slots as a key-value map plus a link to the TM
Benjamin: in VC we talk about envelop proofs and embedded proofs
… data integrity would include a proof, can be read as JSON
… wheras an envelope takes work to unpack
Rigo: when you say we want to confirm signatures on constrained devices, you want to look at the use cases
<Zakim> rigo, you wanted to talk about limitation when creating value chains from sensors
rigo: instead may be better to do on controller
McCool: it's also possible we can avoid the issues that cause RDF processing to blow up
<rigo> certainly wanted to insist that at the borderline to the further value chains, there should be a a gate for semantification so all metadata is disambiguated and can be used down the pipe
McCool: although... we would also have to constrain all extensions also
RobA: in the context of registering context
… we are exploring building a number of contexts
… which are combined
… is it possible to have a "federation" or registries?
McCool: are you thinking about a hierarchy of registries?
RobA: maybe, but also need a way to do testing, etc.
Kaz: probably wot needs to thinks about use cases and requirements for this
kk: want to think about constraints, ad-hoc might be ok, but it would be good to see how well the current algorithm works on current input even with no changes
… suspect it would work ok on TD and follow the fast path
… as long as you are not using funny extensions
Benjamin: a lot of that second level is to prevent things like graph poisoning
McCool: generally the cases that cause problems are very hard to express in JSON
… in summary, probably a non-problem
Special Topics
Sebastian: a few things we have run into
… ege and mahda are really the main contexts, not available right now
… my understanding is there is a strange behavior with behaviour with @type and type
… type comes from JSON schema
… when we convert to triples, type and @type get merged
Gregg: @type serializes to rdf:type
Gregg: think what you are seeing is that type is being treated as an alias for rdf:type
… inheriting a term definition for type
McCool: is there a way to force it not to be confused?
Gregg: maybe
<sebastian8> https://
McCool: if the JSON Schema ontology is defining type in terms of @type
Benjamin: sounds like it was a relatively recent change that caused the bug
<bigbluehat> w3c/
Sebastian: comment from mahda seems to imply it is a recent issue
Benjamin: it may work but may not round-trip like you expect, but it's not wrong per se
Sebastian: maybe we can look at what DID and VC are doing?
Benjamin: maybe, but is not exactly the same
Sebastian: next step is to clarify with mahda what is going on, then ping you
Benjamin: sure
Gregg: sure
Sebastian: next, JSON Schema for validating TD instances
… but, currently no normative reference we can use
Sebastian: is like a "living" standard, living standard, have their own SDO
… trying to figure out how we can reference it
Gregg: there are other cases of "community standards" that can be referenced
… it's not a W3C standard, but can be considered a community standard, may be other cases
McCool: I will say that the approach we have taken is redefining what we need and avoid a direct dependence
Benjamin: that is not an unreasonable approach
… note that there is another thing called JSON Types
<Rob-OGC> can that redfinition be a common profile we can re-use - or do we have re-re-specify each time?
Benjamin: there is a danger with governments etc. not accepting community specs
Sebastian: means we should keep it as an informative references
Benjamin: yes, I would recommend that, and also talk about how to stay aligned
… OpenAPI has a similar issue
<Rob-OGC> OpenAPI3.1 matches lastest json-schema version - 3.0 had a local verson
McCool: and honestly we'd like to stay aligned with OpenAPI, but there is not a common spec we can refer to
Benjamin: issue is a lot of them are volunteers with no funding, and could not afford the W3C processes
… could also reference OpenAPI's dialect
McCool: the real question is "do we need to do all this work" and the answer seems to be "yes"
Benjamin: I think you are fine with what you are currently doing
<Rob-OGC> yes
Benjamin: have seen hash of context file published in TR, then is frozen and does not have to be dereferenced
McCool: nice
<kaz> Verifiable Credentials JSON Schema Specification
RobA: OGC has also invested in OpenAPI3.1 is aligned with a formal release of JSON Schema
… tomorrow will talk about how we down-compile for compatibility
Kaz: need to do some more survey on these resources, at that time we will need your help again
<Rob-OGC> +1
[adjourned]