DID Working Group F2F, 2nd day — Minutes

Date: 2020-01-30

See also the Agenda and the IRC Log

Attendees

Present: Markus Sabadello, Brent Zundel, Yoshiaki Fukami, Eugeniu Rusu, David Ezell, Ivan Herman, Christopher Allen, Tobias Looker, Justin Richer, Ganesh Annan, Kenneth Ebert, Drummond Reed, Manu Sporny, Phil Archer, Michael Jones, Kaliya Young, Joe Andrieu, Oliver Terbu, Jonathan Holt, Charles Cunningham, Joachim Lohkamp

Regrets:

Guests: Juan Caballero, Oskar van Deventer, Eric Welton, Carsten Stöcker

Chair: Brent Zundel, Daniel Burnett

Scribe(s): Markus Sabadello, Brent Zundel, Oskar van Deventer

Content:


Ivan Herman: Meeting slides: https://tinyurl.com/didwg-ams2020-slides

Ivan Herman: (going over logistics of afternoon activity)

Brent Zundel: Yesterday we got a lot done, we have more today; hopefully leading to some resolutions
… This morning we will talk about interoperability, what does it mean, and what levels; hopefully come to decision what kind of interoperability we want
… Then we talk about interplay between interoperability and extensibility
… After that we work on JSON PRs
… Then we talk about next steps for the spec
… During lunch I will present on ZKPs
… After that we will have a boat tour on the canals
… (Going over dinner logistics)

Daniel Burnett: Everyone type present+ if you’re participating in the meeting; just being logged in to IRC is not enough
… Still looking for scribes for tomorrow afternoon sessions

Brent Zundel: Mike and Manu will present

1. Levels of Interoperability

Ivan Herman: See slides in the deck.

Manu Sporny: Mike and I will run through some things to understand about interop; often we don’t define what it means
… Will go over types of interop, then Mike will go over how you get to those levels of interop
… Kinds of interoperability: Not necessarily a layered thing, just different types of interop
… First option: No interop at all (we wouldn’t be here if we wanted that)

Daniel Burnett: Making this statement is very important.

Manu Sporny: As we discussed yesterday, this confuses the market and everybody suffers
… Next step: Interop at the data model layer, you got some abstract data model, people agree on types of things we want to write down in the spec.
… This is not ideal, there are different ways of implementing and deploying
… We’d like something better than that
… Next layer: You interop at the data model layer, and you interop on some basic syntax layer
… This is where discussions about JSON,JSON-LD, CBOR, etc. come in
… In VC work, we achieved data model interop, and basic syntax interop
… Then there are different kinds of interop that you can mix in there, nice to have
… Interop on extension mechanism. One aspect of a good spec is to have ways for people to use it for their own use cases (via extension mechanisms).
… Interop on canonical form. Is there a canonical form of the data model, is there a stream of bytes?
… Abstract canonicalization and syntax-based canonicalization. Do you always serialize it in the same way
… Syntax-based canonicalization is interesting if you want to do digital signatures.
… You have a data model, and there is ONE way to serialize it.
… Interop on behavior. Not just data model and syntax, but the spec also defines what you can do with it.
… People do something with the data, and you get the same result
… With the DID Core spec, we expect CRUD operations for all DID methods, we expect interop there.

Christopher Allen: I think there is a missing section: What are the forms of cryptography that is being used?
… When we did our Wyoming interop project, couldn’t find a combination of curve, signature suite, canonical method, selective disclosure that worked consistently across different implementations.

Manu Sporny: Any other kinds of interop?

Kaliya Young: At IIW there was an SSI stack diagram of different layers and protocols at each layer

Ivan Herman: Interop at the user interface level, and the tools that are being used.
… We will have tools that implement DIDs, and there will have to be some conceptual interop between those tools

Juan Caballero: is this oliver’s stack diagram from IIW that Kaliya mentioned? https://medium.com/decentralized-identity/the-self-sovereign-identity-stack-8a2cc95f2d45 ?

Juan Caballero: [https://miro.medium.com/max/1227/14zUczSBaVH-8qilvK4nKwQ.png](https://miro.medium.com/max/1227/14zUczSBaVH-8qilvK4nKwQ.png)

Ivan Herman: Out of scope for this WG, but some other group may look at this

Kaliya Young: https://medium.com/decentralized-identity/the-self-sovereign-identity-stack-8a2cc95f2d45

Phil Archer: One of the ways how you will share DIDs will be via optical QR code. How will people know what the QR code will do? You need some visual indication

Christopher Allen: Related to that is the wire forms. Do you support QR codes, do you use UDP, what transports do you use.

Michael Jones: Interop on protocol messages (showing SSI stack diagram)

Manu Sporny: Mike will now talk about how to get to those levels of interop

Michael Jones: Will talk about experience how you can get implementations of specs to interoperate with each other
… This is informed by the process we used for OIDC, JWT, etc
… In the early beginning, you can get together at IIW, or virtually, etc., and you try to use your implementations together in the ways that the spec intends
… E.g. manu wrote a relying party, and I wrote an identity provider. Now we can try if they can communicate successfully at all.
… As you do this, you will find some thing that work, and some things that don’t work. Either you did things differently, or your implementations are incomplete
… This gives you data what works and what doesn’t, and you can improve things.
… The next level could be creation of a test suite that attempts to test some known parts of the protocol, and you run your implementation against the test suite. Again you learn what works and what doesn’t. You also learn about bugs in the test suite. Then you iterate and fix things.
… Next level is you take a test suite, and you morph it into a formal certification tool, where a WG defines a set of tests. If you successfully test all of them, you get a certification mark.
… In our OIDC work, we got a lot of feedback, the end result is that certified implementations tend to work together much more seamlessly
… Difference: In the interop test suite, you can implement the parts you like and leave out the rest. OTOH to reach a certification level, you have to implement a minimum bar.

Daniel Burnett: Regarding certification program: For most WGs I have been in, that has been out of scope of the WG, it better fits into an industry consortium. It is highly political.
… It’s one thing to say “the spec contains a list of features”, it’s another thing to say “your implementation has failed”.

Michael Jones: Yes there are different certification models.
… One is third party certification. You pay a large amount of money to an independent organization to configure and run the code
… E.g. Microsoft got a SAML certification, product got better because we did that
… In OIDC we said that we wanted certification, but using a third party is a lot of work and costly
… Instead we tried another model: self-certification. You run the tests yourself and submit all logs for public inspection. It’s kind of like the open-source model. Not everyone will read my logs, but the point is that everybody could read them
… Another aspect is making legal statements if a third party certifies

Manu Sporny: Certification is something that W3C has traditionally not done. DIF has more recently said that they are interested in running certification programs. W3C would produce specs, and a different body would do certification

Michael Jones: Virtual cycle between testing and specification work. Interop testing can expose implementation bugs as well spec bugs.
… Developers will get involved early and often, they can tell us what is good and what is bad
… Interop testing can expose all of that. You can interop, test, fix the code, fix the spec.
… Ideally you do several rounds of this. Do a snapshot of the spec, get developer feedback, change spec until it’s interoperably implementable.
… What do we want to test to improve interop?
… Test the MUSTs and SHOULDs
… Test the positive cases (the paths that work), but also at least as important to test the negative cases (what does your implementation do if it encounters bad input)
… Example in the SAML work. More than 50% in a specific situation accepted a bad signature. All the positive cases worked, but the negative cases worked too

Markus Sabadello: (laughter)

Michael Jones: Make sure your implementation errors and logs the error
… You can to test uses of the extensibility points. Deployments will often specialize their implementations by using the extensibility points. You want your tests to check if nothing breaks if extensibility points are being used.
… Example: In OAuth, any request value you don’t understand must be ignored. This let’s you do many extensions where you send extra request parameters. You need test cases that inject arbitrary additional values and check if implementations successfully ignore that.
… If there is a state machine in your specification where you can progress between states, you want to have tests that the succession of states implemented by the code is the expected succession.
… E.g. if you do a write operation, then you do a read operation, test that you get the correct result.
… Regarding burn’s point about political aspects of certification: You want test definitions to be public. They correspond very closely to the specification. There have to be objective reasons why the test exist (ideally pointing to a specific part of the spec).
… You keep everything in an entirely binary pass/fail basis. All have to run the same tests to get the same certifications. No exceptions are made for dominant player in the market. Even if you have something in production, you may not get certification.
… If you do or do not include a test, that is the result of a working group consensus decision. The WG represents the industry, it serves as a check and balance that the tests are fair and complete.

Daniel Burnett: I worked for a company to make sure it was 100% compliant.
… W3C is not an enforcement body. It depends on what your industry wants and needs. A third party is better as an enforcement body.
… Reliability of self-certification is not always as good as it sounds, even if all results are public.
… We define a data format and not a protocol. We cannot stray too far into how our document formats are used. We had a similar situation in the VC WG.
… As we get further along, we have to keep this in mind.
… The IETF used to say be strict in what you issue, be liberal in what you accept. IETF doesn’t say that anymore. It now says you also have to be strict in what you accept. Otherwise implementation get lazy, things don’t work, and security issues follow.
… Agree that it is important to test negative cases.

Michael Jones: In a data format, there are things you can test and things you can’t test. Example: In DID document, some things have to be present, some things have to follow a certain format. You can test this.
… Lazy example: In JWK, key values are encoded as base64url strings. It turns out that Google used to publish their keys in base64 encoding, not base64url encoding.
… Some implementations that read the keys worked (they accepted the value even though the encoding was wrong), other implementation didn’t work.
… Google eventually fixed their keys, this was a success of the certification program.
… Example of OIDC Financial API. They require certification every 6 months. When that we run by OBEI, the gave certifications with footnotes, saying that bank X implemented most of it, but they got client authentication wrong. They were certified, even though something didn’t work.
… Market interest said everybody passed, even though they were not interoperable
… Later this was changed to not use footnotes; it either works or it doesn’t.

Justin Richer: You can run the test suite yourself without going through certification program.
… Newer versions of the test suites is more easily packaged, you can run it on a local dev environment. This is very important. All test suites must be open source and publicly available.
… Regarding extension points, testing those will be very important to this group in particular, given lots of discussion about extensibility.
… If you’re a JSON-LD process and you see something in the doc you don’t understand, you have to react correctly.
… The Java test suite doesn’t use an OIDC library, since you have to test bad things (bad signatures, etc.) to make sure that things are exercise fully. This is easier to do without a library. Just write custom tests for everything.
… Regarding public availability of logs, this is very valuable. If you look at logs of Azure test suite, they pass, but only on single tenancy, therefore Microsoft passes. Other tenancies are different.

Daniel Burnett: A company may get certified for something they built, but the product they sell may be something different

Ivan Herman: One points about certification: W3C has had several problems where it developed things that turned out to be in competition with what members wanted to do for money.
… W3C long ago had a browser implementation. Members didn’t like that because they wanted to have their own business around that.
… How does the spec CR phase fit into this picture? In this testing phase, formally the emphasis is different: When we test in CR, we need to submit a report to the management. Without that we cannot get to a recommendation.
… The primary function of the CR phase is testing. W3C CR goal is to test the spec, not the implementations
… Make sure the spec is “implementable”. Must have at least two independent implementations.
… From W3C point of view: Brave, chrome, vivaldi, etc, don’t count as separate browser implementations.
… We have to report that every features must all be implementable, must be proven by concrete implementations.
… Agree with Justin_R’s point about self reporting. From W3C point of view, if someone comes in and produces tests, it helps them in the public image. But it’s not about public image, it’s about understanding that we did the right thing with the spec, and the basic approach is to trust the submitters of those reports that their claim are correct.

Christopher Allen: I have history here with SSL/TLS. Three intertwined problems: Early on you couldn’t request a certificate unless the code had security reviews.
… We looked at code for max. 8 hours. If we didn’t find any problems we let it pass.
… We usually found serious problems within an hour or two.
… We got feedback that we can’t allow >50% of people to fail.
… Eventually, everybody ended up using only openssl.
… There were non-free implementations that came with support
… Tragedy of open source. Tragedy of the free. Security certification is harder than data certification.
… TLS 1.3 should have been done in 2003. Got done in 2018.
… Keep in mind that once we do DID 1.0, there may not be an opportunity to do DID 2.0

1.1. Discussion

Brent Zundel: Let’s discuss do we want to have interop. How much?

Michael Jones: Regarding centralized vs. diversity. We intentionally made the barrier to certification low enough to make it available also to open source.
… In OIDC, over 100 certified implementations.

Brent Zundel: Which of the kinds of interop do we want

Joe Andrieu: Missing topic is method interop, we should talk about whether we want that

Tobias Looker: what do you mean by that

Joe Andrieu: We don’t have any DID method specification under our control. So we can’t test/certify compliance of DID methods
… Many conversations I’ve heard: We are going to use one DID method. This doesn’t mean it will interoperate with another DID method.

Daniel Burnett: Let’s go over the types of interop and get opinions on which ones we want
… Interop on data model?

Markus Sabadello: (everybody raises hand)

Daniel Burnett: Interop on data model and basic syntax?

Markus Sabadello: (everybody raises hands)

Daniel Burnett: Interop on extension mechanism?

Markus Sabadello: (most people raises hands)

Daniel Burnett: Any concerns with that?

Christopher Allen: Extension mechanisms, what parts? Maybe some yes, others not

Daniel Burnett: Interop on canonical form?

Michael Jones: If there is a canonical form, should we interop?

Manu Sporny: In VC we defined JWTs and LD signature. So people interop’ed on different things. Is that interop or not?

Daniel Burnett: Interop on cryptography?

Drummond Reed: What does this mean?

Brent Zundel: If two implementations agree on the signature type, one can produce it, the other can verify it

Daniel Burnett: Anyone disagrees?

Markus Sabadello: (nobody disagrees)

Tobias Looker: The data model needs to be able to express the type of signature

Daniel Burnett: Is there any other aspect of interop we need to ask about?

Michael Jones: Crytographic algorithms are extension points. New ones will get added, old ones will be removed.

Daniel Burnett: Anyone disagrees?

Markus Sabadello: (nobody disagrees)

Christopher Allen: How many people feel extension mechanisms are very orthogonal aspects. I feel that’s the case

Joe Andrieu: To the extent crypto is an extension point, we may have no interop except between implementations that support the same extensions. Are there any crypto suites that we will test for interop? Is there a subset of the extensible world that we want to test on?

Tobias Looker: E.g. in JWT there is a subset of crypto suites that must be supported, but extensions are still possible.

Daniel Burnett: Let’s try to get some more high level comments about other items

Manu Sporny: Interop on behavior, user experience, transport, protocol are out of scope.
… Test MUSTs and SHOULDs, the ones in the spec apply to data model, not behavior. But there is also a gray area.

Brent Zundel: We also specify how methods get implemented.

Daniel Burnett: Because we specify how methods need to work, therefore there will be some behavioral statements that are testable
… Interop on user experience?

Markus Sabadello: (agreement this is out of scope)

Justin Richer: even if it’s not in scope to specify and test, it’s going to influence every decision we make. what’s available to the user, how the user is presenting things. We all bring our personal biases what kinds of interactions are available.
… We have assumptions based on what we’re building

Daniel Burnett: There may not be testable interop on user experience, but we absolutely must consider user experience.
… Reaching edge cases that require much discussion.

Daniel Burnett: Trying to get additional quick results

Ivan Herman: Based on the experience of other groups, as soon as we have a stable draft, we need to start working on test suites. This takes more time than you think.

Manu Sporny: A test suite exists. A framework is there, but tests are outdated.

Daniel Burnett: Much more to discuss, but we got some agreement on basic statements.

2. Extensibility and Interoperability

Ivan Herman: See slide in the deck.

Daniel Burnett: Chairs believe we’ve had enough of this conversation to now move on to the big topic.
… JSON, JSON-LD, abstract data model come down to extensibility and interoperability.
… Now we can start a discussion on this key topic. That’s why the next topic is “extensibility and interoperability”
… manu has some proposals that result from various conversations with WG members
… manu presenting this is an attempt to record something that the group might be able to agree to
… This is not what manu wants, but what manu thinks we can agree on.
… (short break now)

Manu Sporny: We are going to attempt two proposals based on conversations on these topics
… The purpose is to try lower the tension
… Proposal #1
… I believe everyone was onboard to have an abstract data model that can be cleanly represented in JSON, JSON-LD, CBOR. there will some graphical depiction of the abstract data model.
… The extension mechanism in the spec will be a registry. People have been asking for it. The registry will have to be managed by the Working Group (or maintenance group). This means more process.
… Levels of interop we want to get to: We want to have interop between JSON and JSON-LD
… In order to add an entry to the registry, and extension author has to provide a spec AND a JSON-LD, in order to keep compatibility with the JSON-LD world. This does NOT mean you have to use @context
… With this proposal, you have the option to not use @context and be purely registry-driven

Justin Richer: what do you mean by “use @context

Daniel Burnett: There is a JSON format and a JSON-LD format. The specification for the JSON-LD format has to provide context inforation
… If you do JSON-only there is no @context, there is nothing that looks like JSON-LD
… If you want something to the registry, if you want to add a field, you also have to add context information, just like the original specification has to add context information
… This means everything has a context, but no requirements are imposed on JSON-only

Manu Sporny: We thought through the details, we’re quite certain this can work. There may be details, but in general the shape looks okay.

Tobias Looker: Does this apply to extensions of the core?

Manu Sporny: Applies to all extensions (properties; not DID methods)

Tobias Looker: In JOSE, people can extend but not formally register. JSON developers can just add what they want and not register it.
… Would that not be allowed in the DID document?

Manu Sporny: I think we should make that decision later. It’s important.
… One path: This is for all extensions. Other path: Be more loose about it. Either way, we can figure it out later

Michael Jones: I wanted to ask: 1. Does it matter if the abstract model is “graphical” or “textual”?

Daniel Burnett: It means there will ALSO be graphical representation, in additional to textual abstract model.

Manu Sporny: This will be helpful to a group of people

Michael Jones: Main question: Mike Lodder’s pull request proposes to add a method name to the top level. Does this break the JSON-LD model?

Manu Sporny: This model allows it.

Oskar van Deventer: I’d like to understand governance of the proposed registry. Who will allow/reject registrations?

Drummond Reed: There was really good discussion at W3C TPAC about registries
… This looked encouraging

Manu Sporny: This will take some time

Drummond Reed: Regarding yesterday’s three-tier model. Does this mean there are only two layers (only registry-based extensions, no decentralized extensions)?

Manu Sporny: In the JSON model, you have to use the registry. Future decision: Can JSON-LD people do extensions without using the registry?

Drummond Reed: I think there’s a way to allow both. Registry-based and decentralized extensibility.

Manu Sporny: This proposal attempts to achieve the three tiers (did core, registry, decentralized extensbility)

Drummond Reed: But the registry is not decentralized?

Ivan Herman: If I take the same DID information and put it into JSON and JSON-LD. Are the two equal except for the presence of @context ?

Manu Sporny: Yes

Ivan Herman: When we define the shape of the JSON, it has to make sense in the JSON-LD as well, even though the user doesn’t have to care
… JSON schema could be used to define that

Samuel Smith: It seems there is an assymetry here; We have abstract data model, then we have three syntax, but then it says there has to be a context
… Instead of saying JSON, JSON-LD, CBOR, we should distinguish between semantic encodings and non-semantic encodings

Drummond Reed: JSON-LD would be one representation

Samuel Smith: You should talk about RDF as the data model, and JSON-LD as an encoding

Phil Archer: Adding to a registry… This proposal looks like an arbitrary starting point. To get an extension in the registry, what does it mean? Does it reserve the namespace or does it go further? What’s the governance model, what’s the test suite, what’s the IPR, etc?

Michael Jones: Respond to Oskar’s question, what are the rules for adding something to the registry. The registry includes instructions to the experts with rules what should be added.

Samuel Smith: +1 phila we should be abstract in the semantic model as well to enable other encodings besides json-ld

Michael Jones: E.g. in JWT registry, one rule is a new claim must not duplicate functionality that’s already covered by another claim
… Regarding manu’s question we decide later whether JSON-LD extensions will go into a registry. Absolutely.

Drummond Reed: Drummond reminds himself that his queue turn is to suggest that the registry should also handle DID method names.

Michael Jones: The registry is documentation, not just reserving names

Samuel Smith: This allows a true semantic overlay that is a semantic model not merely a json-ld encoding of that abstract model

Drummond Reed: Drummond also reminds himself about the point that JSON and JSON-LD and CBOR are not the only potential representations.

Markus Sabadello: I had a similar question as drummond
… does the open world extensibility need to go in the registry? If not, does that mean only the JSON-LD would be able to function with that extension?

Daniel Burnett: As chair, I have mixed opinions. The next proposal will be more controversial. I want to get to agreement on something.
… manu it’s up to you, should people see the second proposal?

Drummond Reed: Drummond also reminds himself that his third point is to add support for decentralized extension to this proposal.

Joe Andrieu: How does versioning happen? Does the JSON-LD context update to a new version?

Manu Sporny: JSON-LD contexts will work the same way as they work today.

Drummond Reed: I think the proposal is a good starting point. Some of us will think there is a clear enough path to the third tier toward decentralized extensions. There may be evolutionary pressure to support that.
… Even though we call out three representations, we are committed to an abstract data model. Every extensibility model should be expressible in all concrete formats.
… Governance of the registry is important. I think we should explore if we also want to apply this same mechanis to method names.

Samuel Smith: +1 method names

Tobias Looker: The way you use URIs, how DID documents reference each other, I want that to be consistent. I don’t want to use different URI formats depending ont he DID document format (e.g. have JSON pointers in a DID URL)
… I want to be able to consistently be able to link to keys, e.g.

Drummond Reed: Drummond reminds himself to talk about lossless conversion between representations

Manu Sporny: Agree with tplooker, that should be a goal, would be surprised if anyone argued otherwise
… SamSmith Regarding your point to have an abstract data model for syntax, and an abstract data model for semantics. I think this will add complexity to the specification that we probably dont need to be successful.
… E.g. with JSON-LD, anyone who uses it is aware that the underlying semantic model is RDF, so whoever works with RDF already can also express it in Turtle and other RDF serializations. We could go into details in the DID Core spec, but let’s be pragmatic

Daniel Burnett: Concern with RDF + Registry - is not specific enough
… need specific JSSON LD
… Registry replicates what we are already doing in the spec
… Registry gives us interop
… anyone can privately extend anyway, one cannot stop that
… however, one is not interoperable if not in registry
… We could also be more strict

Michael Jones: Versioning question
… increasing set of version numbers is antipattern
… one wants to mix and match
… Cf Dan’s comment about interop
… want same language in PRs, oAUTH, “a field that you don’t understand you must ignore”
… this maintains interoper in case of private extensions

Ivan Herman: “Sam is right, but …”
… if you want to do it right, one also should use also, e.g., OWL or the like to define things properly
… don’t undersestimate the anti-semanticweb forces and anti RDF forces
… would be interested in separate note on semantic web, on which they can build

Daniel Burnett: Agenda - THIS IS OUR MAIN TOPIC, we’ll continue to lunch

Ganesh Annan: Proposal is incomple and not feasible to fill in the holes
… Can we agree on something this incomplete?

Samuel Smith: Semanticweb not for us
… We should add method names to the registry
… this should resolve namespace collisions
… do similar this for interop with methods

Justin Richer: I would agree with “a registry” not “the registry” but that’s a detail

Daniel Burnett: Maybe add methods to a registry, but not the same one that we are currently discussing for extensions

Drummond Reed: test - if we have this registry what would be the constraints?
… talked with folks who made PRs
… start from requirements, and puts these into core spec
… could you define/require lossless conversion between representations?

Brent Zundel: +1 to lossless conversion between formats

Drummond Reed: that would make things a lot easier for DID Controllers (authors) to produce DIDdocuments in multiple representations
… Lossless conversion would be the test
… this could be done even decentralised

Justin Richer: +1 to lossless conversion, I think it’s required (modulo signature verification)

Markus Sabadello: Decentralised non-registry solution, can that also allow interoperability
… or can the JSON-LD @context achieve the same?

Manu Sporny: Group should look down the list and identify objections
… Let’s drive to agreement

Kenneth Ebert: Are ALL extensions done via JSON-LD context, or also other

Manu Sporny: That is the fall-back position, requiring EVERYONE to go via the registry?
… Would this take away the JSON-LD extensibility mechanism?

Christopher Allen: Not part of this proposal

Drummond Reed: Here’s a counterproposal: any extension designed for lossless conversion MUST use the registry mechanism, but decentralized extensions are still allowed.

Brent Zundel: Let’s focus on proposal.
… Disagreement on point 1)?

Ivan Herman: With my proposed modification

Daniel Burnett: Five hands up for worries

Ivan Herman: Then slides needs to get included shape constraint

Tobias Looker: URI, different serialisation should not cause different but equivalent …

Drummond Reed: Text is changed. Lossless conversion covers every constraint. AT LEAST JSON-LD and CBOR

Michael Jones: I don’t understand what shape constraint means, so please delete that sentence from the slide?

Manu Sporny: Shape constraint = “there must be loss-less conversion”

Michael Jones: Can you then sign the data?
… Then please write “lossless”, not “shape”

Daniel Burnett: OK, changed, new formulation of 1)
… any further concerns, no? Consensus, hurray!!!

Justin Richer: VC document has sentence “process of serialisation has to be lossless and bidirectional”
… This limits stuff from JSON-LD and CBOR

Justin Richer: From VC core spec: The process of serialization and/or de-serialization has to be deterministic, bi-directional, and lossless.

Daniel Burnett: Lot of headnodding on Justin_R remark
… any concerns?
… Alternative formulation on screen

Samuel Smith: Can’t depend on semantic, or constraints on @context of JSON-LD?

Daniel Burnett: Sorry, too detailed, wanted to pursue headnods

Michael Jones: Process point, agreement on 1), not further wordsmithing

Daniel Burnett: Procedural …
… concerns about 2)?

Jonathan Holt: jonathan just virtually nodding his head

Markus Sabadello: Not the ONLY extension mechanism

Tobias Looker: Extension of the CORE SPEC will be administered by the reigistry
… all the CORE attributes go to the single registry
… versus decentralised extension mechanism

Daniel Burnett: Not too many nods
… Let ONLY manu do text editing in the slide, please
… who has concerns on 2)

Phil Archer: How are multiple registries linked to each other?

Michael Jones: One per type, they are disjunctive

Drummond Reed: But that is not what the wording says

Phil Archer: “Typed registry”?

Daniel Burnett: Registry mechanism is used for extensions (no singular/plural)

Drummond Reed: Want that, but don’t want to exclude others for doing decentralezed extensions

Daniel Burnett: Anyone can provide additional proposal, anyone can make a PR

Drummond Reed: Does this exclude PR mech?

self issued: add the word interoperable

Michael Jones: The registry mechanism is the one that will be used for ^interoperable extensions

Joe Andrieu: can we clarify and say properties? so we know we’re not talking about other things?

Juan Caballero: and DID methods?

Markus Sabadello: same concern, it sounds like you NEED to use the registry

Joe Andrieu: Wording is incorrect; this is about extensibility for everything, that isn’t the proposal

Tobias Looker: Same as Markus, use a registry mechanism, proper documentation for JSON-only developers
… We can agree about CORE extensions
… We can also include other extension mechanisms

Jonathan Holt: method to model the “expressivity” of deterministic extensibility

Daniel Burnett: lots of hands up
… start with Joe’s point

2.1. property extensibility

Joe Andrieu: Presents
… Essence is properties, want they mean, and their extension
… Make clear that we are talking about properties, not matrix parameters

Justin Richer: I interpret this as core properties
… Method extension is a different issue
… we should be specific about properties, but let’s look also at other things
… pairwise agreement is not spec interoperability
… former is DIF universal resolver
… need ways for the latter
… There should be rules on how to handle unrecognized field: ignore, allow, error, …

Daniel Burnett: Queue is not moving well, please be brief
… one minute each

Manu Sporny: Remind that this is a hold-your-nose proposal, no perfection needed, just general agreement about direction

Brent Zundel: Focus is joe’s topic, properties

Drummond Reed: This is about extensions of the DATA MODEL
… Extensions to the core? Core=core, as defined by spec. Extensions != core

Michael Jones: Unless we delete “properties”, I don’t hold my nose. Other extensions, other than properties.
… please delete “properties”

Ivan Herman: Properties AND possible values

Markus Sabadello: Type of extensibility relates to purpose of document as a whole. Is it minimalistic DNS-like?
… Or is it like a web-ID profile? WE.g. extensions for publishing list of fiends

Michael Jones: If you qualify the kind of extensions, then there could be extensions that don’t need registry.
… ONLY registry for interoperable extensions

Manu Sporny: Delete “Properties”

Daniel Burnett: Objections? Yes …

Joe Andrieu: OK with getting rid of “property”
… But then how to address Markus’ and Drummond’s points?

Daniel Burnett: Opening scope?

Manu Sporny: No, focus is issue 2) on the slide

2.2. registry extension mechanism

Joe Andrieu: Markus wants (also) decentralised extension mech

Drummond Reed: Agree with …, but should not exclude decentralized extensions (i.e. not-registered ones)
… Disagree if 2) disallows decentralized extensions

Justin Richer: Issue with universality without/with the word “property”

Joe Andrieu: Registry mechanism is how we publish interoperable extensions
… Use that for those, does not exclude non-ublished solurions for extensions

Markus Sabadello: What goes into my DIDdocument is controlled by ME. However others seem to see this differently. Interoper issue

Michael Jones: Key word is INTEROPERABLE extension. Private agreements can always happen. However, register is for global interop
… Interop between .. out of time

Christopher Allen: Global interoperable via registry, but smaller commnunities should not be impeded to extend

Brent Zundel: Add “globally” interoperable

Markus Sabadello: +1, add “globally”
… not sure whether that is good enough? “mandatory”?

Michael Jones: Please delete “property and possible values”, let us that at provisional consensus

Manu Sporny: Updated the statement

Daniel Burnett: Any objections?

Joe Andrieu: Want to have the word “published”
… withdrawn

Brent Zundel: Why “in general”

Manu Sporny: Wiggle room

Markus Sabadello: Add “allow other extension mech”?
… lossless conversion?

Oskar van Deventer: Consensus about number 2), hurray!

Daniel Burnett: Consensus, hurray, thank you!

2.3. proposal 1.3

Daniel Burnett: Number 3), any objection?
… several objections

Michael Jones: Objecting to singular “registry”

Manu Sporny: change to “registry mechanism”

Justin Richer: Clarifying, “managed” means what goes in or out?

Manu Sporny: “governed”?

Justin Richer: Don’t want to redo IANA

Ivan Herman: Replace W3C groups by just “governed by W3C”

Daniel Burnett: Not accepted

Michael Jones: Clarification, WG should govern registry, and delegate day-to-day operations to designated experts of W3C staff

Daniel Burnett: Seems agreement about that notion

Drummond Reed: WE define the governance, that defines who/what/…

Joe Andrieu: In response to ivan –> want “DID WG” governs …

Manu Sporny: updated …

Tobias Looker: Process carried out by …
… DID WG reserves the right to manage the registry

Daniel Burnett: Not agreed

Ivan Herman: DID WG ceases to exist, cannot mandate maintenance WG, cannot load work on that. ONLY W3C can do this long term, though W3C will also delegate to, e.g., dedicated groups or experts.

Phil Archer: +1 to Ivan

Drummond Reed: Registry mechanism governance will be defined by the DID WG

Samuel Smith: +1 drummonds suggestions

Daniel Burnett: Agreed

Manu Sporny: Includes Drummond’s text

Joe Andrieu: Thanks, addresses my point and ivan’s

Michael Jones: Like this, simpler

Phil Archer: Separate document or in this spec

Daniel Burnett: That is detail, not now

Drummond Reed: Will follow from W3C policy

Brent Zundel: Any objections to 3)?

Daniel Burnett: Hurray, consensus!

2.4. proposal 1.4

Daniel Burnett: Start discussing item 4)
… any objections?
… many objections

Justin Richer: Provide “a” specification is way one-to-one
… how about JSON-LD reachability of @context
… compatibility between producers and consumers, not just about types, who is writing and rwsafing

Phil Archer: Governance defined by group, delete 4)

Manu Sporny: Is technical interop

Ganesh Annan: No wording, nothing is enforcing JSON-LD context is correct or valid, so what does that mean? Valid for what?

Daniel Burnett: add “valid” –> agreed

Michael Jones: Second justin’s first point, “must provides references to specs for each new attribute”

Daniel Burnett: Agreed!

Markus Sabadello: Is about “entries” in general

Samuel Smith: We are mixing between specs. DIDcore is abstract data model, is other than semantic model as exptressed by JSON-LD

Drummond Reed: Getting semantic AND syntactic interop, no proposal

Brent Zundel: Compatibility between formats?

Daniel Burnett: I see objections

Michael Jones: “Producers or consumers” –> AND

Brent Zundel: Any further objections to 4)

Joe Andrieu: Producers and consumers, language does not flow here. Is is about compatibility about serialisations, not consumers and producers

Markus Sabadello: Same point, “entry”?

Drummond Reed: To add an entry to the registry, wording could be better

Ganesh Annan: entry versus entries, plural?

Manu Sporny: fixed

Daniel Burnett: editorial

Michael Jones:

Samuel Smith: Compatibility between JSON and JSON-LD is not our goal, but semantic interoperability. There should be an @context

Daniel Burnett: problem with presentation mode, use Google Docs correctly, please

David Ezell: In general, if you give too many reasons then it is harder to get consensus (so remove reasons)

Daniel Burnett: … slight confusion about numbering on screen ..

David Ezell: Lot of “reasons”, focus on pure wording

Justin Richer: Confused about “for example”. changes rest of sentence
… more than ONLY semantic interop

Manu Sporny: changed

Daniel Burnett: change agreed

Michael Jones: Delete ALL the reasons, only what, not why

Joe Andrieu: replare JSON-LD @context with lower-case context

Daniel Burnett: Not agreed

Drummond Reed: Let’s use terminology consistently, “representation”

Daniel Burnett: No agreement on last line?
… Any further objections on 4)?

Joe Andrieu: Serialisation and producers/consumers are not siblings, remove producers/consumers

Markus Sabadello: We want lossless conversion, can we repeat THAT language as explanation of compatibility?

Joe Andrieu: Delete producers/consumers

Daniel Burnett: Not agreed

Justin Richer: Important that producer and consumer considerations are different. Proposes “lossless conversion …”

Manu Sporny: correcting

Daniel Burnett: Any objections?
… Hurray, we have consensus on item 4!!

2.5. formal resolution

Manu Sporny: Here’s the proposal #1 we’ve all seemed to agree to:

  1. The DID Core specification will define an abstract data model that can be cleanly represented in at least JSON, JSON-LD, and CBOR. There will also be a graphical depiction of the abstract data model. There must be lossless conversion between multiple syntaxes (modulo signatures and verification).
  2. In general, the registry mechanism is the one that will be used for globally interoperable extensions.
  3. The governance of the registry mechanism will be defined by the W3C DID Working Group.
  4. Extension authors must provide references to specifications for new entries and a valid JSON-LD Context to be associated with each entry to ensure lossless conversion between serializations for both producers and consumers. This is partly being done to ensure semantic interoperability.

Joe Andrieu: Compliments to manu

all: applause

Manu Sporny: That is proposal #1.

Daniel Burnett: Voting on IRC

Markus Sabadello: Is this agreed for the spec?

Daniel Burnett: No this is not yet spec text, but it was agreed at DID WG

Joe Andrieu: It shall be noted that this is a hold-your-nose

Daniel Burnett: We are saying, by agreeing here, that the working group accepts this proposal at this time

Manu Sporny: I have a proposed text

Daniel Burnett: Voting via IRC, observers don’t vote
… any suggestions for changes to manu’s PROPOSAL wordings?

Markus Sabadello:

Michael Jones:

Joe Andrieu: “#1”

Proposed resolution: Proposal #1 above is agreed to by the Working Group. Further proposals may modify the baseline proposal above. (Manu Sporny)

Drummond Reed: +1

Brent Zundel: +1

Ganesh Annan: +1

Tobias Looker: +1

Kenneth Ebert: +1

Joe Andrieu: +1

David Ezell: +1

Christopher Allen: +1

Oliver Terbu: +1

Justin Richer: +1

Phil Archer: +1

Michael Jones: +1

Ivan Herman: +

Manu Sporny: +1

Samuel Smith: +1

Daniel Burnett: +1

Ivan Herman: +1

Eugeniu Rusu: +1

Drummond Reed: +1

Resolution #1: Proposal #1 above is agreed to by the Working Group. Further proposals may modify the baseline proposal above.

Daniel Burnett: No objections –> RESOLVED, hurray!

Samuel Smith: Strong consensus!

Daniel Burnett: No discussion now, let’s congratulate ourselves first

all: Applause!!

Samuel Smith: Compliments to burn

Joe Andrieu: Plus thanks for whole days!

all: Again applause

2.6. proposal #2

Ivan Herman: See slide in the deck.

Daniel Burnett: Have many present proposal #2, so all can have at least seen it
… Any objections to that approach?

Christopher Allen: Manu may need time to incorporate #1 into #2

Manu Sporny: already done
… Proposal #2 is parallel/additional to #1!
… Line 1 is same

Daniel Burnett: manu, please read all out

Manu Sporny: Reading proposal #2
… No need for registry and maintenance of registry

Daniel Burnett: manu, please clarify rationale

Manu Sporny: proposal #1 has weak technical argument. Benefit of #2 is decentralized extensibility, no need for governance, addresses many’s requirements for extensibility, without registry processes overhead

Justin Richer: This feels heavily biased to JSON-LD
… this direction feels OK, but then spec becomes JSON-LD spec
… that should be made explicit

Samuel Smith: +1 Justin

Phil Archer: This contradicts previous proposal?

Samuel Smith: Not technically equivalent

Ivan Herman: indeed, is alternate proposal

Phil Archer: feels strange

Michael Jones: making things easier for WG? Does not what matters. Ease to developers is more important.
… This would hurt implementation, is not developer friendly

Samuel Smith: Broad Industry Adoption is hurt by a JSON-LD specification

Daniel Burnett: manu proposed #2 to replace #1

Michael Jones: Proposal number #2 would impose a substantial unnecessary burden on developers

Drummond Reed: I understand what this is about, purely political thing, JSON-only PRs
… want to see abstract data model in multiple representations may be too JSON-LD centric

Markus Sabadello: Biggest extension is in the wording itself, “in general”, “could be”, language

Christopher Allen: I believe that #2 is equivalent, but politically unsound
… We can remedy that by reframing some words
… We just don’t mention JSON-LD by name, but only wording pointing the registry into that direction

Jonathan Holt: ipld://bafyreiauq2tulhnkrktu6brs4jfe472cvgbf2gvvmd6gjjzxy2lyedrmyq

Jonathan Holt: item in array, IANA

Samuel Smith: JSON-LD centric serves one community well, and others poorly

Joe Andrieu: I don’t like registries
… Don’t see how JSON-only developer can extend this

Michael Jones: Registries per #1 are for global interoperability, #2 makes it dead in the water

Daniel Burnett: This is START of conversation, we have limited time to discuss. Please talk about it offline with each other please!

Michael Jones: Per #1, the registries are there to enable global interoperability. Without the registries, you don’t have that. Therefore, #2 is dead in the water.

Daniel Burnett: and please seek common ground, or at least something for common agreement
… after break other items on the agenda

Manu Sporny: Registries are not the only way to get global interop

Michael Jones: But registries are a proven, developer-friendly means of achieving interoperability

Daniel Burnett: Any other issues?


3. Resolutions