This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 25618 - Extensibility: Offer spec-blessed ways to extend the algorithms and curves, rather than monkey-patching the spec
Summary: Extensibility: Offer spec-blessed ways to extend the algorithms and curves, r...
Status: RESOLVED FIXED
Alias: None
Product: Web Cryptography
Classification: Unclassified
Component: Web Cryptography API Document (show other bugs)
Version: unspecified
Hardware: PC All
: P2 normal
Target Milestone: ---
Assignee: Ryan Sleevi
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks: 25198
  Show dependency treegraph
 
Reported: 2014-05-09 00:13 UTC by Ryan Sleevi
Modified: 2014-10-22 20:39 UTC (History)
11 users (show)

See Also:


Attachments

Description Ryan Sleevi 2014-05-09 00:13:32 UTC
To avoid monkey-patches to the Web Crypto spec, it should have defined extension points for how additional algorithms can be implemented.

This is akin to the Structured Clone ( http://www.whatwg.org/specs/web-apps/current-work/multipage/common-dom-interfaces.html#safe-passing-of-structured-data ), which includes language to the effect of 

"If input is an object that another specification defines how to clone
Let output be a clone of the object as defined by the other specification."

eg: Consider the comments from the W3C TAG review ( https://github.com/w3ctag/spec-reviews/issues/3#issuecomment-41521737 ) , which notes the issue with monkey patches ( http://annevankesteren.nl/2014/02/monkey-patch )

Possible places where defined extension points are needed:
- JWK "alg" handling
- Named Curves
  - The string name (for JWK)
  - Debate on enum vs string (is it monkey patching the enum)
  - The ASN.1 encoding/decoding rules
- Hash algorithms used in signatures
  - How to invoke the underlying hash algorithm
  - The string names (for JWK)
  - The ASN.1 encoding/decoding rules
- Import/Export Key
  - ASN.1 handling of the algorithm OID
  - JWK "alg" handling

This is "conceptually" encapsulated in the spec already with the notion of "registered algorithms". However, there have been issues raised about the confusion of the language. As much as possible, the spec should be clear on exactly how algorithms are resolved and extensions implemented.

A successful resolution of this bug will be ensuring that one can, in a way that does not alter the Web Crypto spec, define an entirely new spec that adds additional algorithm(s).
Comment 1 vijaybh 2014-05-10 06:29:22 UTC
Just wanted to express whole-hearted support for this. In particular the ability to add support for additional elliptic curves in an extension spec instead of monkey patching this on seems very important given the current level of interest in that area.
Comment 2 Mark Watson 2014-07-28 15:51:35 UTC
Specifically regarding new eliptic curves, presently, each of the existing algorithms is associated with a single definition of the key material on which is it based, with serializations of this keying material being independent of any other algorithm parameters.

If we are to add additional curves to the existing EC algorithms, we break this pattern. This might involve quite extensive re-working of some aspects of the specification.

Would it make sense to model different families of Elliptic Curves as separate algorithms ? So, the existing algorithms become ECDSA-NIST etc. and we can add ECDSA-NUMS in a separate specification ?
Comment 3 Trevor Perrin 2014-07-28 16:00:10 UTC
(In reply to Mark Watson from comment #2)
> 
> Would it make sense to model different families of Elliptic Curves as
> separate algorithms ? So, the existing algorithms become ECDSA-NIST etc. and
> we can add ECDSA-NUMS in a separate specification ?

That's how I planned to do 25519, FWIW (add algorithms named CURVE25519 and ED25519, or perhaps ECDH-CURVE25519 and ECSCHNORR-ED25519).

Trevor
Comment 4 Ryan Sleevi 2014-07-28 18:55:26 UTC
(In reply to Mark Watson from comment #2)
> Specifically regarding new eliptic curves, presently, each of the existing
> algorithms is associated with a single definition of the key material on
> which is it based, with serializations of this keying material being
> independent of any other algorithm parameters.
> 
> If we are to add additional curves to the existing EC algorithms, we break
> this pattern. This might involve quite extensive re-working of some aspects
> of the specification.

Mark,

I'm having trouble making sense of this, so I'm hoping you can explain.

> 
> Would it make sense to model different families of Elliptic Curves as
> separate algorithms ? So, the existing algorithms become ECDSA-NIST etc. and
> we can add ECDSA-NUMS in a separate specification ?

For NUMS, no, it certainly doesn't make sense. The point of NUMS is to be compatible with the NIST (which is really the specification of the SECG, and plenty of other related bits). Brainpool also fits into this.

There's no technical reason that prevents Curve25519 from not fitting into this pattern either, other than the proponents of Curve25519 not wanting to go through the effort. Having Curve25519 fit into this model is something that's being discussed within the IETF, but the proponents for WebCrypto don't want to wait on this effort.

Just making sure we're clear as to the limitations or lack thereof.
Comment 5 Trevor Perrin 2014-07-28 20:01:23 UTC
(In reply to Ryan Sleevi from comment #4)
> (In reply to Mark Watson from comment #2)
> > 
> > Would it make sense to model different families of Elliptic Curves as
> > separate algorithms ? So, the existing algorithms become ECDSA-NIST etc. and
> > we can add ECDSA-NUMS in a separate specification ?
> 
> For NUMS, no, it certainly doesn't make sense. The point of NUMS is to be
> compatible with the NIST (which is really the specification of the SECG, and
> plenty of other related bits). Brainpool also fits into this.

The NUMS Edwards curves are more similar to 25519 than to the NIST curves, IMO.

> There's no technical reason that prevents Curve25519 from not fitting into
> this pattern either, other than the proponents of Curve25519 not wanting to
> go through the effort.

25519 is intentionally presented in the form of rigidly-specified ECDH and signature algorithms.  It's not just "not wanting to go through the effort" of providing a more naked curve, which would of course be less effort.

In particular, the Curve25519 ECDH paper specifies the sort of details you've left to ANSI X9.63, and the Ed25519 paper specifies the sort of details you've left to ANSI X9.62.

Since some of these details are curve-dependent, it's reasonable to provide them in monolithic ECDH and signature packages, i.e. Curve25519 and Ed25519.

For example, the ANSI X9.63 Section 5.4.1 ECDH primitive you specify I believe requires encoding the full point (not just x-coordinate), and for non-cofactor 1 curves (like some of the NUMS curves) requires an extra scalar multiplication for point validation, and doesn't account for the cofactor in the ECDH.  That's probably NOT what you want for all the NUMS curves.

And your ANSI X9.62 ECDSA primitive with random nonce is considered by many cryptographers to be inferior to Schnorr signatures with deterministic nonce like Ed25519, which is more robust against hash collisions and RNG failures, more efficient and flexible to things like threshold signatures, and has a better security proof.

We could argue more about these "modular" vs "monolithic" design decisions, but there are good reasons to rigorously specify algorithm details to match a particular curve.  You haven't done this, but that doesn't mean everything "just works" - it means you have work ahead of you to match your algorithm specification with your curves.

Trevor
Comment 6 Mark Watson 2014-07-28 20:11:30 UTC
(In reply to Ryan Sleevi from comment #4)
> (In reply to Mark Watson from comment #2)
> > Specifically regarding new eliptic curves, presently, each of the existing
> > algorithms is associated with a single definition of the key material on
> > which is it based, with serializations of this keying material being
> > independent of any other algorithm parameters.
> > 
> > If we are to add additional curves to the existing EC algorithms, we break
> > this pattern. This might involve quite extensive re-working of some aspects
> > of the specification.
> 
> Mark,
> 
> I'm having trouble making sense of this, so I'm hoping you can explain.
> 

I meant only that extending the existing algorithms involves quite some editorial work on the main specification to provide the necessary extension points, some of which you outline above.

By contrast, we already have the extension point necessary to add new algorithms without monkey-patching. It may be that adding a completely new algorithm (in a separate spec) for the new curves involves duplicating some material present in the existing EC algorithms, but I think that would be acceptable.
Comment 7 Mike Jones 2014-07-30 00:32:03 UTC
The IETF uses the IANA Registry model to good effect to enable extensibility.  (In fact, we're using the registries defined by the JOSE specs to define new algorithm identifiers, etc.)  Using registries would similarly enable the WebCrypto spec to be extended by other specs without revising the WebCrypto spec itself.

One way we could do this is to use IANA Registries.  RFCs are required to establish new registries.  Therefore, we could create a small RFC as a companion spec to the WebCrypto spec that established the registries needed by WebCrypto.  This could progress quickly as an area-director sponsored spec, rather than going through the working group process.

Alternatively, the W3C could establish its own registries and means of administering them.

I would like us to seriously considering using registries for extensibility of WebCrypto - either IANA registries or W3C registries, as it cleanly solves the extensibility problem better than any of the other approaches that have been discussed.
Comment 8 Ryan Sleevi 2014-07-30 00:42:31 UTC
(In reply to Mike Jones from comment #7)
> I would like us to seriously considering using registries for extensibility
> of WebCrypto - either IANA registries or W3C registries, as it cleanly
> solves the extensibility problem better than any of the other approaches
> that have been discussed.

Mike,

This comment is completely unrelated to the meat at hand with this issue. The spec already supports registration of arbitrary algorithms, by virtue of the algorithms defined in Algorithm Normalization. A registry serves no direct technical purpose in support of that - it's purely documentary.

However, much in the same way that the W3C does not provide a "registry" of specs that modify the IDL for the Window or Document, it would be a serious change in precedent to provide a registry for Algorithms - and, arguably, a detrimental one.

The W3C process is fully capable of embracing specs on REC track, within this WG or other.

However, in order to meaningfully address this issue, we don't need registries. We need to identify the places that are hardcoded with certain assumptions about algorithms being supported, and tease them out to be like algorithm normalization. Then, whether you hide it behind the documentation of a registry (unnecessarily) or you place it within the realm of REC track documents, spec authors can meaningfully avoid monkey patching.
Comment 9 Mark Watson 2014-07-30 01:16:06 UTC
Registries are necessary to manage a namespace where there are multiple ways / parties who might add new entries to that namespace. They also provide a convenient reference for someone looking to see all the values defined to date.

However, we have a more fundamental problem here that some of our namespaces are not extensible at all (e.g. named curves) without 'monkey patching' which is considered a bad thing (TM).

To be concrete, named curves are where this started. We have two options:
(a) do not allow extension of our existing EC algorithms and require new algorithms to be defined for new curves
(b) modify the specification to support non-monkey-patch addition of new curves

For (b) we must introduce a concept of the "supported curves list" and define the information that must be provided for each supported curve (at least, curve name, algorithms supported, import / export procedures, algorithm procedures for each algorithm )

We already have this for algorithm normalization, defined in terms of nested abstract associative containers (a little abstruse if you ask me, but that's stylistic).

This algorithm extension mechanism is very powerful. It is limited only by the text in Section 15.3 which defines the method procedures. I see two problems with that text, namely "recognized key format values" and "recognized key usage values". For both of these we should explain in the definition that they may be extended and the information necessary to comprise a definition. In several of the procedures there is a switch across key format values which needs an "otherwise, if the key format is defined in another specification, follow the procedures of that specification" in structured-clone style.

The above may be sufficient. New algorithms may be defined which are more extensible than the existing ones. However, enabling new key formats to be defined for the existing algorithms is straightforward once the notion has been introduced  as above: each switch on key format in algorithm-specific procedures needs the same "otherwise ..." clause.

Ryan also mentions JWK "alg" values. I am not sure there is much to do here. We expect the site to support WebCrypto algorithm identifiers and all we do with JWK "alg" is check for consistency. Is the suggestion that JOSE may define new values in future which happen to be consistent with existing WebCrypto algorithms ?

Finally, recursive algorithm normalization has a slight problem in that we do not have generality in terms of the operation against which a sub-field which is an AlgorithmIdentifier should be normalized. This is easily fixed and I will file a separate bug for this.
Comment 10 Mike Jones 2014-07-30 21:16:57 UTC
First, I support Mark's comment that we must define a procedure for implementations to support new algorithm parameters, such as curve identifiers - not just new algorithms.  His list of "Possible places where defined extension points are needed" is valuable, and should be address by any proposed resolution to this issue.

Next, without a registry, as an implementer of the specs, how do I know where to look for the definitions of the different algorithms and algorithm parameters that I might want to use in my implementation?  Also, without a registry, who administers the namespace of algorithm identifiers and other extensible sets of identifiers?  If I'm writing a spec defining a new algorithm, how do I reserve the identifiers that my spec uses?
Comment 11 Ryan Sleevi 2014-07-30 21:26:42 UTC
(In reply to Mike Jones from comment #10)
> First, I support Mark's comment that we must define a procedure for
> implementations to support new algorithm parameters, such as curve
> identifiers - not just new algorithms.  His list of "Possible places where
> defined extension points are needed" is valuable, and should be address by
> any proposed resolution to this issue.

Considering that these issues are the entire point of the bug, it seems unnecessary to express support, since that's the whole point of filing the bug to begin with.

> 
> Next, without a registry, as an implementer of the specs, how do I know
> where to look for the definitions of the different algorithms and algorithm
> parameters that I might want to use in my implementation?  

As I mentioned in my previous reply, the exact same place for all other aspects of your user agent - the W3C (and WHATWG)

> Also, without a
> registry, who administers the namespace of algorithm identifiers and other
> extensible sets of identifiers? 

The exact same august body of collaborators that determines that the Window IDL object shall have an "open" method, that the PerformanceTiming interface should have a "navigationStart" member, and that the Document object shall fire load events - the W3C.

>  If I'm writing a spec defining a new
> algorithm, how do I reserve the identifiers that my spec uses?

The exact same process you reserve the identifier "PerformanceTiming" if you're attempting to reserve an identifier for providing performance details, or reserving the identifier "serviceWorker" on the navigator object.

Mike, these are API decisions, and the W3C process - and the WebApps work mode - have identified that these questions you're raising as hypothetical issues to be solved by a registry are not needed to be solved by a registry. They're part of the standards-driven W3C process. I realize, coming from the IETF, that there may be a perchant for IANA registries. This is especially true with protocols, which are agnostic as to the implementation and how they're exposed to developers.

This is, still and always, fundamentally a matter of API decisions. That there are string identifiers may make this slightly confusing, but it's important to realize that window.crypto.subtle.encrypt({"name": "AES-GCM", ...}, ...) is, from the point of view of interoperability, web developers, and very purpose, not one lick of functional difference than window.crypto.subtle.aesgcm.encrypt(...)

It's an API surface. As such, any changes are treated like all other changes to the Web Platform - by bringing to the WHATWG / W3C, agreeing, and implementing.
Comment 12 Harry Halpin 2014-08-04 14:58:03 UTC
Here's a concrete proposal that I think would satisfy this without relying on an IANA registry. It needs to be worked out, but the general geist is here, very similar to how HTM5 maintains a registry of link relations:

The Web Cryptography Working Group will support algorithm extensibility in the Web Cryptography API, allowing new algorithms to be defined and specified as extension specifications.

The extension specification publication process is as follows:

1) The W3C maintain indefinitely a wiki of reserved algorithm identifiers, with links to specifications. This wiki should be referenced from the W3C Web Cryptography API specification. 

2) When a new algorithm is suggested and the name reserved, a timeline (no more than 6 months) should be given for the production of a draft specification. This specification should be formatted as an W3C Editor's Note, the Extension Algorithm Note, and sent to the Web Cryptography Working Group for discussion and consensus. If the Web Cryptography Working Group has been closed, the W3C will determine an appropriate Interest or Working Group for reaching consensus over publication of the Extension Algorithm Note. One requirement for the addition of new algorithms should be that they should be well-specified enough for implementers and are supported by at least one implementation. 

3) If consensus is reached, the Extension Algorithm should be published as a W3C Working Group Note and the wiki should be updated to reflect the maturity of the Algorithm Extension Note. When known, any further supporting implementations should be listed  

In the case of in which there are any disputes over reserved names or process, the Web Cryptography Working Group will be empowered to make a decision. If the charter of the Web Cryptography Working Group is not in effect, the W3C will find an appropriate Working Group or Interest Group, to be reflected in the wiki, to maintain the list of algorithms or will maintain the list of extension specs directly.
Comment 13 Ryan Sleevi 2014-08-04 17:42:21 UTC
(In reply to Harry Halpin from comment #12)
> Here's a concrete proposal that I think would satisfy this without relying
> on an IANA registry. It needs to be worked out, but the general geist is
> here, very similar to how HTM5 maintains a registry of link relations:

This seems to miss the forest for the tree.

> 
> The Web Cryptography Working Group will support algorithm extensibility in
> the Web Cryptography API, allowing new algorithms to be defined and
> specified as extension specifications.
> 
> The extension specification publication process is as follows:
> 
> 1) The W3C maintain indefinitely a wiki of reserved algorithm identifiers,
> with links to specifications. This wiki should be referenced from the W3C
> Web Cryptography API specification. 

This seems unnecessary. That is, every W3C spec is inherently part of a "registry" - the W3C registry. Every extension to the Window object, to the Navigator object, etc plays by this.

There's no need for any explicit registry.

> 
> 2) When a new algorithm is suggested and the name reserved, a timeline (no
> more than 6 months) should be given for the production of a draft
> specification. This specification should be formatted as an W3C Editor's
> Note, the Extension Algorithm Note, and sent to the Web Cryptography Working
> Group for discussion and consensus. If the Web Cryptography Working Group
> has been closed, the W3C will determine an appropriate Interest or Working
> Group for reaching consensus over publication of the Extension Algorithm
> Note. One requirement for the addition of new algorithms should be that they
> should be well-specified enough for implementers and are supported by at
> least one implementation. 

I see no value in artificial timelines like this, nor does it match how the rest of the Web Platform works.

> 
> 3) If consensus is reached, the Extension Algorithm should be published as a
> W3C Working Group Note and the wiki should be updated to reflect the
> maturity of the Algorithm Extension Note. When known, any further supporting
> implementations should be listed  
> 
> In the case of in which there are any disputes over reserved names or
> process, the Web Cryptography Working Group will be empowered to make a
> decision. If the charter of the Web Cryptography Working Group is not in
> effect, the W3C will find an appropriate Working Group or Interest Group, to
> be reflected in the wiki, to maintain the list of algorithms or will
> maintain the list of extension specs directly.

Harry,

I appreciate you taking the time for this. However, it sets forth a solution for problems that it fails to define. I appreciate you may see there are problems with the approach of handling this exactly as every other Web Specification, but you owe it to the WG to enumerate these, before proposing a solution for how to deal with them.

That is, what remains to be identified is how, in any shape, form, or way, is this different than how every other Web API works?
Comment 14 Harry Halpin 2014-08-04 18:20:29 UTC
(In reply to Ryan Sleevi from comment #13)
> (In reply to Harry Halpin from comment #12)
> > Here's a concrete proposal that I think would satisfy this without relying
> > on an IANA registry. It needs to be worked out, but the general geist is
> > here, very similar to how HTM5 maintains a registry of link relations:
> 
> This seems to miss the forest for the tree.
> 
> > 
> > The Web Cryptography Working Group will support algorithm extensibility in
> > the Web Cryptography API, allowing new algorithms to be defined and
> > specified as extension specifications.
> > 
> > The extension specification publication process is as follows:
> > 
> > 1) The W3C maintain indefinitely a wiki of reserved algorithm identifiers,
> > with links to specifications. This wiki should be referenced from the W3C
> > Web Cryptography API specification. 
> 
> This seems unnecessary. That is, every W3C spec is inherently part of a
> "registry" - the W3C registry. Every extension to the Window object, to the
> Navigator object, etc plays by this.
> 
> There's no need for any explicit registry.
> 
> > 
> > 2) When a new algorithm is suggested and the name reserved, a timeline (no
> > more than 6 months) should be given for the production of a draft
> > specification. This specification should be formatted as an W3C Editor's
> > Note, the Extension Algorithm Note, and sent to the Web Cryptography Working
> > Group for discussion and consensus. If the Web Cryptography Working Group
> > has been closed, the W3C will determine an appropriate Interest or Working
> > Group for reaching consensus over publication of the Extension Algorithm
> > Note. One requirement for the addition of new algorithms should be that they
> > should be well-specified enough for implementers and are supported by at
> > least one implementation. 
> 
> I see no value in artificial timelines like this, nor does it match how the
> rest of the Web Platform works.
> 
> > 
> > 3) If consensus is reached, the Extension Algorithm should be published as a
> > W3C Working Group Note and the wiki should be updated to reflect the
> > maturity of the Algorithm Extension Note. When known, any further supporting
> > implementations should be listed  
> > 
> > In the case of in which there are any disputes over reserved names or
> > process, the Web Cryptography Working Group will be empowered to make a
> > decision. If the charter of the Web Cryptography Working Group is not in
> > effect, the W3C will find an appropriate Working Group or Interest Group, to
> > be reflected in the wiki, to maintain the list of algorithms or will
> > maintain the list of extension specs directly.
> 
> Harry,
> 
> I appreciate you taking the time for this. However, it sets forth a solution
> for problems that it fails to define. I appreciate you may see there are
> problems with the approach of handling this exactly as every other Web
> Specification, but you owe it to the WG to enumerate these, before proposing
> a solution for how to deal with them.

The problem is how does one add an algorithm identifier (with appropriate extension spec) without returning to Last Call?

If you think you have a better way, please provide. I assumed a wiki that tracks the reserved names and the status of the extension spec as a WG Note was about as basic and sensible as you can get. 

> 
> That is, what remains to be identified is how, in any shape, form, or way,
> is this different than how every other Web API works?
Comment 15 Ryan Sleevi 2014-08-04 18:45:23 UTC
(In reply to Harry Halpin from comment #14)
> The problem is how does one add an algorithm identifier (with appropriate
> extension spec) without returning to Last Call?
> 
> If you think you have a better way, please provide. I assumed a wiki that
> tracks the reserved names and the status of the extension spec as a WG Note
> was about as basic and sensible as you can get. 

Harry,

How does WebApps introduce/propose/spec-through-CFI a new API without sending HTML5 to Last Call?

For example, the recently introduced Service Workers? Or the Mixed Content spec?

I hope, by example and extension, you can see that, by design, there is nothing that should require the "Web Crypto API" to re-enter last call. If there is, well, that's the general extensibility bug. This does not require a registry to solve, nor does it require a Note or a Wiki page, nor does the W3C's version of "point in time HTML5" require a "forward declaration" towards the work of WebApps or WebAppsSec.

Again, I remain opposed to registries because this is not a generic, arbitrary registration surface. This is a key, intrinsic component of the API, and it should not be treated any different than how the Web Platform and User Agents have treated every other aspect of API - through specifications, WGs, and consensus.

The only "risk" to requiring a registry is if a UA decides to forgo standards-based development, forgo the W3C process, and begin shipping untested, unspecified, unadopted work directly to the Web. I think the UAs participating in this WG, and in the W3C, have learned of the deleterious effects, and thus consensus is pursued in parallel or even prior to anything being shipped, and only shipping when such a thing is believed to be stable enough such that changes will not affect web developers (or, in the case of some UAs, prefixing).

Thus even the remotest probabilities of:
1) Some new hip algorithm being introduced
2) Two UAs deciding to implement some new hip algorithm immediately after it's announced
3) Coming up with incompatible definitions of some new hip algorithm

Are greatly reduced by the incredible unlikelihood that a UA would actually ship them before building consensus and a standards-based approach.

Microsoft's approach with the NUMS curves is no different than if Microsoft were to come to WebApps tomorrow and propose a new depth-sensor API for use with devices like Kinect. It would be discussed, consensus would be built, Microsoft (might) ship it behind a prefix if they believe it's stable/mature, or they (might) wait until the WG has adopted and it's progressed to a point of maturity.

This is API. This is not a protocol format. API changes, by design, take time, because we need all UAs to agree to the shape and purpose of such API changes, since it's the shared API of the web.

Once again, a registry is unnecessary to serve this in any form of prescriptive role. A registry to document this is no more a WG activity than it is the WebApps responsibility to document every extension of the Document or Window interfaces. The W3C has already invested - along with it's member organizations - significantly in providing a place for such documentation "for developers" (as a registry is often pointed as serving) - webplatform.org
Comment 16 Harry Halpin 2014-08-04 19:50:28 UTC
(In reply to Ryan Sleevi from comment #15)
> (In reply to Harry Halpin from comment #14)
> > The problem is how does one add an algorithm identifier (with appropriate
> > extension spec) without returning to Last Call?
> > 
> > If you think you have a better way, please provide. I assumed a wiki that
> > tracks the reserved names and the status of the extension spec as a WG Note
> > was about as basic and sensible as you can get. 
> 
> Harry,
> 
> How does WebApps introduce/propose/spec-through-CFI a new API without
> sending HTML5 to Last Call?
> 
> For example, the recently introduced Service Workers? Or the Mixed Content
> spec?
> 
> I hope, by example and extension, you can see that, by design, there is
> nothing that should require the "Web Crypto API" to re-enter last call. If
> there is, well, that's the general extensibility bug. This does not require
> a registry to solve, nor does it require a Note or a Wiki page, nor does the
> W3C's version of "point in time HTML5" require a "forward declaration"
> towards the work of WebApps or WebAppsSec.
> 
> Again, I remain opposed to registries because this is not a generic,
> arbitrary registration surface. This is a key, intrinsic component of the
> API, and it should not be treated any different than how the Web Platform
> and User Agents have treated every other aspect of API - through
> specifications, WGs, and consensus.

We are not discussing an IANA registry. We are addressing that concern of "where developers find out if an algorithm is specified if it's not in the main spec". For that, a W3C wiki is a lightweight solution that has a longevity that is not connected to the spec. 

We may have that issue with both NUMS and Curve 25519. We are simply stating *where* the specifications not in the main spec can be found, what process they go for maturation (i.e. the WG process if available) and how they are published, i.e. as Notes. 

That's one concrete proposal. We still need to go through and make sure the spec clearly lists all extension points where "the object is defined by the other specification." However, it would seem useful that for algorithm identifiers in particular that the "other specifications" can be easily found, lest folks reinvent the wheel. 


> 
> The only "risk" to requiring a registry is if a UA decides to forgo
> standards-based development, forgo the W3C process, and begin shipping
> untested, unspecified, unadopted work directly to the Web. I think the UAs
> participating in this WG, and in the W3C, have learned of the deleterious
> effects, and thus consensus is pursued in parallel or even prior to anything
> being shipped, and only shipping when such a thing is believed to be stable
> enough such that changes will not affect web developers (or, in the case of
> some UAs, prefixing).
> 
> Thus even the remotest probabilities of:
> 1) Some new hip algorithm being introduced
> 2) Two UAs deciding to implement some new hip algorithm immediately after
> it's announced
> 3) Coming up with incompatible definitions of some new hip algorithm
> 
> Are greatly reduced by the incredible unlikelihood that a UA would actually
> ship them before building consensus and a standards-based approach.
> 
> Microsoft's approach with the NUMS curves is no different than if Microsoft
> were to come to WebApps tomorrow and propose a new depth-sensor API for use
> with devices like Kinect. It would be discussed, consensus would be built,
> Microsoft (might) ship it behind a prefix if they believe it's
> stable/mature, or they (might) wait until the WG has adopted and it's
> progressed to a point of maturity.
> 
> This is API. This is not a protocol format. API changes, by design, take
> time, because we need all UAs to agree to the shape and purpose of such API
> changes, since it's the shared API of the web.
> 
> Once again, a registry is unnecessary to serve this in any form of
> prescriptive role. A registry to document this is no more a WG activity than
> it is the WebApps responsibility to document every extension of the Document
> or Window interfaces. The W3C has already invested - along with it's member
> organizations - significantly in providing a place for such documentation
> "for developers" (as a registry is often pointed as serving) -
> webplatform.org
Comment 17 Mike Jones 2014-08-04 20:01:11 UTC
> In comment 15, Ryan wrote:
> This is API. This is not a protocol format. API changes, by design, take
> time, because we need all UAs to agree to the shape and purpose of such API
> changes, since it's the shared API of the web.

You're missing a key distinction, at least as I see it, Ryan.  APIs are about things like how you express the "sign" operation and the "decrypt" operation.  These will not change over time.

Whereas, the names of the algorithms used with those operations *will* change over time as new algorithms are adopted and old algorithms are deprecated.  That's a consequence of crypto agility, and critical to the long-term success of the spec.  That's why the set of algorithms need to be extensible without updating the base spec.  It's *not* API.

This could happen via a registry, a wiki, a web site with expert review, etc.  I'm not all that picky about the particular mechanism.  But the point is that it needs to be a mechanism that accommodates algorithm changes as a normal part of the life cycle of the usage of spec - unlike methods like "sign" and "decrypt", which *are* API, and which aren't expected to change.
Comment 18 Ryan Sleevi 2014-08-04 20:30:45 UTC
(In reply to Mike Jones from comment #17)
> > In comment 15, Ryan wrote:
> > This is API. This is not a protocol format. API changes, by design, take
> > time, because we need all UAs to agree to the shape and purpose of such API
> > changes, since it's the shared API of the web.
> 
> You're missing a key distinction, at least as I see it, Ryan.  APIs are
> about things like how you express the "sign" operation and the "decrypt"
> operation.  These will not change over time.
> 
> Whereas, the names of the algorithms used with those operations *will*
> change over time as new algorithms are adopted and old algorithms are
> deprecated.  That's a consequence of crypto agility, and critical to the
> long-term success of the spec.  That's why the set of algorithms need to be
> extensible without updating the base spec.  It's *not* API.
> 
> This could happen via a registry, a wiki, a web site with expert review,
> etc.  I'm not all that picky about the particular mechanism.  But the point
> is that it needs to be a mechanism that accommodates algorithm changes as a
> normal part of the life cycle of the usage of spec - unlike methods like
> "sign" and "decrypt", which *are* API, and which aren't expected to change.

Mike, Harry,

Since this bug has morphed into the registry discussion, I'm going to kindly ask you to take it to another bug. This discussion of registry is entirely orthogonal to why I filed this bug, or of the issues that remain.

Mike, I'm very much aware, and violently disagree with your assertion. That is, this is *not* a naming issue, is very much an API issue, and is very much that window.crypto.subtle.encrypt("AES-GCM", foo, bar, baz) is no different in form or function than window.crypto.subtle.encrypt.aes-gcm(foo, bar, baz).

If you wish to continue this discussion - a discussion that we've repeatedly had and seemingly closed - on a new bug, I'm happy to follow-up there. But I think we'd be doing a great disservice to this bug, and the very real and pressing spec issues, to try to mix that discussion in here.

Cheers
Comment 19 virginie.galindo 2014-08-06 12:34:29 UTC
FYI : opening a thread on the Web Crypto WG mailing list with a strawman proposal to progress on extension integration...
http://lists.w3.org/Archives/Public/public-webcrypto/2014Aug/0036.html

Results of discussion (if any) will be reported on that bug.
Comment 21 Mark Watson 2014-10-06 17:17:02 UTC
Re-opening as per conf call discussion.

The current status seems to be as follows:
1) Several WG members dislike 'forward references', meaning references to as-yet-unwritten 'other specifications': the only method of extension should be the writing of new specification text which is either referenced from or included in a 'base' or 'root' specification.
2) In such a model, the purpose of explicit extensibility points is to provided implementors with information about what procedures may or may not be modified by 'other specifications' - they are thereby directed to look for references to other such specifications in the 'root' specification and implement those requirements as well.
3) We presently only have one specification. We can modify anything we like in this specification in future versions.

Combining (1) and (3) there is no possibility of 'monkey-patching' and therefore no need for extension points. Anne seems to agree with this.

In future, we may decide we need additional specifications (e.g. for additional elliptic curves). According to (1), we will need to modify our base specification anyway, to include the references to these new curve specifications. So, I see no reason why we could not include the extensibility provisions then.

As I result, I suggest we just revert the changes related to extensibility.
Comment 22 Harry Halpin 2014-10-07 12:29:45 UTC
Maybe this wasn't clear, but given the experience W3C had with XML-DSIG as anytime we want to update the spec we have to go all the way through W3C process again. I know people have different tastes here, but c'est la vie. 

Thus, option 1) is not viable to W3C and will likely prevent us from going to CR. Also, at least one browser vendor (Microsoft) wants extension points. If you don't believe me, feel free to ask Wendy Seltzer or Tim Berners-Lee. 

Thus, I suggest we do not revert the changes but simply have the WG, including those that want them like BAL, check them before we go through Last Call. 

Folks who don't like extension points, given they likely have no impact on running code but only allow future code to run without monkey-patching, simply don't have to be bothered.
Comment 23 Ryan Sleevi 2014-10-07 12:54:39 UTC
(In reply to Mark Watson from comment #21)
> Re-opening as per conf call discussion.
>
> Combining (1) and (3) there is no possibility of 'monkey-patching' and
> therefore no need for extension points. Anne seems to agree with this.

Agreed

> 
> In future, we may decide we need additional specifications (e.g. for
> additional elliptic curves). According to (1), we will need to modify our
> base specification anyway, to include the references to these new curve
> specifications. So, I see no reason why we could not include the
> extensibility provisions then.

Agreed

> 
> As I result, I suggest we just revert the changes related to extensibility.

Agreed

(In reply to Harry Halpin from comment #22)
> Maybe this wasn't clear, but given the experience W3C had with XML-DSIG as
> anytime we want to update the spec we have to go all the way through W3C
> process again. I know people have different tastes here, but c'est la vie. 
> 
> Thus, option 1) is not viable to W3C and will likely prevent us from going
> to CR. Also, at least one browser vendor (Microsoft) wants extension points.
> If you don't believe me, feel free to ask Wendy Seltzer or Tim Berners-Lee. 
> 

While I cannot speak for Microsoft, and do not attempt to position this as a final Mozilla position, from discussing both internally and externally, there is a strong belief that 1 is the ONLY viable option from the browser side.

If that's not something the W3C can fit for process reasons, then this spec may be better maintained through activities that can support it, such as the WHATWG.

> Folks who don't like extension points, given they likely have no impact on
> running code but only allow future code to run without monkey-patching,
> simply don't have to be bothered.

That's not a viewpoint we share. The complexity - in both text and in possible implications for future efforts regarding "how" extensibility is handled -
Comment 24 Boris Zbarsky 2014-10-08 03:37:01 UTC
> anytime we want to update the spec we have to go all the way through W3C
> process again

As in, the W3C errata/update process is somewhere between nonexistent and untenable.  Yes, it is.  That's something the W3C really needs to fix, and has needed to for over a decade.
Comment 25 Mark Watson 2014-10-08 15:01:41 UTC
Re-reading the above, we should now be careful with the term "extension points". I think this term has been widely assumed to refer to the idea that future as-yet-unwritten specification can add to a base specification without modification of the base specification.

However, what I'm hearing is that there is a desire to avoid such "forward references" and in this context the term "extension point" refers to the idea that other specifications *referenced from the base specification* can add to the base specification. This is a new requirement: in fact comment#0 says "A successful resolution of this bug will be ensuring that one can, in a way that *does not alter the Web Crypto spec*, define an entirely new spec that adds additional algorithm(s)." We have such a proposal in the Editor's Draft, but apparently no consensus.

So, unless some opinions change, we will need to update the base specification anyway for any and all extensions, even if only to include a new reference.

There are a few options for making this viable, process-wise:

(1) Improve the W3C process for such minor updates to specifications
(2) Modularize, so that the "base specification" is little more than references and common data types, and thus much lighter-weight to update
(3) Modularize in a different way, so that the "root" for WebCrpyto contains only references and is in fact a web page, not a specification, and so can be very easily updated (by the WebCrypto WG).
(4) Hand everything over to WHATWG

At this stage, I am reluctant to embark in (2) or (3) and I don't support (4). I suggest we publish now as a monolithic specification and work on modularizing for the next version. This could happen very quickly - we essentially republish WebCrpyto v1 in modular form, with extension points, whilst we in parallel and on a different timescale start work on the modules that form WebCrypto v2.
Comment 26 Harry Halpin 2014-10-08 15:18:47 UTC
Just to be clear, "extension point" means that other specifications *which normatively reference the base specification* and *which are not explicitly referenced in the base specification itself* can use the operations of the base specifications with new algorithms that are not listed in the base specification. We have, for example, the Editor's Draft of Curve 25519 from Trevor Perrin. 

I do not see any clear arguments against this and it's a pretty reasonable request give the state of crypto right now. 

I don't see any normative or informative references to "as yet to written specifications" (caveat being that we already have an extension spec for Curve 25519), rather than just making sure that the spec as written does not hardcode anything that makes it impossible or difficult to extend. 

If folks don't want to implement Cuve25519 or other algorithms, they can just not implement them (including anything outside the base specification) and this will be recorded in the test suite for the base specification or any other future specification like the Curve25519 spec. 

I do not see how Mark's edits cause more work for implementers and, given they provide at least a first try at the needed extensibility points, I do not see why they should be removed. "Not liking something" is not a sufficient reason I think.
Comment 27 Mark Watson 2014-10-08 15:39:33 UTC
(In reply to Harry Halpin from comment #26)
> Just to be clear, "extension point" means that other specifications *which
> normatively reference the base specification* and *which are not explicitly
> referenced in the base specification itself* can use the operations of the
> base specifications with new algorithms that are not listed in the base
> specification. We have, for example, the Editor's Draft of Curve 25519 from
> Trevor Perrin. 

This was my understanding and apparently the intention of this bug (see comment#0). But as so often with WebCrypto, the goalposts have changed at the very last minute: We have Google and Mozilla arguing that the extension specifications must be referenced from the base (please Ryan and Anne correct me if I am wrong there).

> 
> I do not see any clear arguments against this and it's a pretty reasonable
> request give the state of crypto right now. 

The argument, IIUC, is that implementors should have a single starting point from which they can determine what to implement.

> 
> I don't see any normative or informative references to "as yet to written
> specifications" (caveat being that we already have an extension spec for
> Curve 25519), rather than just making sure that the spec as written does not
> hardcode anything that makes it impossible or difficult to extend. 

The Editor's Draft refers to "other specifications", without constraining what those specifications may be (in common with the 'clone' procedures on which I based the text). This is what Ryan calls a 'forward reference', like declaring a class without defining it. What's being asked for now is that we constrain the 'other specifications' to those referenced from the base specification.

> 
> If folks don't want to implement Cuve25519 or other algorithms, they can
> just not implement them (including anything outside the base specification)
> and this will be recorded in the test suite for the base specification or
> any other future specification like the Curve25519 spec. 
> 
> I do not see how Mark's edits cause more work for implementers and, given
> they provide at least a first try at the needed extensibility points, I do
> not see why they should be removed. "Not liking something" is not a
> sufficient reason I think.

Ryan objects to the fact that my proposal allows extension specifications to override import / export procedures for existing algorithm / hash combinations. However, constraining them is quite involved, text-wise. I don't see this is a problem since whether extension specifications do override those things is still up to the WebCrypto WG alone.

Anyway, to make progress, and since it seems we have to update the base specification anyway for all extensions, I suggest we add the extension points when we need them, instead of now.
Comment 28 Brian LaMacchia 2014-10-09 14:09:37 UTC
[Copying my e-mail message to the list here so it's also directly in Bugzilla...]

Folks,
 
As you are all well aware, we have had extensive discussions in this WG (on both the list and during our conference calls) on the need for defined extensibility points in the WebCrypto specification. The result of those discussions was an agreement that those defined extension points would be added to the specification as part of resolving Bug #25618 (https://www.w3.org/Bugs/Public/show_bug.cgi?id=25618), which was the placeholder for this work.  Co-editor Mark Watson has already made those changes to the draft and asked for them to be reviewed.  
 
Speaking on behalf of Microsoft and our two independent implementations of WebCrypto (Internet Explorer 11+ and the MSR JavaScript Cryptography Library), we believe that the spec should not exit Last Call without having a well-defined extensibility mechanism that allows the definition and integration of new cryptographic algorithms.  Our expectation is that whatever the mechanism, an extension will not impact the base specification nor compromise implementations that comply with the base spec.
 
--bal
Comment 29 Mark Watson 2014-10-09 14:36:44 UTC
(In reply to Brian LaMacchia from comment #28)
> [Copying my e-mail message to the list here so it's also directly in
> Bugzilla...]
> 
> Folks,
>  
> As you are all well aware, we have had extensive discussions in this WG (on
> both the list and during our conference calls) on the need for defined
> extensibility points in the WebCrypto specification. The result of those
> discussions was an agreement that those defined extension points would be
> added to the specification as part of resolving Bug #25618
> (https://www.w3.org/Bugs/Public/show_bug.cgi?id=25618), which was the
> placeholder for this work.  Co-editor Mark Watson has already made those
> changes to the draft and asked for them to be reviewed.  
>  
> Speaking on behalf of Microsoft and our two independent implementations of
> WebCrypto (Internet Explorer 11+ and the MSR JavaScript Cryptography
> Library), we believe that the spec should not exit Last Call without having
> a well-defined extensibility mechanism that allows the definition and
> integration of new cryptographic algorithms.  Our expectation is that
> whatever the mechanism, an extension will not impact the base specification
> nor compromise implementations that comply with the base spec.

Brain,

The decision to require extensibility points was based on the assumption that the objective was to allow extension of the specification without modification of the base specification, as described in Comment#0 above.

Things have now changed, with two browser vendors arguing that this should not be possible: the base specification should always be updated, at the least with references to the new specification. With this basic assumption now contested, the earlier decision is meaningless.

Given this, can you address my point above that we might as well add the extension points when / if we make such updates to the base specification ? Indeed, we might as well just add the new stuff into the spec if we have to update it anyway.

Or, are you objecting to the position from Google and Mozilla that extensions require an update to the base specification ?

It would be really great if, instead of simply taking contrary positions which leave us at an impasse, people could throw out ideas for resolution. I have suggested several things above and it would be nice to have feedback on those.


>  
> --bal
Comment 30 Brian LaMacchia 2014-10-09 14:47:30 UTC
(In reply to Mark Watson from comment #29)
> (In reply to Brian LaMacchia from comment #28)
> > [Copying my e-mail message to the list here so it's also directly in
> > Bugzilla...]
> > 
> > Folks,
> >  
> > As you are all well aware, we have had extensive discussions in this WG (on
> > both the list and during our conference calls) on the need for defined
> > extensibility points in the WebCrypto specification. The result of those
> > discussions was an agreement that those defined extension points would be
> > added to the specification as part of resolving Bug #25618
> > (https://www.w3.org/Bugs/Public/show_bug.cgi?id=25618), which was the
> > placeholder for this work.  Co-editor Mark Watson has already made those
> > changes to the draft and asked for them to be reviewed.  
> >  
> > Speaking on behalf of Microsoft and our two independent implementations of
> > WebCrypto (Internet Explorer 11+ and the MSR JavaScript Cryptography
> > Library), we believe that the spec should not exit Last Call without having
> > a well-defined extensibility mechanism that allows the definition and
> > integration of new cryptographic algorithms.  Our expectation is that
> > whatever the mechanism, an extension will not impact the base specification
> > nor compromise implementations that comply with the base spec.
> 
> Brain,
> 
> The decision to require extensibility points was based on the assumption
> that the objective was to allow extension of the specification without
> modification of the base specification, as described in Comment#0 above.
> 
> Things have now changed, with two browser vendors arguing that this should
> not be possible: the base specification should always be updated, at the
> least with references to the new specification. With this basic assumption
> now contested, the earlier decision is meaningless.
> 
> Given this, can you address my point above that we might as well add the
> extension points when / if we make such updates to the base specification ?
> Indeed, we might as well just add the new stuff into the spec if we have to
> update it anyway.
> 
> Or, are you objecting to the position from Google and Mozilla that
> extensions require an update to the base specification ?
> 
> It would be really great if, instead of simply taking contrary positions
> which leave us at an impasse, people could throw out ideas for resolution. I
> have suggested several things above and it would be nice to have feedback on
> those.
> 
> 
> >  
> > --bal

Yes, I am specifically objecting to the position that extensions require an update to the base specification.  I don't believe this is the case and that's why I wrote "Our expectation is that whatever the mechanism, an extension will not impact the base specification nor compromise implementations that comply with the base spec." We can make a specification that allows extensions in defined places and the act of making those extensions changes nothing in the base spec.  Having a defined extension point in the base spec is not in any way equivalent to a forward reference to an undefined normative or informative reference.  

--bal
Comment 31 Mark Watson 2014-10-09 14:59:26 UTC
Ok, well as it happens I completely agree. People have written extensible specifications for decades.

But I do not feel strongly enough about this to delay the specification.

Ryan, Anne, this appears to be a point-of-principle disagreement between Browser vendors where the principle is not at all specific to WebCrypto. Is there some way we can punt this disagreement to a forum with the right scope to address the point of principle ?
Comment 32 Mark Watson 2014-10-09 15:05:48 UTC
Oh, and the disagreement on principle is recent, since this bug was raised, since Comment#0 clearly envisages extensions without modification of the base specification.
Comment 33 Boris Zbarsky 2014-10-09 15:08:03 UTC
The disagreement seems to be simple to me, but I welcome corrections if I misstate someone's position.  It goes as follows:

Say we have a base spec S1 and an extension S2.

The fundamental question is whether it's useful to ship a browser that supports S1 but not S2, or whether in practice anyone implementing S1 will also need to implement S2.

Brian's position seems to be that he wants the freedom to ship S1 but not S2, which implicitly suggests that he doesn't think support for S1 will necessarily require supporting S2.

Anne and Ryan's position seems to be that in practice anyone shipping S1 will also be forced to ship S2, so we do implementors of S1 a disservice by pretending in S1 that S2 does not exist: they end up discovering the hard way, via bug reports about stuff being broken, that S2 does in fact exist and needs to be implemented.

The history of web specs suggests to me that Anne and Ryan are right, for what it's worth, assuming that S2 is implemented by anyone at all....  but in the end this really depends on what exactly S2 is.  For example, if S2 is not meant to be implemented by browsers to start with, then it doesn't matter all that much whether it's explicitly pointed to from S1 or not.
Comment 34 Mark Watson 2014-10-09 15:18:41 UTC
In this case S2 is a specification extending the set of Elliptic Curves or Hash algorithms supported. It's certainly not the case that everyone has to implement S2. All the algorithms in WebCrypto are optional (which is a whole different story, but anyway).
Comment 35 Boris Zbarsky 2014-10-09 15:20:28 UTC
> It's certainly not the case that everyone has to implement S2

As a theoretical, or practical matter?

> All the algorithms in WebCrypto are optional

As a theoretical but not practical matter....
Comment 36 Harry Halpin 2014-10-09 15:25:12 UTC
"Anne and Ryan's position seems to be that in practice anyone shipping S1 will also be forced to ship S2, so we do implementors of S1 a disservice by pretending in S1 that S2 does not exist: they end up discovering the hard way, via bug reports about stuff being broken, that S2 does in fact exist and needs to be implemented."

By this logic, I might add, we can't ship Version 5 until we are done with Version 6. 

Again, "this really depends on what exactly S2 is". S2 is at least one (Curve 25519 spec and a possible NUMS curve spec depending on results of CFRG discussion) and likely future extension specs that are being developed at their own pace. No one is "pretending" S2 doesn't exist. It's just consensus to uniformly implement them across all browsers does not exist yet. However, at least one browser vendor will implement some S2s (Microsoft) and possibly more in the future. 

Also we have algorithms that are nationally mandated (GOST in Russia, SEED in Korea, etc.) supported by entire industries around custom browser builds for those governments. Being a global standards body, it seems we don't want to shut them out of using W3C standards. 

It seems like extension specs are needed in this case. 

I still do not see a good argument against it. The argument is "People might think extension spec is implemented, realize it isn't, and file a bug report."
In which case if the bug report is for a spec like Curve 25519 which Google isn't sure about supporting with WebCrypto, you think that having or not having bug reports from developers might help them make that decision. 

If developers want to know what browsers definitely support an extension S2 like Curve 25519, they could just check on where it is in W3C process as well.

Thus, the pain of not having the ability to easily extend seems to outweigh the fact that folks might be bug reports.
Comment 37 Brian LaMacchia 2014-10-09 15:28:13 UTC
Using Boris's S1/S2 notation, two points to keep in mind:

1) The desire for S2 often happens later than S1 is implemented, in particular when there's a new cryptographic attack against an algorithm or protocol construction.  So it's not "I want to ship S1 and not S2" as much as "I want to ship S1 and have the flexibility to add a new hash algorithm/elliptic curve/KDF/padding mode/whatever in some scenario quickly without needing to reopen S1".

2) As I've said before, this spec is going to be implemented by clients other than browsers.  You need to think about the WebCrypto spec as the crypto platform API for JavaScript, wherever that JS runs. 

--bal
Comment 38 Boris Zbarsky 2014-10-09 15:40:36 UTC
> By this logic, I might add, we can't ship Version 5 until we are done with
> Version 6. 

By this logic, when you start work on version 6 you need to add a link in version 5 pointing out that it's no longer the most recent specification and that version 6 exists.  Assuming you were talking about versions of specifications.

If you were talking about versions of implementations, I have no idea what you were trying to say.

> No one is "pretending" S2 doesn't exist.

Please put yourselves in the shoes of an implementor who is told to "implement S1".  They find the S1 spec and implement.  How do they find out about S2?

> It seems like extension specs are needed in this case. 

I'm not extension specs at all.  I'm not even against extension specs that are clearly labeled as "optional unless you want your UA to be usable in Russia"; in fact I'm quite for it.

I _am_ against non-discoverable extension specs.

Obviously I can't speak for Anne and Ryan here, but I suspect that they agree with me.

> The argument is "People might think extension spec is implemented, realize it
> isn't, and file a bug report."

No, the argument is "A browser implementor will think this S1 is implemented and ship it and sites will break in the browser because they assume that if S1 is implemented so is S2, otherwise they polyfill S1".

As in, if S2 is in fact a required consequence of having S1, then by not making that clear you set up people implementing S1 to fail.

> If developers want to know what browsers definitely support an extension S2

Based on historical evidence, developers will just test their favorite browser or maybe two and then if they support S2 will assume everyone who implements S1 does.  They are likely to not even realize that the things they're using are in S2 and not in S1, since developer-facing blogs, documentation, etc, is not likely to make that clear.

> Thus, the pain of not having the ability to easily extend 

Adding a single forward-reference line to a spec does not seem like a high bar for "easily".

On the other hand, I think you underestimate the pain of "ship a browser that causes pages to break because a spec was incomplete"; a pain we have ended up dealing with on a fairly regular basis with W3C specs and are somewhat familiar with.

Basically, you are prioritizing the comfort of the working group over implementors.  This is a direct reversal of <http://www.w3.org/TR/html-design-principles/#priority-of-constituencies>.  You're not bound by that design principle of course, being a different working group and all that, but I'll posit that design principle is still a good idea in general.  You might disagree, of course, in which case we really do have a fundamental disagreement.
Comment 39 Boris Zbarsky 2014-10-09 15:45:09 UTC
> "I want to ship S1 and have the flexibility to add a new hash
> algorithm/elliptic curve/KDF/padding mode/whatever in some scenario quickly
> without needing to reopen S1"

As an implementation matter, you can just add it, right?  If you really need to do it quickly, you can't wait for the likely-long process of S2 going to REC.

> this spec is going to be implemented by clients other than browsers.

That's fine.  Having different normative requirements for different conformance classes is something other specs have done.
Comment 40 Mark Watson 2014-10-09 15:59:49 UTC
(In reply to Boris Zbarsky from comment #35)
> 
> As a theoretical, or practical matter?
> 

Like I said, algorithm optionality is a whole other story ...

But, if we are being practical, would it work to simply maintain a list - in a small spec or on a WebCrypto WG web page - of WebCrypto specifications, which we would point to from the base spec (under the definition of 'other specifications').

This _surely_ solves the practical matter of discoverability, without invoking the presently-rather-heavy W3C specification update process for each new extension.
Comment 41 Brian LaMacchia 2014-10-09 16:22:34 UTC
(In reply to Boris Zbarsky from comment #39)
> > "I want to ship S1 and have the flexibility to add a new hash
> > algorithm/elliptic curve/KDF/padding mode/whatever in some scenario quickly
> > without needing to reopen S1"
> 
> As an implementation matter, you can just add it, right?  If you really need
> to do it quickly, you can't wait for the likely-long process of S2 going to
> REC.

But I need a way to do it without violating S1, which I don't have today.  Using my previous example, without extensibility points like Mark was adding I can't add support for "SHA-3" to the RSA-PSS list without violating the S1-specified behavior.  The base spec needs to have cryptographic agility built in or I have to break the behavior contract to extend it.
Comment 42 Brian LaMacchia 2014-10-09 16:25:19 UTC
(In reply to Mark Watson from comment #40)
> (In reply to Boris Zbarsky from comment #35)
> > 
> > As a theoretical, or practical matter?
> > 
> 
> Like I said, algorithm optionality is a whole other story ...
> 
> But, if we are being practical, would it work to simply maintain a list - in
> a small spec or on a WebCrypto WG web page - of WebCrypto specifications,
> which we would point to from the base spec (under the definition of 'other
> specifications').
> 
> This _surely_ solves the practical matter of discoverability, without
> invoking the presently-rather-heavy W3C specification update process for
> each new extension.

Note that this is essentially what the IETF does with the searchable RFC database; new extension RFCs will have back-pointers to the RFCs they extend, but no one expects to re-open and re-publish old RFCs to add forward pointers to new RFCs that define extensions.  Imagine if we had to re-open PKIX Part 1 every time someone defined a new certificate extension...
Comment 43 Harry Halpin 2014-10-09 17:06:17 UTC
(In reply to Mark Watson from comment #40)
> (In reply to Boris Zbarsky from comment #35)
> > 
> > As a theoretical, or practical matter?
> > 
> 
> Like I said, algorithm optionality is a whole other story ...
> 
> But, if we are being practical, would it work to simply maintain a list - in
> a small spec or on a WebCrypto WG web page - of WebCrypto specifications,
> which we would point to from the base spec (under the definition of 'other
> specifications').
> 
> This _surely_ solves the practical matter of discoverability, without
> invoking the presently-rather-heavy W3C specification update process for
> each new extension.


Note that W3C has offered to host this web-page and spec BTW. The offer still stands.
Comment 44 Boris Zbarsky 2014-10-09 18:17:29 UTC
> would it work to simply maintain a list - in a small spec or on a WebCrypto WG
> web page - of WebCrypto specifications, which we would point to from the base
> spec (under the definition of 'other specifications').

That would work just fine for addressing the issue I have.  I obviously can't speak for Anne or Ryan.

> But I need a way to do it without violating S1

I don't see how you can, unless S1 doesn't really define behavior at all.

Either S1 says "you can add support for whatever", and then you're not violating it no matter what you do.

Or S1 says "you can add support for things defined in extension specs" and then you're violating it if you add stuff before said extension spec exists.

Again as a practical matter for the specific scenario you described UAs will act as if S1 allowed them to add support for whatever, even if it actually says you have to wait for an extension spec, because they will need to add support before the lifecycle of the extension spec completes.  The extension spec will then backfill and document existing practice so new UAs don't have to reverse-engineer.  Therefore I don't have a strong opinion on whether in our particular S1 we should take one or the other option.  I do care that once the backfill happens it gets referenced.

> but no one expects to re-open and re-publish old RFCs to add forward pointers
> to new RFCs that define extensions.

This is a huge problem with RFCs.  It's very common for people to read and implement an RFC without realizing that it's been obsoleted.

The fact that adding such a forward reference would involve any sort of concept of "re-opening" is a process issue that just needs to be addressed.  Obsolete things should get clearly marked obsolete, and "IETF doesn't do that" doesn't mean the W3C shouldn't.

> Note that W3C has offered to host this web-page and spec BTW. 

I'm not sure what the argument is about, then... ;)
Comment 45 Harry Halpin 2014-10-10 14:29:02 UTC
(In reply to Boris Zbarsky from comment #38)
> > By this logic, I might add, we can't ship Version 5 until we are done with
> > Version 6. 
> 
> By this logic, when you start work on version 6 you need to add a link in
> version 5 pointing out that it's no longer the most recent specification and
> that version 6 exists.  Assuming you were talking about versions of
> specifications.
 
Extension spec S2 would normatively reference S1. Why would it make sense to have S1 reference S2? 

> 
> If you were talking about versions of implementations, I have no idea what
> you were trying to say.
> 
> > No one is "pretending" S2 doesn't exist.
> 
> Please put yourselves in the shoes of an implementor who is told to
> "implement S1".  They find the S1 spec and implement.  How do they find out
> about S2?

Like I said, the W3C has offered to host either a web-page or wiki, which can be pointed at from S1, for these. That's pretty simple and solves your problem. 

> 
> > It seems like extension specs are needed in this case. 
> 
> I'm not extension specs at all.  I'm not even against extension specs that
> are clearly labeled as "optional unless you want your UA to be usable in
> Russia"; in fact I'm quite for it.

Exactly.

> 
> I _am_ against non-discoverable extension specs.
> 
> Obviously I can't speak for Anne and Ryan here, but I suspect that they
> agree with me.

Agreed. We wanted extension specs to be discoverable. Ryan pushed against it for reasons which he has not clarified. 

> 
> > The argument is "People might think extension spec is implemented, realize it
> > isn't, and file a bug report."
> 
> No, the argument is "A browser implementor will think this S1 is implemented
> and ship it and sites will break in the browser because they assume that if
> S1 is implemented so is S2, otherwise they polyfill S1".

Why would sites assume S2 is implemented if it's clearly an extension?

> 
> As in, if S2 is in fact a required consequence of having S1, then by not
> making that clear you set up people implementing S1 to fail.

However, using Curve25519 and NUMs curves as examples, we are *not* saying they are required.

> 
> > If developers want to know what browsers definitely support an extension S2
> 
> Based on historical evidence, developers will just test their favorite
> browser or maybe two and then if they support S2 will assume everyone who
> implements S1 does.  They are likely to not even realize that the things
> they're using are in S2 and not in S1, since developer-facing blogs,
> documentation, etc, is not likely to make that clear.

Given there is a vastly decreasing amount of browsers, I don't think this is a huge problem. Indeed, the problem is that some jurisdictions and applicaitons *require* crypto that for, some reason or another, some browser vendor won't implement but another will. To discpline this, an extension spec mechanism makes sense. 

Also, note the point made by Mark that at least his company is interested in using this API in JS programs that aren't browsers. 

> 
> > Thus, the pain of not having the ability to easily extend 
> 
> Adding a single forward-reference line to a spec does not seem like a high
> bar for "easily".

Yes, because the patent commits go to the spec and adding algorithms to the base spec would cause us to go back to Last Call.

That being said, a reference to *a wiki* or *another website* that *forward references all the other specs and their status/testing from the main spec solves this problem rather easily.

> 
> On the other hand, I think you underestimate the pain of "ship a browser
> that causes pages to break because a spec was incomplete"; a pain we have
> ended up dealing with on a fairly regular basis with W3C specs and are
> somewhat familiar with.
> 
> Basically, you are prioritizing the comfort of the working group over
> implementors.  This is a direct reversal of
> <http://www.w3.org/TR/html-design-principles/#priority-of-constituencies>. 
> You're not bound by that design principle of course, being a different
> working group and all that, but I'll posit that design principle is still a
> good idea in general.  You might disagree, of course, in which case we
> really do have a fundamental disagreement.

No, we are not prioritizing the comfort of the working group. Certain groups, like *entire countries*, need crypto that is not included in the base spec for some reason or another. 

As a global standards body we can't ignore whole countries. I rank ordinary people above both the comfort of standards bodies and implementers to be honest. 

We will clearly define a browser profile that every developer can depend on. 

However, we do understand some browsers don't have the resources or desire to implement certain crypto (like Curve25519 or NUMS or SEED or GOST) that certain people (and again, whole countries) need. As long as specs for these are written and code works in at least one browser, I think extension specs are a good idea for tackling this issue without putting any additional weight on browser vendors. If the browser vendors decide later to implement extension specs, then we can update accordingly but no reason to hold up other browser vendors and the rest of W3C now.
Comment 46 Boris Zbarsky 2014-10-10 15:14:26 UTC
> Why would it make sense to have S1 reference S2? 

We've covered that already in this discussion.  Please just read what was already said instead of us repeating the same thing over and over?

> Why would sites assume S2 is implemented if it's clearly an extension?

Because the people who create sites don't read specs.  They read tutorials and copy/paste demos.

So when browser X implements S1+S2 and puts out a blog post saying "we implement S1!" with a demo that uses S1+S2, after that everyone assumes that "S1" includes the functionality that's actually in S2.

We've seen this play out over and over.  The fact that you seem unaware of it while being on W3C staff is somewhat disturbing to me.

> we are *not* saying they are required.

I meant "required" in the sense of "you are not web-compatible unless you implement it", not in a normative spec requirement sense.

I mean, you could go and say that support for document.write is optional, but it's still required to be implemented by a web browser that wants to actually, say, browse the web.

> Given there is a vastly decreasing amount of browsers

This fact is itself a problem.

To address this problem, and various other issues, new browser engines (Servo, but also others) are in fact being created.  Those are one of the primary customers of W3C specifications: they need the specs to avoid having to reverse-engineer Trident/Blink/WebKit/Firefox.

You may not care about whether creation of new browser engines is feasible, but I do.  Maybe you do too; in that case I don't understand this reluctance all along to make it more feasible.

> and adding algorithms to the base spec

I'm not requesting that algorithms be added to the base spec, and I don't see why you thought I was.  I'm glad we're clear on the fact that I'm not.  ;)

It sounds like you and I at least are in agreement: the base spec should reference an up-to-date list of extension specs.
Comment 47 Ryan Sleevi 2014-10-10 15:15:51 UTC
(In reply to Harry Halpin from comment #45)
> Extension spec S2 would normatively reference S1. Why would it make sense to
> have S1 reference S2? 
> 

This was answered two lines below in the message you were replying to.

> > 
> > If you were talking about versions of implementations, I have no idea what
> > you were trying to say.
> > 
> > > No one is "pretending" S2 doesn't exist.
> > 
> > Please put yourselves in the shoes of an implementor who is told to
> > "implement S1".  They find the S1 spec and implement.  How do they find out
> > about S2?
> 
> Like I said, the W3C has offered to host either a web-page or wiki, which
> can be pointed at from S1, for these. That's pretty simple and solves your
> problem. 

It doesn't solve the problem for a second.

One, the issue at hand is "Implement S1", except the Web now depends on "S1+S2" because some Vendor C implemented S2 and decided to ship it, and sites and authors came to believe that S1 (the Web Crypto API, in this case) also guarantees support for S2, because for some vendors, it does.

So now you have two problems:
1) How do authors realize that S1 and S2 are disjoint (answer: And this is true for nearly every implementor - is that they don't, no matter how much we try to spin it)
2) How do implementors realize that sites are expecting S2 when they ask for S1 (answer: Again, plenty of experience here is that they don't, until well after the fact)

The question of "Put it in S1" vs "Put it in a wiki that is put in S1" is silly. It highlights the exact process issues being discussed in http://www.w3.org/community/w3process/track/issues/141

> Agreed. We wanted extension specs to be discoverable. Ryan pushed against it
> for reasons which he has not clarified. 

No, that's not really true. I've provided quite clear reasons why non-discoverable extension specs are a problem. I've also provided quite clear reasons why "random extension specs" are a problem, especially for the Web Platform (i.e.. the 'IETF and JOSE developed extension specs')

> Why would sites assume S2 is implemented if it's clearly an extension?

Because that's exactly how the web and authors work? We're being pragmatic, rather than dogmatic, in that the process here continually leads to failure.

> > As in, if S2 is in fact a required consequence of having S1, then by not
> > making that clear you set up people implementing S1 to fail.
> 
> However, using Curve25519 and NUMs curves as examples, we are *not* saying
> they are required.

And yet, when a critical mass of UAs implement Curve25519 or NUMS, then _regardless_ of what the spec says, it is in fact required for a UA that wishes to expose the web.

This is what we've seen time and time again w/r/t HTML, which is why we have the WHATWG documenting the way the world _is_, not the way the W3C imagined the world should look like. Because ultimately, the world we live in is the messy one where sites depend on these algorithms, and so any new UA, or any UA freshly implementing S1, needs to know that S2, despite being "extension", is in effect "required" (the same as the majority of S1 is already)

> Given there is a vastly decreasing amount of browsers, I don't think this is
> a huge problem. Indeed, the problem is that some jurisdictions and
> applicaitons *require* crypto that for, some reason or another, some browser
> vendor won't implement but another will. To discpline this, an extension
> spec mechanism makes sense. 
> 
> Also, note the point made by Mark that at least his company is interested in
> using this API in JS programs that aren't browsers. 

Which just amplifies the problem that either the W3C and Web Crypto WG are either set up to serve UAs for the web, or they aren't. If you're trying to be ecumenical for NodeJS, WinJS, <insert some language here that doesn't even use ES but uses WebIDL for reasons yet unknown>, then all you're doing is forcing UAs to go off on their own to focus on the things that matter.

This is how and why we got the WHATWG, and why it increasingly may be a better place for future development. UAs cannot effectively discuss what matters or is idiomatic for NodeJS or WinJS, nor, as has been repeatedly shown in this WG, can vendors of non-UAs be expected to effectively discuss what matters or is idiomatic for the Web.

> Yes, because the patent commits go to the spec and adding algorithms to the
> base spec would cause us to go back to Last Call.

That's not a BUG. It's a FEATURE.

> No, we are not prioritizing the comfort of the working group. Certain
> groups, like *entire countries*, need crypto that is not included in the
> base spec for some reason or another. 

Put differently: You believe that the legislative fiat of independent countries should trump implementor or developer concerns.

Either the boutique crypto needs of these *entire countries* are of concerns for implementors - in which case, it's an implementation issue - or they aren't, in which case, discussions of them are in effect placing a higher priority on legislation than implementation.

That's not a good path.

> 
> As a global standards body we can't ignore whole countries. I rank ordinary
> people above both the comfort of standards bodies and implementers to be
> honest. 
> 
> We will clearly define a browser profile that every developer can depend on. 
> 
> However, we do understand some browsers don't have the resources or desire
> to implement certain crypto (like Curve25519 or NUMS or SEED or GOST) that
> certain people (and again, whole countries) need. As long as specs for these
> are written and code works in at least one browser, I think extension specs
> are a good idea for tackling this issue without putting any additional
> weight on browser vendors. If the browser vendors decide later to implement
> extension specs, then we can update accordingly but no reason to hold up
> other browser vendors and the rest of W3C now.

This ignores how the Web has worked, currently works, and, from implementors, how they desire it to work.

For the W3C to ignore these concerns or reality seems... less than ideal.
Comment 48 Harry Halpin 2014-10-10 15:29:06 UTC
(In reply to Ryan Sleevi from comment #47)
> (In reply to Harry Halpin from comment #45)
> > Extension spec S2 would normatively reference S1. Why would it make sense to
> > have S1 reference S2? 
> > 
> 
> This was answered two lines below in the message you were replying to.
> 
> > > 
> > > If you were talking about versions of implementations, I have no idea what
> > > you were trying to say.
> > > 
> > > > No one is "pretending" S2 doesn't exist.
> > > 
> > > Please put yourselves in the shoes of an implementor who is told to
> > > "implement S1".  They find the S1 spec and implement.  How do they find out
> > > about S2?
> > 
> > Like I said, the W3C has offered to host either a web-page or wiki, which
> > can be pointed at from S1, for these. That's pretty simple and solves your
> > problem. 
> 
> It doesn't solve the problem for a second.
> 
> One, the issue at hand is "Implement S1", except the Web now depends on
> "S1+S2" because some Vendor C implemented S2 and decided to ship it, and
> sites and authors came to believe that S1 (the Web Crypto API, in this case)
> also guarantees support for S2, because for some vendors, it does.

You are assuming "entire Web" depends on S2. It is more realistic certain classes of applications and countries depend on S2.

First of all, can you please clarify if this position is *your personal position* or an agreed-upon internal Google position?

Microsoft has made their agreed-upon position clear, and W3C would appreciate the same from Google. 

> 
> So now you have two problems:
> 1) How do authors realize that S1 and S2 are disjoint (answer: And this is
> true for nearly every implementor - is that they don't, no matter how much
> we try to spin it)

There are separate specs.

Again, cryptographic applications are not adding new shiny graphics to CSS that we expect everyone to implement.

Particular implementers are not idiots, particularly if they are using a library that is called "SubtleCrypto". 

> 2) How do implementors realize that sites are expecting S2 when they ask for
> S1 (answer: Again, plenty of experience here is that they don't, until well
> after the fact)
> 
> The question of "Put it in S1" vs "Put it in a wiki that is put in S1" is
> silly. It highlights the exact process issues being discussed in
> http://www.w3.org/community/w3process/track/issues/141

No, it's not silly. It's a perfectly fine solution. I think not supporting Curve25519 or other specific crypto and then asking the rest of the world and other browser vendors not to specify it is not exactly a good idea.

Furthermore, you have changed your mind on this *several* times on this. Now, you are basically arguing *all algorithms* should be normative.

Can you explain why?

> 
> > Agreed. We wanted extension specs to be discoverable. Ryan pushed against it
> > for reasons which he has not clarified. 
> 
> No, that's not really true. I've provided quite clear reasons why
> non-discoverable extension specs are a problem. I've also provided quite
> clear reasons why "random extension specs" are a problem, especially for the
> Web Platform (i.e.. the 'IETF and JOSE developed extension specs')

Again, see above question. You can't have it both ways. Either you declare all algorithms are normative and MUST be implemented or you allow extension specs. Otherwise none of the examples you discuss hold.

> 
> > Why would sites assume S2 is implemented if it's clearly an extension?
> 
> Because that's exactly how the web and authors work? We're being pragmatic,
> rather than dogmatic, in that the process here continually leads to failure.
> 
> > > As in, if S2 is in fact a required consequence of having S1, then by not
> > > making that clear you set up people implementing S1 to fail.
> > 
> > However, using Curve25519 and NUMs curves as examples, we are *not* saying
> > they are required.
> 
> And yet, when a critical mass of UAs implement Curve25519 or NUMS, then
> _regardless_ of what the spec says, it is in fact required for a UA that
> wishes to expose the web.
> 
> This is what we've seen time and time again w/r/t HTML, which is why we have
> the WHATWG documenting the way the world _is_, not the way the W3C imagined
> the world should look like. Because ultimately, the world we live in is the
> messy one where sites depend on these algorithms, and so any new UA, or any
> UA freshly implementing S1, needs to know that S2, despite being
> "extension", is in effect "required" (the same as the majority of S1 is
> already)

Hey, I'm not the one blocking Curve 25519, which developers do want, I believe you are. We are trying to have some agility so as crypto changes and developers need, we can let them have it.  I think that's dealing with the messy world actually. 

> 
> > Given there is a vastly decreasing amount of browsers, I don't think this is
> > a huge problem. Indeed, the problem is that some jurisdictions and
> > applicaitons *require* crypto that for, some reason or another, some browser
> > vendor won't implement but another will. To discpline this, an extension
> > spec mechanism makes sense. 
> > 
> > Also, note the point made by Mark that at least his company is interested in
> > using this API in JS programs that aren't browsers. 
> 
> Which just amplifies the problem that either the W3C and Web Crypto WG are
> either set up to serve UAs for the web, or they aren't. If you're trying to
> be ecumenical for NodeJS, WinJS, <insert some language here that doesn't
> even use ES but uses WebIDL for reasons yet unknown>, then all you're doing
> is forcing UAs to go off on their own to focus on the things that matter.
> 
> This is how and why we got the WHATWG, and why it increasingly may be a
> better place for future development. UAs cannot effectively discuss what
> matters or is idiomatic for NodeJS or WinJS, nor, as has been repeatedly
> shown in this WG, can vendors of non-UAs be expected to effectively discuss
> what matters or is idiomatic for the Web.
> 
> > Yes, because the patent commits go to the spec and adding algorithms to the
> > base spec would cause us to go back to Last Call.
> 
> That's not a BUG. It's a FEATURE.

Not if the editors are not fixing bugs in the time scale needed to push the spec out to developers with test-suites. 

> 
> > No, we are not prioritizing the comfort of the working group. Certain
> > groups, like *entire countries*, need crypto that is not included in the
> > base spec for some reason or another. 
> 
> Put differently: You believe that the legislative fiat of independent
> countries should trump implementor or developer concerns.

No, I believe the needs of developers and users overrun the comfort of browser vendors and the W3C. It's interesting that you call the concern for non-NIST curves "boutique crypto". I think a lot of people would disagree.

> 
> Either the boutique crypto needs of these *entire countries* are of concerns
> for implementors - in which case, it's an implementation issue - or they
> aren't, in which case, discussions of them are in effect placing a higher
> priority on legislation than implementation.
> 
> That's not a good path.
> 
> > 
> > As a global standards body we can't ignore whole countries. I rank ordinary
> > people above both the comfort of standards bodies and implementers to be
> > honest. 
> > 
> > We will clearly define a browser profile that every developer can depend on. 
> > 
> > However, we do understand some browsers don't have the resources or desire
> > to implement certain crypto (like Curve25519 or NUMS or SEED or GOST) that
> > certain people (and again, whole countries) need. As long as specs for these
> > are written and code works in at least one browser, I think extension specs
> > are a good idea for tackling this issue without putting any additional
> > weight on browser vendors. If the browser vendors decide later to implement
> > extension specs, then we can update accordingly but no reason to hold up
> > other browser vendors and the rest of W3C now.
> 
> This ignores how the Web has worked, currently works, and, from
> implementors, how they desire it to work.
> 
> For the W3C to ignore these concerns or reality seems... less than ideal.

Ditto for any individual person. For implementors to ignore the needs of users isn't great either. 

That's why W3C have a consensus-based process and clear governance.
Comment 49 Domenic Denicola 2014-10-10 15:32:11 UTC
With my TAG hat on, I would really like to second Ryan's point that this seems to highlight the issues described by Jeff Jaffe in http://www.w3.org/community/w3process/track/issues/141, and also those of Boris about the priority of constituencies.

This kind of thinking about extension specs seems systematic of the historical 'process and practice' issues Jeff talks about. In practice the web crypto spec needs to 'continuously evolve' (Jeff's phrase) to reflect the requirements for implementing a web-compatible browser, whereas the advocates of extension specs seem to argue from the point of view that actually updating the web crypto spec is too difficult for process reasons.

I'd strongly encourage the web crypto working group to lead the way on this W3C-wide initiative to improve working practices, by having a spec that reflects implementation realities and does not become stale. As such, algorithms required for web compatibility should not be put into other specs, but kept in the main, continuously-updated spec. The proposed "process hack" of (normatively?) linking to a wiki that documents the required extension algorithms that didn't make it in time for last call is just that: a hack. If there are process consequences for issuing this kind of "errata", that needs to be fixed instead of hacked around.

With my Google/implementer hat on, a continuously-updated standard with obvious requirements for what needs to be implemented, and how, is clearly preferable. This is why we use living standards and editors drafts exclusively when implementing Chrome.
Comment 50 Harry Halpin 2014-10-10 15:43:33 UTC
Domenic,

I believe the issue you reference is *not* there should be no extension specs.

To be precise, the issue is: *Improve Errata Management*.

"For reasons of process and practice, W3C working groups do not necessarily issue errata in an expeditious fashion. We should fix the W3C Process so that it encourages groups to consistently issue errata. We should also explore Best Practices that groups could adopt to improve their handling of this issue."

I agree with that. I would of course be happy to add errata to the spec pointing to any extensions that have made through the W3C process. 

However, currently the Web Crypto spec does *not* have a mandatory-to-implement list of algorithms.

You have two options here:

1) All algorithms are mandatory to implement. Thus, developers know exactly what algorithms to implement and there are no extensions. 

2) Some or no algorithms are mandatory to implement. Then, there may be extensions.

Ryan has switched his position from 2) to 1). 

In particular, for the case of 1) there is no way to add Curve 25519 to the spec without having all browsers implement it and re-opening Last Call. 

So, you gotta chose - logically you can't have 2) and then not allow extension specs (and yes, extension specs could be mentioned in errata and be easily discoverable).

Microsoft has already chosen 2). I'd like to know what Google's position is, not you with a TAG hat or Ryan's personal position.
Comment 51 Ryan Sleevi 2014-10-11 01:18:28 UTC
(In reply to Harry Halpin from comment #48)
> You are assuming "entire Web" depends on S2. It is more realistic certain
> classes of applications and countries depend on S2.
> 
> First of all, can you please clarify if this position is *your personal
> position* or an agreed-upon internal Google position?
> 
> Microsoft has made their agreed-upon position clear, and W3C would
> appreciate the same from Google. 

As a W3C staff contact, both the tone and knowledge of the space representing are disappointing. It is well known within the W3C and it's member organizations that this "entire web" distinction you make does not exist for implementors, and that assumption has caused great harm and damage to the web platform that the WHATWG has spent correcting.

Our position and preference for living specs has been made clear, repeatedly, within the W3C and the TAG. Beyond this, I don't think much more productive discussion can be made here with you. 

> 
> > 
> > So now you have two problems:
> > 1) How do authors realize that S1 and S2 are disjoint (answer: And this is
> > true for nearly every implementor - is that they don't, no matter how much
> > we try to spin it)
> 
> There are separate specs.
> 
> Again, cryptographic applications are not adding new shiny graphics to CSS
> that we expect everyone to implement.
> 
> Particular implementers are not idiots, particularly if they are using a
> library that is called "SubtleCrypto". 

This isn't at all what I was saying. I encourage you to re-read Boris' thoughtful replies to your message, which have already spelled out the issues here.

This is not hypothetical. This is something we see time and time again - and which the W3C has made efforts to try to address, seeing that they had been supplanted by the WHATWG. This tone and response suggests that perhaps it's harder for the process to adjust to the reality - the concerns Jeff Jaffe was talking about, that you've heard from Boris, Anne, and Domenic on.


> Furthermore, you have changed your mind on this *several* times on this.
> Now, you are basically arguing *all algorithms* should be normative.
> 
> Can you explain why?
> 

I am not arguing this anymore with you, for the reasons I explain below.

> 
> Again, see above question. You can't have it both ways. Either you declare
> all algorithms are normative and MUST be implemented or you allow extension
> specs. Otherwise none of the examples you discuss hold.

You've continually misinterpreted this point that it's no doubt no longer productive to discuss. You've heard from both Boris and myself about the distinction between "spec required" and "implementation required". I've spent quite a bit of time explaining to you, both publicly and privately, regarding how the normative requirements of the spec play out for implementors.

You've heard from other UAs and representatives to this same effect.

> 
> Hey, I'm not the one blocking Curve 25519, which developers do want, I
> believe you are. We are trying to have some agility so as crypto changes and
> developers need, we can let them have it.  I think that's dealing with the
> messy world actually. 

I'm telling you, with an implementation hat on, where Curve25519 sits for priorities.

Either we add it to a spec, which is then ignored, or we have the spec reflect reality.

> No, I believe the needs of developers and users overrun the comfort of
> browser vendors and the W3C. It's interesting that you call the concern for
> non-NIST curves "boutique crypto". I think a lot of people would disagree.

You don't convince UAs to implement it by putting it in a spec. If anything, that's how you get UAs ignoring the W3C - when it fails to reflect the realities of the web, it is no longer relevant nor productive.

> 
> Ditto for any individual person. For implementors to ignore the needs of
> users isn't great either. 
> 
> That's why W3C have a consensus-based process and clear governance.

Indeed. But that doesn't, for a second, mean that it produces specs relevant to UAs. The specs most driven by consensus, rather than implementation, reflect this - no one uses them.
Comment 52 Ryan Sleevi 2014-10-11 01:23:22 UTC
(In reply to Harry Halpin from comment #50)
> Domenic,
> 
> I believe the issue you reference is *not* there should be no extension
> specs.
> 
> To be precise, the issue is: *Improve Errata Management*.
> 
> "For reasons of process and practice, W3C working groups do not necessarily
> issue errata in an expeditious fashion. We should fix the W3C Process so
> that it encourages groups to consistently issue errata. We should also
> explore Best Practices that groups could adopt to improve their handling of
> this issue."
> 
> I agree with that. I would of course be happy to add errata to the spec
> pointing to any extensions that have made through the W3C process. 
> 
> However, currently the Web Crypto spec does *not* have a
> mandatory-to-implement list of algorithms.
> 
> You have two options here:
> 
> 1) All algorithms are mandatory to implement. Thus, developers know exactly
> what algorithms to implement and there are no extensions. 
> 
> 2) Some or no algorithms are mandatory to implement. Then, there may be
> extensions.
> 
> Ryan has switched his position from 2) to 1). 
> 
> In particular, for the case of 1) there is no way to add Curve 25519 to the
> spec without having all browsers implement it and re-opening Last Call. 
> 
> So, you gotta chose - logically you can't have 2) and then not allow
> extension specs (and yes, extension specs could be mentioned in errata and
> be easily discoverable).
> 
> Microsoft has already chosen 2). I'd like to know what Google's position is,
> not you with a TAG hat or Ryan's personal position.

You are conflating two distinct issues. There is zero requirement to make something normative in order to improve the process. 

That is, you're distinctly ignoring option 3

3) The spec (and errata, aka 'living spec') list the algorithms. New algorithms are incorporated, via errata, into "The spec", without necessitating extension specs.

The issue of normative and profiles, which we've discussed at length and you very well know Google's position on, is addressed as an orthogonal and separate concern.

Also, from a W3C staff representative, the tone is not appreciated, as has been previously communicated privately but, unfortunately, continues to attempt to dismiss the concerns Google is bringing you. Let's try to keep things positive and productive, please.
Comment 53 Harry Halpin 2014-10-11 09:11:04 UTC
(In reply to Ryan Sleevi from comment #52)
> (In reply to Harry Halpin from comment #50)
> > Domenic,
> > 
> > I believe the issue you reference is *not* there should be no extension
> > specs.
> > 
> > To be precise, the issue is: *Improve Errata Management*.
> > 
> > "For reasons of process and practice, W3C working groups do not necessarily
> > issue errata in an expeditious fashion. We should fix the W3C Process so
> > that it encourages groups to consistently issue errata. We should also
> > explore Best Practices that groups could adopt to improve their handling of
> > this issue."
> > 
> > I agree with that. I would of course be happy to add errata to the spec
> > pointing to any extensions that have made through the W3C process. 
> > 
> > However, currently the Web Crypto spec does *not* have a
> > mandatory-to-implement list of algorithms.
> > 
> > You have two options here:
> > 
> > 1) All algorithms are mandatory to implement. Thus, developers know exactly
> > what algorithms to implement and there are no extensions. 
> > 
> > 2) Some or no algorithms are mandatory to implement. Then, there may be
> > extensions.
> > 
> > Ryan has switched his position from 2) to 1). 
> > 
> > In particular, for the case of 1) there is no way to add Curve 25519 to the
> > spec without having all browsers implement it and re-opening Last Call. 
> > 
> > So, you gotta chose - logically you can't have 2) and then not allow
> > extension specs (and yes, extension specs could be mentioned in errata and
> > be easily discoverable).
> > 
> > Microsoft has already chosen 2). I'd like to know what Google's position is,
> > not you with a TAG hat or Ryan's personal position.
> 
> You are conflating two distinct issues. There is zero requirement to make
> something normative in order to improve the process. 

I'm going to point out that Boris' objections about people expecting S2 hold in the current spec unless *all algorithms* are normative. That should be straightforward. 

> 
> That is, you're distinctly ignoring option 3
> 
> 3) The spec (and errata, aka 'living spec') list the algorithms. New
> algorithms are incorporated, via errata, into "The spec", without
> necessitating extension specs.

If this is rephrasing Richard Barnes's "we can add forward-links to extension specs in an errata" per Mozilla's rather reasonable proposal on the mailing list, then yes, at least Mark Watson and Mozilla agree we can go with that. That is fine per W3C process as there is a well-defined procedure for errata, it solves the discoverability/maturity problem, and gives the spec the agility that BAL requires. So, thus, we have consensus. 

> 
> The issue of normative and profiles, which we've discussed at length and you
> very well know Google's position on, is addressed as an orthogonal and
> separate concern.
> 
> Also, from a W3C staff representative, the tone is not appreciated, as has
> been previously communicated privately but, unfortunately, continues to
> attempt to dismiss the concerns Google is bringing you. Let's try to keep
> things positive and productive, please.


I think those questions I asked are exceedingly reasonable as this is the last blocking substantial bug and Microsoft has been very clear about what they want.
We have simply asked the same from you and Google. 

I think Richard's proposal solves this bug. Richard, can you please put what you think is the final version of your proposal in bugzilla to make sure we have consensus?
Comment 54 Harry Halpin 2014-10-11 09:20:46 UTC
(In reply to Ryan Sleevi from comment #51)
> (In reply to Harry Halpin from comment #48)
> > You are assuming "entire Web" depends on S2. It is more realistic certain
> > classes of applications and countries depend on S2.
> > 
> > First of all, can you please clarify if this position is *your personal
> > position* or an agreed-upon internal Google position?
> > 
> > Microsoft has made their agreed-upon position clear, and W3C would
> > appreciate the same from Google. 
> 
> As a W3C staff contact, both the tone and knowledge of the space
> representing are disappointing. It is well known within the W3C and it's
> member organizations that this "entire web" distinction you make does not
> exist for implementors, and that assumption has caused great harm and damage
> to the web platform that the WHATWG has spent correcting.

To summarize, previously you wanted *everything* to be viewed as an extension and no normative algorithms. This is backed by other real-world vendors like Netflix who plan to implement, albeit not in a browser. However, given concerns from browser implementers, the process would be that a normative "browser profile" would be developed during CR to satisfy Boris' earlier concerns. 

We have to double-check to make sure you guys haven't changed your minds to make everything normative and not allow extensions. 

If you want is forward references, then extension specs can be listed in errata that the editors, the WG, and W3C will commit to keeping up to date. 

In which case, I believe all problems are solved. We have forward-references so extension specs can be discovered, we have a clear "browser profile" so developers and users know what to expect, and we don't prematurely optimize our algorithms given the state of play in crypto. 

Unless you can see another way forward, I suggest we keep Richard's proposal as consensus and go forward out of Last Call with that as the resolution.

> 
> Our position and preference for living specs has been made clear,
> repeatedly, within the W3C and the TAG. Beyond this, I don't think much more
> productive discussion can be made here with you. 
> 
> > 
> > > 
> > > So now you have two problems:
> > > 1) How do authors realize that S1 and S2 are disjoint (answer: And this is
> > > true for nearly every implementor - is that they don't, no matter how much
> > > we try to spin it)
> > 
> > There are separate specs.
> > 
> > Again, cryptographic applications are not adding new shiny graphics to CSS
> > that we expect everyone to implement.
> > 
> > Particular implementers are not idiots, particularly if they are using a
> > library that is called "SubtleCrypto". 
> 
> This isn't at all what I was saying. I encourage you to re-read Boris'
> thoughtful replies to your message, which have already spelled out the
> issues here.
> 
> This is not hypothetical. This is something we see time and time again - and
> which the W3C has made efforts to try to address, seeing that they had been
> supplanted by the WHATWG. This tone and response suggests that perhaps it's
> harder for the process to adjust to the reality - the concerns Jeff Jaffe
> was talking about, that you've heard from Boris, Anne, and Domenic on.
> 
> 
> > Furthermore, you have changed your mind on this *several* times on this.
> > Now, you are basically arguing *all algorithms* should be normative.
> > 
> > Can you explain why?
> > 
> 
> I am not arguing this anymore with you, for the reasons I explain below.
> 
> > 
> > Again, see above question. You can't have it both ways. Either you declare
> > all algorithms are normative and MUST be implemented or you allow extension
> > specs. Otherwise none of the examples you discuss hold.
> 
> You've continually misinterpreted this point that it's no doubt no longer
> productive to discuss. You've heard from both Boris and myself about the
> distinction between "spec required" and "implementation required". I've
> spent quite a bit of time explaining to you, both publicly and privately,
> regarding how the normative requirements of the spec play out for
> implementors.
> 
> You've heard from other UAs and representatives to this same effect.
> 
> > 
> > Hey, I'm not the one blocking Curve 25519, which developers do want, I
> > believe you are. We are trying to have some agility so as crypto changes and
> > developers need, we can let them have it.  I think that's dealing with the
> > messy world actually. 
> 
> I'm telling you, with an implementation hat on, where Curve25519 sits for
> priorities.
> 
> Either we add it to a spec, which is then ignored, or we have the spec
> reflect reality.
> 
> > No, I believe the needs of developers and users overrun the comfort of
> > browser vendors and the W3C. It's interesting that you call the concern for
> > non-NIST curves "boutique crypto". I think a lot of people would disagree.
> 
> You don't convince UAs to implement it by putting it in a spec. If anything,
> that's how you get UAs ignoring the W3C - when it fails to reflect the
> realities of the web, it is no longer relevant nor productive.
> 
> > 
> > Ditto for any individual person. For implementors to ignore the needs of
> > users isn't great either. 
> > 
> > That's why W3C have a consensus-based process and clear governance.
> 
> Indeed. But that doesn't, for a second, mean that it produces specs relevant
> to UAs. The specs most driven by consensus, rather than implementation,
> reflect this - no one uses them.
Comment 55 Mike Jones 2014-10-11 17:30:58 UTC
(In reply to Harry Halpin from comment #54)
> Unless you can see another way forward, I suggest we keep Richard's proposal
> as consensus and go forward out of Last Call with that as the resolution.

Richard's proposal ignores the requirement that extensions be able to be defined without modifying the base spec, and so can't be used as the basis for consensus.

If instead, the base spec *referenced* a Web page run by the W3C that could be modified as extensions are created, to provide pointers to the extension specs for developers, that would solve the problem.  Harry, I believe you've previously offered to do that.  Let's go with that solution.
Comment 56 virginie.galindo 2014-10-14 15:30:32 UTC
A proposal from the chair and W3C staff to progress on this bug, based on Richard Barnes proposal. 
Read : http://lists.w3.org/Archives/Public/public-webcrypto/2014Oct/0119.html

Virginie
Comment 57 Mark Watson 2014-10-15 23:40:41 UTC
I have implemented the revised proposal from Richard in the Editor's draft:

https://dvcs.w3.org/hg/webcrypto-api/rev/4677d99c9a2e
https://dvcs.w3.org/hg/webcrypto-api/rev/ae06638b018b
https://dvcs.w3.org/hg/webcrypto-api/rev/d9b3d6f2d930
https://dvcs.w3.org/hg/webcrypto-api/rev/160514715d11
https://dvcs.w3.org/hg/webcrypto-api/rev/1499886c4da3
https://dvcs.w3.org/hg/webcrypto-api/rev/10c01a8e208e
https://dvcs.w3.org/hg/webcrypto-api/rev/d17c700dd816

This proposal considerably limits the extensibility for elliptic curves to those with various similarities to the existing NIST curves. Closing this bug should be dependent on resolving that issue, which I have explained in more detail on the list:

http://lists.w3.org/Archives/Public/public-webcrypto/2014Oct/0128.html
Comment 58 Colin Gallagher 2014-10-17 08:24:22 UTC
Please see my comment(s) 4 and 5, shown here:
https://www.w3.org/Bugs/Public/show_bug.cgi?id=24444#c4
Thank you
Comment 59 Mark Watson 2014-10-17 22:53:21 UTC
According to the list discussion, I have generalized the curve extensibility for ECDSA and ECDH slightly:

https://dvcs.w3.org/hg/webcrypto-api/rev/2eecd936e1e8
https://dvcs.w3.org/hg/webcrypto-api/rev/24e0e32852f8

According to my understanding this would allow curve25519 and other curves to be added to the existing ECDSA and ECDH algorithms through extension specifications.

Please also remember that even if a proposed curve does not fit with these extensibility points, a new algorithm can always be added (and probably should be if the curve is so very different that it does not fit into the newly generalized extension points).
Comment 60 Colin Gallagher 2014-10-17 23:06:13 UTC
Thank you, I look forward to additional information that ensures the curve25519 and support for secp256k1 in browser, per prior discussions and suggestions, will be included in the Named Curve dictionary as deliberations on the Web Cryptography API Document / extensibility/errata proceed.

Respect,

Colin
Comment 61 Mike Jones 2014-10-18 01:53:19 UTC
The JWK elliptic curve language has been generalized to allow new curves to be registered that represent their keys using only an “x” value, rather than always requiring “x” and “y” values.  See http://tools.ietf.org/html/draft-ietf-jose-json-web-algorithms-35#section-6.2.1 and 6.2.1.1.

I’m calling this out to WebCrypto because I believe this opens the door for new curves for WebCrypto to have the same flexibility.
Comment 62 Mark Watson 2014-10-22 20:39:29 UTC
Based on the meeting discussion and comment#61, the remaining issue here is to delegate decoding of the private key data structure to extensions specifications for ECDSA and ECDH. This is fixed in:

https://dvcs.w3.org/hg/webcrypto-api/rev/b71fc3eaf6db
https://dvcs.w3.org/hg/webcrypto-api/rev/be599c620546