This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 27402 - Specify the behavior when returning an octet string with a particular _bit_ length
Summary: Specify the behavior when returning an octet string with a particular _bit_ l...
Status: RESOLVED MOVED
Alias: None
Product: Web Cryptography
Classification: Unclassified
Component: Web Cryptography API Document (show other bugs)
Version: unspecified
Hardware: PC Linux
: P2 normal
Target Milestone: ---
Assignee: Ryan Sleevi
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2014-11-21 19:53 UTC by Eric Roman
Modified: 2016-05-23 22:53 UTC (History)
2 users (show)

See Also:


Attachments

Description Eric Roman 2014-11-21 19:53:08 UTC
There are a few places in the spec which octet strings are used either as input or as output, however not all bits in the string are relevant.

For instance:
  * Importing an HMAC key were the length is not a multiple of 8 bits
  * Exporting an HMAC key whose length is not a multiple of 8 bits
  * Deriving bits for ECDH, using a length that is not a multiple of 8 bits

The spec is ambiguous on how exactly that mechanism works. This could lead to implementation incompatibilities if users rely on the behavior chosen by a particular implementation.

For instance consider these scenarios:

  * Import an HMAC key using data = [0xff] and length=1 bit. When exporting that key, implementations could return any of the following key values:

   [0xff] (the exact octet stream imported)
   [0x80] (the unused bits having been zeroed out)
   [0x84] (or any other combination where first bit is zero)

  * When importing an HMAC key and the unused bits are not zero, we could consider treating this as an error error to catch potential mis-use? 

  * When deriving 12 bits for ECDH, it is natural for an implementation to return the same thing as if deriving 16 bits. However there is nothing in the spec that mandates this. If another implementation decided to zero out those last 4 bits however users became reliant on the other behavior...


My recommendation is to mandate that unused bits when returning an octet string should be set to zero.
Comment 1 Mark Watson 2016-05-23 22:53:26 UTC
Moved to https://github.com/w3c/webcrypto/issues/31