<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<!DOCTYPE bugzilla SYSTEM "https://www.w3.org/Bugs/Public/page.cgi?id=bugzilla.dtd">

<bugzilla version="5.0.4"
          urlbase="https://www.w3.org/Bugs/Public/"
          
          maintainer="sysbot+bugzilla@w3.org"
>

    <bug>
          <bug_id>27402</bug_id>
          
          <creation_ts>2014-11-21 19:53:08 +0000</creation_ts>
          <short_desc>Specify the behavior when returning an octet string with a particular _bit_ length</short_desc>
          <delta_ts>2016-05-23 22:53:26 +0000</delta_ts>
          <reporter_accessible>1</reporter_accessible>
          <cclist_accessible>1</cclist_accessible>
          <classification_id>1</classification_id>
          <classification>Unclassified</classification>
          <product>Web Cryptography</product>
          <component>Web Cryptography API Document</component>
          <version>unspecified</version>
          <rep_platform>PC</rep_platform>
          <op_sys>Linux</op_sys>
          <bug_status>RESOLVED</bug_status>
          <resolution>MOVED</resolution>
          
          
          <bug_file_loc></bug_file_loc>
          <status_whiteboard></status_whiteboard>
          <keywords></keywords>
          <priority>P2</priority>
          <bug_severity>normal</bug_severity>
          <target_milestone>---</target_milestone>
          
          
          <everconfirmed>1</everconfirmed>
          <reporter name="Eric Roman">ericroman</reporter>
          <assigned_to name="Ryan Sleevi">sleevi</assigned_to>
          <cc>public-webcrypto</cc>
    
    <cc>watsonm</cc>
          
          

      

      

      

          <comment_sort_order>oldest_to_newest</comment_sort_order>  
          <long_desc isprivate="0" >
    <commentid>115314</commentid>
    <comment_count>0</comment_count>
    <who name="Eric Roman">ericroman</who>
    <bug_when>2014-11-21 19:53:08 +0000</bug_when>
    <thetext>There are a few places in the spec which octet strings are used either as input or as output, however not all bits in the string are relevant.

For instance:
  * Importing an HMAC key were the length is not a multiple of 8 bits
  * Exporting an HMAC key whose length is not a multiple of 8 bits
  * Deriving bits for ECDH, using a length that is not a multiple of 8 bits

The spec is ambiguous on how exactly that mechanism works. This could lead to implementation incompatibilities if users rely on the behavior chosen by a particular implementation.

For instance consider these scenarios:

  * Import an HMAC key using data = [0xff] and length=1 bit. When exporting that key, implementations could return any of the following key values:

   [0xff] (the exact octet stream imported)
   [0x80] (the unused bits having been zeroed out)
   [0x84] (or any other combination where first bit is zero)

  * When importing an HMAC key and the unused bits are not zero, we could consider treating this as an error error to catch potential mis-use? 

  * When deriving 12 bits for ECDH, it is natural for an implementation to return the same thing as if deriving 16 bits. However there is nothing in the spec that mandates this. If another implementation decided to zero out those last 4 bits however users became reliant on the other behavior...


My recommendation is to mandate that unused bits when returning an octet string should be set to zero.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>126491</commentid>
    <comment_count>1</comment_count>
    <who name="Mark Watson">watsonm</who>
    <bug_when>2016-05-23 22:53:26 +0000</bug_when>
    <thetext>Moved to https://github.com/w3c/webcrypto/issues/31</thetext>
  </long_desc>
      
      

    </bug>

</bugzilla>