BBS Cryptosuite v2023

Securing Verifiable Credentials with Selective Disclosure using BBS Signatures

W3C Working Draft

More details about this document
This version:
https://www.w3.org/TR/2023/WD-vc-di-bbs-20231104/
Latest published version:
https://www.w3.org/TR/vc-di-bbs/
Latest editor's draft:
https://w3c.github.io/vc-di-bbs/
History:
https://www.w3.org/standards/history/vc-di-bbs/
Commit history
Editors:
Greg Bernstein (Invited Expert)
Manu Sporny (Digital Bazaar)
Feedback:
GitHub w3c/vc-di-bbs (pull requests, new issue, open issues)

Abstract

This specification describes a Data Integrity Cryptosuite for use when generating digital signatures using the BBS signature scheme. The Signature Suite utilizes BBS signatures to provide selective disclosure and unlinkable derived proofs.

Status of This Document

This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

This is an experimental specification and is undergoing regular revisions. It is not fit for production deployment.

This document was published by the Verifiable Credentials Working Group as a Working Draft using the Recommendation track.

Publication as a Working Draft does not imply endorsement by W3C and its Members.

This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 12 June 2023 W3C Process Document.

1. Introduction

This specification defines a cryptographic suite for the purpose of creating, verifying, and deriving proofs using the BBS Signature Scheme in conformance with the Data Integrity [VC-DATA-INTEGRITY] specification. The BBS signature scheme directly provides for selective disclosure and unlinkable proofs. It provides four high-level functions that work within the issuer, holder, verifier model. Specifically, an issuer uses the BBS Sign function to create a cryptographic value known as a "BBS signature" which is used in signing the original credential. A holder, on receipt of a credential signed with BBS, then verifies the credential with the BBS Verify function.

The holder then chooses information to selectively disclose from the received credential and uses the BBS ProofGen function to generate a cryptographic value, known as a "BBS proof", which is used in creating a proof for this "derived credential". The cryptographic "BBS proof" value is not linkable to the original "BBS signature" and a different, unlinkable "BBS proof" can be generated by the holder for additional "derived credentials", including any containing the exact same information. Finally, a verifier uses the BBS ProofVerify function to verify the derived credential received from the holder.

Applying the BBS signature scheme to verifiable credentials involves the processing specified in this document. In general the suite uses the RDF Dataset Normalization Algorithm [RDF-DATASET-NORMALIZATION] to transform an input document into its canonical form. An issuer then uses selective disclosure primitives to separate the canonical form into mandatory and non-mandatory statements. These are processed separately with other information to serve as the inputs to the BBS Sign function along with appropriate key material. This output is used to generate a secured credential. A holder uses a set of selective disclosure functions and the BBS Verify function on receipt of the credential to ascertain validity.

Similarly, on receipt of a BBS signed credential, a holder uses the RDF Dataset Normalization Algorithm [RDF-DATASET-NORMALIZATION] to transform an input document into its canonical form, and then applies selective disclosure primitives to separate the canonical form into mandatory and selectively disclosed statements, which are appropriately processed and serve as inputs to the BBS ProofGen function. Suitably processed, the output of this function becomes the signed selectively disclosed credential sent to a verifier. Using canonicalization and selective disclosure primitives, the verifier can then use the BBS verifyProof function to validate the credential.

1.1 Terminology

This section defines the terms used in this specification. A link to these terms is included whenever they appear in this specification.

data integrity proof
A set of attributes that represent a digital proof and the parameters required to verify it.
private key
Cryptographic material that can be used to generate digital proofs.
challenge
A random or pseudo-random value used by some authentication protocols to mitigate replay attacks.
domain
A string value that specifies the operational domain of a digital proof. This could be an Internet domain name like example.com, an ad-hoc value such as mycorp-level3-access, or a very specific transaction value like 8zF6T8J34qP3mqP. A signer could include a domain in its digital proof to restrict its use to particular target, identified by the specified domain.
cryptographic suite
A specification defining the usage of specific cryptographic primitives in order to achieve a particular security goal. These documents are often used to specify verification methods, digital signature types, their identifiers, and other related properties.
decentralized identifier (DID)
A globally unique persistent identifier that does not require a centralized registration authority and is often generated and/or registered cryptographically. The generic format of a is defined in [DID-CORE]. Many—but not all—methods make use of distributed ledger technology (DLT) or some other form of decentralized network.
controller
An entity that has the capability to make changes to a controller document.
controller document
A set of data that specifies one or more relationships between a controller and a set of data, such as a set of public cryptographic keys.
subject
The entity identified by the id property in a controller document. Anything can be a subject: person, group, organization, physical thing, digital thing, logical thing, etc.
distributed ledger (DLT)
A non-centralized system for recording events. These systems establish sufficient confidence for participants to rely upon the data recorded by others to make operational decisions. They typically use distributed databases where different nodes use a consensus protocol to confirm the ordering of cryptographically signed transactions. The linking of digitally signed transactions over time often makes the history of the ledger effectively immutable.
verifier
A role an entity performs by receiving data containing one or more data integrity proofs and then determining whether or not the proof is valid.
verifiable credential
A standard data model and representation format for expressing cryptographically-verifiable digital credentials, as defined by the W3C Verifiable Credentials specification [VC-DATA-MODEL-2.0].
verification method

A set of parameters that can be used together with a process to independently verify a proof. For example, a cryptographic public key can be used as a verification method with respect to a digital signature; in such usage, it verifies that the signer possessed the associated cryptographic private key.

"Verification" and "proof" in this definition are intended to apply broadly. For example, a cryptographic public key might be used during Diffie-Hellman key exchange to negotiate a shared symmetric key for encryption. This guarantees the integrity of the key agreement process. It is thus another type of verification method, even though descriptions of the process might not use the words "verification" or "proof."

1.2 Conformance

As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.

The key words MAY, MUST, MUST NOT, and SHOULD in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.

A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections 2. Data Model and 3. Algorithms of this document MUST be enforced.

A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.

This document contains examples of JSON and JSON-LD data. Some of these examples are invalid JSON, as they include features such as inline comments (//) explaining certain portions and ellipses (...) indicating the omission of information that is irrelevant to the example. Such parts need to be removed if implementers want to treat the examples as valid JSON or JSON-LD.

2. Data Model

The following sections outline the data model that is used by this specification for verification methods and data integrity proof formats.

2.1 Verification Methods

These verification methods are used to verify Data Integrity Proofs [VC-DATA-INTEGRITY] produced using BLS12-381 cryptographic key material that is compliant with [CFRG-BBS-SIGNATURE]. The encoding formats for these key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used during the processing of digital signatures.

2.1.1 Multikey

The Multikey format, as defined in [VC-DATA-INTEGRITY], is used to express public keys for the cryptographic suites defined in this specification.

The publicKeyMultibase property represents a Multibase-encoded Multikey expression of a BLS12-381 public key in the G2 group. The encoding of this field is the two-byte prefix 0xeb01 followed by the 96-byte compressed public key data. The 98-byte value is then encoded using base58-btc (z) as the prefix. Any other encodings MUST NOT be allowed.

Developers are advised to not accidentally publish a representation of a private key. Implementations of this specification will raise errors in the event of a [MULTICODEC] value other than 0xeb01 being used in a publicKeyMultibase value.

Example 1: A BLS12-381 G2 group public key, encoded as a Multikey
{
  "id": "https://example.com/issuer/123#key-0",
  "type": "Multikey",
  "controller": "https://example.com/issuer/123",
  "publicKeyMultibase": "zUC7EK3ZakmukHhuncwkbySmomv3FmrkmS36E4Ks5rsb6VQSRpoCrx6
  Hb8e2Nk6UvJFSdyw9NK1scFXJp21gNNYFjVWNgaqyGnkyhtagagCpQb5B7tagJu3HDbjQ8h
  5ypoHjwBb"
}
Example 2: A BLS12-381 G2 group public key, encoded as a Multikey in a controller document
{
  "@context": [
    "https://www.w3.org/ns/did/v1",
    "https://w3id.org/security/data-integrity/v1"
  ],
  "id": "https://example.com/issuer/123",
  "verificationMethod": [{
    "id": "https://example.com/issuer/123#key-1",
    "type": "Multikey",
    "controller": "https://example.com/issuer/123",
    "publicKeyMultibase": "zUC7EK3ZakmukHhuncwkbySmomv3FmrkmS36E4Ks5rsb6VQSRpoCr
    x6Hb8e2Nk6UvJFSdyw9NK1scFXJp21gNNYFjVWNgaqyGnkyhtagagCpQb5B7tagJu3HDbjQ8h
    5ypoHjwBb"
  }]
}

2.2 Proof Representations

This suite relies on detached digital signatures represented using [MULTIBASE] and [MULTICODEC].

2.2.1 DataIntegrityProof

The verificationMethod property of the proof MUST be a URL. Dereferencing the verificationMethod MUST result in an object containing a type property with the value set to Multikey.

The type property of the proof MUST be DataIntegrityProof.

The cryptosuite property of the proof MUST be bbs-2023.

The created property of the proof MUST be an [XMLSCHEMA11-2] formatted date string.

The proofPurpose property of the proof MUST be a string, and MUST match the verification relationship expressed by the verification method controller.

The value of the proofValue property of the proof MUST be an BBS signature or BBS proof produced according to [CFRG-BBS-SIGNATURE] then serialized and encoded according to procedures in section 3. Algorithms.

3. Algorithms

The following algorithms describe how to use verifiable credentials with the BBS Signature Scheme [CFRG-BBS-SIGNATURE]. When using the BBS signature scheme the SHAKE-256 variant SHOULD be used.

Implementations SHOULD fetch and cache verification method information as early as possible when adding or verifying proofs. Parameters passed to functions in this section use information from the verification method — such as the public key size — to determine function parameters — such as the cryptographic hashing algorithm.

When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used, implementations of that algorithm will detect dataset poisoning by default, and abort processing upon detection.

3.1 Selective Disclosure Functions

3.1.1 createShuffledIdLabelMapFunction

The following algorithm creates a label map factory function that uses an HMAC to shuffle canonical blank node identifiers. The required input is an HMAC (previously initialized with a secret key), HMAC. A function, labelMapFactoryFunction, is produced as output.

  1. Create a function, labelMapFactoryFunction, with one required input (a canonical node identifier map, canonicalIdMap), that will return a blank node identifier map, bnodeIdMap, as output. Set the function's implementation to:
    1. Generate a new empty bnode identifier map, bnodeIdMap.
    2. For each map entry, entry, in canonicalIdMap:
      1. Perform an HMAC operation on the canonical identifier from the value in entry to get an HMAC digest, digest.
      2. Generate a new string value, b64urlDigest, and initialize it to "u" followed by appending a base64url-no-pad encoded version of the digest value.
      3. Add a new entry, newEntry, to bnodeIdMap using the key from entry and b64urlDigest as the value.
    3. Derive the shuffled mapping from the bnodeIdMap as follows:
      1. Set hmacIds to be the sorted array of values from the bnodeIdMap, and set bnodeKeys to be the ordered array of keys from the bnodeIdMap.
      2. For each key in bnodeKeys, replace the bnodeIdMap value for that key with the index position of the value in the hmacIds array prefixed by "b", i.e., bnodeIdMap.set(bkey, 'b' + hmacIds.indexOf(bnodeIdMap.get(bkey))).
    4. Return bnodeIdMap.
  2. Return labelMapFactoryFunction.

3.2 bbs-2023 Functions

3.2.1 serializeBaseProofValue

The following algorithm serializes the base proof value, including the BBS signature, HMAC key, and mandatory pointers. The required inputs are a base signature bbsSignature, an HMAC key hmacKey, and an array of mandatoryPointers. A single base proof string value is produced as output.

  1. Initialize a byte array, proofValue, that starts with the BBS base proof header bytes 0xd9, 0x5d, and 0x02.
  2. Initialize components to an array with five elements containing the values of: bbsSignature, hmacKey, and mandatoryPointers.
  3. CBOR-encode components and append it to proofValue.
  4. Initialize baseProof to a string with the multibase-base64url-no-pad-encoding of proofValue. That is, return a string starting with "u" and ending with the base64url-no-pad-encoded value of proofValue.
  5. Return baseProof as base proof.

3.2.2 parseBaseProofValue

The following algorithm parses the components of a bbs-2023 selective disclosure base proof value. The required input is a proof value (proofValue). A single object, parsed base proof, containing three elements, using the names "bbsSignature", "hmacKey", and "mandatoryPointers", is produced as output.

  1. Ensure the proofValue string starts with u, indicating that it is a multibase-base64url-no-pad-encoded value, and throw an error if it does not.
  2. Initialize decodedProofValue to the result of base64url-no-pad-decoding the substring following the leading u in proofValue.
  3. Ensure that the decodedProofValue starts with the BBS base proof header bytes 0xd9, 0x5d, and 0x02, and throw an error if it does not.
  4. Initialize components to an array that is the result of CBOR-decoding the bytes that follow the three-byte ECDSA-SD base proof header. Ensure the result is an array of three elements.
  5. Return an object with properties set to the three elements, using the names "bbsSignature", "hmacKey", and "mandatoryPointers", respectively.

3.2.3 createDisclosureData

The following algorithm creates data to be used to generate a derived proof. The inputs include a JSON-LD document (document), a BBS base proof (proof), an array of JSON pointers to use to selectively disclose statements (selectivePointers), and any custom JSON-LD API options (such as a document loader). A single object, disclosure data, is produced as output, which contains the "bbsProof", "labelMap", "mandatoryIndexes", "selectiveIndexes", and "revealDocument" fields.

  1. Initialize bbsSignature, hmacKey, and mandatoryPointers to the values of the associated properties in the object returned when calling the algorithm in Section 3.2.2 parseBaseProofValue, passing the proofValue from proof.
  2. Initialize hmac to an HMAC API using hmacKey. The HMAC uses the same hash algorithm used in the signature algorithm, i.e., SHAKE-256.
  3. Initialize labelMapFactoryFunction to the result of calling the createShuffledIdLabelMapFunction algorithm passing hmac as HMAC.
  4. Initialize combinedPointers to the concatenation of mandatoryPointers and selectivePointers.
  5. Initialize groupDefinitions to a map with the following entries: key of the string "mandatory" and value of mandatoryPointers; key of the string "selective" and value of selectivePointers; and key of the string "combined" and value of combinedPointers.
  6. Initialize groups and labelMap to the result of calling the algorithm in Section 3.3.16 canonicalizeAndGroup of the [DI-ECDSA] specification, passing document labelMapFactoryFunction, groupDefinitions, and any custom JSON-LD API options. Note: This step transforms the document into an array of canonical N-Quads whose order has been shuffled based on 'hmac' applied blank node identifiers, and groups the N-Quad strings according to selections based on JSON pointers.
  7. Compute the mandatory indexes relative to their positions in the combined statement list, i.e., find the position at which a mandatory statement occurs in the list of combined statements. One method for doing this is given below.
    1. Initialize mandatoryIndexes to an empty array. Set mandatoryMatch to groups.mandatory.matching map; set combinedMatch to groups.combined.matching; and set combinedIndexes to the ordered array of just the keys of the combinedMatch map.
    2. For each key in the mandatoryMatch map, find its index in the combinedIndexes array (e.g., combinedIndexes.indexOf(key)), and add this value to the mandatoryIndexes array.
  8. Compute the selective indexes relative to their positions in the non-mandatory statement list, i.e., find the position at which a selected statement occurs in the list of non-mandatory statements. One method for doing this is given below.
    1. Initialize selectiveIndexes to an empty array. Set selectiveMatch to the groups.selective.matching map; set mandatoryNonMatch to the map groups.mandatory.nonMatching; and nonMandatoryIndexes to to the ordered array of just the keys of the mandatoryNonMatch map.
    2. For each key in the selectiveMatch map, find its index in the nonMandatoryIndexes array (e.g., nonMandatoryIndexes.indexOf(key)), and add this value to the selectiveIndexes array.
  9. Initialize bbsMessages to an array of byte arrays obtained from the UTF-8 encoding of the the values in the nonMandatory array.
  10. Recompute the bbsHeader using the following steps:
    1. Initialize proofHash to the result of calling the RDF Dataset Canonicalization algorithm [RDF-CANON] on proof with the proofValue removed and then cryptographically hashing the result using the same hash that is used by the signature algorithm, i.e., SHAKE-256. Note: This step can be performed in parallel; it only needs to be completed before this algorithm terminates, as the result is part of the return value.
    2. Initialize mandatoryHash to the result of calling the algorithm in Section 3.3.17 hashMandatoryNQuads of the [DI-ECDSA] specification, passing the values from the map groups.mandatory.matching and utilizing the SHAKE-256 algorithm.
    3. Set bbsHeader to the concatenation of proofHash and mandatoryHash in that order.
  11. Set bbsProof to the value computed by the ProofGen procedure from [CFRG-BBS-SIGNATURE], i.e. ProofGen(PK, signature, header, ph, messages, disclosed_indexes), where PK is the original issuers public key, signature is the bbsSignature, header is the bbsHeader, ph is an empty byte array, messages is bbsMessages, and disclosed_indexes is selectiveIndexes.
  12. Initialize revealDocument to the result of the "selectJsonLd" algorithm, passing document, and combinedPointers as pointers.
  13. Run the RDF Dataset Canonicalization Algorithm [RDF-CANON] on the joined combinedGroup.deskolemizedNQuads, passing any custom options, and get the canonical bnode identifier map, canonicalIdMap. Note: This map includes the canonical blank node identifiers that a verifier will produce when they canonicalize the reveal document.
  14. Initialize verifierLabelMap to an empty map. This map will map the canonical blank node identifiers produced by the verifier when they canonicalize the revealed document, to the blank node identifiers that were originally signed in the base proof.
  15. For each key (inputLabel) and value (verifierLabel) in `canonicalIdMap:
    1. Add an entry to verifierLabelMap, using verifierLabel as the key, and the value associated with inputLabel as a key in labelMap as the value.
  16. Return an object with properties matching bbsProof, "verifierLabelMap" for labelMap, mandatoryIndexes, selectiveIndexes, and revealDocument.

3.2.4 compressLabelMap

The following algorithm compresses a label map. The required input is label map (labelMap). The output is a compressed label map.

  1. Initialize map to an empty map.
  2. For each entry (k, v) in labelMap:
    1. Add an entry to map, with a key that is a base-10 integer parsed from the characters following the "c14n" prefix in k, and a value that is a base-10 integer parsed from the characters following the "b" prefix in v.
  3. Return map as compressed label map.

3.2.5 decompressLabelMap

The following algorithm decompresses a label map. The required input is a compressed label map (compressedLabelMap). The output is a decompressed label map.

  1. Initialize map to an empty map.
  2. For each entry (k, v) in compressedLabelMap:
    1. Add an entry to map, with a key that adds the prefix "c14n" to k, and a value that adds a prefix of "b" to v.
  3. Return map as decompressed label map.

3.2.6 serializeDerivedProofValue

The following algorithm serializes a derived proof value. The required inputs are a BBS proof (bbsProof), a label map (labelMap), an array of mandatory indexes (mandatoryIndexes), and an array of selective indexes (selectiveIndexes). A single derived proof value, serialized as a byte string, is produced as output.

  1. Initialize compressedLabelMap to the result of calling the algorithm in Section 3.2.4 compressLabelMap, passing labelMap as the parameter.
  2. Initialize a byte array, proofValue, that starts with the BBS disclosure proof header bytes 0xd9, 0x5d, and 0x03.
  3. Initialize components to an array with four elements containing the values of bbsProof, compressedLabelMap, mandatoryIndexes, and selectiveIndexes.
  4. CBOR-encode components and append it to proofValue.
  5. Return the derived proof as a string with the multibase-base64url-no-pad-encoding of proofValue. That is, return a string starting with "u" and ending with the base64url-no-pad-encoded value of proofValue.

3.2.7 parseDerivedProofValue

The following algorithm parses the components of the derived proof value. The required input is a derived proof value (proofValue). A A single derived proof value value object is produced as output, which contains a set of five elements, using the names "bbsProof", "labelMap", "mandatoryIndexes", and "selectiveIndexes".

  1. Ensure the proofValue string starts with u, indicating that it is a multibase-base64url-no-pad-encoded value, and throw an error if it does not.
  2. Initialize decodedProofValue to the result of base64url-no-pad-decoding the substring that follows the leading u in proofValue.
  3. Ensure that the decodedProofValue starts with the ECDSA-SD disclosure proof header bytes 0xd9, 0x5d, and 0x03, and throw an error if it does not.
  4. Initialize components to an array that is the result of CBOR-decoding the bytes that follow the three-byte BBS disclosure proof header. Ensure the result is an array of four elements — a byte array, a map of integers to integers, an array of integers, and another array of integers; otherwise, throw an error.
  5. Replace the second element in components using the result of calling the algorithm in Section 3.2.5 decompressLabelMap, passing the existing second element of components as compressedLabelMap.
  6. Return derived proof value as an object with properties set to the five elements, using the names "bbsProof", "labelMap", "mandatoryIndexes", and "selectiveIndexes" respectively.

3.2.8 createVerifyData

The following algorithm creates the data needed to perform verification of a BBS-protected verifiable credential. The inputs include a JSON-LD document (document), a BBS disclosure proof (proof), and any custom JSON-LD API options (such as a document loader). A single verify data object value is produced as output containing the following fields: "bbsProof", "proofHash", "mandatoryHash", "selectedIndexes", and "nonMandatory".

  1. Initialize proofHash to the result of performing RDF Dataset Canonicalization [RDF-CANON] on the proof options, i.e., the proof portion of the document with the proofValue removed. The hash used is the same as that used in the signature algorithm, i.e., SHA-256 for a P-256 curve. Note: This step can be performed in parallel; it only needs to be completed before this algorithm needs to use the proofHash value.
  2. Initialize bbsProof, labelMap, mandatoryIndexes, and selectiveIndexes to the values associated with their property names in the object returned when calling the algorithm in Section 3.2.7 parseDerivedProofValue, passing proofValue from proof.
  3. Initialize labelMapFactoryFunction to the result of calling the "createLabelMapFunction" algorithm.
  4. Initialize nquads to the result of calling the "labelReplacementCanonicalize" algorithm of [DI-ECDSA], passing document, labelMapFactoryFunction, and any custom JSON-LD API options. Note: This step transforms the document into an array of canonical N-Quads with pseudorandom blank node identifiers based on labelMap.
  5. Initialize mandatory to an empty array.
  6. Initialize nonMandatory to an empty array.
  7. For each entry (index, nq) in nquads, separate the N-Quads into mandatory and non-mandatory categories:
    1. If mandatoryIndexes includes index, add nq to mandatory.
    2. Otherwise, add nq to nonMandatory.
  8. Initialize mandatoryHash to the result of calling the "hashMandatory" primitive, passing mandatory.
  9. Return an object with properties matching baseSignature, proofHash, nonMandatory, mandatoryHash, and selectiveIndexes.

3.3 bbs-2023

The bbs-2023 cryptographic suite takes an input document, canonicalizes the document using the Universal RDF Dataset Canonicalization Algorithm [RDF-CANON], and then applies a number of transformations and cryptographic operations resulting in the production of a data integrity proof. The algorithms in this section also include the verification of such a data integrity proof.

3.3.1 Add Base Proof (bbs-2023)

To generate a base proof, the algorithm in Section 4.1: Add Proof of the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.3.2 Base Proof Transformation (bbs-2023), the hashing algorithm is defined in Section 3.3.3 Base Proof Hashing (bbs-2023), and the proof serialization algorithm is defined in Section 3.3.5 Base Proof Serialization (bbs-2023).

3.3.2 Base Proof Transformation (bbs-2023)

The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.3.3 Base Proof Hashing (bbs-2023).

Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type), a cryptosuite identifier (cryptosuite), and a verification method (verificationMethod). The transformation options MUST contain an array of mandatory JSON pointers (mandatoryPointers) and MAY contain additional options, such as a JSON-LD document loader. A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.

  1. Initialize hmac to an HMAC API using a locally generated and exportable HMAC key. The HMAC uses the same hash algorithm used in the signature algorithm, i.e., SHAKE-256.
  2. Initialize labelMapFactoryFunction to the result of calling the createShuffledIdLabelMapFunction algorithm passing hmac as HMAC.
  3. Initialize groupDefinitions to a map with an entry with a key of the string "mandatory" and a value of mandatoryPointers.
  4. Initialize groups to the result of calling the algorithm in Section 3.3.16 canonicalizeAndGroup of the [DI-ECDSA] specification, passing labelMapFactoryFunction, groupDefinitions, unsecuredDocument as document, and any custom JSON-LD API options. Note: This step transforms the document into an array of canonical N-Quads whose order has been shuffled based on 'hmac' applied blank node identifiers, and groups the N-Quad strings according to selections based on JSON pointers.
  5. Initialize mandatory to the values in the groups.mandatory.matching map.
  6. Initialize nonMandatory to the values in the groups.mandatory.nonMatching map.
  7. Initialize hmacKey to the result of exporting the HMAC key from hmac.
  8. Return an object with "mandatoryPointers" set to mandatoryPointers, "mandatory" set to mandatory, "nonMandatory" set to nonMandatory, and "hmacKey" set to hmacKey.

3.3.3 Base Proof Hashing (bbs-2023)

The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.3.5 Base Proof Serialization (bbs-2023).

The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A hash data value represented as an object is produced as output.

  1. Initialize proofHash to the result of calling the RDF Dataset Canonicalization algorithm [RDF-CANON] on canonicalProofConfig and then cryptographically hashing the result using the same hash that is used by the signature algorithm, i.e., SHAKE-256. Note: This step can be performed in parallel; it only needs to be completed before this algorithm terminates, as the result is part of the return value.
  2. Initialize mandatoryHash to the result of calling the the algorithm in Section 3.3.17 hashMandatoryNQuads of the [DI-ECDSA] specification, passing transformedDocument.mandatory and utilizing the SHAKE-256 algorithm.
  3. Initialize hashData as a deep copy of transformedDocument, and add proofHash as "proofHash" and mandatoryHash as "mandatoryHash" to that object.
  4. Return hashData as hash data.

3.3.4 Base Proof Configuration (bbs-2023)

The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the base proof hashing algorithm.

The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.

  1. Let proofConfig be an empty object.
  2. Set proofConfig.type to options.type.
  3. If options.cryptosuite is set, set proofConfig.cryptosuite to its value.
  4. If options.type is not set to DataIntegrityProof and proofConfig.cryptosuite is not set to bbs-2023, an INVALID_PROOF_CONFIGURATION error MUST be raised.
  5. Set proofConfig.created to options.created. If the value is not a valid [XMLSCHEMA11-2] datetime, an INVALID_PROOF_DATETIME error MUST be raised.
  6. Set proofConfig.verificationMethod to options.verificationMethod.
  7. Set proofConfig.proofPurpose to options.proofPurpose.
  8. Set proofConfig.@context to unsecuredDocument.@context.
  9. Let canonicalProofConfig be the result of applying the Universal RDF Dataset Canonicalization Algorithm [RDF-CANON] to the proofConfig.
  10. Return canonicalProofConfig.

3.3.5 Base Proof Serialization (bbs-2023)

The following algorithm, to be called by an issuer of a BBS-protected Verifiable Credential, specifies how to create a base proof. The base proof is to be given only to the holder, who is responsible for generating a derived proof from it, exposing only selectively disclosed details in the proof to a verifier. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.

  1. Initialize proofHash, mandatoryPointers, mandatoryHash, nonMandatory, and hmacKey to the values associated with their property names in hashData.
  2. Initialize bbsHeader to the concatenation of proofHash and mandatoryHash in that order.
  3. Initialize bbsMessages to an array of byte arrays obtained from the UTF-8 encoding of the the values in the nonMandatory array.
  4. Compute the bbsSignature using the Sign procedure of [CFRG-BBS-Signature] with appropriate key material and bbsHeader for the header and bbsMessages for the messages
  5. Initialize `proofValue to the result of calling the algorithm in Section 3.2.1 serializeBaseProofValue, passing bbsSignature, hmacKey, and mandatoryPointers as parameters to the algorithm.
  6. Return proofValue as digital proof.

3.3.6 Add Derived Proof (bbs-2023)

The following algorithm, to be called by a holder of a bbs-2023-protected verifiable credential, creates a selective disclosure derived proof. The derived proof is to be given to the verifier. The inputs include a JSON-LD document (document), a BBS base proof (proof), an array of JSON pointers to use to selectively disclose statements (selectivePointers), and any custom JSON-LD API options, such as a document loader. A single selectively revealed document value, represented as an object, is produced as output.

  1. Initialize bbsProof, labelMap, mandatoryIndexes, selectiveIndexes, and revealDocument to the values associated with their property names in the object returned when calling the algorithm in Section 3.2.3 createDisclosureData, passing the document, proof, selectivePointers, and any custom JSON-LD API options, such as a document loader.
  2. Initialize newProof to a shallow copy of proof.
  3. Replace proofValue in newProof with the result of calling the algorithm in Section 3.2.6 serializeDerivedProofValue, passing bbsProof, labelMap, mandatoryIndexes, and selectiveIndexes.
  4. Set the value of the "proof" property in revealDocument to newProof.
  5. Return revealDocument as the selectively revealed document.

3.3.7 Verify Derived Proof (bbs-2023)

The following algorithm attempts verification of a bbs-2023 derived proof. This algorithm is called by a verifier of an BBS-protected verifiable credential. The inputs include a JSON-LD document (document), a BBS disclosure proof (proof), and any custom JSON-LD API options (such as a document loader). A single boolean verification result value is produced as output.

  1. Initialize bbsProof, proofHash, mandatoryHash, selectedIndexes, and nonMandatory to the values associated with their property names in the object returned when calling the algorithm in Section 3.2.8 createVerifyData, passing the document, proof, and any custom JSON-LD API options (such as a document loader).
  2. Initialize bbsHeader to the concatenation of proofHash and mandatoryHash in that order. Initialize disclosedMessages to an array of byte arrays obtained from the UTF-8 encoding of the elements of the nonMandatory array.
  3. Initialize verificationResult to the result of applying the verification algorithm ProofVerify of [CFRG-BBS-SIGNATURE] with PK set as the public key of the original issuer, proof set as bbsProof, header set as bbsHeader, disclosed_messages set as disclosedMessages, ph set as an empty byte array, and disclosed_indexes set as selectiveIndexes. Return verificationResult as verification result.

4. Privacy Considerations

Issue 1

TODO: We need to add a complete list of privacy considerations.

5. Security Considerations

Issue 2

TODO: We need to add a complete list of security considerations.

A. Test Vectors

This section is non-normative.

Demonstration of selective disclosure features including mandatory disclosure, selective disclosure, and overlap between those, requires an input credential document with more content than previous test vectors. To avoid excessively long test vectors, the starting document test vector is based on a purely fictitious windsurfing (sailing) competition scenario. In addition, we break the test vectors into two groups, based on those that would be generated by the issuer (base proof) and those that would be generated by the holder (derived proof).

A.1 Base Proof

To add a selective disclosure base proof to a document, the issuer needs the following cryptographic key material:

  1. The issuer's private/public key pair, i.e., the key pair corresponding to the verification method that will be part of the proof.
  2. An HMAC key. This is used to randomize the order of the blank node IDs to avoid potential information leakage via the blank node ID ordering. This is used only once, and is shared between issuer and holder. The HMAC in this case is functioning as a pseudorandom function (PRF).

The key material used for generating the test vectors to test add base proof is shown below. Hexadecimal representation is used for the BBS key pairs and the HMAC key.

Example 3: Private and Public keys for Signature
{
  "publicKeyHex": "a4ef1afa3da575496f122b9b78b8c24761531a8a093206ae7c45b80759c168ba4f7a260f9c3367b6c019b4677841104b10665edbe70ba3ebe7d9cfbffbf71eb016f70abfbb163317f372697dc63efd21fc55764f63926a8f02eaea325a2a888f",
  "privateKeyHex": "66d36e118832af4c5e28b2dfe1b9577857e57b042a33e06bdea37b811ed09ee0",
  "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF"
}

In our scenario, a sailor is registering with a race organizer for a series of windsurfing races to be held over a number of days on Maui. The organizer will inspect the sailor's equipment to certify that what has been declared is accurate. The sailor's unsigned equipment inventory is shown below.

Example 4: Credential without Proof
{
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    {
      "@vocab": "https://windsurf.grotto-networking.com/selective#"
    }
  ],
  "type": [
    "VerifiableCredential"
  ],
  "credentialSubject": {
    "sailNumber": "Earth101",
    "sails": [
      {
        "size": 5.5,
        "sailName": "Kihei",
        "year": 2023
      },
      {
        "size": 6.1,
        "sailName": "Lahaina",
        "year": 2023
      },
      {
        "size": 7.0,
        "sailName": "Lahaina",
        "year": 2020
      },
      {
        "size": 7.8,
        "sailName": "Lahaina",
        "year": 2023
      }
    ],
    "boards": [
      {
        "boardName": "CompFoil170",
        "brand": "Wailea",
        "year": 2022
      },
      {
        "boardName": "Kanaha Custom",
        "brand": "Wailea",
        "year": 2019
      }
    ]
  }
}

In addition to letting other sailors know what kinds of equipment their competitors may be sailing on, it is mandatory that each sailor disclose the year of their most recent windsurfing board and full details on two of their sails. Note that all sailors are identified by a sail number that is printed on all their equipment. This mandatory information is specified via an array of JSON pointers as shown below.

Example 5: Mandatory Pointers
["/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2"]

The result of applying the above JSON pointers to the sailor's equipment document is shown below.

Example 6: JSON Pointers and Values
[
  {
    "pointer": "/sailNumber",
    "value": "Earth101"
  },
  {
    "pointer": "/sails/1",
    "value": {
      "size": 6.1,
      "sailName": "Lahaina",
      "year": 2023
    }
  },
  {
    "pointer": "/boards/0/year",
    "value": 2022
  },
  {
    "pointer": "/sails/2",
    "value": {
      "size": 7,
      "sailName": "Lahaina",
      "year": 2020
    }
  }
]

Transformation of the unsigned document begins with canonicalizing the document, as shown below.

Example 7: Canonical Document
[
  "_:c14n0 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n",
  "_:c14n0 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n",
  "_:c14n0 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:c14n1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n",
  "_:c14n1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n",
  "_:c14n1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:c14n2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n",
  "_:c14n2 <https://www.w3.org/2018/credentials#credentialSubject> _:c14n6 .\n",
  "_:c14n3 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n",
  "_:c14n3 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n",
  "_:c14n3 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:c14n4 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n",
  "_:c14n4 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:c14n4 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:c14n5 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n",
  "_:c14n5 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n",
  "_:c14n5 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#boards> _:c14n0 .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#boards> _:c14n3 .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n1 .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n4 .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n5 .\n",
  "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n7 .\n",
  "_:c14n7 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n",
  "_:c14n7 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n",
  "_:c14n7 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
]

To prevent possible information leakage from the ordering of the blank node IDs these are processed through a PRF (i.e., the HMAC) to give the canonicalized HMAC document shown below. This represents an ordered list of statements that will be subject to mandatory and selective disclosure, i.e., it is from this list that statements are grouped.

Example 8: Canonical HMAC Document
[
  "_:b0 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n",
  "_:b0 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n",
  "_:b0 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:b1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n",
  "_:b1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n",
  "_:b1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:b2 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n",
  "_:b2 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n",
  "_:b2 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:b3 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n",
  "_:b3 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n",
  "_:b3 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n",
  "_:b4 <https://www.w3.org/2018/credentials#credentialSubject> _:b6 .\n",
  "_:b5 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n",
  "_:b5 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:b5 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b2 .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b7 .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b0 .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b1 .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b3 .\n",
  "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b5 .\n",
  "_:b7 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n",
  "_:b7 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n",
  "_:b7 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
]

The above canonical document gets grouped into mandatory and non-mandatory statements. The final output of the selective disclosure transformation process is shown below. Each statement is now grouped as mandatory or non-mandatory, and its index in the previous list of statements is remembered.

Example 9: Add Base Transformation
{
  "mandatoryPointers": [
    "/credentialSubject/sailNumber",
    "/credentialSubject/sails/1",
    "/credentialSubject/boards/0/year",
    "/credentialSubject/sails/2"
  ],
  "mandatory": {
    "dataType": "Map",
    "value": [
      [
        0,
        "_:b0 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n"
      ],
      [
        1,
        "_:b0 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n"
      ],
      [
        2,
        "_:b0 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ],
      [
        8,
        "_:b2 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ],
      [
        12,
        "_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n"
      ],
      [
        13,
        "_:b4 <https://www.w3.org/2018/credentials#credentialSubject> _:b6 .\n"
      ],
      [
        14,
        "_:b5 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n"
      ],
      [
        15,
        "_:b5 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ],
      [
        16,
        "_:b5 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ],
      [
        17,
        "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b2 .\n"
      ],
      [
        19,
        "_:b6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n"
      ],
      [
        20,
        "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b0 .\n"
      ],
      [
        23,
        "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b5 .\n"
      ]
    ]
  },
  "nonMandatory": {
    "dataType": "Map",
    "value": [
      [
        3,
        "_:b1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n"
      ],
      [
        4,
        "_:b1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n"
      ],
      [
        5,
        "_:b1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ],
      [
        6,
        "_:b2 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n"
      ],
      [
        7,
        "_:b2 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n"
      ],
      [
        9,
        "_:b3 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n"
      ],
      [
        10,
        "_:b3 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n"
      ],
      [
        11,
        "_:b3 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ],
      [
        18,
        "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b7 .\n"
      ],
      [
        21,
        "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b1 .\n"
      ],
      [
        22,
        "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b3 .\n"
      ],
      [
        24,
        "_:b7 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n"
      ],
      [
        25,
        "_:b7 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n"
      ],
      [
        26,
        "_:b7 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n"
      ]
    ]
  },
  "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF"
}

The next step is to create the base proof configuration and canonicalize it. This is shown in the following two examples.

Example 10: Base Proof Configuration
{
  "type": "DataIntegrityProof",
  "cryptosuite": "bbs-2023",
  "created": "2023-08-15T23:36:38Z",
  "verificationMethod": "did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ",
  "proofPurpose": "assertionMethod",
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    {
      "@vocab": "https://windsurf.grotto-networking.com/selective#"
    }
  ]
}
Example 11: Canonical Base Proof Configuration
_:c14n0 <http://purl.org/dc/terms/created> "2023-08-15T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
_:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> .
_:c14n0 <https://w3id.org/security#cryptosuite> "bbs-2023" .
_:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> .
_:c14n0 <https://w3id.org/security#verificationMethod> <did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ> .

In the hashing step, we compute the SHAKE-256 hash of the canonicalized proof options to produce the proofHash, and we compute the SHAKE-256 hash of the join of all the mandatory N-Quads to produce the mandatoryHash. These are shown below in hexadecimal format.

Example 12: Add Base Hashes
{
  "proofHash": "109514ed8101a836d240819e30630f48639bf7f1f247074e928eaad99e5775d4",
  "mandatoryHash": "e8bf46bff3db96eabc3a9410795dc94bc3537165e082f4a3e58841982fd7d4b3"
}

Shown below are the computed bbsSignature in hexadecimal, and the mandatoryPointers. These are are fed to the final serialization step with the hmacKey.

Example 13: Add Base Signing
{
  "bbsSignature": "93c7abe23fdf4856654bc858e607b7659af82b564340731454884724ec01e25360ac49e39cf0df7631535373042caed256abed6e81884e71a21590fef8dbe07e177dcedd8cfe94e4574c4ab51a22bdf9",
  "mandatoryPointers": [
    "/credentialSubject/sailNumber",
    "/credentialSubject/sails/1",
    "/credentialSubject/boards/0/year",
    "/credentialSubject/sails/2"
  ]
}

Finally, the values above are run through the algorithm of Section 3.2.1 serializeBaseProofValue, to produce the proofValue which is used in the signed base document shown below.

Example 14: Signed Base Document
{
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    {
      "@vocab": "https://windsurf.grotto-networking.com/selective#"
    }
  ],
  "type": [
    "VerifiableCredential"
  ],
  "credentialSubject": {
    "sailNumber": "Earth101",
    "sails": [
      {
        "size": 5.5,
        "sailName": "Kihei",
        "year": 2023
      },
      {
        "size": 6.1,
        "sailName": "Lahaina",
        "year": 2023
      },
      {
        "size": 7,
        "sailName": "Lahaina",
        "year": 2020
      },
      {
        "size": 7.8,
        "sailName": "Lahaina",
        "year": 2023
      }
    ],
    "boards": [
      {
        "boardName": "CompFoil170",
        "brand": "Wailea",
        "year": 2022
      },
      {
        "boardName": "Kanaha Custom",
        "brand": "Wailea",
        "year": 2019
      }
    ]
  },
  "proof": {
    "type": "DataIntegrityProof",
    "cryptosuite": "bbs-2023",
    "created": "2023-08-15T23:36:38Z",
    "verificationMethod": "did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ",
    "proofPurpose": "assertionMethod",
    "proofValue": "u2V0Cg9hAWFCTx6viP99IVmVLyFjmB7dlmvgrVkNAcxRUiEck7AHiU2CsSeOc8N92MVNTcwQsrtJWq-1ugYhOcaIVkP742-B-F33O3Yz-lORXTEq1GiK9-dhAWCAAESIzRFVmd4iZqrvM3e7_ABEiM0RVZneImaq7zN3u_4R4HS9jcmVkZW50aWFsU3ViamVjdC9zYWlsTnVtYmVyeBovY3JlZGVudGlhbFN1YmplY3Qvc2FpbHMvMXggL2NyZWRlbnRpYWxTdWJqZWN0L2JvYXJkcy8wL3llYXJ4Gi9jcmVkZW50aWFsU3ViamVjdC9zYWlscy8y"
  }
}

A.2 Derived Proof

To create a derived proof, a holder starts with a signed document containing a base proof. The base document we will use for these test vectors is the final example from Section A.1 Base Proof, above. The first step is to run the algorithm of Section 3.2.2 parseBaseProofValue to recover bbsSignature, hmacKey, and mandatoryPointers, as shown below.

Example 15: Recovered Base Signature Data
{
  "bbsSignature": "93c7abe23fdf4856654bc858e607b7659af82b564340731454884724ec01e25360ac49e39cf0df7631535373042caed256abed6e81884e71a21590fef8dbe07e177dcedd8cfe94e4574c4ab51a22bdf9",
  "hmacKey": "00112233445566778899aabbccddeeff00112233445566778899aabbccddeeff",
  "mandatoryPointers": [
    "/credentialSubject/sailNumber",
    "/credentialSubject/sails/1",
    "/credentialSubject/boards/0/year",
    "/credentialSubject/sails/2"
  ]
}

Next, the holder needs to indicate what else, if anything, they wish to reveal to the verifiers, by specifying JSON pointers for selective disclosure. In our windsurfing competition scenario, a sailor (the holder) has just completed their first day of racing, and wishes to reveal to the general public (the verifiers) all the details of the windsurfing boards they used in the competition. These are shown below. Note that this slightly overlaps with the mandatory disclosed information which included only the year of their most recent board.

Example 16: Selective Disclosure Pointers
["/credentialSubject/boards/0", "/credentialSubject/boards/1"]

To produce the revealDocument (i.e., the unsigned document that will eventually be signed and sent to the verifier), we append the selective pointers to the mandatory pointers, and input these combined pointers along with the document without proof to the selectJsonLd algorithm of [DI-ECDSA], to get the result shown below.

Example 17: Unsigned Reveal Document
{
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    {
      "@vocab": "https://windsurf.grotto-networking.com/selective#"
    }
  ],
  "type": [
    "VerifiableCredential"
  ],
  "credentialSubject": {
    "sailNumber": "Earth101",
    "sails": [
      {
        "size": 6.1,
        "sailName": "Lahaina",
        "year": 2023
      },
      {
        "size": 7,
        "sailName": "Lahaina",
        "year": 2020
      }
    ],
    "boards": [
      {
        "year": 2022,
        "boardName": "CompFoil170",
        "brand": "Wailea"
      },
      {
        "boardName": "Kanaha Custom",
        "brand": "Wailea",
        "year": 2019
      }
    ]
  }
}

Now that we know what the revealed document looks like, we need to furnish appropriately updated information to the verifier about which statements are mandatory, and the indexes for the selected non-mandatory statements. Running step 6 of the 3.2.3 createDisclosureData yields an abundance of information about various statement groups relative to the original document. Below we show a portion of the indexes for those groups.

Example 18: Derived Group Indexes
{
  "combinedIndexes": [0, 1, 2, 6, 7, 8, 12, 13, 14, 15, 16, 17, 18, 19, 20, 23, 24, 25, 26],
  "mandatoryIndexes": [0, 1, 2, 8, 12, 13, 14, 15, 16, 17, 19, 20, 23 ],
  "nonMandatoryIndexes": [3, 4, 5, 6, 7, 9, 10, 11, 18, 21, 22, 24, 25, 26],
  "selectiveIndexes": [6, 7, 8, 12, 13, 17, 18, 24, 25, 26]
}

The verifier needs to be able to aggregate and hash the mandatory statements. To enable this, we furnish them with a list of indexes of the mandatory statements adjusted to their positions in the reveal document (i.e., relative to the combinedIndexes), while the selectiveIndexes need to be adjusted relative to their positions within the nonMandatoryIndexes. These "adjusted" indexes are shown below.

Example 19: Adjusted Mandatory and Selective Indexes
{
  "adjMandatoryIndexes":[0,1,2,5,6,7,8,9,10,11,13,14,15],
  "adjSelectiveIndexes":[3,4,8,11,12,13]
}

The last important piece of disclosure data is a mapping of canonical blank node IDs to HMAC-based shuffled IDs, the labelMap, computed according to Section 3.2.3 createDisclosureData. This is shown below along with the rest of the disclosure data minus the reveal document.

Example 20: Disclosure Data
{
  "bbsProof":"b29c719aba8103c713c5facba9b690930ad458816645adc1a53b251010bc3b128d72580239f66ff4e9739e28425794e881b5737fb3abce02b2655d4fb3babebd515685ce7567eab5bd01360e8131150576357509db309294569d822d56e1c581420a8af29b7c7984d50fd5c79a06d64a2586da8a24e93c3742d09f2c0e24d7fe4891927c7ffe408d563a64f586737867a1f020f742fc6eaa1d37eda426c9c75566de8be54822f69749fc462c86caaaf4f9f73ee1b08726f378432e382322a3cc0e87d5b23fc36364bc5c94cfb8a305be6f912bd7152e7a48d4d41571c653d58e5fea8a8238e05aea910e5b62c9d15b8d527c0d59f619fbab6a8799b1ce1da13c6516c23eefc03b247672878c34949943e02f4b3991139276c89a00c4ee64bbce570201ac3502fb4769e6b869919320ad9f3121dfeeecdb2914cfc7d4a386b6153f54b18b4148742ec7b66c81cff0b1de88d2d299f35f2ff817fb422fe0bbf65b5cd7deb939a10cc524f08eff46f31b5631afbd0551d9816e32fb2e4bb7214ce76136057c1298e2a161b5ec3280f0530130ab9600426c7e521d1b893850ae83cf4f211987c93f3a41c16b0cbac29e5dcf88eb65892518f643d5c2acd4888045d4",
  "labelMap":{"dataType":"Map",
    "value":[["c14n0","b2"],["c14n1","b4"],["c14n2","b7"],["c14n3","b6"],["c14n4","b5"],["c14n5","b0"]]
  },
  "mandatoryIndexes":[0,1,2,5,6,7,8,9,10,11,13,14,15],
  "adjSelectiveIndexes":[3,4,8,11,12,13]
}

Finally, using the disclosure data above with the algorithm of Section 3.2.6 serializeDerivedProofValue, we obtain the signed derived (reveal) document shown below.

Example 21: Signed Derived Document
{
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    {
      "@vocab": "https://windsurf.grotto-networking.com/selective#"
    }
  ],
  "type": [
    "VerifiableCredential"
  ],
  "credentialSubject": {
    "sailNumber": "Earth101",
    "sails": [
      {
        "size": 6.1,
        "sailName": "Lahaina",
        "year": 2023
      },
      {
        "size": 7,
        "sailName": "Lahaina",
        "year": 2020
      }
    ],
    "boards": [
      {
        "year": 2022,
        "boardName": "CompFoil170",
        "brand": "Wailea"
      },
      {
        "boardName": "Kanaha Custom",
        "brand": "Wailea",
        "year": 2019
      }
    ]
  },
  "proof": {
    "type": "DataIntegrityProof",
    "cryptosuite": "bbs-2023",
    "created": "2023-08-15T23:36:38Z",
    "verificationMethod": "did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ",
    "proofPurpose": "assertionMethod",
    "proofValue": "u2V0DhNhAWQHAspxxmrqBA8cTxfrLqbaQkwrUWIFmRa3BpTslEBC8OxKNclgCOfZv9OlznihCV5TogbVzf7OrzgKyZV1Ps7q-vVFWhc51Z-q1vQE2DoExFQV2NXUJ2zCSlFadgi1W4cWBQgqK8pt8eYTVD9XHmgbWSiWG2ook6Tw3QtCfLA4k1_5IkZJ8f_5AjVY6ZPWGc3hnofAg90L8bqodN-2kJsnHVWbei-VIIvaXSfxGLIbKqvT59z7hsIcm83hDLjgjIqPMDofVsj_DY2S8XJTPuKMFvm-RK9cVLnpI1NQVccZT1Y5f6oqCOOBa6pEOW2LJ0VuNUnwNWfYZ-6tqh5mxzh2hPGUWwj7vwDskdnKHjDSUmUPgL0s5kROSdsiaAMTuZLvOVwIBrDUC-0dp5rhpkZMgrZ8xId_u7NspFM_H1KOGthU_VLGLQUh0Lse2bIHP8LHeiNLSmfNfL_gX-0Iv4Lv2W1zX3rk5oQzFJPCO_0bzG1Yxr70FUdmBbjL7Lku3IUznYTYFfBKY4qFhtewygPBTATCrlgBCbH5SHRuJOFCug89PIRmHyT86QcFrDLrCnl3PiOtliSUY9kPVwqzUiIBF1KYAAgEEAgcDBgQFBQCNAAECBQYHCAkKCw0OD4YDBAgLDA0"
  }
}

B. Acknowledgements

Portions of the work on this specification have been funded by the United States Department of Homeland Security's (US DHS) Silicon Valley Innovation Program under contracts 70RSAT20T00000003, and 70RSAT20T00000033. The content of this specification does not necessarily reflect the position or the policy of the U.S. Government and no official endorsement should be inferred.

C. References

C.1 Normative references

[CFRG-BBS-SIGNATURE]
The BBS Signature Scheme. Tobias Looker; Vasilis Kalos; Andrew Whitehead; Mike Lodder. Draft. URL: https://www.ietf.org/archive/id/draft-irtf-cfrg-bbs-signatures-02.html
[DI-ECDSA]
The Elliptic Curve Digital Signature Algorithm Cryptosuites v1.0. David Longley; Manu Sporny; Marty Reed. W3C Verifiable Credentials Working Group. W3C Working Draft. URL: https://www.w3.org/TR/vc-di-ecdsa/
[MULTIBASE]
Multibase. URL: https://tools.ietf.org/html/draft-multiformats-multibase-01
[RDF-CANON]
RDF Dataset Canonicalization. Gregg Kellogg; Dave Longley; Dan Yamamoto. W3C. 31 October 2023. W3C Candidate Recommendation. URL: https://www.w3.org/TR/rdf-canon/
[RDF-DATASET-NORMALIZATION]
RDF Dataset Normalization 1.0. David Longley; Manu Sporny. JSON-LD Community Group. CGDRAFT. URL: http://json-ld.github.io/normalization/spec/
[RFC2119]
Key words for use in RFCs to Indicate Requirement Levels. S. Bradner. IETF. March 1997. Best Current Practice. URL: https://www.rfc-editor.org/rfc/rfc2119
[RFC3986]
Uniform Resource Identifier (URI): Generic Syntax. T. Berners-Lee; R. Fielding; L. Masinter. IETF. January 2005. Internet Standard. URL: https://www.rfc-editor.org/rfc/rfc3986
[RFC8174]
Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words. B. Leiba. IETF. May 2017. Best Current Practice. URL: https://www.rfc-editor.org/rfc/rfc8174
[VC-DATA-INTEGRITY]
Verifiable Credential Data Integrity 1.0. David Longley; Manu Sporny. W3C Verifiable Credentials Working Group. Working Draft. URL: https://www.w3.org/TR/vc-data-integrity/
[XMLSCHEMA11-2]
W3C XML Schema Definition Language (XSD) 1.1 Part 2: Datatypes. David Peterson; Sandy Gao; Ashok Malhotra; Michael Sperberg-McQueen; Henry Thompson; Paul V. Biron et al. W3C. 5 April 2012. W3C Recommendation. URL: https://www.w3.org/TR/xmlschema11-2/

C.2 Informative references

[DID-CORE]
Decentralized Identifiers (DIDs) v1.0. Manu Sporny; Amy Guy; Markus Sabadello; Drummond Reed. W3C. 19 July 2022. W3C Recommendation. URL: https://www.w3.org/TR/did-core/
[MULTICODEC]
Multicodec. URL: https://github.com/multiformats/multicodec/
[VC-DATA-MODEL-2.0]
Verifiable Credentials Data Model v2.0. Manu Sporny; Orie Steele; Michael Jones; Gabe Cohen; Oliver Terbu. W3C. 1 November 2023. W3C Working Draft. URL: https://www.w3.org/TR/vc-data-model-2.0/