Copyright © 2023 World Wide Web Consortium. W3C® liability, trademark and permissive document license rules apply.
This specification describes a Data Integrity Cryptosuite for use when generating a digital signature using the Elliptic Curve Digital Signature Algorithm (ECDSA).
This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.
This is an experimental specification and is undergoing regular revisions. It is not fit for production deployment.
This document was published by the Verifiable Credentials Working Group as a Working Draft using the Recommendation track.
Publication as a Working Draft does not imply endorsement by W3C and its Members.
This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.
This document is governed by the 12 June 2023 W3C Process Document.
This specification defines a cryptographic suite for the purpose of creating, and verifying proofs for ECDSA signatures in conformance with the Data Integrity [VC-DATA-INTEGRITY] specification. ECDSA signatures are specified in [FIPS-186-5] with elliptic curves P-256 and P-384 specified in [NIST-SP-800-186]. [FIPS-186-5] includes the deterministic ECDSA algorithm which is also specified in [RFC6979].
This specification uses either the RDF Dataset Canonicalization Algorithm [RDF-CANON] or the JSON Canonicalization Scheme [RFC8785] to transform the input document into its canonical form. It uses one of two mechanisms to digest and sign: SHA-256 [RFC6234] as the message digest algorithm and ECDSA with Curve P-256 as the signature algorithm, or SHA-384 [RFC6234] as the message digest algorithm and ECDSA with Curve P-384 as the signature algorithm.
The elliptic curves P-256 and P-384 of [NIST-SP-800-186] are referred to as secp256r1 and secp384r1 respectively in [SECG2]. In addition, this notation is sometimes used in ECDSA software libraries.
This section defines the terms used in this specification. A link to these terms is included whenever they appear in this specification.
example.com, an
ad-hoc value such as mycorp-level3-access, or a very
specific transaction value like 8zF6T8J34qP3mqP. A signer could
include a domain in its digital proof to restrict its use
to particular target, identified by the specified domain.
id property in a controller document.
Anything can be a subject: person, group, organization, physical thing, digital
thing, logical thing, etc.
A set of parameters that can be used together with a process to independently verify a proof. For example, a cryptographic public key can be used as a verification method with respect to a digital signature; in such usage, it verifies that the signer possessed the associated cryptographic private key.
"Verification" and "proof" in this definition are intended to apply broadly. For example, a cryptographic public key might be used during Diffie-Hellman key exchange to negotiate a shared symmetric key for encryption. This guarantees the integrity of the key agreement process. It is thus another type of verification method, even though descriptions of the process might not use the words "verification" or "proof."
As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.
The key words MAY, MUST, and MUST NOT in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.
A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections 2. Data Model and 3. Algorithms of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.
This document also contains examples that contain JSON and JSON-LD content. Some
of these examples contain characters that are invalid JSON, such as inline
comments (//) and the use of ellipsis (...) to denote
information that adds little value to the example. Implementers are cautioned to
remove this content if they desire to use the information as valid JSON or
JSON-LD.
The following sections outline the data model that is used by this specification to express verification methods, such as cryptographic public keys, and data integrity proofs, such as digital signatures.
These verification methods are used to verify Data Integrity Proofs [VC-DATA-INTEGRITY] produced using Elliptic Curve cryptographic key material that is compliant with [FIPS-186-5]. The encoding formats for these key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used during the processing of digital signatures.
The Multikey format, as defined in [VC-DATA-INTEGRITY], is used to express public keys for the cryptographic suites defined in this specification.
The publicKeyMultibase property represents a Multibase-encoded Multikey
expression of a P-256 or P-384 public key. The encoding of a P-256 public key is
the two-byte prefix 0x8024 (the varint expression of 0x1200) followed
by the 33-byte compressed public key data.
The 35-byte value is then encoded using base58-btc (z) as the prefix. The
encoding of a P-384 public key is the two-byte prefix 0x8124 (the varint
expression of 0x1201) followed by the 49-byte compressed public key data.
The 51-byte value is then encoded using base58-btc (z) as the prefix. Any
other encodings MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private
key. Implementations of this specification will raise errors in the event of a
[MULTICODEC] value other than 0x1200 or 0x1201 being used in a
publicKeyMultibase value.
{
"id": "https://example.com/issuer/123#key-0",
"type": "Multikey",
"controller": "https://example.com/issuer/123",
"publicKeyMultibase": "zDnaerx9CtbPJ1q36T5Ln5wYt3MQYeGRG5ehnPAmxcf5mDZpv"
}
{
"id": "https://example.com/issuer/123#key-0",
"type": "Multikey",
"controller": "https://example.com/issuer/123",
"publicKeyMultibase": "z82LkvCwHNreneWpsgPEbV3gu1C6NFJEBg4srfJ5gdxEsMGRJ
Uz2sG9FE42shbn2xkZJh54"
}
{
"@context": [
"https://www.w3.org/ns/did/v1",
"https://w3id.org/security/data-integrity/v1"
],
"id": "did:example:123",
"verificationMethod": [{
"id": "https://example.com/issuer/123#key-1",
"type": "Multikey",
"controller": "https://example.com/issuer/123",
"publicKeyMultibase": "zDnaerx9CtbPJ1q36T5Ln5wYt3MQYeGRG5ehnPAmxcf5mDZpv"
}, {
"id": "https://example.com/issuer/123#key-2",
"type": "Multikey",
"controller": "https://example.com/issuer/123",
"publicKeyMultibase": "z82LkvCwHNreneWpsgPEbV3gu1C6NFJEBg4srfJ5gdxEsMGRJ
Uz2sG9FE42shbn2xkZJh54"
}],
"authentication": [
"did:example:123#key-1"
],
"assertionMethod": [
"did:example:123#key-2"
],
"capabilityDelegation": [
"did:example:123#key-2"
],
"capabilityInvocation": [
"did:example:123#key-2"
]
}
This suite relies on detached digital signatures represented using [MULTIBASE] and [MULTICODEC].
The verificationMethod property of the proof MUST be a URL.
Dereferencing the verificationMethod MUST result in an object
containing a type property with the value set to
Multikey.
The type property of the proof MUST be DataIntegrityProof.
The cryptosuite property of the proof MUST be ecdsa-rdfc-2019 or ecdsa-jcs-2019.
The created property of the proof MUST be an [XMLSCHEMA11-2]
formatted date string.
The proofPurpose property of the proof MUST be a string, and MUST
match the verification relationship expressed by the verification method
controller.
The proofValue property of the proof MUST be an ECDSA or deterministic ECDSA
signature produced according to [FIPS-186-5] using the curves and hashes as
specified in section 3. Algorithms, encoded according to section 7
of [RFC4754] (sometimes referred to as the IEEE P1363 format), and serialized
according to [MULTIBASE] using the base58-btc base encoding.
{
"@context": [
{"title": "https://schema.org/title"},
"https://w3id.org/security/data-integrity/v1"
],
"title": "Hello world!",
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "ecdsa-2019",
"created": "2020-11-05T19:23:24Z",
"verificationMethod": "https://example.com/issuer/123#key-2",
"proofPurpose": "assertionMethod",
"proofValue": "z4oey5q2M3XKaxup3tmzN4DRFTLVqpLMweBrSxMY2xHX5XTYVQeVbY8nQA
VHMrXFkXJpmEcqdoDwLWxaqA3Q1geV6"
}
}
The following section describes multiple Data Integrity cryptographic suites that utilize the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5].
The ecdsa-rdfc-2019 cryptographic suite takes an input document, canonicalizes
the document using the Universal RDF Dataset Canonicalization Algorithm
[RDF-CANON], and then cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used, implementations of that algorithm will detect dataset poisoning by default, and abort processing upon detection.
To generate a proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.1.3 Transformation (ecdsa-rdfc-2019), the hashing algorithm is defined in Section 3.1.4 Hashing (ecdsa-rdfc-2019), and the proof serialization algorithm is defined in Section 3.1.6 Proof Serialization (ecdsa-rdfc-2019).
To verify a proof, the algorithm in Section 4.2: Verify Proof in the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.1.3 Transformation (ecdsa-rdfc-2019), the hashing algorithm is defined in Section 3.1.4 Hashing (ecdsa-rdfc-2019), and the proof verification algorithm is defined in Section 3.1.7 Proof Verification (ecdsa-rdfc-2019).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.1.4 Hashing (ecdsa-rdfc-2019).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
DataIntegrityProof and options.cryptosuite is not
set to the string ecdsa-rdfc-2019 then a PROOF_TRANSFORMATION_ERROR MUST be
raised.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.1.6 Proof Serialization (ecdsa-rdfc-2019) or Section 3.1.7 Proof Verification (ecdsa-rdfc-2019). One must use the hash algorithm appropriate in security level to the curve used, i.e., for curve P-256 one uses SHA-256 and for curve P-384 one uses SHA-384.
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof and
proofConfig.cryptosuite is not set to ecdsa-rdfc-2019, an
INVALID_PROOF_CONFIGURATION error MUST be raised.
INVALID_PROOF_DATETIME error MUST be raised.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes) and proof options (options). A verification result represented as a boolean value is produced as output.
The ecdsa-jcs-2019 cryptographic suite takes an input document, canonicalizes
the document using the JSON Canonicalization Scheme [RFC8785], and then
cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
To generate a proof, the algorithm in Section 4.1: Add Proof of the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite-specific transformation algorithm is defined in Section 3.2.3 Transformation (ecdsa-jcs-2019), the hashing algorithm is defined in Section 3.2.4 Hashing (ecdsa-jcs-2019), and the proof serialization algorithm is defined in Section 3.2.6 Proof Serialization (ecdsa-jcs-2019).
To verify a proof, the algorithm in Section 4.2: Verify Proof of the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite-specific transformation algorithm is defined in Section 3.2.3 Transformation (ecdsa-jcs-2019), the hashing algorithm is defined in Section 3.2.4 Hashing (ecdsa-jcs-2019), and the proof verification algorithm is defined in Section 3.2.7 Proof Verification (ecdsa-jcs-2019).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.2.4 Hashing (ecdsa-jcs-2019).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
DataIntegrityProof and options.cryptosuite is not
set to the string ecdsa-jcs-2019, then a PROOF_TRANSFORMATION_ERROR MUST be
raised.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.2.6 Proof Serialization (ecdsa-jcs-2019) or Section 3.2.7 Proof Verification (ecdsa-jcs-2019). One must use the hash algorithm appropriate in security level to the curve used, i.e., for curve P-256 one uses SHA-256, and for curve P-384 one uses SHA-384.
The required inputs to this algorithm are a transformed data document (transformedDocument) and a canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof and
proofConfig.cryptosuite is not set to ecdsa-jcs-2019, an
INVALID_PROOF_CONFIGURATION error MUST be raised.
INVALID_PROOF_DATETIME error MUST be raised.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes), and proof options (options). A verification result represented as a boolean value is produced as output.
The Working Group is seeking implementer feedback on these generalized selective disclosure functions as well as horizonal security review on the features from parties at W3C and IETF. Those reviews might result in significant changes to these functions, migration of these functions to the core Data Integrity specification (for use by other cryptographic suites), or the removal of the algorithm from the specification during the Candidate Recommendation phase.
The following section contains a set of functions that are used throughout cryptographic suites that perform selective disclosure.
The following algorithm canonizes a JSON-LD document and replaces any blank node identifiers in the canonicalized document by applying a label replacement function, labelReplacementFunction. The required inputs are a JSON-LD document (document) and a label replacement functon (labelReplacementFunction). A N-Quads representation of the canonized result, with the replaced blank node labels, and a map from the old blank node IDs to the new blank node IDs, bnodeIdMap, is produced as output.
The following algorithm creates a label replacement function that uses an HMAC to replace canonical blank node identifiers with their encoded HMAC digests. The required inputs are a canonical node identifier map, canonicalIdMap. A blank node identifier map, bnodeIdMap, is produced as output.
A different primitive could be created that sorted the resulting HMAC digests and assigned labels using a prefix and integers based on their sorted order instead. This primitive might be useful for index-based selective disclosure schemes such as BBS.
The following algorithm creates a label replacement function that uses a label map to replace canonical blank node identifiers with the associated value from the labeel map. The required inputs are a label map, labelMap. A function, labelMapReplacementFunction, is produced as output.
The following algorithm replaces all blank node identifiers in an array of N-Quad statements with a URN. The required inputs are an array of N-Quad strings (inputNquads) and a URN scheme (urnScheme). An array of N-Quad strings, skolemizedNquads, is produced as output.
s1.replace(/(_:([^\s]+))/g, '<urn:custom-scheme:$2>').
The following algorithm replaces all custom scheme URNs in an array of N-Quad statements with a blank node identifier. The required inputs are an array of N-Quad strings (inputNquads) and a URN scheme (urnScheme). An array of N-Quad strings, deskolemizedNquads, is produced as output.
s1.replace(/(<urn:custom-scheme:([^>]+)>)/g, '_:$2')..
The following algorithm converts an array of N-Quads to a skolemized JSON-LD document. The required inputs are an array of N-Quad strings (inputNquads). A JSON-LD document, skolemizedJSONLD, is produced as output.
skolemizedQuads to the result of calling the algorithm in
Section 3.3.4 skolemize, with inputNQuads and
"custom-scheme" as parameters. Implementations MAY choose a different
urnSchemeName that is different than "custom-scheme" so long as the
same scheme name is used in the algorithm in Section
3.3.7 toDeskolemizedRDF.
skolemizedQuads into a single N-Quads string, dataset.
dataset from RDF to a JSON-LD document.
The following algorithm converts a skolemized JSON-LD document, such as one created using the algorithm in Section 3.3.6 toSkolemizedJSONLD, to an array of deskolemized N-Quads. The required inputs are a JSON-LD document, skolemizedJSONLD. An array of deskolemized N-Quad strings (outputNquads) is produced as output.
skolemizedDataset to the result of calling the
Deserialize JSON-LD to RDF algorithm, passing any custom options (such as a
document loader), to convert skolemizedJSONLD from JSON-LD to RDF in N-Quads
format.
skolemizedDataset into an array of individual N-Quads,
skolemizedNquads.
skolemizedNquads and "custom-scheme" as
parameters. Implementations MAY choose a different urnSchemeName that
is different than "custom-scheme" so long as the same scheme name is used in
the algorithm in Section 3.3.6 toSkolemizedJSONLD.
The following algorithm converts a JSON Pointer [RFC6901] to an array of paths into a JSON tree. The required input is a JSON Pointer string (pointer). An array of paths (paths) is produced as output.
paths to an empty array.
splitPath to an array by splitting pointer on the
"/" character and skipping the first, empty, split element. In Javascript
notation, this step is equivalent to the following code:
pointer.split('/').slice(1)
path in splitPath:
path does not include ~, then add path to paths, converting it to
an integer if it parses as one, leaving it as a string if it does not.
path and add the
result to paths.
paths.
The following algorithm creates an initial JSON-LD frame based on a JSON-LD object. This is a helper function used within the algorithm in Section 3.3.10 jsonPointersToFrame. The required input is a JSON-LD object (value). A JSON-LD frame frame is produced as output.
id that is not a blank node identifier, set
frame.id to its value. Note: All non-blank node identifiers in the path of
any JSON Pointer MUST be included in the frame, this includes any root document
identifier.
type is set, set frame.type to its value.
Note: All types in the path of any JSON Pointer MUST be included in the frame,
this includes any root document type.
The following algorithm converts an array of JSON Pointers and a JSON-LD document to a JSON-LD Frame to be used on that specific document. The required input is an array of JSON Pointers (pointers) and a JSON-LD document (document). A JSON-LD frame (frame) is produced as output.
pointers is empty, return null.
frame to an initial frame passing document as value to
the algorithm in Section 3.3.9 createInitialFrame.
pointer in pointers walk the document from root to the pointer
target value building the frame:
parentFrame to frame.
parentValue to document.
value to parentValue.
valueFrame to parentFrame.
pointer into an array of paths using the algorithm in
Section 3.3.8 jsonPointerToPaths.
path in paths:
parentFrame to valueFrame.
parentValue to value.
value to parentValue[path]. If value is now undefined, throw an error
indicating that the JSON pointer does not match the given document.
valueFrame to parentFrame[path].
valueFrame is undefined:
value is an array, set valueFrame to an empty array.
valueFrame to an initial frame passing value to
the algorithm in Section 3.3.9 createInitialFrame.
parentFrame[path] to valueFrame.
valueFrame.
value is not an object, then a literal has been selected: Set valueFrame
to value.
value is an array: Set valueFrame to the result of mapping
every element in value to a deep copy of itself. If any element in value is
also an array, throw an error indicating that arrays of arrays are not
supported.
valueFrame to an object that merges a shallow copy of
valueFrame with a deep copy of value, e.g., {...valueFrame,
…deepCopy(value)}.
paths has a length of zero, then the whole document has been selected by
the pointer: Set frame to valueFrame.
path, lastPath, from paths.
parentFrame[lastPath] to valueFrame.
frame['@context'] to a deep copy of document['@context'].
frame.
The following algorithm performs a JSON-LD framing operation on a JSON-LD document with strict framing options. The required inputs are a JSON-LD Document (document) and a JSON-LD Frame (frame). A JSON-LD document (framedDocument) is generated as output.
document and frame, and setting the
options requireAll, explicit, and omitGraph to true. Any additional
custom options passed, such as a document loader, is included as well.
The following algorithm groups N-Quads into matching and non-matching groups. The inputs are an array of N-Quads (nquads, an optional skolemized JSON-LD document (skolemizedDocument), an optional JSON-LD frame (frame) , and any options, such as a document loader, to be passed to JSON-LD APIs. Each of the output groups (matching and non-matching) are expressed as a map that maps an index into nquads to the N-Quad value. This algorithm uses a JSON-LD frame to match specific N-Quads in the array of given nquads. It internally skolemizes and then deskolemizes any blank nodes around the framing operation to ensure blank node identifiers do not change, preventing the matching operation from working properly. An object containing a matching and nonmatching arrays of N-Quads are generated as output.
matching to an empty map.
nonMatching to an empty map.
frame is not given or null, then there are no matches so:
nquads to nonMatching.
matching and "nonMatching" set to
nonMatching.
skolemizedDocument has not been given: Set skolemizedDocument to the
result of calling "createSkolemizedDocument", passing nquads and any custom
JSON-LD API options (such as a document loader).
framed to the result of calling "strictFrame", passing
skolemizedDocument, frame, and any custom JSON-LD API options. Note: This
step filters the skolemized document to get only data that matches the frame as
a new JSON-LD document.
matchingDeskolemized to the result of calling "toDeskolemizedRDF",
passing framed and any custom JSON-LD API options. Note: This step converts
any matching data back to deskolemized N-Quads, matching their original
expression.
index, nq) in nquads:
matchingDeskolemized includes nq, add the entry to matching.
nonMatching.
matching and "nonMatching" set to
nonMatching.
The following algorithm filters N-Quads, given in an array of N-Quads and a JSON-LD filtering frame, and then groups the N-Quads that passed the filter into matching and non-matching groups based on another JSON-LD grouping frame. This function will internally perform skolemization and deskolemization around framing operations to ensure that any blank node identifiers do not change, which would prevent filtering and matching operations from working properly. The inputs to the algorithm are an array of N-Quads (nquads), a JSON-LD filtering frame (filterFrame), a JSON-LD grouping frame (groupFrame). Additionally, any custom JSON-LD API options are expected to be given as an input. An object containing two properties is provided as output; matching and nonmatching each hold arrays of their associated N-Quads.
skolemizedDocument to the result of calling the algorithm in
Section 3.3.6 toSkolemizedJSONLD, passing nquads and any custom
JSON-LD API options (such as a document loader).
filteredDocument to the result of calling the algorithm in Section
3.3.11 strictFrame, passing skolemizedDocument, filterFrame, and
any custom JSON-LD API options.
filteredNQuads to the result of calling the algorithm in Section 3.3.7 toDeskolemizedRDF, passing filteredDocument and any custom
JSON-LD API options.
canonicalIdMap, by calling
[RDF-CANON], passing the joined filteredNQuads. Canonicalize `filteredNQuads
Note: These two steps can be performed in parallel.
groupResult by calling the algorithm in Section
3.3.12 groupNquads, passing filteredNQuads, filteredDocument,
groupFrame, and any custom JSON-LD API options.
groupResult is different; it contains matching and
non-matching maps using the filteredNQuads indexes. Both maps of indexes are
useful to callers.
matching to a new map.
nonMatching to a new map.
groupResult.matching.
groupResult.nonMatching.
index, nq) in nquads:
filteredMatches includes nq then add the entry to matching.
filteredNonMatching includes nq then add the entry to
nonMatching.
labelMap to the reverse of canonicalIdMap. labelMap uses
canonical blank node identifiers as keys and original blank node identifiers as
values.
groupResult, "labelMap" set to
labelMap, "matching" to matching, and "nonMatching" to nonMatching.
The following algorithm cryptographically hashes an array of mandatory to disclose N-Quads using a provided hashing API. The required input is an array of mandatory to disclose N-Quads (mandatory) and a hashing function (hasher). A cryptographic hash (mandatoryHash) is produced as output.
bytes to the UTF-8 representation of the joined mandatory
N-Quads.
mandatoryHash to the result of using hasher to hash bytes.
mandatoryHash.
The Working Group is seeking implementer feedback on these cryptographic suite functions as well as horizonal security review on the feature from parties at W3C and IETF. Those reviews might result in significant changes to these algorithms, or the removal of the algorithms from the specification during the Candidate Recommendation phase.
This section contains subalgorithms that are useful to the ecdsa-sd-2023
cryptographic suite.
The following algorithm serializes the data that is to be signed by the private key associated with the base proof verification method. The required inputs are the proof options hash (proofHash), the proof-scoped multikey-encoded public key (publicKey), and the mandatory hash (mandatoryHash). A single sign data value, represented as series of bytes, is produced as output.
The following algorithm serializes the base proof value, including the base signature, public key, HMAC key, signatures, and mandatory pointers. The required inputs are a base signature baseSignature, a public key publicKey, an HMAC key hmacKey, an array of signatures, and an array of mandatoryPointers. A single base proof string value is produced as output.
proofValue, that starts with the ECDSA-SD base proof
header bytes 0xd9, 0x5d, and 0x00.
components to an array with five elements containing the values of:
baseSignature, publicKey, hmacKey, signatures, and mandatoryPointers.
components and append it to proofValue.
baseProof to a string with the multibase-base64url-no-pad-encoding
of proofValue. That is, return a string starting with "u" and ending with the
base64url-no-pad-encoded value of proofValue.
baseProof as base proof.
The following algorithm parses the components of an ecdsa-sd-2023 selective
disclosure base proof value. The required inputs are a proof value
(proofValue). A single object parsed base proof, containing
five elements, using the names "baseSignature", "publicKey", "hmacKey",
"signatures", and "mandatoryPointers", is produced as output.
proofValue string starts with u, indicating that it is a
multibase-base64url-no-pad-encoded value, throwing an error if it does not.
decodedProofValue to the result of base64url-no-pad-decoding the
substring after the leading u in proofValue.
decodedProofValue starts with the ECDSA-SD base proof header
bytes 0xd9, 0x5d, and 0x00, throwing an error if it does not.
components to an array that is the result of CBOR-decoding the
bytes that follow the three-byte ECDSA-SD base proof header. Ensure the result
is an array of five elements.
The following algorithm creates data to be used to generate a derived proof. The inputs include a JSON-LD document (document), an ECDSA-SD base proof (proof), an array of JSON pointers to use to selectively disclose statements (selectivePointers), and any custom JSON-LD API options, such as a document loader). A single object, disclosure data, is produced as output, which contains the "baseSignature", "publicKey", "signatures" for "filteredSignatures", "labelMap", "mandatoryIndexes", and "revealDocument" fields.
baseSignature, publicKey, hmacKey, signatures, and
mandatoryPointers to the values of the associated properties in the object
returned when calling the algorithm in Section
3.4.3 parseBaseProofValue, passing the proofValue from proof.
hmac to an HMAC API using hmacKey. The HMAC uses the same hash
algorithm used in the signature algorithm, i.e., SHA-256 for a P-256 curve.
nquads to the result of calling the algorithm in Section
3.3.2 hmacIdCanonize, passing document, hmac, and any custom
JSON-LD API options as parameters. Note: This step transforms the document into
an array of canonical N-Quads with pseudorandom blank node identifiers based on
hmac.
mandatoryFrame to the result of calling the algorithm in Section 3.3.10 jsonPointersToFrame, passing document and mandatoryPointers as
pointers.
combinedFrame to the result of calling the
3.3.10 jsonPointersToFrame primitive, passing document and the
concatenation of mandatoryPointers and selectivePointers as pointers.
mandatoryFrame and combinedFrame are both null, then throw an error
indicating that nothing is to be disclosed.
revealDocument to the result of calling the algorithm in
Section 3.3.11 strictFrame, passing document, combinedFrame as
frame, and any custom JSON-LD API options.
filterAndGroupResult to the result of calling the algorithm in
Section 3.3.13 filterAndGroupNquads, passing nquads, combinedFrame for
filterFrame, mandatoryFrame for groupFrame, and any custom JSON-LD API
options.
labelMap to the value of "labelMap" in filterAndGroupResult.
relativeMandatory to the value of "matching" in the value of
"filtered" in filterAndGroupResult.
absoluteMandatory to the value of "matching" in
filterAndGroupResult.
absoluteNonMandatory to the value of "nonMatching" in
filterAndGroupResult.
mandatoryIndexes to the keys from relativeMandatory.
signatures to match with absoluteNonMandatory map keys:
index to 0.
filteredSignatures to an empty array.
signature in signatures:
index is in absoluteMandatory, increment index.
index is in absoluteNonMandatory, add signature to
filteredSignatures.
index.
baseSignature, publicKey,
"signatures" for filteredSignatures, labelMap, mandatoryIndexes, and
revealDocument.
The following algorithm compresses a label map. The required inputs are label map (labelMap). The output is a compressed label map.
map to an empty map.
k, v) in labelMap:
map with a key that is a base-10 integer parsed from the
characters following the "c14n" prefix in k and a value that is a byte array
resulting from base64url-no-pad-decoding the characters after the "u" prefix in
v.
map as compressed label map.
The following algorithm decompresses a label map. The required input is a compressed label map (compressedLabelMap). The output is a decompressed label map.
map to an empty map.
k, v) in compressedLabelMap:
map with a key that adds the prefix "c14n" to k and a value
that adds a prefix of "u" to the base64url-no-pad-encoded value for v.
map as decompressed label map.
The following algorithm serializes a derived proof value. The required inputs are a base signature (baseSignature), public key (publicKey), an array of signatures (signatures), a label map (labelMap), and an array of mandatory indexes (mandatoryIndexes). A single derived proof value, serialized as a byte string, is produced as output.
compressedLabelMap to the result of calling the algorithm in
Section 3.4.5 compressLabelMap, passing labelMap as the parameter.
proofValue, that starts with the ECDSA-SD disclosure
proof header bytes 0xd9, 0x5d, and 0x01.
components to an array with five elements containing the values of:
baseSignature, publicKey, signatures, compressedLabelMap, and
mandatoryIndexes.
components and append it to proofValue.
proofValue. That is, return a string
starting with "u" and ending with the base64url-no-pad-encoded value of
proofValue.
The following algorithm parses the components of the derived proof value. The required inputs are a derived proof value (proofValue). A A single derived proof value value object is produced as output, which contains a set to five elements, using the names "baseSignature", "publicKey", "signatures", "labelMap", and "mandatoryIndexes".
proofValue string starts with u, indicating that it is a
multibase-base64url-no-pad-encoded value, throwing an error if it does not.
decodedProofValue to the result of base64url-no-pad-decoding the
substring after the leading u in proofValue.
decodedProofValue starts with the ECDSA-SD disclosure proof
header bytes 0xd9, 0x5d, and 0x01, throwing an error if it does not.
components to an array that is the result of CBOR-decoding the
bytes that follow the three-byte ECDSA-SD disclosure proof header. Ensure the
result is an array of five elements. Ensure the result is an array of five
elements: a byte array of length 64, a byte array of length 36, an array of byte
arrays, each of length 64, a map of integers to byte arrays of length 32, and an
array of integers, throwing an error if not.
components using the result of calling the
algorithm in Section 3.4.6 decompressLabelMap, passing the existing
fourth element of components as compressedLabelMap.
The following algorithm creates the data needed to perform verification of an ECDSA-SD-protected verifiable credential. The inputs include a JSON-LD document (document), an ECDSA-SD disclosure proof (proof), and any custom JSON-LD API options, such as a document loader. A single verify data object value is produced as output containing the following fields: "baseSignature", "proofHash", "publicKey", "signatures", "nonMandatory", and "mandatoryHash".
proofHash to the result of perform RDF Dataset Canonicalization
[RDF-CANON] on the proof options. The hash used is the same as the one used in
the signature algorithm, i.e., SHA-256 for a P-256 curve. Note: This step can be
performed in parallel; it only needs to be completed before this algorithm needs
to use the proofHash value.
baseSignature, publicKey, signatures, labelMap, and
mandatoryIndexes, to the values associated with their property names in the
object returned when calling the algorithm in Section
3.4.8 parseDerivedProofValue, passing proofValue from proof.
nquads to the result of calling the "labelReplacementCanonize"
primitive, passing document, the result of calling the "labelMapCanonize"
primitive (passing labelMap) as labelReplacementFunction, and any custom
JSON-LD API options. Note: This step transforms the document into an array of
canonical N-Quads with pseudorandom blank node identifiers based on labelMap.
mandatory to an empty array.
nonMandatory to an empty array.
index, nq) in nquads, separate the N-Quads into mandatory
and non-mandatory categories:
mandatoryIndexes includes index, add nq to mandatory.
nq to nonMandatory.
mandatoryHash to the result of calling the "hashMandatory"
primitive, passing mandatory.
baseSignature, proofHash,
publicKey, signatures, nonMandatory, and mandatoryHash.
The Working Group is seeking implementer feedback on this cryptographic suite as well as horizonal security review on the feature from parties at W3C and IETF. Those reviews might result in significant changes to this algorithm, or the removal of the algorithm from the specification during the Candidate Recommendation phase.
The ecdsa-sd-2023 cryptographic suite takes an input document, canonicalizes
the document using the Universal RDF Dataset Canonicalization Algorithm
[RDF-CANON], and then cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
To generate a base proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.5.2 Base Proof Transformation (ecdsa-sd-2023), the hashing algorithm is defined in Section 3.5.3 Base Proof Hashing (ecdsa-sd-2023), and the proof serialization algorithm is defined in Section 3.5.5 Base Proof Serialization (ecdsa-sd-2023).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.5.3 Base Proof Hashing (ecdsa-sd-2023).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type), a cryptosuite identifier (cryptosuite), and a verification method (verificationMethod). The transformation options MUST contain an array of mandatory JSON pointers (mandatoryPointers) and MAY contain additional options, such as a JSON-LD document loader. A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
hmac to an HMAC API using a locally generated and exportable HMAC
key. The HMAC uses the same hash algorithm used in the signature algorithm,
which is detected via the verificationMethod provided to the
function. i.e., SHA-256 for a P-256 curve.
nquads to the result of calling the algorithm in Section
3.3.1 labelReplacementCanonize, passing unsecuredDocument, the
result of calling the algorithm in Section
3.3.3 labelMapCanonize (passing hmac) as the
labelReplacementFunction, and any custom JSON-LD API options. Note: This step
transforms the document into an array of canonical N-Quads with pseudorandom
blank node identifiers based on hmac.
mandatoryFrame to the result of calling the algorithm in Section
3.3.10 jsonPointersToFrame, passing document and mandatoryPointers as
pointers.
matching and nonMatching to the result of calling the algorithm
in Section 3.3.12 groupNquads, passing nquads, mandatoryFrame as
frame, and any custom JSON-LD API options. Note: This step separates the
N-Quads to mandatory (to disclose) and non-mandatory groups.
mandatory to the values in the matching map.
nonMandatory to the values in the nonMatching map.
hmacKey to the result of exporting the HMAC key from hmac.
mandatoryPointers,
"mandatory" set to mandatory, "nonMandatory" set to nonMandatory,
and "hmacKey" set to hmacKey.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.5.5 Base Proof Serialization (ecdsa-sd-2023).
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A hash data value represented as an object is produced as output.
proofHash to the result of calling the RDF Dataset Canonicalization
algorithm [RDF-CANON] on canonicalProofConfig and then cryptographically
hashing the result using the same hash that is used by the signature algorithm,
i.e., SHA-256 for a P-256 curve. Note: This step can be performed in parallel;
it only needs to be completed before this algorithm terminates as the result is
part of the return value.
mandatoryHash to the result of calling the the algorithm in Section
3.3.14 hashMandatoryNQuads, passing
transformedDocument.mandatory.
hashData as a deep copy of transformedDocument and
add proofHash as "proofHash" and mandatoryHash as "mandatoryHash" to that
object.
hashData as hash data.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the base proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof and
proofConfig.cryptosuite is not set to ecdsa-sd-2023, an
INVALID_PROOF_CONFIGURATION error MUST be raised.
INVALID_PROOF_DATETIME error MUST be raised.
The following algorithm specifies how to create a base proof; called by an issuer of an ECDSA-SD-protected Verifiable Credential. The base proof is to be given only to the holder, who is responsible for generating a derived proof from it, exposing only selectively disclosed details in the proof to a verifier. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
proofHash, mandatoryPointers, mandatoryHash, nonMandatory,
and hmacKey to the values associated with their property names
hashData.
proofScopedKeyPair to a locally generated P-256 ECDSA key pair.
Note: This key pair is scoped to the specific proof; it is not used for anything
else and the private key will be destroyed when this algorithm terminates.
signatures to an array where each element holds the result of
digitally signing the UTF-8 representation of each N-Quad string in
nonMandatory, in order. The digital signature algorithm is ES256, i.e., uses a
P-256 curve over a SHA-256 digest, and uses the private key from
proofScopedKeyPair. Note: This step generates individual signatures for each
statement that can be selectively disclosed using a local, proof-scoped key pair
that binds them together; this key pair will be bound to the proof by a
signature over its public key using the private key associated with the base
proof verification method.
publicKey to the multikey expression of the public key exported
from proofScopedKeyPair. That is, an array of bytes starting with the bytes
0x80 and 0x24 (which is the multikey p256-pub header (0x1200) expressed as a
varint) followed by the compressed public key bytes (the compressed header with
2 for an even y coordinate and 3 for an odd one followed by the x
coordinate of the public key).
toSign to the result of calling the algorithm in Section
3.4.1 serializeSignData, passing proofHash, publicKey, and
mandatoryHash as parameters to the algorithm.
baseSignature to the result of digitally signing toSign using the
private key associated with the base proof verification method.
baseSignature,
publicKey, hmacKey, signatures, and mandatoryPointers as parameters
to the algorithm.
proofValue as digital proof.
The following algorithm creates a selective disclosure derived proof; called by
a holder of an ecdsa-sd-2023-protected verifiable credential.
The derived proof is to be given to the verifier. The inputs include a
JSON-LD document (document), an ECDSA-SD base proof
(proof), an array of JSON pointers to use to selectively disclose
statements (selectivePointers), and any custom JSON-LD API options,
such as a document loader. A single selectively revealed document
value, represented as an object, is produced as output.
baseSignature, publicKey, signatures, labelMap,
mandatoryIndexes, revealDocument to the values associated with their
property names in the object returned when calling the algorithm in
Section 3.4.4 createDisclosureData, passing the document, proof,
selectivePointers, and any custom JSON-LD API options, such as a document
loader.
newProof to a shallow copy of proof.
proofValue in newProof with the result of calling the algorithm
in Section 3.4.7 serializeDerivedProofValue, passing
baseSignature, publicKey, signatures, labelMap, and mandatoryIndexes.
revealDocument to newProof.
revealDocument has an @context field that includes a verifiable
credential base context and it has a "credentialSubject" property that is a
string, set the "credentialSubject" value to an object with an "id" value that
matches the original string value.
revealDocument as the selectively revealed document.
The following algorithm attempts verification of an ecdsa-sd-2023 derived
proof. This algorithm is called by a verifier of an ECDSA-SD-protected
verifiable credential. The inputs include a JSON-LD document
(document), an ECDSA-SD disclosure proof (proof), and any
custom JSON-LD API options, such as a document loader. A single boolean
verification result value is produced as output.
baseSignature, proofHash, publicKey, signatures,
nonMandatory, and mandatoryHash to the values associated with their property
names in the object returned when calling the algorithm in Section
3.4.9 createVerifyData, passing the document, proof, and any
custom JSON-LD API options, such as a document loader.
signatures does not match the length of nonMandatory, throw
an error indicating that the signature count does not match the non-mandatory
message count.
publicKeyBytes to the public key bytes expressed in publicKey.
Instructions on how to decode the public key value can be found in Section
2.1.1 Multikey.
toVerify to the result of calling the algorithm in Setion
3.4.1 serializeSignData, passing proofHash, publicKey, and
mandatoryHash.
verificationResult be the result of applying the verification
algorithm of the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5],
with toVerify as the data to be verified against the baseSignature using
the public key specified by publicKeyBytes. If verificationResult is
false, return false.
index, signature) in signatures, verify every signature
for every selectively disclosed (non-mandatory) statement:
verificationResult to the result of applying the verification
algorithm Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], with
the UTF-8 representation of the value at index of nonMandatory as the data
to be verified against signature using the public key specified by
publicKeyBytes.
verificationResult is false, return false.
verificationResult as verification result.
This section is non-normative.
The security (integrity/authenticity) of a verifiable credential signed by a digital signature algorithm is dependent on a number of factors including:
In the following sections, we review these important points and direct the reader to additional information.
This section is non-normative.
The ECDSA signature scheme has the EUF-CMA (existential unforgeability under chosen message attacks) security property. This property guarantees that any efficient adversary who has the public key pk of the signer and received an arbitrary number of signatures on messages of its choice (in an adaptive manner) cannot output a valid signature for a new message (except with negligible probability).
SUF-CMA (strong unforgeability under chosen message attacks) is a stronger notion than EUF-CMA. It guarantees that for any efficient adversary who has the public key pk of the signer and received an arbitrary number of signatures on messages of its choice, it cannot output a new valid signature pair for a new message nor a new signature for an old message (except with negligible probability). ECDSA signature scheme does not have the SUF-CMA property, while other schemes such as EdDSA [FIPS-186-5] do.
Per [NIST-SP-800-57-Part-1] in the absence of large scale quantum computers a security strength level of 128 bits requires a key size of approximately 256 bits while a security strength level of 192 bits requires a key size of 384 bits. [NIST-SP-800-186] recommendations includes curves P-256 and P-384 at these respective security strength levels.
This section is non-normative.
The ECDSA algorithm as detailed in [FIPS-186-5] states: "A new secret random number k, 0 < k < n, shall be generated prior to the generation of each digital signature for use during the signature generation process." The failure to properly generate this k value has lead to some highly publicized integrity breaches in widely deployed systems. To counter this problem, a hash-based method of determining the secret number k, called Deterministic ECDSA, is given in [FIPS-186-5] and [RFC6979]. Verification of a ECDSA signature is independent of the method of generating k. Hence it is generally recommended to use Deterministic ECDSA unless other requirements dictate otherwise.
This section is non-normative.
The security of the ECDSA algorithm is dependent on the quality and protection of its private signing key. Guidance in the management of cryptographic keys is a large subject and the reader is referred to [NIST-SP-800-57-Part-1] for more extensive recommendations and discussion. As strongly recommended in both [FIPS-186-5] and [NIST-SP-800-57-Part-1], an ECDSA private signing key is not to be used for any other purpose than ECDSA signatures.
ECDSA private signing keys and public verification keys are strongly advised to have limited cryptoperiods [NIST-SP-800-57-Part-1], where a cryptoperiod is "the time span during which a specific key is authorized for use by legitimate entities or the keys for a given system will remain in effect." [NIST-SP-800-57-Part-1] gives extensive guidance on cryptoperiods for different key types under different situations and generally recommends a 1-3 year cryptoperiod for a private signing key.
To deal with potential private key compromises, [NIST-SP-800-57-Part-1] gives recommendations for protective measures, harm reduction, and revocation. Although we have been emphasizing the security of the private signing key, assurance of public key validity is highly recommended on all public keys before using them, per [NIST-SP-800-57-Part-1].
Ensuring that cryptographic suites are versioned and tightly scoped to a very small set of possible key types and signature schemes (ideally one key type and size and one signature output type) is a design goal for most Data Integrity cryptographic suites. Historically, this has been done by defining both the key type and the cryptographic suite that uses the key type in the same specification. The downside of doing so, however, is that there might be a proliferation of different key types in multikey that result in different cryptosuites defining the same key material differently. For example, one cryptosuite might use compressed Curve P-256 keys while another uses uncompressed values. If that occurs, it will harm interoperability. It will be important in the coming months to years to ensure that this does not happen by fully defining the multikey format in a separate specification so cryptosuite specifications, such as this one, can refer to the multikey specification, thus reducing the chances of multikey type proliferation and improving the chances of maximum interoperability for the multikey format.
The following section describes privacy considerations that developers implementing this specification should be aware of in order to avoid violating privacy assumptions.
This cryptography suite does not provide for selective disclosure or unlinkability. If signatures are re-used, they can be used as correlatable data.
This section is non-normative.
All test vectors are produced using Deterministic ECDSA. The implementation was validated against the test vectors in [RFC6979].
The group is debating the names used for the cryptosuite identifiers in VC Data Integrity issue #38. Cryptosuite identifiers might change in the future.
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[MULTIBASE]/[MULTICODEC] representation for the public key, p256-pub,
and the representation for the private key, p256-priv, are shown below.
{
"publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"privateKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN"
}
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": ["VerifiableCredential", "AlumniCredential"],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
}
}
<did:example:abcdefgh> <https://www.w3.org/ns/credentials/examples#alumniOf> "The School of Examples" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/ns/credentials/examples#AlumniCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/description> "A minimum viable example of an Alumni Credential." . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/name> "Alumni Credential" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:abcdefgh> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#issuer> <https://vc.example/issuers/5678> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#validFrom> "2023-01-01T00:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
517744132ae165a5349155bef0bb0cf2258fff99dfe1dbd914b938d775a36017
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{
"type": "DataIntegrityProof",
"cryptosuite": "ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"proofPurpose": "assertionMethod",
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
]
}
_:c14n0 <http://purl.org/dc/terms/created> "2023-02-24T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-2019" . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP> .
796bfbfac9833e0c0c199edbade954a34919bfbb91a874087dd5bcc3385e7e6b
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base58-btc encode the signature.
796bfbfac9833e0c0c199edbade954a34919bfbb91a874087dd5bcc3385e7e6b517744132ae165a5349155bef0bb0cf2258fff99dfe1dbd914b938d775a36017
2e3209073fbc0b203fa8f84272c2ad249fe180da63c2d9c15d6605c2594cc67847bb7350e3a04a2e26afb5939ea988addef2a9e2397ade3719737bd37ae4e71a
zvZyUGXX8cyJZRBkNw813SGsJHWrcpo4Y8hRJ7adYn35EetqXb23ZkdakfJNUhiTEdwyE598X7RLrkjnXEADLQZ7
Assemble the signed credential with the following two steps:
proofValue field with the previously computed base58-btc
value to the proof options document.
proof field of the credential to the augmented proof
option document.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": [
"VerifiableCredential",
"AlumniCredential"
],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
},
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"proofPurpose": "assertionMethod",
"proofValue": "zvZyUGXX8cyJZRBkNw813SGsJHWrcpo4Y8hRJ7adYn35EetqXb23ZkdakfJNUhiTEdwyE598X7RLrkjnXEADLQZ7"
}
}
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[MULTIBASE]/[MULTICODEC] representation for the public key, p384-pub,
and the representation for the private key, p384-priv, are shown below.
{
"publicKeyMultibase": "z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ",
"privateKeyMultibase": "z2fanyY7zgwNpZGxX5fXXibvScNaUWNprHU9dKx7qpVj7mws9J8LLt4mDB5TyH2GLHWkUc"
}
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": ["VerifiableCredential", "AlumniCredential"],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
}
}
<did:example:abcdefgh> <https://www.w3.org/ns/credentials/examples#alumniOf> "The School of Examples" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/ns/credentials/examples#AlumniCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/description> "A minimum viable example of an Alumni Credential." . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/name> "Alumni Credential" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:abcdefgh> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#issuer> <https://vc.example/issuers/5678> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#validFrom> "2023-01-01T00:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
8bf6e01df72c5b62f91b685231915ac4b8c58ea95f002c6b8f6bfafa1b251df476b56b8e01518e317dab099d3ecbff96
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{
"type": "DataIntegrityProof",
"cryptosuite": "ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ",
"proofPurpose": "assertionMethod",
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
]
}
_:c14n0 <http://purl.org/dc/terms/created> "2023-02-24T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-2019" . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ> .
deb6ee98fa0511308dd1d6bc74aee270fc233ec3f3fe8c817149ce5dd4fb6836454fe1ad5d8d8e908d613b55fbeeffbe
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base58-btc encode the signature.
deb6ee98fa0511308dd1d6bc74aee270fc233ec3f3fe8c817149ce5dd4fb6836454fe1ad5d8d8e908d613b55fbeeffbe8bf6e01df72c5b62f91b685231915ac4b8c58ea95f002c6b8f6bfafa1b251df476b56b8e01518e317dab099d3ecbff96
3a34d517cffe6146dcd99c44508710cc11e688e05c854a1c276cde23640454fa96c412841f2c3ec3876687b377c58ea8c7bb5acab4bbd2224ae8eafc57ff49395aa872d86ae3da719468f7b9c6018e7b4b5059feabe339bc0c2774f9405cd4c9
zM3wLGZPqFGbByS8HwpcXyGKvUFqjDKwPu7cExSsbKb5ABbJtGs53UzmsCFKHydPagV6smU4c48mW7SrFG5Mwu5GFFpBdcwmS74Hm6JpzSWBBAkEDMDHFq1d3dHZyHwfftM6
Assemble the signed credential with the following two steps:
proofValue field with the previously computed base58-btc
value to the proof options document.
proof field of the credential to the augmented proof
option document.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": [
"VerifiableCredential",
"AlumniCredential"
],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
},
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ",
"proofPurpose": "assertionMethod",
"proofValue": "zM3wLGZPqFGbByS8HwpcXyGKvUFqjDKwPu7cExSsbKb5ABbJtGs53UzmsCFKHydPagV6smU4c48mW7SrFG5Mwu5GFFpBdcwmS74Hm6JpzSWBBAkEDMDHFq1d3dHZyHwfftM6"
}
}
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[MULTIBASE]/[MULTICODEC] representation for the public key, p256-pub,
and the representation for the private key, p256-priv, are shown below.
{
"publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"privateKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN"
}
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": ["VerifiableCredential", "AlumniCredential"],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
}
}
{"@context":["https://www.w3.org/ns/credentials/v2","https://www.w3.org/ns/credentials/examples/v2"],"credentialSubject":{"alumniOf":"The School of Examples","id":"did:example:abcdefgh"},"description":"A minimum viable example of an Alumni Credential.","id":"urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33","issuer":"https://vc.example/issuers/5678","name":"Alumni Credential","type":["VerifiableCredential","AlumniCredential"],"validFrom":"2023-01-01T00:00:00Z"}
59b7cb6251b8991add1ce0bc83107e3db9dbbab5bd2c28f687db1a03abc92f19
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{
"type": "DataIntegrityProof",
"cryptosuite": "jcs-ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"proofPurpose": "assertionMethod"
}
{"created":"2023-02-24T23:36:38Z","cryptosuite":"jcs-ecdsa-2019","proofPurpose":"assertionMethod","type":"DataIntegrityProof","verificationMethod":"https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP"}
4f097fc73b1fd2df8e4f7e68049adab2455b76a009bc02b98e837bcb3dd63936
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base58-btc encode the signature.
4f097fc73b1fd2df8e4f7e68049adab2455b76a009bc02b98e837bcb3dd6393659b7cb6251b8991add1ce0bc83107e3db9dbbab5bd2c28f687db1a03abc92f19
e993f8af2edc8f144ecba79514ae66cc825c0101660f70bf6c6ad11c41782b08af336c29f498b6977acc8c4841101ce148f894e44213e1c7c9fd672f7a3e2030
z5frnhZZhdgMaVDzYoEcxw3gXHxqow5SsLFR63BHc4mSTJcVcU5LCeThJvzMLo8PTC58S4uxhXdMoiSp1nxzBoNGf
Assemble the signed credential with the following two steps:
proofValue field with the previously computed base58-btc
value to the proof options document.
proof field of the credential to the augmented proof
option document.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": [
"VerifiableCredential",
"AlumniCredential"
],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
},
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "jcs-ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP",
"proofPurpose": "assertionMethod",
"proofValue": "z5frnhZZhdgMaVDzYoEcxw3gXHxqow5SsLFR63BHc4mSTJcVcU5LCeThJvzMLo8PTC58S4uxhXdMoiSp1nxzBoNGf"
}
}
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[MULTIBASE]/[MULTICODEC] representation for the public key, p384-pub,
and the representation for the private key, p384-priv, are shown below.
{
"publicKeyMultibase": "z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ",
"privateKeyMultibase": "z2fanyY7zgwNpZGxX5fXXibvScNaUWNprHU9dKx7qpVj7mws9J8LLt4mDB5TyH2GLHWkUc"
}
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": ["VerifiableCredential", "AlumniCredential"],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
}
}
{"@context":["https://www.w3.org/ns/credentials/v2","https://www.w3.org/ns/credentials/examples/v2"],"credentialSubject":{"alumniOf":"The School of Examples","id":"did:example:abcdefgh"},"description":"A minimum viable example of an Alumni Credential.","id":"urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33","issuer":"https://vc.example/issuers/5678","name":"Alumni Credential","type":["VerifiableCredential","AlumniCredential"],"validFrom":"2023-01-01T00:00:00Z"}
3e0be671cc1881035d463158c80921973dab3534d4f8dfacf4ff2725a4115eb718e49d66de0e90e7365cd6062abf2259
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{
"type": "DataIntegrityProof",
"cryptosuite": "jcs-ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ",
"proofPurpose": "assertionMethod"
}
{"created":"2023-02-24T23:36:38Z","cryptosuite":"jcs-ecdsa-2019","proofPurpose":"assertionMethod","type":"DataIntegrityProof","verificationMethod":"https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ"}
f2cb19bff507eb059ba952d2363ec3e889b889e7f2fd0cc7ade4c9ae27a1e22b948f7f77050404634049aedd44cf5f0c
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base58-btc encode the signature.
f2cb19bff507eb059ba952d2363ec3e889b889e7f2fd0cc7ade4c9ae27a1e22b948f7f77050404634049aedd44cf5f0c3e0be671cc1881035d463158c80921973dab3534d4f8dfacf4ff2725a4115eb718e49d66de0e90e7365cd6062abf2259
07d8d8d112272751f3c4eaadc7c8d6616c4b5b73a0f542f5707208e29146210ff4a701c7bcbae48182e27721d7f6ba48b0780aa9a61483a24d1f414ca4b134e2b7075eaf2d98daeccf91ce09cc4ff9dd2f6f27e7fe0cf76f9cfce2d57507efe9
z3hnH49Vkcutq5HJCxhue4fYyXvRzjjzd7WhSuDQ9ALhhoeAeHoHxmbwByayPFudy9zpXYXXXJD91cL2ajsBZS9exnrLfCvd1HFYDvprCGEspv1Qha8bVN7fvw4dTYRNVhbe
Assemble the signed credential with the following two steps:
proofValue field with the previously computed base58-btc
value to the proof options document.
proof field of the credential to the augmented proof
option document.
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
"type": [
"VerifiableCredential",
"AlumniCredential"
],
"name": "Alumni Credential",
"description": "A minimum viable example of an Alumni Credential.",
"issuer": "https://vc.example/issuers/5678",
"validFrom": "2023-01-01T00:00:00Z",
"credentialSubject": {
"id": "did:example:abcdefgh",
"alumniOf": "The School of Examples"
},
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "jcs-ecdsa-2019",
"created": "2023-02-24T23:36:38Z",
"verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ",
"proofPurpose": "assertionMethod",
"proofValue": "z3hnH49Vkcutq5HJCxhue4fYyXvRzjjzd7WhSuDQ9ALhhoeAeHoHxmbwByayPFudy9zpXYXXXJD91cL2ajsBZS9exnrLfCvd1HFYDvprCGEspv1Qha8bVN7fvw4dTYRNVhbe"
}
}
Referenced in: