Copyright © 2024 the Contributors to the Verifiable Credential Barcodes v0.7 Specification, published by the Credentials Community Group under the W3C Community Contributor License Agreement (CLA) . A human-readable summary is available.
This specification describes a mechanism to protect legacy optical barcodes, such as those found on driver's licenses (PDF417) and travel documents (MRZ), using Verifiable Credentials [ VC-DATA-MODEL-2.0 ]. The Verifiable Credential representations are compact enough such that they fit in under 150 bytes and can thus be integrated with traditional two-dimensional barcodes that are printed on physical cards using legacy printing processes.
This specification was published by the Credentials Community Group . It is not a W3C Standard nor is it on the W3C Standards Track. Please note that under the W3C Community Contributor License Agreement (CLA) there is a limited opt-out and other conditions apply. Learn more about W3C Community and Business Groups .
This specification is experimental.
GitHub Issues are preferred for discussion of this specification.
Legacy documentation, such as driver's licenses, passports, and travel credentials often include machine-readable data that can be used to quickly read the information from the document. This information is encoded in formats such as PDF417 [ ISO15438-2015 ], machine-readable zone (MRZ) [ ICAO9303-3 ], and other optically scannable codes that are formatted in one-dimensional or two-dimensional "bars"; thus the term "barcode". This information is often not protected from tampering and the readily available barcode generation and scanning libraries mean that it is fairly trivial for anyone to generate these barcodes.
It is, therefore, useful for an issuer of these barcodes to protect the information contained within the barcode as well as the entity that generated the barcode.
The Verifiable Credentials Data Model v2.0 specification provides a global standard for expressing credential information, such as those in a driver's license or travel document. The Verifiable Credential Data Integrity 1.0 specification provides a global standard for securing credential information. These two specifications, when combined, provide a means of protecting credentials from tampering, expressing authorship of the credential, and providing the current status of a credential in a privacy-protecting manner. These data formats, however, tend to be too large to express in an optical barcode.
The Compact Binary Object Representation for Linked Data v0.7 specification provides a means of compressing secured verifiable credentials to the point at which it becomes feasible to express the information as an optical barcode, or embedded within an optical barcode.
This specification describes a mechanism to protect legacy optical barcodes, such as those found on driver's licenses (PDF417) and travel documents (MRZ), by using a verifiable credential [ VC-DATA-MODEL-2.0 ] to express information about the barcode, which is then secured using Data Integrity [ VC-DATA-INTEGRITY ], and then compressed using CBOR-LD [ CBOR-LD ]. The resulting verifiable credential representations are compact enough such that they fit in under 140 bytes and can thus be integrated with traditional two-dimensional barcodes that are printed on physical cards using legacy printing processes. This adds tamper resistance to the barcode while optionally enhancing the barcode to provide information related to whether or not the physical document has been revoked or suspended by the issuer .
This section provides an example on how the technology in this specification can be utilized to secure the optical barcode on a driver's license that uses a PDF417 barcode. We start off with an example driver's license:
The back of the driver's license contains a PDF417 barcode:
The PDF417 data contains information that is secured using the algorithms described in this specification. Namely, the PDF417 barcode contains a verifiable credential of the following form.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://w3id.org/vdl/v2", "https://w3id.org/vdl/utopia/v1" ], "type": [ "VerifiableCredential", "OpticalBarcodeCredential" ], // the issuer value below is defined as a URL in the 'utopia/v1' context above "issuer": "did:web:dmv.utopia.example", "credentialStatus": { "type": "TerseBitstringStatusListEntry", "terseStatusListBaseUrl": "https://dmv.utopia.gov/statuses/12345/status-lists" "terseStatusListIndex": 123567890 }, "credentialSubject": { "type": "AamvaDriversLicenseScannableInformation", "protectedComponentIndex": "uP_BA" }, "proof": { "type": "DataIntegrity", "cryptosuite": "ecdsa-xi-2023", // the public key below is defined as a URL in the 'utopia/v1' context above "verificationMethod": "did:web:dmv.utopia.example#key-1", "proofPurpose": "assertionMethod", "proofValue": "z4peo48uwK2EF4Fta8P...HzQMDYJ34r9gL" } }
The verifiable credential above is then compressed using [ CBOR-LD ] to the following output (in CBOR Diagnostic Notation):
1281{ 1 => [ 32768, 32769, 32770], // @context 155 => [ 116, 164 ], // type 192 => 174, // issuer 186 => { 154 => 166, 206 => 178, 208 => 1234567890 }, // credentialStatus 188 => { 154 => 172, 180 => h'753FF040 }, // credentialSubject 194 => { // proof 154 => 108, // type 214 => 4, // cryptosuite 224 => 230 // verificationMethod 228 => 176, // proofPurpose 210 => Uint8Array(65) [ ... ], // proofValue } }
The following terms are used to describe concepts in this specification.
Our definition of credential differs from, NIST's definitions of credential .
did:example:123456abcdef
.
verifiableCredential
.
These
properties
result
in
separate
graphs
that
contain
all
claims
defined
in
the
corresponding
JSON
objects.
A
and
B
),
using
Unicode
Codepoint
Collation
,
as
defined
in
[
XPATH-FUNCTIONS
],
which
defines
a
total
ordering
of
strings
comparing
code
points.
Note
that
for
UTF-8
encoded
strings,
comparing
the
byte
sequences
gives
the
same
result
as
code
point
order
.
As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.
The key words MUST , RECOMMENDED , REQUIRED , and SHOULD in this document are to be interpreted as described in BCP 14 [ RFC2119 ] [ RFC8174 ] when, and only when, they appear in all capitals, as shown here.
A conforming document is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections 2. Data Model and 3. Algorithms of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming document . Conforming processors MUST produce errors when non-conforming documents are consumed.
This
document
contains
examples
of
JSON
and
JSON-LD
data.
Some
of
these
examples
are
invalid
JSON,
as
they
include
features
such
as
inline
comments
(
//
)
explaining
certain
portions
and
ellipses
(
...
)
indicating
the
omission
of
information
that
is
irrelevant
to
the
example.
Such
parts
need
to
be
removed
if
implementers
want
to
treat
the
examples
as
valid
JSON
or
JSON-LD.
The following are the design goals of the technology in this specification:
The following sections outline the data model that is used by this specification to express verifiable credentials that secure optically printed information such as barcodes and machine-readable zones on travel documents.
An
OpticalBarcodeCredential
is
used
to
secure
the
contents
of
an
optical
barcode
in
a
way
that
provides
1)
authorship
information
,
2)
tamper
resistance,
and
3)
optionally,
revocation
and
suspension
status.
In
other
words,
the
credential
can
tell
you
who
issued
the
optical
barcode,
if
the
optical
barcode
has
been
tampered
with
since
it
was
first
issued,
and
whether
or
not
the
issuer
of
the
optical
barcode
still
warrants
that
the
document
is
still
valid
or
not.
These
features
provide
significant
anti-fraud
protections
for
physical
documents.
The
credentialSubject
of
an
OpticalBarcodeCredential
is
either
of
type
AamvaDriversLicenseScannableInformation
or
a
MachineReadableZone
.
A
AamvaDriversLicenseScannableInformation
signifies
that
the
verifiable
credential
secures
the
PDF417
barcode
on
the
physical
document
as
well
as
the
information
expressed
in
the
verifiable
credential
.
A
MachineReadableZone
signifies
that
the
verifiable
credential
secures
the
machine-readable
zone
on
the
physical
document
as
well
as
the
information
expressed
in
the
verifiable
credential
.
If
an
OpticalBarcodeCredential
is
of
type
AamvaDriversLicenseScannableInformation
,
there
is
a
REQUIRED
additional
field
protectedComponentIndex
that
contains
information
about
which
fields
in
the
PDF417
are
digitally
signed.
protectedComponentIndex
MUST
be
a
three
byte/24
bit
value
that
is
multibase-base64url
encoded
for
a
total
of
5
characters
in
the
JSON-LD
credential.
There
are
22
mandatory
fields
in
an
AAMVA
compliant
driver's
license
PDF417
[
aamva-dl-id-card-design-standard
],
and
the
first
22
bits
of
the
protectedComponentIndex
value
correspond
to
these
fields.
Each
AAMVA
mandatory
field
begins
with
a
three
character
element
ID
(e.g.
DBA
for
document
expiration
date).
To
construct
a
mapping
between
bits
in
the
protectedComponentIndex
value
and
these
fields,
sort
these
element
IDs
according
to
Unicode
code
point
order.
Then,
if
a
bit
in
position
i
of
protectedComponentIndex
is
1
,
the
AAMVA
mandatory
field
in
position
i
of
the
sorted
element
IDs
is
protected
by
the
digital
signature.
The
last
two
bits
in
protectedComponentIndex
MUST
be
0
.
For
more
information,
see
Section
3.5.4
Create
opticalDataBytes
.
In
order
to
achieve
as
much
compression
as
possible,
it
is
RECOMMENDED
that
the
issuer
and
verificationMethod
fields
utilize
terms
from
a
JSON-LD
Context,
which
can
then
be
compressed
down
to
a
few
bytes
due
to
CBOR-LD's
semantic
compression
mechanism.
An example of an optical barcode credential that utilizes the properties specified in this section is provided below:
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://w3id.org/vdl/v2", "https://w3id.org/vdl/utopia/v1" ], "type": [ "VerifiableCredential", "OpticalBarcodeCredential" ], "issuer": "did:web:dmv.utopia.example", "credentialStatus": { "type": "TerseBitstringStatusListEntry", "terseStatusListBaseUrl": "dmv.utopia.gov/statuses/12345/status-lists" "terseStatusListIndex": 123567890 }, "credentialSubject": { "type": "AamvaDriversLicenseScannableInformation", "protectedComponentIndex": "uP_BA" } }
A
TerseBitstringStatusListEntry
is
a
compact
representation
of
a
BitstringStatusListEntry
as
defined
in
the
Bitstring
Status
List
v1.0
specification.
An
object
of
type
TerseBitstringStatusListEntry
MUST
have
two
additional
properties:
terseStatusListBaseUrl
,
which
identifies
the
location
of
the
status
lists
associated
with
this
credential.
terseStatusListBaseUrl
MUST
be
a
URL
[
URL
].
terseStatusListIndex
,
which
specifies
an
individual
status
at
the
above
URL.
terseStatusListIndex
MUST
be
representable
as
a
32
bit
unsigned
integer.
To process a
TerseBitstringStatusListEntry
,
apply
the
algorithm
in
Section
3.4
Convert
Status
List
Entries
to
convert
it
to
a
BitstringStatusListEntry
,
then
process
it
as
in
Bitstring
Status
List
v1.0
.
Implementers
need
to
set
a
value
listLength
for
the
length
of
an
individual
status
list.
This
then
yields
a
number
of
status
lists
listCount
=
2^32
/
listLength
for
a
32-bit
terseStatusListIndex
.
listLength
is
needed
to
convert
from
a
TerseBitstringStatusListEntry
to
a
BitstringStatusListEntry
.
Noting
that
some
values
of
listLength
will
harm
the
privacy-preserving
properties
of
these
status
lists,
implementations
MUST
use
listLength
=
2^17
and
listCount
=
2^15.
It
is
REQUIRED
that
implementers
character-encode
CBOR-LD
encoded
AamvaDriversLicenseScannableInformation
credentials
as
base64url
before
encoding
them
in
a
PDF417.
It
is
REQUIRED
that
implementers
re-encode
CBOR-LD
encoded
MachineReadableZone
credentials
as
base45
with
the
string
'VC1-'
prepended
before
encoding
them
in
a
QR
code.
The following section describes algorithms for adding and verifying digital proofs that protect optical information, such as barcodes and machine-readable zones, on physical media, such as driver's licenses and travel documents.
Generally speaking, the algorithms in this section are effectively the same as the ones in the Data Integrity ECDSA Cryptosuites v1.0 [ VC-DI-ECDSA ] except that the algorithm name is different because the cryptographic hashing process includes the machine-readable barcode information when calculating the digital signature such that the optical barcode is protected.
This
specification
requires
that
an
application-specific
compression
table
is
provided
to
a
CBOR-LD
processor
when
encoding
and
decoding
verifiable
credentials
of
type
OpticalBarcodeCredential
.
A
registry
for
all
context
URLs
for
various
issuers
is
provided
as
a
comma-separated
value
file
and
can
be
updated
and
modified
via
change
requests
to
the
file
on
an
append-only
and
first-come-first-served
basis.
Implementations
SHOULD
retrieve
and
utilize
the
latest
file
on
a
monthly
basis
to
ensure
that
compression
and
decompression
supports
the
latest
values.
BitstringStatusListCredential
(as
defined
in
the
Bitstring
Status
List
v1.0
specification)
that
the
issuer
wishes
to
add
to
the
OpticalBarcodeCredential
.
TerseBitstringStatusListEntry
and
statusListEntryTerse
.
index
to
the
integer
representation
of
statusListEntryVerbose
.
statusListIndex
.
OpticalBarcodeCredential
with
unsignedStatus
.
issuer
set
to
issuerUrl
and
unsignedStatus
.
credentialStatus
set
to
statusListEntryTerse
.
OpticalBarcodeCredential
in
securedDocument
.
The
algorithm
in
this
section
is
used
to
convert
the
TerseBitstringStatusListEntry
to
a
BitstringStatusListEntry
,
which
is
used
after
verification
has
been
performed
on
the
verifiable
credential
,
during
the
validation
process.
After
verifiable
credential
verification
has
been
performed,
the
algorithm
takes
an
OpticalBarcodeCredential
verifiable
credential
(
struct
vc
),
an
integer
listLength
containing
the
number
of
entries
in
the
BitstringStatusListCredential
associated
with
vc
,
and
a
string
statusPurpose
(e.g.
'revocation',
'suspension'...)
as
input
and
returns
a
'BitstringStatusListEntry'
object.
floor()
operation).
result can be used as input to the validation algorithm in the Bitstring Status List v1.0 specification.
Implementers are advised that not all issuers will publish status list information for their verifiable credentials . Some issuers might require authorization before allowing a verifier to access a status list credential.
The
ecdsa-xi-2023
is
effectively
the
ecdsa-rdfc-2019
algorithm
[
VC-DI-ECDSA
]
with
an
added
step
that
takes
some
"extra
information"
(xi)
as
input,
such
as
the
original
optical
barcode
data,
and
includes
that
data
in
the
information
that
is
protected
by
the
digital
signature.
The
algorithms
in
this
section
detail
how
such
a
signature
is
created
and
verified.
To generate a proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [ VC-DATA-INTEGRITY ] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in the Transformation (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 , the hashing algorithm is defined in Section 3.5.3 Hashing (ecdsa-xi-2023) , and the proof serialization algorithm is defined in the Proof Serialization (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 .
To verify a proof, the algorithm in Section 4.2: Verify Proof in the Data Integrity [ VC-DATA-INTEGRITY ] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in the Transformation (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 , the hashing algorithm is defined in Section 3.5.3 Hashing (ecdsa-xi-2023) , and the proof verification algorithm is defined in the Proof Verification (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 .
The hashing algorithm is what is defined in the Hashing (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 specification with the addition of the hashing of the optical data, as described below. It is presumed that the implementation makes the machine-readable optical data (PDF417 or MRZ data) available to this hashing algorithm.
The required inputs to this algorithm are a transformed data document ( transformedDocument ), a canonical proof configuration ( canonicalProofConfig ), and the optical data ( opticalDataBytes ). A single hash data value represented as series of bytes is produced as output.
The hashing algorithm is what is defined in the Hashing (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 with step 3 replaced with the following two steps:
credentialSubject.protectedComponentIndex
from
multibase-base64url
to
binary.
1
in
bitfieldDecoded
:
\n
,
U+000A
)
to
the
end,
and
append
the
result
to
dataToCanonicalize
.
The proof configuration algorithm is what is defined in the Proof Configuration (ecdsa-rdfc-2019) section of the Data Integrity ECDSA Cryptosuites v1.0 with step 4 replaced with the following step:
DataIntegrityProof
and
proofConfig
.
cryptosuite
is
not
set
to
ecdsa-xi-2023
,
an
INVALID_PROOF_CONFIGURATION
error
MUST
be
raised.
This section is non-normative.
Before reading this section, readers are urged to familiarize themselves with general security advice provided in the Security Considerations section of the Data Integrity specification as well as the specific security advice provided in the Security Considerations section of the ECDSA Cryptosuites specification .
In the following sections, we review these important points and direct the reader to additional information.
One
attack
vector
against
OpticalBarcodeCredentials
involves
duplicating
an
optical
barcode
containing
a
digital
signature
for
use
on
a
fraudulent
document.
While
a
duplicated
barcode
will
pass
signature
validation
like
the
original,
this
attack
is
mitigated
by
the
document
verifier
checking
the
following
three
things:
the
signed
data
matches
the
data
visible
on
the
document,
the
signed
data
matches
the
physical
attributes
of
the
user,
and
the
visible
data
matches
the
physical
attributes
of
the
user.
When
these
three
are
all
equivalent,
the
only
way
the
OpticalBarcodeCredential
could
be
a
duplicate
is
if
the
fraudulent
document
creator
had
access
to
a
real
OpticalBarcodeCredential
where
the
signed
physical
attributes
fully
overlapped
with
those
of
the
user
of
the
fraudulent
document.
The
low
likelihood
of
an
undetected
stolen
OpticalBarcodeCredential
existing
that
completely
matches
the
appearance
of
an
arbitrary
person
makes
this
attack
unlikely
to
succeed.
It
is
possible
that
in
some
cases
the
digital
signature
cannot
be
created
over
the
entirety
of
the
existing
optical
data.
For
example,
consider
a
case
where
a
serial
number
is
injected
by
a
physical
credential
manufacturer
such
that
it
is
not
known
to
the
issuer
at
signature
time.
In
this
case,
the
verifier
will
assume
that
any
data
not
digitally
signed
could
have
been
changed
in
the
optical
barcode
without
impacting
the
OpticalBarcodeCredential's
ability
to
successfully
validate.
When checking that data from the optical barcode matches the data visible on the document as well as the characteristics of the document holder, implementers are advised to only use the fields that are digitally signed. Verifiers are advised to only use fields protected by the digital signature, no matter how commonly the other fields are used for fraud detection on unsigned documents. For example, if eye color and hair color are protected by the signature, but the holder 's portrait is not, verifiers are advised to emphasize the eye color and hair color when attempting to detect fraud over the portrait.
Implementers of software used by verifiers are advised to only display card data that has been secured via digital signature during the verification process. Displaying unsigned data, which could have been tampered with, could interfere with fraud detection.
Verifiers
are
advised
to
always
use
trusted
programs
and
interfaces
to
check
the
validity
of
the
OpticalBarcodeCredential
.
Use
of
untrusted
software
to
verify
a
document
could
result
in
a
fraudulent
credential
being
accepted,
or
a
genuine
credential
being
stolen.
Before reading this section, readers are urged to familiarize themselves with general security advice provided in the Security Considerations section of the Data Integrity specification as well as the specific security advice provided in the Security Considerations section of the ECDSA Cryptosuites specification .
The following section describes privacy considerations that developers implementing this specification should be aware of in order to avoid violating privacy assumptions.
Add security considerations specific to this specification.
This section is non-normative.
This section contains examples of Verifiable Credential Barcodes as well as step-by-step processes for how they are generated and how they are verified.
Throughout
In
this
section
we
will
analyze
two
running
examples,
one
examples:
a
VCB
securing
the
MRZ
of
a
Utopia
Employment
Authorization
Document,
and
one
a
VCB
securing
the
PDF417
of
a
Utopia
Driver's
License.
We
start
with
the
data
that
will
be
signed
by
the
VCB
(i.e
(i.e.,
an
MRZ
and
mandatory
AAMVA
fields
from
a
PDF417):
I A U T O 0 0 0 0 0 0 7 0 1 0 S R C 0 0 0 0 0 0 0 7 0 1 < < 8 8 0 4 1 9 2 M 2 6 0 1 0 5 8 N O T < < < < < < < < < < < 5 S M I T H < < J O H N < < < < < < < < < < < < < < < < < < <
DACJOHN DADNONE DAG123 MAIN ST DAIANYVILLE DAJUTO DAKF87P20000 DAQF987654321 DAU069 IN DAYBRO DBA04192030 DBB04191988 DBC1 DBD01012024 DCAC DCBNONE DCDNONE DCFUTODOCDISCRIM DCGUTO DCSSMITH DDEN DDFN DDGN
Assume
for
simplicity
that
the
only
data
in
the
PDF417
that
you
want
to
sign
is
first
name
(DAC),
(
DAC
),
last
name
(DCS),
(
DCS
),
and
license
number
(DAQ).
(
DAQ
).
The
bitstring
value
for
use
in
protectedComponentIndex
is
then
100000100000000000100000
,
and
the
value
of
protectedComponentIndex
is
"uggAg".
"
uggAg
".
Applying
Algorithm
3.5.4.1
,
we
get
canonicalizedData = 'DACJOHN\nDAQ987654321\nDCSSMITH\n' opticalDataBytes: [188, 38, 200, 146, 227, 213, 90, 250, 50, 18, 126, 254, 47, 177, 91, 23, 64, 129, 104, 223, 136, 81, 116, 67, 136, 125, 137, 165, 117, 63, 152, 207]
For the EAD, we apply Algorithm 3.5.4.2 :
canonicalizedData = 'IAUTO0000007010SRC0000000701<<\n8804192M2601058NOT<<<<<<<<<< <5\nSMITH<<JOHN<<<<<<<<<<<<<<<<<<<\n' opticalDataBytes: [8, 198, 126, 183, 25, 160, 166, 112, 254, 184, 189, 47, 225, 211, 125, 210, 132, 137, 45, 86, 169, 28, 57, 165, 46, 253, 9, 137, 145, 42, 192, 113]
We
now
can
now
use
these
hash
values
with
Algorithm
3.5.3
to
sign
the
VC.
Executing
Algorithm
3.2
with
a
BitstringStatusListCredential
,
we
get
the
following
JSON-LD
VCs:
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://w3id.org/vc-barcodes/v1", "https://w3id.org/utopia/v2" ], "type": [ "VerifiableCredential", "OpticalBarcodeCredential" ], "credentialSubject": { "type": "AamvaDriversLicenseScannableInformation", "protectedComponentIndex": "uggAg" }, "issuer": "did:key:zDnaeWjKfs1ob9QcgasjYSPEMkwq31hmvSAWPVAgnrt1e9GKj", "credentialStatus": { "type": "TerseBitstringStatusListEntry", "terseStatusListBaseUrl": "https://sandbox.platform.veres.dev/statuses/z19rJ4oGrbFCqf3cNTVDHSbNd/status-lists", "terseStatusListIndex": 3851559041 }, "proof": { "type": "DataIntegrityProof", "verificationMethod": "did:key:zDnaeWjKfs1ob9QcgasjYSPEMkwq31hmvSAWPVAgnrt1e9GKj#zDnaeWjKfs1ob9QcgasjYSPEMkwq31hmvSAWPVAgnrt1e9GKj", "cryptosuite": "ecdsa-xi-2023", "proofPurpose": "assertionMethod", "proofValue": "z4g6G3dAZhhtPxPWgFvkiRv7krtCaeJxjokvL46fchAFCXEY3FeX2vn46MDgBaw779g1E1jswZJxxreZDCrtHg2qH" } }
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://w3id.org/vc-barcodes/v1", "https://w3id.org/utopia/v2" ], "type": [ "VerifiableCredential", "OpticalBarcodeCredential" ], "credentialSubject": { "type": "MachineReadableZone" }, "issuer": "did:key:zDnaeZSD9XcuULaS8qmgDUa6TMg2QjF9xABnZK42awDH3BEzj", "proof": { "type": "DataIntegrityProof", "verificationMethod": "did:key:zDnaeZSD9XcuULaS8qmgDUa6TMg2QjF9xABnZK42awDH3BEzj#zDnaeZSD9XcuULaS8qmgDUa6TMg2QjF9xABnZK42awDH3BEzj", "cryptosuite": "ecdsa-xi-2023", "proofPurpose": "assertionMethod", "proofValue": "z4B8AQgjwgsEdcPEZkrkK2mTVKn7qufoDgDkv9Qitf9tjxQPMoJaGdXwDrThjp7LUdvzsDJ7UwYu6Xpm9fjbo6QnJ" } }
We
can
now
apply
CBOR-LD
compression
to
these
VCs.
Here
Here,
we
use
the
newest
version
of
CBOR-LD,
however
CBOR-LD;
however,
at
the
end
of
the
section
section,
we
provide
VCBs
encoded
using
older
versions
of
CBOR-LD
for
interoperability
testing
with
CBOR-LD
implementations
that
are
not
up
to
date.
For
this
specficiation,
we
have
reserved
the
CBOR-LD
registry
entry
with
value
100
(i.e.,
these
payloads
will
begin
with
tag
(i.e.
0x0664).
0x0664
).
The
parameters
to
encode
using
CBOR-LD,
which
can
be
found
in
the
registry
in
the
CBOR-LD
specification,
are
then
as
follows:
registryEntryId: 100 typeTable: { "context": { "https://www.w3.org/ns/credentials/v2": 32768, "https://w3id.org/vc-barcodes/v1": 32769, "https://w3id.org/utopia/v2": 32770 }, "https://w3id.org/security#cryptosuiteString": { "ecdsa-rdfc-2019": 1, "ecdsa-sd-2023": 2, "eddsa-rdfc-2022": 3, "ecdsa-xi-2023": 4 } }
d90664a60183198000198001198002189d82187618a418b8a3189c18a618ce18b218d01ae592208118baa2189c18a018a8447582002018be18aa18c0a5189c186c18d60418e018e618e258417ab7c2e56b49e2cce62184ce26818e15a8b173164401b5d3bb93ffd6d2b5eb8f6ac0971502ae3dd49d17ec66528164034c912685b8111bc04cdc9ec13dbadd91cc18e418ac diagnostic: 1636( { 1: [32768, 32769, 32770], 157: [118, 164], 184: {156: 166, 206: 178, 208: 3851559041}, 186: {156: 160, 168: h'75820020'}, 190: 170, 192: { 156: 108, 214: 4, 224: 230, 226: h'7AB7C2E56B49E2CCE62184CE26818E15A8B173164401B5D3BB93FFD6D2B5EB8F6AC0971502AE3DD49D17EC66528164034C912685B8111BC04CDC9EC13DBADD91CC', 228: 172 } } )
d90664a50183198000198001198002189d82187618a418baa1189c18a218be18ae18c0a5189c186c18d20418dc18e218de58417a9ec7f688f60caa8c757592250b3f6d6e18419941f186e1ed4245770e687502d51d01cd2c2295e4338178a51a35c2f044a85598e15db9aef00261bc5c95a744e718e018b0 diagnostic: 1636( { 1: [32768, 32769, 32770], 157: [118, 164], 186: {156: 162}, 190: 174, 192: { 156: 108, 210: 4, 220: 226, 222: h'7A9EC7F688F60CAA8C757592250B3F6D6E18419941F186E1ED4245770E687502D51D01CD2C2295E4338178A51A35C2F044A85598E15DB9AEF00261BC5C95A744E7', 224: 176 } } )
Encoding
the
Driver's
License
CBORLD
CBOR-LD
as
base64url
and
inserting
the
result
into
the
PDF417
bytes
in
the
'ZZA'
field
in
the
'ZZ'
subfile:
bytes(@\n\x1e\rANSI000000090002DL00410267ZZ03080162DLDAQF987654321\nDCSSMITH\nDDEN\nDACJOHN\nDDFN\nDADNONE\nDDGN\nDCAC\nDCBNONE\nDCDNONE\nDBD01012024\nDBB04191988\nDBA04192030\nDBC1\nDAU069 IN\nDAYBRO\nDAG123 MAIN ST\nDAIANYVILLE\nDAJUTO\nDAKF87P20000 \nDCFUTODOCDISCRIM\nDCGUTO\nDAW158\nDCK1234567890\nDDAN\rZZZZA2QZkpgGDGYAAGYABGYACGJ2CGHYYpBi4oxicGKYYzhiyGNAa5ZIggRi6ohicGKAYqER1ggAgGL4YqhjApRicGGwY1gQY4BjmGOJYQXq3wuVrSeLM5iGEziaBjhWosXMWRAG107uT_9bSteuPasCXFQKuPdSdF-xmUoFkA0yRJoW4ERvATNyewT263ZHMGOQYrA==\r)
Encoding
the
EAD
CBORLD
CBOR-LD
as
base45
and
prepending
'VC1-':
VC1-SJRPWCR803A3P0098G3A3-B02-J743853U53KGK0XJ6MKJ1OI0M.FO053.33963DN04$RAQS+4SMC8C3KM7VX4VAPL9%EILI:I1O$D:23%GJ0OUCPS0H8D2FB9D5G00U39.PXG49%SOGGB*K$Z6%GUSCLWEJ8%B95MOD0P NG-I:V8N63K53
The above can now be turned into barcodes:
We now apply the reverse process to verify.
We first read the data from the barcodes:
bytes(@\n\x1e\rANSI000000090002DL00410267ZZ03080162DLDAQF987654321\nDCSSMITH\nDDEN\nDACJOHN\nDDFN\nDADNONE\nDDGN\nDCAC\nDCBNONE\nDCDNONE\nDBD01012024\nDBB04191988\nDBA04192030\nDBC1\nDAU069 IN\nDAYBRO\nDAG123 MAIN ST\nDAIANYVILLE\nDAJUTO\nDAKF87P20000 \nDCFUTODOCDISCRIM\nDCGUTO\nDAW158\nDCK1234567890\nDDAN\rZZZZA2QZkpgGDGYAAGYABGYACGJ2CGHYYpBi4oxicGKYYzhiyGNAa5ZIggRi6ohicGKAYqER1ggAgGL4YqhjApRicGGwY1gQY4BjmGOJYQXq3wuVrSeLM5iGEziaBjhWosXMWRAG107uT_9bSteuPasCXFQKuPdSdF-xmUoFkA0yRJoW4ERvATNyewT263ZHMGOQYrA==\r)
VC1-SJRPWCR803A3P0098G3A3-B02-J743853U53KGK0XJ6MKJ1OI0M.FO053.33963DN04$RAQS+4SMC8C3KM7VX4VAPL9%EILI:I1O$D:23%GJ0OUCPS0H8D2FB9D5G00U39.PXG49%SOGGB*K$Z6%GUSCLWEJ8%B95MOD0P NG-I:V8N63K53
We extract the data after 'VC1-' and the data in field 'ZZA' in subfile 'ZZ', undoing the base encoding:
d90664a60183198000198001198002189d82187618a418b8a3189c18a618ce18b218d01ae592208118baa2189c18a018a8447582002018be18aa18c0a5189c186c18d60418e018e618e258417ab7c2e56b49e2cce62184ce26818e15a8b173164401b5d3bb93ffd6d2b5eb8f6ac0971502ae3dd49d17ec66528164034c912685b8111bc04cdc9ec13dbadd91cc18e418ac
d90664a50183198000198001198002189d82187618a418baa1189c18a218be18ae18c0a5189c186c18d20418dc18e218de58417a9ec7f688f60caa8c757592250b3f6d6e18419941f186e1ed4245770e687502d51d01cd2c2295e4338178a51a35c2f044a85598e15db9aef00261bc5c95a744e718e018b0
We
now
decompress
with
CBOR-LD
to
get
the
original
JSON-LD
VCs
to
be
verified.
Again,
the
parameters
are
associated
with
CBOR-LD
registry
entry
100.
100
.
typeTable: { "context": { "https://www.w3.org/ns/credentials/v2": 32768, "https://w3id.org/vc-barcodes/v1": 32769, "https://w3id.org/utopia/v2": 32770 }, "https://w3id.org/security#cryptosuiteString": { "ecdsa-rdfc-2019": 1, "ecdsa-sd-2023": 2, "eddsa-rdfc-2022": 3, "ecdsa-xi-2023": 4 } }
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://w3id.org/vc-barcodes/v1", "https://w3id.org/utopia/v2" ], "type": [ "VerifiableCredential", "OpticalBarcodeCredential" ], "credentialSubject": { "type": "AamvaDriversLicenseScannableInformation", "protectedComponentIndex": "uggAg" }, "issuer": "did:key:zDnaeWjKfs1ob9QcgasjYSPEMkwq31hmvSAWPVAgnrt1e9GKj", "credentialStatus": { "type": "TerseBitstringStatusListEntry", "terseStatusListBaseUrl": "https://sandbox.platform.veres.dev/statuses/z19rJ4oGrbFCqf3cNTVDHSbNd/status-lists", "terseStatusListIndex": 3851559041 }, "proof": { "type": "DataIntegrityProof", "verificationMethod": "did:key:zDnaeWjKfs1ob9QcgasjYSPEMkwq31hmvSAWPVAgnrt1e9GKj#zDnaeWjKfs1ob9QcgasjYSPEMkwq31hmvSAWPVAgnrt1e9GKj", "cryptosuite": "ecdsa-xi-2023", "proofPurpose": "assertionMethod", "proofValue": "z4g6G3dAZhhtPxPWgFvkiRv7krtCaeJxjokvL46fchAFCXEY3FeX2vn46MDgBaw779g1E1jswZJxxreZDCrtHg2qH" } }
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://w3id.org/vc-barcodes/v1", "https://w3id.org/utopia/v2" ], "type": [ "VerifiableCredential", "OpticalBarcodeCredential" ], "credentialSubject": { "type": "MachineReadableZone" }, "issuer": "did:key:zDnaeZSD9XcuULaS8qmgDUa6TMg2QjF9xABnZK42awDH3BEzj", "proof": { "type": "DataIntegrityProof", "verificationMethod": "did:key:zDnaeZSD9XcuULaS8qmgDUa6TMg2QjF9xABnZK42awDH3BEzj#zDnaeZSD9XcuULaS8qmgDUa6TMg2QjF9xABnZK42awDH3BEzj", "cryptosuite": "ecdsa-xi-2023", "proofPurpose": "assertionMethod", "proofValue": "z4B8AQgjwgsEdcPEZkrkK2mTVKn7qufoDgDkv9Qitf9tjxQPMoJaGdXwDrThjp7LUdvzsDJ7UwYu6Xpm9fjbo6QnJ" } }
Again
Again,
we
apply
Algorithm
3.5.4.1
and
Algorithm
3.5.4.2
to
create
the
opticalDataBytes
that
ecdsa-xi-2023
requires,
using
the
scanned
PDF417
as
input
for
the
driver's
license,
and
using
the
MRZ
on
the
EAD
as
input
for
the
EAD:
canonicalizedData = 'DACJOHN\nDAQ987654321\nDCSSMITH\n' opticalDataBytes: [188, 38, 200, 146, 227, 213, 90, 250, 50, 18, 126, 254, 47, 177, 91, 23, 64, 129, 104, 223, 136, 81, 116, 67, 136, 125, 137, 165, 117, 63, 152, 207]
canonicalizedData = 'IAUTO0000007010SRC0000000701<<\n8804192M2601058NOT<<<<<<<<<< <5\nSMITH<<JOHN<<<<<<<<<<<<<<<<<<<\n' opticalDataBytes: [8, 198, 126, 183, 25, 160, 166, 112, 254, 184, 189, 47, 225, 211, 125, 210, 132, 137, 45, 86, 169, 28, 57, 165, 46, 253, 9, 137, 145, 42, 192, 113]
We then apply Algorithm 3.5.3 and Algorithm 3.5.2 to verify the credential.
The
last
step
is
to
check
the
status
information
on
the
Driver's
License
credential.
We
apply
Algorithm
3.4
to
convert
the
TerseBitstringStatusListEntry
into
a
BitstringStatusListEntry
.
Here
we
check
two
status
types,
'revocation'
and
'suspension',
passing
those
strings
as
values
of
statusPurpose
.
{ type: 'BitstringStatusListEntry', statusListCredential: 'https://sandbox.platform.veres.dev/statuses/z19rJ4oGrbFCqf3cNTVDHSbNd/status-lists/revocation/29385', statusListIndex: 8321, statusPurpose: 'revocation' }
{ type: 'BitstringStatusListEntry', statusListCredential: 'https://sandbox.platform.veres.dev/statuses/z19rJ4oGrbFCqf3cNTVDHSbNd/status-lists/suspension/29385', statusListIndex: 8321, statusPurpose: 'suspension' }
These can then be validated as in the Bitstring Status List v1.0: Validate Algorithm .
appContextMap: [['https://www.w3.org/ns/credentials/v2', 32768], ['https://w3id.org/vc-barcodes/v1', 32769], ['https://w3id.org/utopia/v2', 32770]]
This section is non-normative.
This section contains the substantive changes that have been made to this specification over time.
The content for this specification will be filled in after the standards-track process has been started.
Referenced in:
Referenced in:
Referenced in: