This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
QT approved comment In 3.3.16.2, the text says "For example, '0FB7' is a hex encoding for the 16-bit integer 4023 (whose binary representation is 111110110111)." 0FB7 represents the two octets (00001111, 10110111). Whether these two octets represent the integer 4023 is debatable and irrelevant: they might represent a character string in UTF-7 encoding or an icon or a packed decimal number... Also the term "tuple" in the definition seems strange. A "character pair" would be better.
Change the first paragraph (inherited from 1.0) of 3.3.16.2 Lexical Mapping to read: hexBinary's lexical space consists of strings of hex (hexadecimal) digits, two consecutive digits representing each octet in the corresponding value (treating the octet as the binary representation of a number between 0 and 255). For example, '0FB7' is a ·lexical representation· of the two-octet value 00001111 10110111.
The change proposed above was approved by the WG in its call of 1 December 2006. It is now reflected in the status quo version of the Datatypes spec. Accordingly, I am setting the disposition of this issue to RESOLVED / FIXED. If the originator of the issue would examine the change and let us know whether it satisfactorily resolves the problem or not, we'd be grateful. To signal that the resolution is acceptable, change the status of the issue to CLOSED. Otherwise, to signal that it's NOT acceptable, change the status to REOPENED (and tell us what's wrong). If we don't hear from you in the next three weeks, we'll assume that silence betokens consent, and close the issue ourselves.