This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 2970 - Datatypes 2006-02-17 WD: merge precisionDecimal and decimal
Summary: Datatypes 2006-02-17 WD: merge precisionDecimal and decimal
Status: CLOSED INVALID
Alias: None
Product: XML Schema
Classification: Unclassified
Component: Datatypes: XSD Part 2 (show other bugs)
Version: 1.1 only
Hardware: Macintosh All
: P2 normal
Target Milestone: ---
Assignee: C. M. Sperberg-McQueen
QA Contact: XML Schema comments list
URL:
Whiteboard: medium, hard
Keywords:
Depends on:
Blocks:
 
Reported: 2006-03-02 23:54 UTC by Xan Gregg
Modified: 2006-10-09 16:19 UTC (History)
0 users

See Also:


Attachments

Description Xan Gregg 2006-03-02 23:54:52 UTC
Adding xs:precisionDecimal is clearly a big change for 1.1. But given that, why be so conservative with 
xs:decimal? Like adding a new datatype, expanding the value and lexical spaces for xs:decimal (e.g., 
INF, 2.1E-2) doesn't impact existing uses. Old valid decimals are still valid.

Furthermore, why not have xs:precisionDecimal be the base type of xs:decimal? This requires widening 
the xs:decimal value/lexical spaces, as above, and having the xs:decimal mapping choose a particular 
value for the precision property. A good choice would be to use the least precision value that maintains 
numerical equality and that forces at least one fraction digit in the precisionDecimal canonical 
representation.

I know it is undesirable for a type to have a different lexical mapping than its base type, but nearby 
xs:integer already has that defect. One could even argue that many xs:string types also have the same 
issue via the whitespace facet. For instance, some lexical representations will produce different values 
for xs:token than they produce for xs:string.
Comment 1 Dave Peterson 2006-03-03 02:44:49 UTC
(In reply to comment #0)
> Adding xs:precisionDecimal is clearly a big change for 1.1. But given that, why be so conservative 
with 
> xs:decimal? Like adding a new datatype, expanding the value and lexical spaces for xs:decimal (e.g., 
> INF, 2.1E-2) doesn't impact existing uses. Old valid decimals are still valid.

Agreed, this part would be be relatively easy--although there are technical details that would, for 
example, require recasting a number of facet functions, and the limits we set for minimum partial 
implementations.

> Furthermore, why not have xs:precisionDecimal be the base type of xs:decimal? This requires 
widening 
> the xs:decimal value/lexical spaces, as above, and having the xs:decimal mapping choose a 
particular 
> value for the precision property. 

We did work out completely the necessary description of the way the mapping could be defined, and 
the specialized facet(s) that would have to be introduced to cause this kind of derivation, but we 
decided there was too much machinery and too much disruption to the organization of our datatypes.  
At least that is my perception of why the WG chose not to do this.  At least I can assure you that the 
idea was carefully considered; it was not accidental that it wasn't done.

> I know it is undesirable for a type to have a different lexical mapping than its base type, but nearby 
> xs:integer already has that defect. 

I don't believe integer has anything unusual any more in its lexical mapping.  The only magic is in the 
canonical mapping, which is strictly advisory and no longer used in schema processing.

>                                               One could even argue that many xs:string types also have the same 
> issue via the whitespace facet. For instance, some lexical representations will produce different 
values 
> for xs:token than they produce for xs:string.

Not true.  You're thinking of character strings like ' a ', but that character string is *not* in the lexical 
space of token.  If that character string occurs between the start- and end-tags of an element that is 
typed token, the leading and trailing whitespace will be stripped first, and then the resulting string ('a') 
is the lexical representation.  Whitespace processing happens before candidacy as a lexical 
representation is considered.
Comment 2 Xan Gregg 2006-03-03 15:47:15 UTC
I remember some of the discussion for aligning these types in the hierarchy
before when I was involved with the WG, but I don't remember the option
presented in my original comment here ("minimal precision") being considered. I
mainly remember a proposal where precision=absent for xs:decimal, but that
expands the value space of xs:precisionDecimal since the latter doesn't allow
precision=absent for numeric values.

What is the extra machinery for treating xs:d as a "minimal precision" xs:pd
(and xs:integer as an integral-presicion xs:pd)? I take it instead of different
lexical mappings, you would have a shared lexical mapping that is conditioned by
a precision-mapping facet, whose values would be exact, minimal, or integral.
Any other machinery or is that already too much?

Anyway, my memory is faulty and probably my analysis, too.

Thanks for the factual corrections on the lexical mapping issues for integer and
token.
Comment 3 Dave Peterson 2006-09-15 16:29:50 UTC
(In reply to comment #2)
> I remember some of the discussion for aligning these types in the hierarchy
> before when I was involved with the WG, but I don't remember the option
> presented in my original comment here ("minimal precision") being considered. I
> mainly remember a proposal where precision=absent for xs:decimal, but that
> expands the value space of xs:precisionDecimal since the latter doesn't allow
> precision=absent for numeric values.

Actually, both were worked out and considered.  For the variant you remember, you have mentioned one difficulty.  For both, there has to be new, special=purpose facets to accomplish the derivation, which would have to be tracked; if those facets exist, then other users could use them to create
other-namespace, other-named versions of decimal that could not be recognized by name and would make recognization for optimization difficult.  In addition, either proposal changes decimal from primitive to derived, which IIRC makes unfortunate trouble for F&O.

> What is the extra machinery for treating xs:d as a "minimal precision" xs:pd
> (and xs:integer as an integral-presicion xs:pd)? I take it instead of different
> lexical mappings, you would have a shared lexical mapping that is conditioned by
> a precision-mapping facet, whose values would be exact, minimal, or integral.
> Any other machinery or is that already too much?

"Too much" is, of course, in the eye of the beholder.  But requiring a lexical mapping that must depend for its results on a facet value introduces several complications:  One is that no libraried mappings work this way, so implimentations must always add their own wrapper.  Another is that technically a datatype exists on its own, independent of any simple type definitions that "point at it" and link it to a name.  For that datatype to have to have knowedge of those simple type definitions is a whole new version of datatype which (at least it's my impression that) the WG is not prepared to work out on the schedule it must work to to get to publication.
Comment 4 Dave Peterson 2006-10-06 16:15:36 UTC
The WG has looked once more at this issue and concluded that there is just too much machinery that needs be added to the facet system for too little gain.

Please let us know if you agree with this resolution of your issue, 
by adding a comment to the issue record and changing the Status of 
the issue to Closed.  Or, if you do not agree with this resolution, 
please add a comment explaining why. If you wish to appeal the WG's 
decision to the Director, then also change the Status of the record 
to Reopened. If you wish to record your dissent, but do not wish to 
appeal the decision to the Director, then change the Status of the 
record to Closed.  If we do not hear from you in the next two weeks, 
we will assume you agree with the WG decision.
Comment 5 Xan Gregg 2006-10-09 16:19:49 UTC
Thanks for the consideration.