DefinitionalWebClosure

From W3C Wiki

This is an attempt to define a subgraph of the WebClosure which is useful in establishing SocialMeaning. The idea is that computing the DefinitionalWebClosure of a graph and checking it for internal consistency (satisfiability) might be a practical and useful way to find common errors of NamespaceSquatting and NamespaceDistortion. The DefinitionalWebClosure may also serve as a good knowledge base for answering queries when the graph itself is too small and the full WebClosure is too large.

Consider the "definition" of a URI to be the subgraph of the graph fetched from that URI whose truth value constrains the interpretations of that URI. That is, any triple in the retrieved graph is part of the "definition" if it being true would tell you something about what the URI could possibly denote. (Feel free to rephrase this, MT gods.)

PFPS:  The problem with this above definition is that it depends on the entirety of one's knowledge. Take the old   "Morning Star"/"Evening Star" example.  If I don't know that they are the same, then anything said about one doesn't (well, sort of) affect the other.  However, if I do know that they are the same, then anything said about one does affect the other.  So the subgraph would grow or shrink depending on other knowledge.  (Actually the situation is even worse, as the parenthetical remark above alludes to.  Depending on how one reads this, it might be the case that everything can affect everything, because there is some extension of one's knowledge where they do.)
SandroHawke replies: (Wow, I invoke the MT gods and one appears!)  I am thinking that the "definition" is defined in terms of having no other knowledge but what is explicitely stated in the original graph and this branch of the closure from that original graph.   But I'm not sure about semantic extensions; are you supposed to understand owl:sameAs triples?....   I guess the DefinitionalWebClosure needs to be parameterized by certain semantic extensions to RDF (eg OWL), but that makes sense.   (... thinking about how to implement this ...)

The DefinitionalWebClosure of an RDF Graph contains the original graph plus the "definitions" of all the URIs which occur in it, plus the "definitions" of the URIs which occur in those definitions, and so on recursively.

For example:


<a> <b> <c>


From <a> we fetch:

<a> <x1> <x2>     # in DWC
<b> <x3> <x4>     # not in DWC; we followed <a> to get here, not <b>
<x1> <x5> <x6>    # in DWC: could constraint x1, and thus a


From we fetch:

<b> rdf:type <C>    # in DWC
<y1> rdf:type <C>   # not in DWC; even though this triple is connected
                    # to <b>, we can eliminate because of the semantics
                    # of rdf:type.  Knowing about a co-instance cannot
                    # eliminate a model, I don't think.  And this is 
                    # basically necessary to keep from sucking in the
                    # whole world.


That's the idea. It remains to be seen whether this captures the information which intuitively we want in a definition, without getting so much that it's impractical to gather.

You cn think of this as the "Contiguous graph" as in the "contiguous 48 states". A variation is one in which you follow only predictaes, not subjects or objects.


hmm... is this based on any experience? Documenting PPR:DesignPatterns based on experience is great, but this feels like PPR:BigDesignUpFront. No, it's just an idea.