From Semantic Sensor Network Incubator Group
The Skeleton of the Semantic Sensor Network Ontology
- Skeleton module
- Links between SSN and DUL (classes and properties)
- Documentation for the DUL classes and properties used by the SSN ontology (extracted from dul.owl)
- Paper: [Janowicz and Compton 2010] The Stimulus-Sensor-Observation Ontology Design Pattern and its Integration into the Semantic Sensor Network Ontology (superseded by this report).
This section outlines and discusses the skeleton first and then describe the alignment of the skeleton to the DOLCE foundational ontology and the development of the SSN ontology based on this alignment.
The design of the Semantic Sensor Network Ontology is the result of two iterations.
- Phase 1: development of the ontology modules and examples,
- Phase 2: alignment to the DOLCE Ultra Lite (DUL) upper ontology.
During the first phase, the group used the results from the review of existing sensor and observation ontologies [Compton et al. 2009a] and the input from all the participants. Figure 5.6 provides an overview of the ontology structure at the end of this first phase: the sensors themselves may have particular properties, such as an accuracy in certain conditions, or may be deployed to observe a particular feature, and thus the whole SSN ontology unfolds around this central pattern, which at its heart relates what a sensor detects to what it observes.
Before the end of the first phase, a proposal to align the SSN ontology with the DOLCE Ultra Lite (DUL) upper ontology was made on the basis of some preliminary alignment work done by one of the group participants using a core design pattern called the Stimulus-Sensor-Observation (SSO) Ontology Design Pattern. After careful consideration, this proposal was accepted by the group as a mean to refine and improve the ontology skeleton and make it more easy to use it in conjunction with other ontologies, especially ones which are also based on a compatible "upper ontology" skeleton.
This means that the SSN ontology skeleton is the sum of:
- a number of (mostly "local") ontology design decisions made during the first phase,
- plus re-engineering work done to align the ontology with SSO and DUL.
This section provides:
- a description of the core skeleton and of how it relates to Stimulus-Sensor-Observation Ontology pattern originally aimed at,
- a description of the alignment to DOLCE Ultra Lite (DUL),
The relation between the three ontologies (the SSN ontology contained at the end of the first phase, the core skeleton and the DUL-aligned version) is best thought of as layers or modules. The core skeleton (also referred to as ontology design pattern) represents the initial conceptualization as a lightweight, minimalistic, and flexible ontology with a minimum ontological commitment. While this pattern can already be used as vocabulary for some use cases, other application areas require a more rigid conceptualization to support semantic interoperability. Therefore, we introduce a realization of the pattern based on the classes and relations provided by DOLCE Ultra Light. This ontology can be either directly used, e.g., for Linked Sensor Data, or integrated into more complex ontologies as a common ground for alignment, matching, translation, or interoperability in general.
The Stimulus-Sensor-Observation Ontology Design Pattern
The Stimulus-Sensor-Observation Ontology Design Pattern, presented in Figure 5.7, aims at all kind of sensor or observation based ontologies and vocabularies for the Semantic Sensor Web and especially Linked Data. The pattern is developed following the principle of minimal ontological commitments to make it reusable for a variety of application areas. It is not aligned to any other top-level ontology and introduces a minimal set of classes and relations centered around the notions of stimuli, sensor, and observations. Based on the work of Quine, the skeleton defines stimuli as the (only) link to the physical environment. Empirical science observes these stimuli using sensors to infer information about environmental properties and construct features of interest.
Stimuli are detectable changes in the environment, i.e., in the physical world. They are the starting point of each measurement as they act as triggers for sensors. Stimuli can either be directly or indirectly related to observable properties and, therefore, to features of interest. They can also be actively produced by a sensor to perform observations. The same types of stimulus can trigger different kinds of sensors and be used to reason about different properties. Nevertheless, a stimulus may only be usable as proxy for a specific region of an observed property.
Sensors are physical objects that perform observations, i.e., they transform an incoming stimulus into another, often digital, representation. Sensors are not restricted to technical devices but also include humans as observers. A clear distinction needs to be drawn between sensors as objects and the process of sensing. We assume that objects are sensors while they perform sensing, i.e., while they are deployed. Furthermore, we also distinguish between the sensor and a procedure, i.e., a description, which defines how a sensor should be realized and deployed to measure a certain observable property. Similarly, to the capabilities of particular stimuli, sensors can only operate in certain conditions. These characteristics are modelled as observable properties of the sensors and includes their survival range or accuracy of measurement under defined external conditions. Finally, sensors can be combined to sensor systems and networks. Many sensors need to keep track of time and location to produce meaningful results and, hence, are combined with further sensors to sensor systems such as weather stations.
Observations act as the nexus between incoming stimuli, the sensor, and the output of the sensor, i.e., a symbol representing a region in a dimensional space. Therefore, we regard observations as social, not physical, objects. Observations can also fix other parameters such as time and location. These can be specified as parts of observation procedure. The same sensor can be positioned in different ways and, hence, collect data about different properties. In many cases, sensors perform additional processing steps or produce single results based on a series of incoming stimuli. Therefore, observations are rather contexts for the interpretation of the incoming stimuli than physical events.
Properties are qualities that can be observed via stimuli by a certain type of sensors. They inhere in features of interest and do not exist independently. While this does not imply that they do not exist without observations, our domain is restricted to those observations for which sensors can be implemented based on certain procedures and stimuli. To minimize the amount of ontological commitments related to the existence of entities in the physical world, observed properties are the only connection between stimuli, sensors, and observations on the one hand, and features of interests on the other hand.
Features of Interest
Features of Interest are entities in the real world that are the target of sensing. As entities are reifications, the decision of how to carve out fields of sensory input to form such features is arbitrary to a certain degree and, therefore, has to be fixed by the observation (procedure).
Procedure is a description of how a sensor works, i.e., how a certain type of stimuli is transformed to a digital representation, perhaps a description of the scientific method behind the sensor. Consequently, sensors can be thought of as implementations of sensing methods where different methods can be used to derive information about the same type of observed property. Sensing methods can also be used to describe how observations where made: e.g., how a sensor was positioned and used. Simplifying, one can think of sensing as recipes for observing.
Result (or SensorOutput)
The result is a symbol representing a value as outcome of the observation. Results can act as stimuli for other sensors and can range from counts and Booleans, to images, or binary data in general
Implementation of the Stimulus-Sensor-Observation design pattern in SSN
Figure 5.8 illustrates the changes applied to the Stimulus-Sensor-Observation Ontology Design Pattern to include the classes and relations already present in the SSN ontology. In particular, several "shortcut" properties have been added to provide users with more options to create links between the main classes: Observation, Sensor, Stimulus Property and FeatureOfInterest.
Also, a few class names have been changed to match the choices previously made for the SSN ontology:
- Result has been replaced by SensorOutput,
- Procedure has been replaced by Sensing,
- And SensorInput has been kept as a class equivalent to Stimuli.
Aligning the SSN Ontology and the core SSO design pattern with DOLCE
To ease the interpretation of the used primitives as well as to boost ontology alignment and matching, the SSO pattern has been aligned to the ultra light version of the DOLCE foundational ontology and refined to match the content of the SSN ontology.
Note that for this reason, new classes and relations are introduced based on subsumption and equivalence. For instance, the first pattern uses the generic involves relation, while the DOLCE-aligned version distinguishes between events and objects and, hence, uses DUL:includesEvent and DUL:includesObject, respectively.
Each class of the SSN ontology is then defined as a subclass of an existing DUL class and related to other SSN and DUL classes. New types of relations are only introduced when the domain or range have to be changed, in all other cases the relations from DUL are reused. The aim of the resulting extension to DUL is to preserve all ontological commitments defined before.
The class Stimulus can be either defined as a subclass of DUL:Event or its immediate subclasses Action and Process. In contrast to processes, actions require at least one agent as participant and, therefore, would be too restrictive for the design pattern. The classifications of events in DUL is work in progress. For instance, there is nothing said about how processes differ from other kinds of events. Therefore, the pattern defines a Stimulus as a subclass of DUL:Event. As a consequence, stimuli need at least one DUL:Object as participant.
Sensors are defined as subclasses of physical objects (DUL:PhysicalObject). Therefore, they have to participate in at least one DUL:Event such as their deployment. This is comparable to the ontological distinction between a human and a human's life. Sensors are related to their sensing method and observations using the DUL:implements and DUL:isObjectIncludedIn relations, respectively.
<owl:Ontology rdf:about="http://purl.oclc.org/NET/ssnx/ssn"> ... <owl:Class rdf:about="http://purl.oclc.org/NET/ssnx/ssn#Sensor"> <rdfs:label>Sensor</rdfs:label> ... <rdfs:subClassOf rdf:resource="http://www.loa-cnr.it/ontologies/DUL.owl#PhysicalObject"/> ... </owl:Class>
As an example, the code snippet shows how sensors are defined as physical objects in terms of DOLCE.
The class Observation is specified as a subclass of DUL:Situation, which in turn is a subclass of DUL:SocialObject. The required relation to stimuli, sensors, and results can be modelled using the DUL:includesEvent, ssn:observedBy, and ssn:observationResult relationship, respectively. Observation procedures can be integrated by DUL:sensingMethod.
The change to the Observation class should be noted and is visible in the figures presented above:
- In Figure 5.6, which shows the ontology structure prior to the DUL alignment, Observation is presented as a sub-class of Event. This corresponds to the approach preferred by the users of the Observations and Measurement standard (O&M) [OM 1 2007], [OM 2 2007].
- In Figure 5.9, a representation of the final version of the SSN ontology, Observation is presented as a sub-class of a dul:Situation. It is defined as a Situation in which a Sensing method has been used to estimate or calculate a value of a Property of a FeatureOfInterest. Links to Sensing and Sensor describe what made the Observation and how; links to Property and Feature detail what was sensed; the result is the output of a Sensor; other metadata details times etc.
This difference is documented in the ontology: Observation in this ontology and O&M are described differently (O&M records an observation as an act/event), but they record the same thing and are essentially interchangeable. The difference is in the ontological structure of the two, not the data or use. Observation here records a Situation (the estimation of the value of a Property) and a description of the method that was used (along with the participants), while O&M interprets an Observation as the event itself; there must, however, have been an event that led to our situation, so both are records of events. The distinction is between the event itself and the record of what happened in that event.
ObservedProperty is defined as a subclass of DUL:Quality. Types of properties, such as temperature or pressure should be added as subclasses of ObservedProperty instead of individuals. A new relation called SSO:isPropertyOf is defined as a subrelation of DUL:isQualityOf to relate a property to a feature of interest.
Features of Interest
Features of interest can be events or objects but not qualities and abstracts to avoid complex questions such as whether there are qualities of qualities. The need to introduce properties for qualities is an artifact of reification and confuses qualities with features or observations. For instance, accuracy is not a property of a temperature but the property of a sensor or an observation procedure.
Sensing (Procedure in the SSO pattern) is defined as a subclass of DUL:Method which in turn is a subclass of DUL:Description. Consequently, procedures are expressed by some DUL:InformationObject such as a manual or scientific paper.
The SensorOutput class (Result in the SSO pattern) is modelled as a subclass of DUL:InformationObject. The management of the concrete data value is introduced through a hasValue relationships to a DUL:Region and then through the data property DUL:hasRegionDataValue in conjunction with some xsd data type.
Device, System, Deployment, ...
As suggested in Figure 5.4, other parts of the SSN Ontology have also been aligned to DUL. Figure 5.10 is a more complete view of this alignment. It represents the links between the classes defined in the SSN ontology classes and in DOLCE Ultra Lite.
The alignment between the two ontologies is also defined through links between the properties which have not been represented in Figure 5.10. The complete list of relationships between the two ontologies is provided as an appendix of the generated documentation.