Skip to toolbar

Community & Business Groups

Semantic Sensor Networks Community Group

To continue the work of the Semantic Sensor Networks Incubator Group (the SSN-XG) in defining and using ontologies and mappings for querying, managing and understanding sensors, sensor networks and observations. This community group will also serve as a community and access point for ontologies (such as the group's SSN ontology) and technologies developed for semantic sensor networks.

Group's public email, repo and wiki activity over time

Note: Community Groups are proposed and run by the community. Although W3C hosts these conversations, the groups do not necessarily represent the views of the W3C Membership or staff.

No Reports Yet Published

Learn more about publishing.

Chairs, when logged in, may publish draft and final reports. Please see report requirements.

This group does not have a Chair and thus cannot publish new reports. Learn how to choose a Chair.

ParisWeb2013 presentation: Designing with Sensors: Creating Adaptive Experiences, by Avi Itzkovitch

[Cross-posted on the sensorweb Community Group‘s blog too.]

I just watched a presentation live from Paris Web 2013, Designing with Sensors: Creating Adaptive Experiences, by Avi Itzkovitch. (A video will probably be available in the near future.)

The tagline of that conference was: “Use Smart Device technology, Sensors and User Data to design a better user experience.

Quoting the excerpt of the conference:

” How do we utilize sensor and user data to create experiences in the digital world? We all know that smart devices have sensors, but how can we use this as a resource to acquire information about the user and his environment? And how can we use this information to design a better user experience that is both unobtrusive and transparent? The simple answer: we create adaptive systems.

A microphone could be used to determine if a user is in a quiet or loud environment,
Analyzing the user’s search patterns or what applications he downloads can tell us about his preferences and hobbies. Using assisted GPS technology for tracking current location and location history can yield us the user’s surroundings and the physical boundaries of his life, so we can understand what subway station he takes to work, or where he likes to eat his lunch. This data allows us to learn about the user, his environment and dynamically adapt to his needs in any situation. When we understand the context of use we can design functions that are triggered in relevant situations.

Join speaker Avi Itzkovitch to discover core concepts for utilizing smart device technologies and sensor data in order to understand context, and add “adaptive thinking” to the UX professional’s toolset when designing experiences. In his presentation, Avi will demonstrate the importance of understanding context when designing adaptive experiences, give ideas on how to design adaptive systems and most important, inspire designers to think how smart devices and context aware applications can enhance the user experience with adaptivity. “

I found an overview page on his Website, Designing with Sensors: Creating Adaptive Experiences, which embeds a slide-deck from November 2012.