See also: IRC log
<PhilA> rrsagent. make logs public
<raphael> scribenick: raphael
<scribe> scribe: nick
<scribe> scribe: raphael
<scribe> scribenick: raphael
Phil Archer is doing the opening of the W3C Linking Geospatial Data Workshop
scribe: room is fully packed
First session is chaired by John Goodwin
Clemens Portele (Interactive Instruments): Using INSPIRE data on the Web
scribe: background: ELF =
European Location Framework
... ELF: http://www.elfproject.eu/
... where are we with respect to INSPIRE data?
... INSPIRE Geoportal: http://inspire-geoportal.ec.europa.eu/
... click on the Discovery link, and browse the catalog of datasets for download
... problem: linking mismatch
<PhilA> INSPIRE about providing the geography, different domains of interest then add/link to it
scribe: in scope are the widely
used spatial objects such as cadastral parcels, addresses, CRS,
... need for an agreed common base of objects
... how to access to features via http?
... ELF provides access to data in more than one platform, including ArcGIS Online
... example of an administrative unit description in HTML or JSON
... example of a Gazeteer geolocator web service
... next problem is persistent identifiers
... this is a recognized problem in INSPIRE, which provide INSPIRE IDs
... on the web, those IDs should be http URIs
... how to deal with copies? copies of the same feature (e.g. an administrative unit) exist, which URI should be used for the consolidation?
<PhilA> So there's a challenge - multiple URIs for same thing
scribe: implicit links: example
of Eurostats stats that use NUTS code, but this is not
... system use lat/long and address, but not NUTS code, how can we change this?
<PhilA> Looks as if CSV on the Web work could be relevant here
<PhilA> (for NUTS codes)
Kerry Taylor: Developing Ontologies for Linked Geospatial Data
Kerry: a tale of two ontologies - ACORN-SAT (long term climate data accessible via data.gov.au) and ad-how modeling the water regulations 2008 (report of water data)
<hugh> PhilA - sameas.org? :-) http://www.sameas.org/?uri=http://viaf.org/viaf/85312226
Kerry: ARCON-SAT: 100 year climate observations, 61 million RDF triples, station metadata included, 5 stars Linked Data
<PhilA> I had a feeling you'd mention that hugh :-)
Kerry: ontology used are W3C SSN ontology, W3C datacube vocabulary and Geonames for places
<hugh> And don't forget different is (more) important? http://differentfrom.org/os/symbols/http%3A%2F%2Fdata.ordnancesurvey.co.uk%2Fid%2F7000000000003822
<vicchi> Would be interesting to hear experiences on using GeoNames as a single place source - over coffee though
Kerry: another picture of the
ontology showing how SSN and DataCube are articulated in the
... work performed by 6 person months (from CSV to 5 stars open linked data)
... ELDA is used as the linked data api for browsing the data
... WDTF ontology (water ontology, adhoc project)
... WDTF is an XML format developed to capture water data, broadly adopted in Australian industry
... goal was to translate the UML into OWL
... there is WaterML 2.0 standardized by OGC, see http://www.opengeospatial.org/projects/groups/waterml2.0swg
... model driven architecture approach, is it good to translate this UML model in a set of ontologies?
... end up developing WDTF OWL which does owl:import Harmonized OWL ontologies from ISO/DIS 19150-2 rules
... WaterML2.0 arose from WDTF
... what went wrong? this took way too long and end up with a too complex ontology which is useless
<boricles> very nice sentence Kerry ... too much semantics ...
Kerry: too much semantics in the UML which is not properly captured in OWL
Lars G. Svensson: Enriching the German National Library’s Linked Data Service with Geographic Coordinates : Approach and Lessons Learned (so far)
scribe: the DNB has started to
publish geospatial data
... point coordinates and bounding boxes for maps and charts, and geographical entities (authorities)
<PhilA> You take the data that's easiest to use, says Lars
<vicchi> "you take the open data which is (most) easy to use" ... very valid point - ease of use over (potential) accuracy
scribe: the slides are at
... the links between titles and places allows to find publications deadling with a specific location
... for the RDF representation of those coordinates, we had 2 major requirements: same vocab for points and polygons, and a widely deployed vocab
... decision was to use GeoSPARQL, and the WKT representation for the polygons
<ldodds> I wonder why they didn't link to external sources, rather than include the data directly?
@ldodds, because they have the data, which might be different that what you include, they want to be an authority too
<Alex_Coley> "GeoSPARQL was chosen, it may have been the wrong choice" <- will be interesting to hear the thinking behind that
<herste> Unclear whether to use longitude-latitude or latitude-longitude.
scribe: the Linked Data Service of the German National Library is available at http://www.dnb.de/EN/lds.html
<boricles> +1 to ldodds question
Question and Answer
<peterisb> lon lat or lat lon - axis orientation
Andreas Harth: is it possible to have access to the catalog data rather than browsing the HTML interface?
Clemens: not yet, licensing issue, but soon links will be provided on ELF
Kerry: bringing WaterML 2.0 in the OWL is a mistake if you do this as a literal translation
Leigh: why do you write the geo data rather than re-using existing data sources
<PhilA> Peter Parslow: Who has been using your LD?
Lars: we re-use Geonames
... places data is becoming increasingly important within the library
<PhilA> Stuart: Is there a geoSPARQL store endpoint or just GeoSPARQL data
Stuart: do you just use geosparql syntax or also a geosparql endpoint?
Michael and John: what about temporal evolution of administrative unit decomposition?
Phil: there is a session about
... what was the reason for the failure to capture WaterML semantics? Was it an OWL failure? Do we need OWL 3?
Kerry: no, not a problem of OWL
Lars: we don't do CIDOC-CRM, very complex, hard to consume and we want to ease the linked data consumption
Titi Roman: looking for best practices for publishing the data?
Lars: different models: bibframe,
frbr, etc. so a choice needs to be make
... for authorities, we made our own ontology
<hugh> Is Pelagios relevant: for the previous Q? https://github.com/pelagios/pelagios-cookbook/wiki/Pelagios-Gazetteer-Interconnection-Format
<boricles> there are some efforts on that direction ... http://www.w3.org/TR/ld-bp/
<hugh> Ah - found the group - worrying about temporal: https://groups.google.com/forum/#!forum/lod-gc
Titi Roman: how do you manage data quality in INSPIRE?
Clemens: this is clearly important
<hugh> Nope, not right - Leif Isaksen (email@example.com) is trying to get a UK group with temporal, but can't find it - I'll get back in my box :-)
<hugh> Yeah - nice rt
<HadleyBeeman> Subtopic: Interoperable Registers and Registries in the EU: Perspectives from INSPIRE. Andrea Perego/Michael Lutz
<scribe> scribe: HadleyBeeman
<scribe> scribenick: hadleybeeman
AndreaP: Does everyone know about INSPIRE? Okay, no need for an introduction.
… We are dealing with the maintenance process, coordinated by a the INSPIRE implementation group.
<Gianfra> hello is there some live streaming?
<raphael> no Gianfra, only those minutes
<PhilA> Diederik Tirry presents PIDS and RDF for INSPIRE
<PeterParslow> (late note for previous scribe: I was the one who asked what kinds of people are using the German library linked data)
<rhwarren> PeterParslow: I will be shortly. Having a geometry to search with helps a lot.
<Gianfra> thank you raphael
<PhilA> Diederik: Slides are informative of topic
<peterisb> lack of agreed rules how to create RDF vocabularies from UML models.
<PhilA> ... looking for guidance on URI construction for PIDs etc
<raphael> ARENA = A Reusable INSPIRE Reference Platform, https://joinup.ec.europa.eu/community/are3na/description
… INSPIRE meant to be using UML models. Technical guideline for how to encode in GML, but no guidelines for how to encode them in RDF.
… So we are doing tht.
Related: issue of identifiers. Special databases, objects, code lists.
… Help us build bridges from INSPIRE to eGov. Webinars, strategies for persistent identifiers.
... Research Data Alliance: Research Data Sharing without barriers. Herman Stehouwer,
Herman: I'm going to talk about the Research Data Alliance in general
… We're a worldwide organisation that tries to increase research data sharing. We do this in a small way, taking a small step that allows data to be shared that could be before.
… We have working groups with a specific task and lifetime, who must deliver something. We have interest groups, which are more broad.
… We have 15 working groups and 27 interest groups.
… Some will deliver stuff by the end of this year. We'll have our 4th official conference, we will have existed for two years.
… One group is working on a common layer on top of persistent identifiers where you can put in formation in a standardised way.
… There is a Data type registries group, that wants to describe data.
… There is a metadata standards directory
… There is a group on terminology of interest within the RDA.
… I wanted to highlight the citation of dynamic data.
… EPOS are dealing with data which is constantly changing, and they want to be able to cite their own data at a specific time or decision point.
… The data is gappy, it goes continuously, they may need to recalibrate a sensor, a goat may eat a cable, etc.
…. You can participate in upcoming plenaries. Upcoming one in Dublin at end of the month, in Amsterdam in end of the year. Discussions are mostly online.
… You can join or start a group.
Subtopic: Oceans of Linked Data? Adam Leadbetter
AdamL: Social sharing of data and linking data and registries: come together in linking oceanagraphic data here.
… Re social aspects: if you put scientists together in a tin can for a long time, they will share data.
… Oceanography is expensive, encourages data sharing.
… You want to reuse existing data as much as possible.
<Keith> RDA: some of us are working on linking LOD (government level usually summary) to research data (usually held in database environment with detailed schema-level metadata). The EC ENGAGE project is making this bridge
… Even back to 1987, we had registers of controlled vocabularies.
… Here is a book of them, published by UNESCO
… We've grown those base vocabularies and put them on the web. Initially in CSV, now they're online in SKOS.
… Recently, W3C has published guidelines on linked data. These vocabs fit nicely with those.
… We've built up trust in the community by having these be online, persistent, version-controlled, and understanding how they're being used.
… Internet of Things: this is a robot vessel in the ocean. As we're building new standards for these sensors, we're looking at embedding semantics in them.
… Mark-up/annotation from the moment the data is captured.
… Most of our registers are of parameters in the data, but we do have a few that are geospatial. We have a gazetter of sea areas (as mentioned in the BBC Shipping Forecast). We have bounding boxes for them.
… Heading to data management questions. Can ask the ship operators what tools were deployed, who to contact, etc.
… Marine Strategy Framework Directive in Europe, reporting on chemicals in the seas.
… We need to link these things spatially, rather than just recording what and how.
Question: Re SKOS vocabularies in the oceanography: they are in SKOS RDF?
Question: Are we looking for relationships between specifications? (In RDF vocabularies)
Reply: That's part of the study. We will see. In the end, we want to provide guidelines.
Question: Jeremy Tandy: INSPIRE implementations often talk about the document object, not the real-world thing itself. Has there been any thought on how to reconcile them?
<joeri_robbrecht> How to convert INSPIRE dataspecs into RDF. A RDF model for every dataspec or an integrated model for all INSPIRE annex thematic datasets?
Reply: INSPIRE was not meant for that.
… At this moment, I don't have an answer that. I hope that in may we'll have more guidance.
Question: PhilA: Is it hard to come up with URIs for versions? At W3C, we have the latest version URI, and every document has its own URI. Each iteration has its own URI. You can add language tags. Why is this a problem? How does content negotiation help/hinder?
… For the technical part, you have to find a way that isn't increasing the complexity.
Question: If you have a code list in the registry, and you use that code in some value. Then the decision to update that value creates a new element in the code list in the registry.
… You have to make aware people using the registry that something has changed.
Herman: Where data has observations made on a certain date — a persistent identifier will give you just the more recent version. But it gets more stable over time.
AdamL: We iterated on this problem, settled on the persistent identifier pointing to the most current version. Painful process.
… but there is a social aspect, trust on what you're providing and how people are using it. If you deprecate - can cause problems.
Question: Have any of you used a versioning or provenance ontology to describe versioning of data?
AdamL: We version version tags to explain the concepts in the register, but we dont' have full provenance info. We should look at that in the next year.
Andrea: Depends on what you have to do. Provenance would be good.
Question: to AdamL: a geodetic reference is needed in what your'e doing, isn't it?
AdamL: Yes, we do store it. It's in GS84.
… Time is as important in the ocean as space and depth. Every measurement is the one capture of that x/y/z/t of that specific parameter.
… We don't have a data lifecycle because we don't ever get rid of anything.
… Traditionally, we've been thinking of other aspects of the data.
<Stuart> scribenick Stuart
<Stuart> scribenick: Stuart
subtopic: The MELODIES project: Exploiting Linked Open Geospatial Data Jon Blower, Reading University
Panels session: short presentations - presenters plus other and hopefully lots off time for discussion.
Broad interest in EO in paper submissions.
scribe: Melodies is about using
/exploiting linked open data.
... acronym expansion (too much to type!)
... aiming to demonstrate the value of openly publishing data.
... EUR 6.7M buget.
... obligatory FP7 slide... partners include SMEs with a big 'S' that are looking to use open linked dtata.
... Project overview: Sources to the left; platform; and then services feeding back as sources.
... Oh... and users :-)
... 8 services... precision agricultuture, urban planning... all anchoured in some UN initiative/activity.
... EO data mostly raster data. Processed to extract features; agreegate; integtrated and then present in applications.
... using Strabon geospatiaotemporal linked data platform.
... Challenges and question... (find the slide).
subtopic: Expressing weather observations as Linked Data; ISO 19100 geographic information meets semantic web head on Jeremy Tandy,
Jeremy Tandy: Weather observations as linked data.
scribe: WMO been a UN agency from
1951; 190?? countries exchanging weather information.
... WMO has MOUs with ISO and OGC...
... ISO/OGC start with abstract specifications; build application schema on top of these; various topics.
... key thing is we have 'semantics' vested in the application schema.
... ISO 19150-2 is a piece of ISO work looking at the creation of OWL ontologies from ISO application schema.
... Example measured air temperature at Exeter airport as of a given time.
... But were not in a green field and we should be using those from spaces like LOV listed vocabs.
... abstract specifications need to translated into existing widely used vocabs.
subtopic: Data Discovery: Unleashing the Power of Geospatial Linked Data Dicky Allison, Woods Hole Oceanographic Institution
<PhilA> Dicky: WHOi has more geo date than anyone. Not a boast, a ple for mercy
Dicky Allison: BM-DM has one of the most heterogenically diverse data collection on the planet.
scribe: that's a plea for
... Matching and mapping terms for the purpose of finding data about same place, event, measurement ....
... First step is local to global vocab mapping.
... BC-DM then map to broader community vocabs.
... NERC vocabs relate to other related vocabs and we (BC-DMO) benefit from those curated mappings.
... Use GeoSPARQL ontology to capture whole of a cruise's track.
... aiming to link to geonames and/or SISSVOC marine regions vocab.
... annotating cruise tracks with relevant POIs'.
... aiming to provide GeoSparql support at endpoints.
Panel forms: presenters plus: Chris Little, Massimo Zotti, Bente Lija Bye and Tony Bush.
Massimo: An open data buisness model: use ontologies produced for INSPIRE data models to link EO data to ???. Data Fusion centre...
Chris Little: From UK Metoffice. At OGC chair ??? Abstract is a quesion/call to action. Problem happens after youv'e found the data you want. Terra bytes of it... now you want to work with it. Given a conceptual model can you use that to generate some subsetting tools.
Tony Bush: from Ricardo AEA and air quality specialist and amateur INSPIRE specialise. Working with DEFRA to migrate UK air quality holdings to INSPIRE. Now have an application schema that draws on about 5 INSPIRE data spec. Regulatory framework pulling in different directions.
scribe: here to find out how to do things better and improve openness in air quality data.
Bente Lilja Bye: from Norway. Representing small companioes using EO data to bridge the gap to societical applications.
scribe: Social aspects of data policy also come into play.
<herste> Whether we like it or not: people are generating data via social media.
<herste> There are some issues, we need to know the quality of the data, can you thrust this data (filtering), need to integrate this in the systems that make decisions.
scribe: most of the data were using is authoritative. However, social data say around a crisis. How do we make that data interoperable and integrated it into decison making systems. How can we ascertain the quality of socially sourced data and factor that into decision making processes.
<AdamL> @Stuart - to clarify at 11:52 - Dicky is potentiallyvlinking to SeaVoX Gazetteer, not SISSVoc (http://vocab.nerc.ac.uk/collection/C19/current/)
Bart ??? from OGC: Question to panel viz Chris point. EO data is quite large. Tend to download to process it locally. How about taking the procing power to the data and resuslts to the user.
<Gianfra> Gianfranco Gliozzo from UCL/ZSL for Bente Lilja Bye: are you also combining data with environmental observations from citizens? I am referring to citizen science.
CL: A rule of super comuting is that computation is free. Data is so big now that you have to leave it where it is and take the applciation to the data.
JT: We collaborate around the world. Now looking at ways to bering up 'cloud' based apps adjacent to the data where its is. Challenge to get levels of agreement about platform standatrd and how to be confident runing someone elses code on your box.
BLB: noting the public efforts to build maps in support of disasters like the Beijing flooding (??)
DA: we don't deal with tera bytes of data, but we have ways of users being able to subselect our data.
TB: The problem of moving
computation to data is beginning to be recognised in
infrastructure. Web service APIs are good for small amounts of
... Lets not forget aout file formats/serailisation.
<PhilA> DanCooper Hants CC
Dan Cooper Hampshire Council: Looking to release councils aerial data soon... trying to figure out how they should do it? How would users want to access it? and in the context that it will incur some cost to provide.
TB: Map services can be quite challenging. Just making the images simply available is a really good first step.
JT: Yes concur... that the sort of thing the Meto have been doing. Good metadata... KML wrapping.
CL: Don't view publishing INSPIRE as a pain. Look for the benefits that you can gain.
Agggh all thos TBs' are not TB... I think its Jon Blower.
Steve Peters DCLG: Moving into interesting times with socially/crowd sourced information and reporting. Challenging to official sources. How should stanadard evolve to put officially sourced and crowd sourced data on a level playing field.
<AdamL> PhilA: The most important star of Linked Data is the first star - get your data out there with an open license on it.
BLB: ISO/OGC et. al. should be
looking at this.
... need different kinds of standard.
Bart ??: EU has funded 5 citizen data observatories.
scribe: GeoBigwa(sp?) for
applying quality labelling to spatial data sets.
... OGC participates on one of these citizen observatories.
JT: Crowd source Weather Observation Website. This info is used alongside official data. But at present not used as input to forcasting models - data quality issues.
<Gianfra> crowdsourced environmental information also referred as one of the forms of citizenscience
Tony Bush: Data quality flags are essential. Danger of conflicting message between official and crowd sourced data.
Chris Little: We rely on the 'community' to do their own quality control. Means to express data quality and community may comment on each others observations.
Bente Lilja Bye: Wanted to mention near real time aspects. Particularly important in disaster situations. Even evaluation of authoritative data quality in near real time is already challenging.
Dicky Allison: missed comment
Tony Bush: Mention of UK DEFRA open data strategy - publish up front and then improve later.
Alex: Yse... growing experience of practice. Feedback has been valuable in improving the data.
Keith May, English Heritage: We've put up some heritage data: freshly discovered that one of our terms has been use by someoneelse. How do we find these things out.
Phil A: Plug for best practice work, data citation vocabuilaries.
???: Mention of Billions of triples challenge.
Herman Stehouwer: Please cite data so that in the future we can measure data citation.
<AndyS> ??? = Andreas Harth
Alex: call to ACTION: think about the standards we need and the gaps we need to fill.
Que next scribe :-)
Ah.. that will be Phil!
<LarsG> PhilA: many standards organisations in the room. They are listening carefully!
<PhilA> scribe: PhilA
<scribe> scribeNick: PhilA
Christopher Baldock, The Clear project http://www.w3.org/2014/03/lgd/papers/lgd14_submission_17, paper at http://www.w3.org/2014/03/lgd/CLEAR
Similarities between CLEAR and MELODIES project
ChrisB: Linking company data and
geospatial so you can see where the companies are
... Env Agency has data integration tool, includes data submitted by companies, permits, waster abstraction etc
... main challenges were differnet data collection methods
... using company numbers from Dun and Bradstreet
... shows example of Tata Group
... Chanllenge now is getting company level data from various countries
... want to include geospatial, things like air and water quality, land cover classification
scribe: lists the challenges
Speaker is actually Otakar ?erba
Tallking about Plan4Business project http://www.plan4business.eu/
Otaka: Shows list of data sources
... general architecture
... robust storage engines
... data processed by variety of tools
Otakar: Shows publishing of
... location evaluator provides some info about regions
PhilA: Looks like Otakar should talk to Steve Peters - work looks similar
subtopic: The GeoKnow Generator: Managing Geospatial Data in the Linked Data Web, Claus Stadler
Geoknow paper http://www.w3.org/2014/03/lgd/papers/lgd14_submission_52
GeoKnow project http://geoknow.eu/
ClausStadler: Many apps need
traditional GIS as well as RDF. make Sem Web less special
... Introduced GeoKnow Generator
... set of services and APIs
... incl data transformation
... interlinking is most fundamental part
... everything else comes from that
... on data level, 2 partners have use cases based on their private data
... includes persistence layer
... work being done on improving performance
... e.g. sorting triples based on spatial proximity
... life cycle shows tools for each node
... quick plug for EDF 2014
Q & A session
Question: How do you secure the knowledge when the EC funding ends
ClausStadler: We get another
... part of the funding is sufficient to keep it running. We bought a server in LOD2, we're reusing it in GeoKnow
ChrisB: The cube data base that the Env Agency developed , with risk asessment scores etc. That's available for free reuse (subject to data sharing contract and data licence)
Bart: Will that be available after the project
ChrisB: Ask me when it's finished
Otakar: We're preparing the association to develop our products and we hope to have enough customers for our data. Our Plan4Business and its predecessor Plan4All will survive
ClausStadler: We have put our open source toools on GitHub so they last as long as GitHub lasts
<LarsG> Claus Stadler: Commercial partners are eager to have the data up and running since they build applications on top of it
Rein: Lots of partners with lots of use cases. Projects often don't publish the use cases - will you?
ClausStadler: Our use case
partners have data that is confidential. Our tools are very
generic. if there is commercial interest in the tool, OK, it's
ChrisB: Everything we do in CLEAR
is informed by workshops we've done with Env Agencies around
... so lots of communication and that's all publicly available
<LarsG> PhilA: do all the companies have URIs?
<LarsG> PhilA: OSM has been mentioned several times but is not represented at lgd
<jtandy> scribe: Jeremy Tandy
<jtandy> scribenick: jtandy
First speaker ... integrating address data ...
Stijn Goedertier, PWC
2 years ago created core location vocabulary & pilot application
looking at data consumer perspective, hard to consume address / location data
data is fragmented across multiple datasets ... lack of common identifiers etc
address data governance in Belgium is fragmented; this is reflected in the available address data
Core Location vocabulary ...
scribe: conceptual model + RDF schema + XML schema
pilot application: deliver address data from Belgium as a linked data service
persistent identifiers for the entities in the data
3 important use cases:
i) disambiguation ... are these two identifiers talking about the same address?
ii) resolve information about an address using its identifier
iii) link datasets using these address identifiers
<Alex_Coley> Core Location Vocab link: https://joinup.ec.europa.eu/asset/core_location/document/core-location-pilot-interconnecting-belgian-address-data
pilot project demonstrated all three use cases
conclusions: ... core location
vocabulary works and you can use it with INSPIRE data
... the vocabulary is easily extendible
conference plug: http://semic.eu, Athens, 9-April
(semantic interoperability conference)
Next speaker: Kostis Kyzirakos
Linked Earth observation data projects
Projects: TELEIOS and LEO
Lifecycle of Linked Open Earth Observation Data ...
Developed a query language for scientific applications: SciQL
Data Vault framework - extend MonetDB to access external (?) file stores
(Remote file repositories)
Example application for DLR organisation ....
semantic annotation ... port areas for venice
using contextual information from the Linked Open Data Cloud to annotate the data
Developed extensions of RDF and SPARQL to deal with data that varies over time: stRDF and stSPARQL
<JohnGoodwin> Ok - will see how it goes :)
RDFi framework developed to query against incomplete information
Data publishing example ...
Attempt to reconcile identifiers for entities using similarity, location etc.
Example publication of temporal data: evolution of land use
Next talk: Linked Data and geoinformatics, Frans Knibble
Miss Globe is dreaming about mr cube ... amusing slides :-)
<PhilA> I have corrected Frans's name on the agenda/attendee list
(context: how will geoinformatics "miss globe" work with semantic web / linked data "mr cube"
but mr cube is thinking about miss globe ...
scribe: critical mass (linked
data is not as widely accepted as geospatial information)
... enhanced connectivity (between data objects and datasets)
... geoinformatics can add topological links to data
there was love between mr cube and miss globe ... now there's a baby too
if they can get together
but there are issues to resolve
scribe: how to encode
... topological functions
... metadata; additional metadata for spatial datasets
... software support; for spatial datatypes and functions
... geometry WKT literal descriptions are large objects; right now we're only used to simple literals
... steep learning curve for application developers to work with linked data
Next speaker: Boris Villazón-Terrazas
Ecuadorian Geospatial Linked Data
building a latin american linked data community
scribe: special focus on Ecuador
"Ecuadorian Geospatial Linked Data"
extend guidelines for publishing linked data to the geospatial linked data domain
scribe: selection of geospatial data resources, assignment of URIs etc.
data sources: geo-databases, KML, shape files etc.
identification of existing geospatial vocabularies
scribe: neogeo, geosparql, core location, schema.org etc.
the project is using geosparql and ISO 19115 geographic information metadata vocabs
issues / discussions include:
scribe: separation of geometry
from the real world thing (feature)
... representation of geometry as RDF literal
GeoKettle ... spatially enabled version of ETL (extract transform load) tool
scribe: read the source data file, transform & generate RDF
publish the data via SPARQL endpoint - allow for rich queries using SPARQL
visualisation of query results using "Map4RDF" (?)
All presentations finished ... now getting ready for questions
<PhilA> Jon Blower question
Jon Blower: for Kostis ... what temporal information do capture in the system?
<ocorcho> Question: what is the stability (and roadmap) of the W3C Location and Addresses group's ontology? Can we use it as is right now?
Kostis: (summary) using times for the graphs, not any underlying data model like in Observations and Measurements
Question: can we use the Core Location vocabulary in commercial / operational projects ... is it stable enough?
PhilA: it's published by W3C so
it's stable ... we might deprecate it but it will never go
... it might be amended, but those changes will be backwards compatible
Question: for Kostis - is the system production ready, how well does it scale
Kostis: we have metrics showing
it scales well ...
... as a research prototype, don't understand why some older releases of the underpinning components were used
... have tested with large triple sets & would say that this is mature open source implementation
Steve Peters: excuses first "I'm new to this", but I'm in a dilema - what standards should I use to query the data published by my organisation - WFS or GeoSPARQL?
Kostis: we've experimented with WFS ... but I think it depends on what you're looking for ... difficult to do relational joins with WFS
Boris: my colleague is working with WFS but recognises the restrictions mentioned by Kostis
Steijn: not sure (geo)sparql scales well from a data consumer perspective ... is there available tooling to support local analytics of the data
Question: for Frans - can you talk about ideas for making things easier for application developers?
Frans: people are working on these but nothing concrete yet (stay tuned for more updates)
Bill Oates: for Boris - very impressed with work of small country (!) ... is the data publication sustainable in longer term; can the tools be reused, can the datasets be maintained? Would you recommend the toolsets used in Ecuador?
Boris: our tools are open source
- so welcome to reuse.
... GeoKettle is a well known open source tool for working with geodata; performance is reliable
... happy to share offline
Session is now complete.
<JohnGoodwin> Scribe: JohnGoodwin
<jtandy> PhilA: gets next session of Lightning Talks underway
<jtandy> scribenick: JohnGoodwin
Next speaker: Andreas Harth
Geospatial Data Integration with Linked Data and Provenance Tracking
continuing love story from previous talk
discussing mapping between geosparql, neogeo and location core vocabularies
helps to integrate datasets on superficial level
discussing different geomtry representations - many options
just public geometry online - give it a URI for a first start
No SPARQL for querying RCC relationships on the web
RESTful API instead
claim is this scales better than arbitary SPARQL
If you want to SPARQL do it locally
check out NeoGo vocab http://geovocab.org
Questions for Andreas
Phil Archer: Why do NeoGeo when there is GeoSPARQL - what does it add?
Andreas: NeoGeo started in
GeoVoCamp in Washington - overlap in development with that and
... content negotiation on geometries in NeoGeo
... still some minor differences
... neogeo more nimble to make changes
Phil Archer: Should the API not SPARQL approach be extended to other domains?
Andreas: expecations raised with
SPARQL endpoint high - people do expensive queries
... use SPARQL locally
?: should more restrive APIs be put on top of SPARQLRein
Next speaker: Mano Marks
Geospatial Data and Linked Data
Perspective of Google for geospatial linked data
the knowledge graph - underpinning Google search
geospatial data and linked data both aiming to integrate data from disparate datasets around a single tihng - i.e. location
many geospatial data formats and libraries are closed and hard to understand - including semantic web
in 2000s number of standards emerge (KML, GeoJSON, microdata, GeoRSS, GPX, GTFS etc. etc.)
microdata - add small bits of machine readable content to website
interesting problem areas to work on: tooling and libraries
?: How does Google pick next best thing to indexCharlton Galvarino
Mano Marks: based on usefulness
scribe: interested in GeoJSON as a 'standard'
Leigh Dodds: JSON-LD solves number of prolems
<LarsG> Backing from e. g. schema.org is important, too
<AdamL> Question at 15:20 from Charlton Galvarino
Mano Marks: GeoJSON interesting new emerging format
<LarsG> Web developers allergic to XML
<AdamL> Charlton's q: Mano mentioned indexing and using indexing to promote sharing of geospatial data. Lead to Charlton asking, after exerimenting with GeoSparql indexing, how do Google choose what may be next best to index?
<LarsG> Mano: GML and KML are presentation formats, not storage formats
?: will Google push JSON-LD and GeoJSON as standards?
Mano Marks: probably not, maybe?
Phil Archer: JSON-LD is W3C standard, can do same for GeoJSON
Next speaker: Raphael Troncy
Modeling Geometry and Reference Systems on the Web of Data
work from the Data Lift project does with IGN France
talking about modelling coordinate reference systems
Overview of coordinate systems, projections etc etc - so many to choose from
world coordinate converter http://twcc.free.fr - beware advertising popups
Data lift - develop REST based coordinate conversion service
mainly converts WGS84 to Lambert93 and vice versa
motivation - to link geographical datasets
this tool with LIMES allows to integrate datasets
vocabulary for CRS: http://data.ign.fr/ontologies/ignf#
ontology for geometries http://data.ign.fr/ontologies/geom#
Hugh Glaser: How accurate are these tools
Raphael Troncy: marginal errors
Leigh Dodds: can this vocab be used to describe coordinates on the moon?
Raphael Troncy: Yes
?: GeoSPARQL allows for specification of CRS?
Raphael Troncy: yes, but all embedded in big literal. This is structural represenation of the CRS
Frans: could someone look up axis order for CRS (e..g. lat/long or long/lat)
Raphael Troncy: No
?: Does this work with polygons?
Raphael: just points a tthe moment
End of talks. Discussion time.
<LarsG> But can convert a list of points, too
<ocorcho> I agree with the comment done by Raphael about this point on encoding crs in GeoSPARQL. See thread at http://lists.w3.org/Archives/Public/public-locadd/2013Dec/0009.html
Leigh Dodds: are we in a situation where we have standardised to early?
Peter Parslow: culture clashes between top down standards approach, and bottom up lets just do it approach. Should we give up on standards from top down?
Mano: early in out history - standards bodies still have role to play. Consider emerging practices and watch
Andreas: What Mano said
Phil Archer: standards are never done top down. Every working group is there because people in the community want it to happen.
scribe: people write
... standards development is cyclic
Peter Parslow: discusses GML vs GeoJSON and how most developers don't seem to want XML anymore
Ed Parsons: we do have standards that can be implemented in the real worlds. Standards have to work
<LarsG> Andreas Harth: Requirements in the corporate world are different from those on the web: on the web it doesn't matter if you miss something, corporate data often demands high precision
Leigh Dodds: easy to pull down too much data in geospatial linked data - how do we get round this?
Andreas: different predicates for different predicates
Andrea: different predicates for different geometry types - e.g. 'centroid' for the coordinate centroid, and others for other geometries
Raphael: usecases for geometry as big literal vs geometry stored in the 'graph' - audience, what are your usecases for geometry in the graph...so you can manipulate geometry in the graph?
Stuart: territorial disputes?
<boricles> GeoLinkedData.es also represented complex geometries as structured RDF objects
Ordnance Survey linked data used 'big literal' approach
Jeremy Tandy: the key is...what do the tools support - that will give you the answer
Session over - coffee
<PhilA> scribe: PhilA
<scribe> scribe: pduchesne
<scribe> scribeNick: pduchesne
Shift happens - Steve Peters
<PhilA> Speaker is Steve Peters (paper at http://www.w3.org/2014/03/lgd/papers/lgd14_submission_12)
<PhilA> Slides are at http://www.w3.org/2014/03/lgd/peters-roberts-temporal-change
<PhilA> Steve: There's a need to get at the statistics
two interesting dimensions : need to get at underlying sources for statistical data...
second, the need to distinguish between geo footprints and organisations
<showing data collected on house buildings>
<aharth_scribe> (Boundary) Shift Happens
<PhilA> Steve: shows boundary changes in Cornwall from multi-district to single council, all new identifiers, new departments etc.
<aharth_scribe> steve peters, dept for communities and local government
<aharth_scribe> steve: need to record provenance of data points for statistics
<aharth_scribe> bill roberts: we're addressing a versioning problem, how do real-world concepts change rather than how documents change
<aharth_scribe> bill: represent changing linked data over time, make possible for people to see how things looked in the past
<aharth_scribe> bill: also, which services are provided by which authorities over time (for example, trash collection)
<aharth_scribe> bill: current vs. historic datasets that describe previous states
<aharth_scribe> bill: use epimorphic's version:currentVersion and dct:hasVersion, dct:replaces
<aharth_scribe> bill: plan is worked out, now we need to implement the plan
<aharth_scribe> bill: we have a suggestion for a versioning pattern, but are interested in other re-usable patterns
<aharth_scribe> rein van tverr, the netherlands: geospatial semantics
<aharth_scribe> rein: den foundation, digital heritage netherlands
<aharth_scribe> rein: sharing knowledge and tools
<aharth_scribe> rein: three layered architecture: 1 data & aggregation, 2 semantics, 3 applications
<aharth_scribe> rein: provide linked data services and geospatial services on layer 1
<aharth_scribe> rein: provide enrichment services on level 2 to add vocabularies to data
<aharth_scribe> rein: not provide 3 application layer; applications are supposed to come from industries that have use for the data
<aharth_scribe> rein: we need a geo-semantic marriage
<aharth_scribe> rein: we like complex spatial joins (is object of type x in vicinity of object of type y?)
<aharth_scribe> rein: and also several other features from both semantics and geospatial
<aharth_scribe> rein: really like spatial technologies on the web for mapping data
<aharth_scribe> rein: experiments with geosparql after geoknow study
<aharth_scribe> rein: what it takes to get data in, to query data, to combine it with a web mapping framework
<aharth_scribe> rein: demo, two datasets, one about monuments, one about archeological observations
<aharth_scribe> rein: shows demo of simple spatial filtering (works fast)
<aharth_scribe> rein: want only observations done inside the monuments, query that intersects monuments with observations (takes a bit longer)
<aharth_scribe> rein: 20 secs!
<aharth_scribe> rein: main questions: 1) what solution to choose, 2) is GeoJSON-LD the way to go?
<aharth_scribe> rob warren, canada
<aharth_scribe> From the trenches
<aharth_scribe> rob: plot of positions of geo-points in world war 1
<aharth_scribe> rob: map 1:100,000 in 1914, worked well enough to specify you want to meet in paris
<aharth_scribe> rob: germans and british could use maps of belgians, but france didn't have good maps
<aharth_scribe> rob: germans expanded the maps into france relative to existing belgian maps
<aharth_scribe> rob: differences of western front around 100 yards off depending on which map you use
<aharth_scribe> rob: coordinate system needed; artillery men wanted map in yards, use yard squares (500, 50 or 5 yards), but system is only precise up to 20 yards
<aharth_scribe> rob: maps were not good for artillery use
<aharth_scribe> rob: points are really circles with 20 yards diameter
<aharth_scribe> rob: muninn trench map api, translates british trench maps coordinates for lat/long
<aharth_scribe> rob: bnf's rdf could be better typed
<aharth_scribe> rob: computer scientists vs. geographers have different notions of floating point numbers
<aharth_scribe> rob: trench names do not always make sense, taken over by different parties at different times, named differently
<aharth_scribe> rob: need the ability of doing ad-hoc queries
<aharth_scribe> hadley beeman: speakers back on stage, also susanne rutishauser, uni berne, ch
<aharth_scribe> question to steve and bill: versioning uris...
<aharth_scribe> ... if you link to different objects in your datasets, how does the link change when the data changes
<aharth_scribe> bill: depends, versions have fixed periods of validity...
<LarsG> I think the question at 17:08 was by Boris Boris Villazón-Terrazas
<aharth_scribe> q: always link to a specific version?
<aharth_scribe> ... two axes of change: lineage and successor (?)
<boricles> 17:08? ... I think it was Michael Lutz
<aharth_scribe> q: application might need to figure out which version to use
<aharth_scribe> q: does the link have to have a validity period attached as well?
<aharth_scribe> hadley: introduces susanne
<aharth_scribe> susanne: project provides archeological information about a region in turkey (virtual cilicia project)
<raphael> scribenick: aharth_scribe
<raphael> scribe: aharth
orri: happy that geoknow survey
has been mentioned often, virtuoso is fast for geoqueries
... interested in providing faster response time for rein's 20 sec query
q: historic boundaries very interesting in conjunction with cultural heritage, when can we get the data?
steve: depends on ods and
... need discussion on how far back we can provide data
q: are paper versions available for older maps?
a: possibly in national archives
phil: what's the records/data are you using (old maps, physical measurements)
<pauljcripps> question re historic boundaries was from me (paul cripps, university of south wales) - would form a very useful way of mediating queries for cultural heritage data
susanne: first, local level with
own field work: which objects belong to which temple, have also
photographs but don't make that detailed information available
on the web due to possible theft
... illegal excavations are a problem
... maps based on street system of 200 b.c. to get field boundaries
rob: if you review historical
work, the thing, the name of thing, the geometry of the
... different things, need to deal with part-of, one has to be pragmatic
q: use events, to talk about something that has happened at time and place
scribe: take a look at event ontology
hadley: you'd need time spans and time points
q: do you know about the event? e.g. watching a tree growing, but only have some discreet observations
hugh: what's the idea of truth? who said that something's happened? have to embrace the idea that brussels is at different places
q: withing ogc we have ows 9 activies (check out ows 9 engineering report)
q: did you see the limits of rdf here? how to represent points in time in triples?
<pauljcripps> just to add when working with archaeological data using CIDOC-CRM, we often use 'relative' chronologies for events as we don't know specific start and end points. Temporal reasoning accomplished using Allen's temporal operators.
aharth: use n-ary
... you can stay inside rdf
q: not enough to standardise on iso-8601 (?), you need coordinate reference system, calendar, and ...
hadley: what about leaving earth?
q: i'm an astrophysicists, this planet and celestial reference frames are connected, communities work together, translation of one coordinate system to another is handled well
scribe: quasars are giving us a relative reference to define our place in the universe
john: also need different temporal reference systems
<AdamL> Speaker in this discussion before John Goodwin was Bente Lilja Bye
q: u.k. met office and nasa are forecasting e.g. aurora, sun is 8 minutes away, what time to use?
hadley: thanks to the speakers
phil: some of us will go to the
... tomorrow we'll use this floor and the third floor, keep your badge if you come back tmrw
... lots of stuff in the morning, panel in the afternoon about what are we going to do next, w3c and ogc together want to know what the community wants done
... 60 secs or less pitches for bar camps
... thanks to participants & good bye
This is scribe.perl Revision: 1.138 of Date: 2013-04-25 13:59:11 Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/ Guessing input format: RRSAgent_Text_Format (score 1.00) Succeeded: s/Clemens Portele (Texas Instrument): Using INSPIRE data on the Web/Clemens Portele (Interactive Instruments): Using INSPIRE data on the Web/ Succeeded: s/slides/paper/ Succeeded: s/regios/regions/ Succeeded: s/teh/the/ Succeeded: s/Andrea/Andreas/ Succeeded: s/?/Rein/ Succeeded: s/ineresting/interesting/ FAILED: s/polems/problems/ Succeeded: s/?/Charlton Galvarino/ Succeeded: s/differnet/different/ Succeeded: s/now onto Lightning Talks: An Alternative Projection/Topic: Lightning Talks: An Alternative Projection/ Succeeded: s/Session: It's about time/Topic: It's about time/ Found ScribeNick: raphael Found Scribe: nick Found Scribe: raphael Inferring ScribeNick: raphael Found ScribeNick: raphael Found Scribe: HadleyBeeman Inferring ScribeNick: HadleyBeeman Found ScribeNick: hadleybeeman Found ScribeNick: Stuart Found Scribe: PhilA Inferring ScribeNick: PhilA Found ScribeNick: PhilA Found Scribe: Jeremy Tandy Found ScribeNick: jtandy Found Scribe: JohnGoodwin Inferring ScribeNick: JohnGoodwin Found ScribeNick: JohnGoodwin Found Scribe: PhilA Inferring ScribeNick: PhilA Found Scribe: pduchesne Found ScribeNick: pduchesne Found ScribeNick: aharth_scribe Found Scribe: aharth Scribes: nick, raphael, HadleyBeeman, PhilA, Jeremy Tandy, JohnGoodwin, pduchesne, aharth ScribeNicks: raphael, HadleyBeeman, Stuart, PhilA, jtandy, JohnGoodwin, pduchesne, aharth_scribe WARNING: No "Present: ... " found! Possibly Present: AdamL Alejandro_Llaves Alex Alex_Coley Andrea AndreaP AndreaPerego Andreas AndyS Angelica BLB Bart Bart_De_Lathouwer BillOates Boris CL ChrisB ClausStadler Clemens ClemensPortele Curly DA Dicky DickyWH Diederik Diederik_tirry Emily_C FannyL Fionalm FlyingDutchwoman FlyingDutchwoman_ Frans Frans_Knibbe Gianfra Gianfra1 Gobe__Envitia_ HadleyBeeman Herman JMEV JT Janderl JohnGoodwin Keith Kerry Kostis Lars LarsG LarsG_ Lathoub Leigh Mano ManoMarks Massimo MichaelLutz Otaka Otakar PeterParslow PeterRushforth Phil PhilA Projects RDA Raphael Related Reply SpatialRed Steijn Steve Stuart Subtopic TB TimDuffy Tim_Duffy aharth aharth_scribe aw ayymanduh bert_ bill billroberts boricles boricles_ chobbs chrishenden conclusions cperey cperey_ cyrubattino dennis dennis_ dennis__ dennis_keck edp eparsons giusepperizzo giusepperizzo1 hadley herste hugh icm icm1 isedwards jhigman joeri_robbrecht john jtandy laurent_oz ldodds ldodds1 libby lindavandenbrink martintuchyna ocorcho orri pauljcripps pduchesne peterisb pezholio pjc196 rein rhwarren rob robbejoe scribeNick stuart_ susanne svensson vcp vicchi wpoates1 You can indicate people for the Present list like this: <dbooth> Present: dbooth jonathan mary <dbooth> Present+ amy Got date from IRC log name: 05 Mar 2014 Guessing minutes URL: http://www.w3.org/2014/03/05-lgd-minutes.html People with action items:[End of scribe.perl diagnostic output]