IRC log of sdwcov on 2016-04-06

Timestamps are in UTC.

13:02:33 [RRSAgent]
RRSAgent has joined #sdwcov
13:02:33 [RRSAgent]
logging to
13:02:35 [trackbot]
RRSAgent, make logs world
13:02:35 [Zakim]
Zakim has joined #sdwcov
13:02:37 [trackbot]
Zakim, this will be SDW
13:02:37 [Zakim]
ok, trackbot
13:02:38 [trackbot]
Meeting: Spatial Data on the Web Working Group Teleconference
13:02:38 [trackbot]
Date: 06 April 2016
13:02:45 [billroberts]
ah hi Phil - we're stuck with the webex
13:03:03 [billroberts]
being asked for MIT certificate
13:03:38 [billroberts]
will try, 2 secs
13:03:49 [phila_mtg]
RRSAgent, make logs public
13:04:02 [ScottSimmons]
present+ ScottSimmons
13:04:07 [phila_mtg]
zakim, code?
13:04:07 [Zakim]
I have been told this is SDW
13:04:31 [phila_mtg]
zakim, this is SDW Weekly, Access code 642 889 345, password sdw
13:04:31 [Zakim]
got it, phila_mtg
13:04:35 [kerry]
present+ Kerry
13:04:38 [billroberts]
yes, success - many thanks Phil
13:04:39 [Maik]
present+ Maik
13:04:42 [phila_mtg]
zakim, save this description
13:04:42 [Zakim]
this conference description has been saved, phila_mtg
13:05:07 [sam]
present+ sam
13:05:11 [billroberts]
present+ billroberts
13:05:20 [billroberts]
regrets+ Lewis
13:05:24 [Duo]
present+ duo
13:05:27 [billroberts]
regrets+ phila
13:05:49 [dmitrybrizhinev]
present+ dmitrybrizhinev
13:06:56 [kerry]
regrets+ lewis
13:07:03 [kerry]
regrets+ eparsons
13:07:17 [kerry]
i cando it if you likje!
13:07:46 [kerry]
scribe: kerry
13:07:52 [kerry]
scribeNick: kerry
13:08:02 [kerry]
chair: Bill
13:08:28 [kerry]
topic: patent call
13:09:08 [kerry]
propose: approve minutes
13:09:23 [billroberts]
13:09:26 [kerry]
13:10:01 [dmitrybrizhinev]
13:10:12 [kerry]
resolved approve minutes
13:10:29 [kerry]
resolved: approve minutes
13:11:05 [kerry]
topic: Brief recap of previous meeting
13:11:45 [kerry]
bill: reviewed requirements, talked about subsets and web-friendly formats, reviewed data on the Web view
13:11:55 [kerry]
..with some members of that group on the call
13:12:12 [kerry]
...large portions of coverage data are gridded (but not all)
13:12:29 [kerry] gridded datasets sections can be defined farly easily
13:12:52 [kerry]
... will work on grid first, and also lokk not non-gridded for important cases
13:12:59 [kerry]
13:12:59 [billroberts]
13:13:08 [kerry]
13:13:19 [billroberts]
13:13:24 [kerry]
topic: erminology for 'subsets' of coverage datasets (Action 152)
13:13:42 [kerry]
13:14:05 [kerry]
bill: kerry does not like 'subsetting" but I don't mind either way
13:14:27 [kerry]
... "extract" or something comes up all the time
13:14:35 [kerry]
... mimimise misunderstandings
13:15:07 [billroberts]
kerry: there was a raging debate on mailing list. Most people don't mind
13:15:15 [billroberts]
suggestions: extract, filter, ...
13:15:48 [kerry]
<kerry summarises email discussion>
13:16:03 [kerry]
13:16:18 [billroberts]
ack q
13:17:50 [billroberts]
ack keryr
13:17:54 [billroberts]
ack k
13:18:45 [kerry]
13:18:57 [billroberts]
ack k
13:22:10 [kerry]
.Maik: notes some reasons to prefer extract as the more general term, but not too fussed
13:22:32 [kerry]
bill: thinks extract may be less confusing
13:23:18 [kerry]
Proposed: that we use "extract" as the main work in most places (and mention subsetting as used for same thinfgwhen introducing)
13:23:25 [kerry]
13:23:40 [billroberts]
13:23:42 [kerry]
s/thinfg/thing /
13:23:46 [kerry]
13:23:55 [Maik]
13:24:08 [dmitrybrizhinev]
13:24:09 [ScottSimmons]
13:24:15 [Duo]
13:24:21 [sam]
13:25:09 [kerry]
resolved: to encourage the use of "extract" as the main word in most places (and mention subsetting as used for same thing when introducing)
13:25:12 [billroberts]
13:25:26 [sam]
More verbose link with examples:
13:25:33 [billroberts]
Topic: ANU work on an ontology for earth observation data
13:25:34 [kerry]
topic: Sam Toyer: ANU work on an ontology for representing earth observation data as Linked Data (see )
13:26:11 [dmitrybrizhinev]
no I can't hear
13:26:36 [sam]
sorry, not sure what's going on with phone
13:26:37 [Duo]
should I introduce things while he gets that sorted?
13:26:42 [sam]
yes please
13:26:51 [dmitrybrizhinev]
13:28:35 [kerry]
Duo: 2 key points: using dggs for data (landsat data)
13:28:45 [billroberts]
DGGS: Discrete Global Grid System
13:28:51 [kerry]
....stores geospatail data in a standardised format
13:29:17 [kerry]
...lokking to put it into an rdf datacube using fuseki triple store and elda api
13:29:39 [kerry]
... dimitry is developing ontology inspired by coveragejson
13:30:27 [kerry]
Dmitry: I have been writing the coveragejson spec in owl
13:30:37 [kerry]
...see the posted example and you can see it in rdf
13:30:57 [kerry]
...lets you define axes and link them to a crs, and link values to some other meanig
13:31:04 [kerry]
...this is the way coverajson does it
13:31:35 [kerry]
13:31:47 [sam]
is this working?
13:31:59 [dmitrybrizhinev]
nope, just echoes
13:32:20 [kerry]
<problems with sound>
13:32:30 [sam]
sure, that works. I only have a little bit to say.
13:32:54 [sam]
my part of the project is to build the API which will be used by the client app to access our satellite data
13:33:07 [sam]
I think Duo explained some of that before (Fuseki + Elda)
13:33:24 [kerry]
maik: interesting to see coveragejson moving this way
13:33:31 [kerry]
...what is the main motivation/
13:33:37 [sam]
We've been trying to encode our data as RDF, but expose the service as a simple REST-ish API (at Kerry's suggestion)
13:34:08 [kerry]
dmitrybrizhinev: seemed to be a good way to organise the data -- somethin like rdf data cube but is more efficient that rdf datacube
13:34:29 [kerry]
maik: we come from the netcdf direction and just want a little bit of linking..
13:34:36 [sam]
At the moment, I'm mostly interested in the group's feedback on (1) the suitability of SPARQL vs. REST-ish API from web developers' perspective and (2) best format for delivering data (JSON-LD, RDF/JSON, etc.)
13:34:38 [kerry] do you want to use it
13:34:58 [sam]
(/end comments)
13:35:03 [kerry]
dmitrybrizhinev: exactly how it would be used is not really clear
13:35:31 [kerry]
...assuming that something a bit like the datacube would be useful...
13:35:56 [Duo]
13:36:06 [kerry]
billroberts: linked data and rdf in general offers the ability to link to anything, becuase everything gets an identifier
13:36:22 [kerry]
...every observation, datapoint, has a URI, so you can stuff about it
13:37:07 [kerry]
...other reason is you can combine data eg by sparql queries over one or several triple stores
13:37:20 [kerry]
... one sapect is http, another is standardisation
13:37:49 [kerry]
....depends on who wants to use the data and the tools they are used to
13:38:05 [kerry] very well for metadata and alos provenance of data processing
13:38:33 [kerry]
.... my first thought on seeing the rdf here is that he numbers may need a concise microformat....
13:38:50 [kerry]
dmitrybrizhinev: that is what coveragejson does -- or could it even be a binary file
13:39:28 [kerry]
billroberts: my fistreaction is that then there is not a lot of point in using rdf may be the worst of both worlds
13:39:52 [kerry]
dmitrybrizhinev: do you have a suggestion? this has been discussed many times -- its too much, the space expolodes
13:40:23 [kerry]
...what if it was an rdf list, and json-ld can encode into a json array
13:40:34 [kerry]
...would this be the best of both worlds?
13:41:11 [kerry]
billroberts: even people that like rdf hate rdf lists...
13:41:28 [kerry]
...maybe an approach like this linking to data in another rep would do...
13:41:32 [kerry]
13:42:01 [kerry]
... and link to a separate url to retun json in a file or something
13:42:31 [kerry]
.... could be more like coveragejson -- could addmetadata in rdf while using json for the numbers
13:42:40 [billroberts]
ack duo
13:42:46 [kerry]
dmitrybrizhinev: yes, canot see rdf for terrabyte datasets
13:43:23 [kerry]
Duo: melodies
13:43:29 [Maik]
important: people don't fetch terrabytes anyway, always just small parts, it's all about the API
13:43:40 [kerry]
billroberts: yes coverage json is a product of the melodies project
13:44:01 [billroberts]
ack kerry
13:44:06 [kerry]
duo; looking at <tiles?..> and client applicatins
13:44:49 [billroberts]
kerry: is coverageJSON metadata sufficient to describe a specific data point? or is it just metadata for a whole dataset or large part of a coverage?
13:45:09 [dmitrybrizhinev]
Yes, this was a suggestion before - that there can be a clear distinction between the way the data is stored and the way it is represented in response to a query
13:47:36 [billroberts]
kerry: if coveragejson provides a way to uniquely identify any element in the data, that should be sufficient
13:48:18 [billroberts]
kerry: could make a URL pattern that allows identifying an extract using that
13:48:37 [billroberts]
kerry: do we then have a sufficiently fine-grained way of identifying 'chunks'
13:48:47 [billroberts]
Topic: criteria for assessing potential solution
13:50:06 [kerry]
Maik: if you want to identify a single datapoint that would be a combination of a parameter plus a domain index (e.g. time)
13:50:18 [kerry]
...this could be put into a url
13:50:20 [Maik]
13:50:48 [kerry]
...index based subsetting, but sometimes perople want coordinates instead...
13:51:04 [kerry]
... some apis alsways use coordinates and not indices
13:51:18 [kerry] do we assess what is good an what is not
13:51:45 [kerry]
...its hould not include any type of query language so even if you change underlying technologythe reference does not change
13:52:34 [kerry]
billroberts: the API or method of identifying extract should be independetn of implementation
13:52:53 [kerry]
.... but this is spatail data on the web, so needs to be http and uris
13:53:01 [kerry]
13:53:31 [kerry]
...needs to to be "simple" whatever that is -- needs to be easy for some community of data users
13:54:10 [kerry]
....we are agreeing some kind of exchange language between people who have a lot ov coverage data and some people on the web who need it
13:54:44 [kerry]
Maik: e/g like leaflet, always lat/long, no other projections -- so even if dataset is not stored that way it should be usabl that way
13:55:19 [kerry] should offer an API based on lat/longs, so you don't need to know how to do british national grids
13:55:31 [kerry]
billroberts: yes, probably wgs84
13:56:00 [kerry]
billroberts: that data manger should take charge of conversion between grid space and user CRS
13:57:01 [kerry]
....we want something that will persist for a while... needs to be not too closely tied to specific things
13:57:26 [kerry]
billroberts: we want something that is not too verbose becuase data is large and we need to transfer it in a finite amount of time
13:57:38 [Maik]
13:57:54 [kerry]
....browers will run out of resources (time and space)
13:58:21 [kerry]
maik; havong lots of examples and tools avaialble e.g. plugins, libraries
13:58:49 [kerry]
billroberts: ANU work is looking at data through an API plus something that is onsuming it
13:59:01 [kerry]
13:59:15 [kerry]
....would like document of waht works well and what does not
13:59:47 [kerry]
billroberts: will develop a stra man set of criteria
14:00:23 [kerry]
billroberts: wouild like examples with real data,as well as simple illustrations
14:01:01 [kerry]
billroberts: reminder that you are encouraged to edit pages on working group wiki to share information and documents for discussion
14:01:25 [kerry] strangth and wekanesses
14:01:32 [kerry]
due: yes we can do that in two weeks
14:02:15 [kerry]
action: Duo to write up what has been learn on the wiki for 2 weeks
14:02:15 [trackbot]
Error finding 'Duo'. You can review and register nicknames at <>.
14:02:51 [kerry]
rrsagent, make logs public
14:02:58 [kerry]
rrsagent, draft minutes
14:02:58 [RRSAgent]
I have made the request to generate kerry
14:04:10 [billroberts]
trackbot, end meeting
14:04:10 [trackbot]
Zakim, list attendees
14:04:11 [Zakim]
As of this point the attendees have been ScottSimmons, Kerry, Maik, sam, billroberts, duo, dmitrybrizhinev
14:04:18 [trackbot]
RRSAgent, please draft minutes
14:04:18 [RRSAgent]
I have made the request to generate trackbot
14:04:19 [trackbot]
RRSAgent, bye
14:04:19 [RRSAgent]
I see 1 open action item saved in :
14:04:19 [RRSAgent]
ACTION: Duo to write up what has been learn on the wiki for 2 weeks [1]
14:04:19 [RRSAgent]
recorded in