13:02:33 RRSAgent has joined #sdwcov 13:02:33 logging to http://www.w3.org/2016/04/06-sdwcov-irc 13:02:35 RRSAgent, make logs world 13:02:35 Zakim has joined #sdwcov 13:02:37 Zakim, this will be SDW 13:02:37 ok, trackbot 13:02:38 Meeting: Spatial Data on the Web Working Group Teleconference 13:02:38 Date: 06 April 2016 13:02:45 ah hi Phil - we're stuck with the webex 13:03:03 being asked for MIT certificate 13:03:38 will try, 2 secs 13:03:49 RRSAgent, make logs public 13:04:02 present+ ScottSimmons 13:04:07 zakim, code? 13:04:07 I have been told this is SDW 13:04:31 zakim, this is SDW Weekly https://mit.webex.com/mit/j.php?MTID=m4ecb967bb70c02dedc9eb66013281084, Access code 642 889 345, password sdw 13:04:31 got it, phila_mtg 13:04:35 present+ Kerry 13:04:38 yes, success - many thanks Phil 13:04:39 present+ Maik 13:04:42 zakim, save this description 13:04:42 this conference description has been saved, phila_mtg 13:05:07 present+ sam 13:05:11 present+ billroberts 13:05:20 regrets+ Lewis 13:05:24 present+ duo 13:05:27 regrets+ phila 13:05:49 present+ dmitrybrizhinev 13:06:56 regrets+ lewis 13:07:03 regrets+ eparsons 13:07:17 i cando it if you likje! 13:07:46 scribe: kerry 13:07:52 scribeNick: kerry 13:08:02 chair: Bill 13:08:28 topic: patent call https://www.w3.org/2015/spatial/wiki/Patent_Call 13:09:08 propose: approve minutes https://www.w3.org/2016/03/23-sdwcov-minutes 13:09:23 +1 13:09:26 +1 13:10:01 +1 13:10:12 resolved approve minutes https://www.w3.org/2016/03/23-sdwcov-minutes 13:10:29 resolved: approve minutes https://www.w3.org/2016/03/23-sdwcov-minutes 13:11:05 topic: Brief recap of previous meeting 13:11:45 bill: reviewed requirements, talked about subsets and web-friendly formats, reviewed data on the Web view 13:11:55 ..with some members of that group on the call 13:12:12 ...large portions of coverage data are gridded (but not all) 13:12:29 ..ro gridded datasets sections can be defined farly easily 13:12:52 ... will work on grid first, and also lokk not non-gridded for important cases 13:12:59 s/lokk/look/ 13:12:59 q? 13:13:08 s/not/inot/ 13:13:19 https://www.w3.org/2015/spatial/track/actions/152 13:13:24 topic: erminology for 'subsets' of coverage datasets (Action 152) 13:13:42 s/ermin/termin/ 13:14:05 bill: kerry does not like 'subsetting" but I don't mind either way 13:14:27 ... "extract" or something comes up all the time 13:14:35 ... mimimise misunderstandings 13:15:07 kerry: there was a raging debate on mailing list. Most people don't mind 13:15:15 suggestions: extract, filter, ... 13:15:48 13:16:03 q+ 13:16:18 ack q 13:17:50 ack keryr 13:17:54 ack k 13:18:45 q+ 13:18:57 ack k 13:22:10 .Maik: notes some reasons to prefer extract as the more general term, but not too fussed 13:22:32 bill: thinks extract may be less confusing 13:23:18 Proposed: that we use "extract" as the main work in most places (and mention subsetting as used for same thinfgwhen introducing) 13:23:25 s/work/word/ 13:23:40 +1 13:23:42 s/thinfg/thing / 13:23:46 +1 13:23:55 +1 13:24:08 +1 13:24:09 +1 13:24:15 0 13:24:21 +1 13:25:09 resolved: to encourage the use of "extract" as the main word in most places (and mention subsetting as used for same thing when introducing) 13:25:12 https://github.com/ANU-Linked-Earth-Data/ontology 13:25:26 More verbose link with examples: https://github.com/ANU-Linked-Earth-Data/main-repo/wiki/NotesForCoveragesSubgroupApril06 13:25:33 Topic: ANU work on an ontology for earth observation data 13:25:34 topic: Sam Toyer: ANU work on an ontology for representing earth observation data as Linked Data (see https://github.com/ANU-Linked-Earth-Data/ontology ) 13:26:11 no I can't hear 13:26:36 sorry, not sure what's going on with phone 13:26:37 should I introduce things while he gets that sorted? 13:26:42 yes please 13:26:51 yes 13:28:35 Duo: 2 key points: using dggs for data (landsat data) 13:28:45 DGGS: Discrete Global Grid System 13:28:51 ....stores geospatail data in a standardised format 13:29:17 ...lokking to put it into an rdf datacube using fuseki triple store and elda api 13:29:39 ... dimitry is developing ontology inspired by coveragejson 13:30:27 Dmitry: I have been writing the coveragejson spec in owl 13:30:37 ...see the posted example and you can see it in rdf 13:30:57 ...lets you define axes and link them to a crs, and link values to some other meanig 13:31:04 ...this is the way coverajson does it 13:31:35 s/coverajson/coveragejson/ 13:31:47 is this working? 13:31:59 nope, just echoes 13:32:20 13:32:30 sure, that works. I only have a little bit to say. 13:32:54 my part of the project is to build the API which will be used by the client app to access our satellite data 13:33:07 I think Duo explained some of that before (Fuseki + Elda) 13:33:24 maik: interesting to see coveragejson moving this way 13:33:31 ...what is the main motivation/ 13:33:37 We've been trying to encode our data as RDF, but expose the service as a simple REST-ish API (at Kerry's suggestion) 13:34:08 dmitrybrizhinev: seemed to be a good way to organise the data -- somethin like rdf data cube but is more efficient that rdf datacube 13:34:29 maik: we come from the netcdf direction and just want a little bit of linking.. 13:34:36 At the moment, I'm mostly interested in the group's feedback on (1) the suitability of SPARQL vs. REST-ish API from web developers' perspective and (2) best format for delivering data (JSON-LD, RDF/JSON, etc.) 13:34:38 ...how do you want to use it 13:34:58 (/end comments) 13:35:03 dmitrybrizhinev: exactly how it would be used is not really clear 13:35:31 ...assuming that something a bit like the datacube would be useful... 13:35:56 q+ 13:36:06 billroberts: linked data and rdf in general offers the ability to link to anything, becuase everything gets an identifier 13:36:22 ...every observation, datapoint, has a URI, so you can stuff about it 13:37:07 ...other reason is you can combine data eg by sparql queries over one or several triple stores 13:37:20 ... one sapect is http, another is standardisation 13:37:49 ....depends on who wants to use the data and the tools they are used to 13:38:05 ...works very well for metadata and alos provenance of data processing 13:38:33 .... my first thought on seeing the rdf here is that he numbers may need a concise microformat.... 13:38:50 dmitrybrizhinev: that is what coveragejson does -- or could it even be a binary file 13:39:28 billroberts: my fistreaction is that then there is not a lot of point in using rdf may be the worst of both worlds 13:39:52 dmitrybrizhinev: do you have a suggestion? this has been discussed many times -- its too much, the space expolodes 13:40:23 ...what if it was an rdf list, and json-ld can encode into a json array 13:40:34 ...would this be the best of both worlds? 13:41:11 billroberts: even people that like rdf hate rdf lists... 13:41:28 ...maybe an approach like this linking to data in another rep would do... 13:41:32 q+ 13:42:01 ... and link to a separate url to retun json in a file or something 13:42:31 .... could be more like coveragejson -- could addmetadata in rdf while using json for the numbers 13:42:40 ack duo 13:42:46 dmitrybrizhinev: yes, canot see rdf for terrabyte datasets 13:43:23 Duo: melodies 13:43:29 important: people don't fetch terrabytes anyway, always just small parts, it's all about the API 13:43:40 billroberts: yes coverage json is a product of the melodies project 13:44:01 ack kerry 13:44:06 duo; looking at and client applicatins 13:44:49 kerry: is coverageJSON metadata sufficient to describe a specific data point? or is it just metadata for a whole dataset or large part of a coverage? 13:45:09 Yes, this was a suggestion before - that there can be a clear distinction between the way the data is stored and the way it is represented in response to a query 13:47:36 kerry: if coveragejson provides a way to uniquely identify any element in the data, that should be sufficient 13:48:18 kerry: could make a URL pattern that allows identifying an extract using that 13:48:37 kerry: do we then have a sufficiently fine-grained way of identifying 'chunks' 13:48:47 Topic: criteria for assessing potential solution 13:50:06 Maik: if you want to identify a single datapoint that would be a combination of a parameter plus a domain index (e.g. time) 13:50:18 ...this could be put into a url 13:50:20 #x=1,y=2,t=2 13:50:48 ...index based subsetting, but sometimes perople want coordinates instead... 13:51:04 ... some apis alsways use coordinates and not indices 13:51:18 ....how do we assess what is good an what is not 13:51:45 ...its hould not include any type of query language so even if you change underlying technologythe reference does not change 13:52:34 billroberts: the API or method of identifying extract should be independetn of implementation 13:52:53 .... but this is spatail data on the web, so needs to be http and uris 13:53:01 s/spatail/spatial/ 13:53:31 ...needs to to be "simple" whatever that is -- needs to be easy for some community of data users 13:54:10 ....we are agreeing some kind of exchange language between people who have a lot ov coverage data and some people on the web who need it 13:54:44 Maik: e/g like leaflet, always lat/long, no other projections -- so even if dataset is not stored that way it should be usabl that way 13:55:19 ...you should offer an API based on lat/longs, so you don't need to know how to do british national grids 13:55:31 billroberts: yes, probably wgs84 13:56:00 billroberts: that data manger should take charge of conversion between grid space and user CRS 13:57:01 ....we want something that will persist for a while... needs to be not too closely tied to specific things 13:57:26 billroberts: we want something that is not too verbose becuase data is large and we need to transfer it in a finite amount of time 13:57:38 http://reading-escience-centre.github.io/covjson-playground/ 13:57:54 ....browers will run out of resources (time and space) 13:58:21 maik; havong lots of examples and tools avaialble e.g. plugins, libraries 13:58:49 billroberts: ANU work is looking at data through an API plus something that is onsuming it 13:59:01 s/onsum/consum/ 13:59:15 ....would like document of waht works well and what does not 13:59:47 billroberts: will develop a stra man set of criteria 14:00:23 billroberts: wouild like examples with real data,as well as simple illustrations 14:01:01 billroberts: reminder that you are encouraged to edit pages on working group wiki to share information and documents for discussion 14:01:25 ....eg strangth and wekanesses 14:01:32 due: yes we can do that in two weeks 14:02:15 action: Duo to write up what has been learn on the wiki for 2 weeks 14:02:15 Error finding 'Duo'. You can review and register nicknames at . 14:02:51 rrsagent, make logs public 14:02:58 rrsagent, draft minutes 14:02:58 I have made the request to generate http://www.w3.org/2016/04/06-sdwcov-minutes.html kerry 14:04:10 trackbot, end meeting 14:04:10 Zakim, list attendees 14:04:11 As of this point the attendees have been ScottSimmons, Kerry, Maik, sam, billroberts, duo, dmitrybrizhinev 14:04:18 RRSAgent, please draft minutes 14:04:18 I have made the request to generate http://www.w3.org/2016/04/06-sdwcov-minutes.html trackbot 14:04:19 RRSAgent, bye 14:04:19 I see 1 open action item saved in http://www.w3.org/2016/04/06-sdwcov-actions.rdf : 14:04:19 ACTION: Duo to write up what has been learn on the wiki for 2 weeks [1] 14:04:19 recorded in http://www.w3.org/2016/04/06-sdwcov-irc#T14-02-15