See also: IRC log
<scribe> scribe: phila
<scribe> scribeNick: phila
<billroberts> minutes of last meeting: https://www.w3.org/2016/04/06-sdwcov-minutes
PROPOSED: Accept last week's minutes
<billroberts> +1
=) not present
<kerry> +1
<sam> +1
<Maik> +1
<jtandy> +0 - not present
<Duo> +1
RESOLUTION: Accept last week's minutes
<billroberts> patent call: https://www.w3.org/2015/spatial/wiki/Patent_Call
<billroberts> https://www.w3.org/2015/spatial/wiki/Meetings:Coverage-Telecon20160420
billroberts: First thing is to
    run through what we did last time
    ... Discussed our prefered terminology and decided that
    'extracts' was our favourite term for subsets.
    ... Heard about ANU's work on something similar to, but not the
    same as, CoverageJSON
    ... Looked at criteria for deciding how to choose solutions
<billroberts> https://www.w3.org/2015/spatial/wiki/Coverage_Solution_Criteria
billroberts: Made some rough notes on that page to open the topic
(Sorry, scribe distracted)
billroberts: Talking about what
    we have already
    ... obvious question to raise, why not just use what already
    exists.
    ... So if we think WCS is a good solution, job done
    ... Bullet point notes in that page
What makes for 'web-friendly' ? Some initial ideas for discussion:
accessible over HTTP
follows the relevant parts of the SDW Best Practices
API follows patterns familiar to web data users and web developers?
can link to it, and can link from it to other relevant things
practical to receive data using HTTP over the network - extracts, option to get data in chunks etc
play nicely with web mapping tools
practical for a data owner to implement and operate a server following the standard
only a finite list of pre-prepared extracts available? (simple, quick, not very flexible) or allow arbitrary extracts to be requested and rely on the server to generate an appropriate response (flexible but complex to implement and may be performance challenges)
<Zakim> kerry, you wanted to comment on points when ready
kerry: First of all, thnk youi,
    Bill. Can't immediately see anything that's missing.
    ... Slight concern over playing nicely with Web mapping
    tools
    ... that might contradict some of what you said. Suppose there
    is no mapping tool at the moment that can consume what we think
    is best
billroberts: Fair point
<Zakim> jtandy, you wanted to note "play nicely in common user-agents"
jtandy: I wanted to suggest that
    rather than playing nicely with web mapping tools, it's data in
    a format that can work in most user agents which generally
    means a JavaScript engine
    ... So I'd replace Web mapping with common user agents.
billroberts: There's a balance
    between making it easy to use by a community familiar with a
    toolset and making it too brittle for when the toolset
    changes.
    ... XML was the solution to everything 0 years ago, now it's
    JSON. 10 years' time?
<Zakim> jtandy, you wanted to ask if we're talking about data [encoding] or APIs to access the data or both
jtandy: When we were talking in
    the CEO-LD environment, we noted that there are 2 parts.
    ... One is making coverage data avialable in a JS engine, and
    there's the idea of grabbing a chunk of data to process
    locally.
<Maik> just format or also API
jtandy: Are we defining an API
    that allows us to work with coverage data, request it etc,
    or...
    ... just the format that we might give the data in for the UA
    to process
billroberts: Both I think, there's the API and what you get in response.
jtandy: So in the Coevrage JSON work that Maik and Jon have done they have the Coverage JSON and the API for requesting it.
billroberts: My feeling is that both are in scope for us.
kerry: I agree with the comment
    on playing nicely with a browser
    ... The lost of reqs sounds like a RESTful API to me
billroberts: I think your point
    about browser cf. web mapping tools. I'd say user agent
    ... I'd be interested to hear from people who have been using
    it, what people think of some of the existing solutions like
    OpenDAP and WCS
    ... Why not just use WCS?
<Zakim> jtandy, you wanted to suggest that an API should consider mechanisms to avoid overloading a user-agent and to
jtandy: I'll skip the issue about
    overloading hte user agent.
    ... On WCS...
W3C does a number of things. It's a combo of an API spec and a conceptual data model, how to package the info, and a number of binary encodings associated with it.
scribe: It's hard to unpick
    those
    ... You end up with a lot of XML
jtandy: The API has become very
    complicated as it deasl with the edge cases of an expert
    community.
    ... It's complicated because it tried to do evereything.
billroberts: I'm sure we're not going to say there's anything wrong with WCS
ScottSimmons: Trying to avoid the
    conflict of interest... some of the extensions are making it
    more accessibile. Transactions, for example.
    ... And there's some clever work odne on an XPath extension.
    Likely to become a discussion soon.
    ... There's an evolutionary path ahead, editors are considering
    simplification.
billroberts: There's a lifecycle
    with a standard. Things start simple and get more complex ad
    more edge cases are uncovered. then it's gets too complex so
    people start again
    ... I thinkw e shoould try and stick to what's simple. Optimise
    for simnplicity over capability?
<jtandy> +1 for optiising similicity
<Maik> [made it to the gate without dropping wifi/audio connection!]
<Zakim> jtandy, you wanted to note that the conceptual model for a coverage resource forms the basis of CoverageJSON
jtandy: In terms of WCS, there
    has been some really good work in OGC taking a conceptual
    object... things like domain, range and rangeset
    ... largely that's the basis of the Coverage JSON encoding that
    Jon and Maik have been working on. That work has an
    evolutionary path
    ... There seems to be a desire to allow Coverage JSON to become
    on eof the encodings that WCS supports.
    ... The work that Jon nad Maik have done is based on how
    developers use JSON, but if you just take the OGC model and
    encode it in JSON directly, it looks weird.
<Maik> I agree to everything you said
billroberts: Jon and Maik can't
    speak here today.
    ... If we're talking about one specific community, then maybe
    Coverage JSON is a goood starting point.
    ... Not sure what's involved in making CJ an official format of
    WCS.
    ... So can we develop that list I started with?
<jtandy> +1
billroberts: I'd also like to make sure we follow our own WG's Best Practices.
<Zakim> jtandy, you wanted to ask Scott if he heard this beginning in Washington TC?
<Maik> haven't heard anything from the OGC meeting
jtandy: I think the Washington TC was happening a week after we were together - was there a conversation with Peter Baumann.
ScottSimmons: I heard the
    rumours, not of any progress.
    ... Peter's interested but apparently is triple booked.
jtandy: So there may be an opportunity to progress that discussion in Dublin (the next TC)
<billroberts> phila: if coverageJSON ends up being a WCS encoding that's great. Will it be linkable enought o count as linked data?
<jtandy> +1
<Maik> +1 :p
<ScottSimmons> +1
<Maik> we have defined "id" fields for most things where you are supposed to put in URIs
billroberts: From what I've seen... there clearly have been efforts in the CJ approach to use LD style approaches for the metadata so that you have things to link to.
<Maik> you can give a URI to.... the coverage, the parameters, the abstract measured properties, the domain, ...
billroberts: If you have a URI
    then you can say things about it
    ... So that sounds like something we could take forwards being
    sufficiently linkable.
<Maik> it is linkable, it is just not hitler-RDF
<Maik> should be enough for most cases
billroberts: Anything more on this topics just now?
<Maik> have to board the plane
<Maik> thanks!
<Duo> https://github.com/ANU-Linked-Earth-Data/main-repo/wiki/Coverage-Recommendation-20-4-16
Duo: We weren't able to access
    the W3C wiki
    ... So it's currently on this wiki
    ... Hopefully Sam can talk about what we're going to do with
    the data Cube
sam: After the last meeting we
    looked back over what we've done with our Coverage JSON so far.
    More or less converting it to RDF.
    ... You can't use SPARQL and just fetch that bit down
    ... You have to fetch the whole lot and process it
    ... So we tried RDF Data Cube. It's very verbose, yes, but it's
    very flexible with a URL for each observation
    ... The huge space explosion is a problem, yes. We have a
    prototype that is looking at generating RDF triples on the fly
    that you don#t have to store
billroberts: That sounds
    sensible, you can use RDF when you need to but not when you
    don't.
    ... What struck me is cases where we're not talking about
    gridded coverages
    ... Gridded is always more compact
    ... A point cloud needs to lost all the coordinates
    ... One of the things you talked about last time was a client
    app that can consume this data.
<Duo> +1
billroberts: Anything to report on that front?
Jo: We have changed everything
    around so we're starting from scratch now. Hoping to get a new
    prototye out i the next few weeks.
    ... Ontology, client App, data etc.
    ... Should be able to query a select set of arguments that you
    can provide. A time frame, a set of coordinates etc.
billroberts: That sounds very interesting
<Zakim> jtandy, you wanted to ask if there is a requirement to treat the coverage _data_ as RDF triples?
<jtandy> graph centric - just because you can encode coverage data in RDF, should you?
jtandy: Talking about encoding
    coverage data as RDF - just because we can, should we. What is
    the requirement?
    ... When I look at the use of RDF, it's to create graphs of
    info, I canstitch multipel sources together.
    ... But when we talk about geometry objects, we want the whole
    thing, not individual coordinates.
    ... Do we need to just be able to query the blob?
    ... Is there really a need for the range data as triples?
billroberts: The kind of use
    cases I think of, we want to be able to relate things like land
    cover, and look at what crops farming are growing.
    ... An ID might be for a parcel that is receiving a grant or
    whatever.
    ... You want to see from the satellite what crop is actually
    being grown.
    ... It may be that your coverage has 100 points in that
    polygon, or some sort of derived figure
    ... Any solutions springing to mind?
jtandy: I think that's a good
    example. You have vector geometry intersecting with the
    coverage, you haev statistical data...
    ... It ticks a lot of complicated boxes.
    ... And probably allows us to explore whether there are
    benefits in encoding coverage values as RDF.
<kerry> +q
kerry: A similar example -
    working with statisticians. Models that would look like the
    data cube model. Modulo the scale problem, modelling things
    like Earth Obs data is conceptually similar.
    ... So yes about the geometry. Remember we need to encode time
    series data. We have SSN as part of the picture.
    ... So there's a natural fit that I think is worth
    investigating.
    ... Instead of getting your head around a very differnet model
    which is what I see Coverage JSON as being. I don't think it's
    a natural data model to be working with for our targets.
    ... This has been done before with Earth Obs data - some work
    from Munster. This is pre-RDF Data Cube model
    ... their EO data - not a lot of it - but I find it pretty
    convincing in terms of usability.
billroberts: A lot of my work is
    around stat data as QB
    ... the potential of that is about comparing different data
    sets about the same place.
    ... So i certianly see value in elating EO data to that
    model.
    ... The challenge is gridded data cf. data-backed
    polygons
    ... If you're picking pixels off a picture and trying to work
    out which ones are relevant for my town, that's hard.
    ... Can we pick some use cases that we think are
    representative?
billroberts talks more about orders of magnitude and sense, or not, of using RDF for the range data
<Zakim> jtandy, you wanted to ask whether we should distinguish between (non)expert users
jtandy: One thing I think we're
    striving to do is to make the data more accessible. WCS queries
    are difficult. I imagine it would be hard to build a query for
    a QB
    ... one of the BPs we have is that people should expose simple
    APIs that are easy to use as well as arbitrary query
    endpoints.
    ... Should we look at that too?
billroberts: Yes.
jtandy: The good old fashioned bathing water data... all of the data is in QB. They have exposed it through a series of RESTful endpoints for eay browsinbg.
kerry: If we go towards Coverage
    JSON, we shoujld also connect it to an ontology model, similar
    to what Dimitri was presenting a couple of weeks ago.
    ... That ontology model as JSON-LD might be worth
    exploring.
    ... I don't want to limit ourselves to JSON without JSON-LD as
    well.
<jtandy> +1 - I think that the intent was to be able to expose coverageJSON as JSONLD
<jtandy> ... or at least the metadata
kerry: When Sam was speaking, he had a simple SPARQL query. What he's done is more like LD than just JSON. There's nothing complex about that SPARQL query.
<billroberts> https://www.w3.org/2015/spatial/wiki/CoverageJSON_/_REST_API_Reflection
<billroberts> https://www.gitbook.com/book/reading-escience-centre/coveragejson-cookbook/details
billroberts: I'll have proposals for next time
<jtandy> thanks Bill!
<billroberts> thanks everyone, sorry for abrupt end