IRC log of BioRDF on 2007-12-10

Timestamps are in UTC.

16:02:04 [RRSAgent]
RRSAgent has joined #BioRDF
16:02:04 [RRSAgent]
logging to
16:02:08 [Zakim]
Zakim has joined #BioRDF
16:02:25 [AdrianP]
16:02:26 [dbooth]
Meeting: BioRDF Telecon
16:02:37 [dbooth]
Chair: Susie
16:03:36 [dbooth]
Present: Don, Olivier, Scott, Matthias, Adrian, Kei, Susie, DBooth
16:04:29 [kei]
kei has joined #BioRDF
16:05:01 [Susie]
16:05:31 [dbooth]
Topic: Adrian's Slides on Rule Responder Demo
16:05:36 [matthiassamwald]
sorry, nothing new about the SenseLab conversion, still waiting vor CVS access to the W3C site.
16:05:56 [dbooth]
(Adrian goes through slides at )
16:06:14 [dbooth]
(Adrian goes through slides at )
16:06:22 [ericP]
Zakim, please dial ericP-cw
16:06:22 [Zakim]
sorry, ericP, I don't know what conference this is
16:06:31 [ericP]
Zakim, this is BioRdf
16:06:32 [Zakim]
ok, ericP; that matches SW_HCLS(BioRDF)11:00AM
16:06:35 [ericP]
Zakim, please dial ericP-cw
16:06:35 [Zakim]
ok, ericP; the call is being made
16:06:36 [Zakim]
16:09:29 [dbooth]
Present+ EricP
16:13:47 [ericP]
there are a lot of layers here. i wonder which are backed by use cases
16:20:50 [dbooth]
EricP: This treats Prova as the unifying layer. Another approach would be to put RDF adaptors on all the data sources and use SPARQL as the unifying language.
16:22:16 [mscottm]
another uk project 'comparagrid' also uses rules to create views of (OWL) data
16:30:50 [dbooth]
Kei: How does your rules language compare to RUleML and OWL.
16:31:28 [dbooth]
Adrian: Different family. SWRL is a subfamily of RuleML. ReactionRuleML is intended for reactive rules.
16:31:55 [dbooth]
... It can implement workflow-like systems like BPEL.
16:32:57 [dbooth]
... RuleML is more homogeneous integration, whereas ReactionRule is more heterogeneous. You can use external schema vocabularies to type the variables. You can have a certain OWL ont and give a variable the typ of this class.
16:33:27 [mscottm]
Zakim, +003120416aaaa is M_Scott_Marshall
16:33:27 [Zakim]
+M_Scott_Marshall; got it
16:33:36 [dbooth]
Susie: Plans to integrate with HCLS knowledge base?
16:34:34 [dbooth]
Adrian: You can define conditional decision logic, and then encode additional decisions. You could also do transformation al rules: pull the data, then update to HCLS knowledgebase, though that would need SPARQL update which doesn't yet exist.
16:35:11 [dbooth]
... If there are queries that always repeat in a certain binding -- author, patents, etc., -- then you can implement it in a declarative rule language.
16:35:19 [Susie]
zakim, +049351aabb is Adrian
16:35:19 [Zakim]
+Adrian; got it
16:35:42 [dbooth]
DBooth: What are you using this for?
16:36:41 [dbooth]
Adrian: We use it for virtual orgs, where the members are defined as a set of autonomous agents with their internal decision logic. Other apps are IT service mgmt, where you define quality contracts between services and you need to monitor conformance, and that is easy to do in this rule language.
16:37:03 [dbooth]
... In HCLS it could be used as a heterogeneous approach for a HCLS infrastructure.
16:37:33 [dbooth]
Adrian: Please let me know of suggestions for applying this in W3C.
16:37:51 [dbooth]
Topic: SenseLab Conversion Documentation
16:37:55 [AdrianP]
some links:
16:38:42 [dbooth]
Matthias: Still working with Eric to get it on W3C server. Need to update my invited expert status.
16:39:04 [dbooth]
EricP: I'm ready to do the next step, now that invited expert status is done.
16:39:18 [dbooth]
... Send me your SSH public key.
16:40:09 [dbooth]
Topic: Note desribing Knowledge Base
16:40:21 [dbooth]
Susie: Status?
16:40:29 [mscottm]
16:42:50 [dbooth]
Scott: I have some new items that were requested to be added, and I'll try to fill them in. Starting with the motivation, I look at it as conveying the steps and thoughts behind the design of this knowledgebase, so that people can understand and reproduce it if desired -- ambitious. Added a lenghty intro, not sure it should be quite so long. We've also got more info about the incorporated databases. Would like to add discussion of how it needs to be main
16:42:51 [dbooth]
tained, next steps for improvement, detailed section on 1-2 conversions to RDF from legacy datastore.
16:43:13 [ericP]
q+ to suggest that SenseLab is the conversion detail document
16:43:14 [dbooth]
... I've mailed to Alan to get input on which conversion to cover.
16:43:33 [dbooth]
s/tained/... maintained/
16:44:23 [dbooth]
... Added to the wish list: List of current implementation, where to find them. Also listing of schema classes and properties used. And an additional resources section in the appendix.
16:44:48 [dbooth]
... i'd like to ask Matthias for comments on next steps.
16:45:36 [dbooth]
... Also would like input on current doc structure, or if anyone wants to provide material. One tricky piece: explanation of how evidence is done, but I may have a doc from Alan that talks about it.
16:45:57 [dbooth]
EricP: Sometimes evidence is handled by providence in the sparql query, and sometimes it's in the data itself.
16:46:41 [dbooth]
Scott: Yes, in some cases it's encoded in the properties.
16:47:23 [dbooth]
... There are a number of approaches that can be used to model evidence, and we're using more than one, but we think evidence is important, so we should at least describe the approaches taken.
16:47:42 [dbooth]
Scott: Has everyone looked at it?
16:48:20 [dbooth]
EricP: I have. I had envisioned the SenseLab conversion as being the detailed description of importing more than one knowledge base, so we wouldn't have to do that in this document -- we'd just reference it.
16:48:26 [dbooth]
Susie: That makes sense to me.
16:48:29 [ericP]
16:48:49 [dbooth]
Scott: I suppose we'll link out to a separate doc.
16:49:12 [dbooth]
EricP: It woudl be nifty to have the motivated query incorporate some of the Senselab data.
16:49:24 [dbooth]
Scott: Good plan. Do we have such a query?
16:49:33 [dbooth]
Matthis: Nothing we can show.
16:50:25 [dbooth]
ACTION: Kei to provide query that makes use of SenseLab graph
16:52:04 [matthiassamwald]
I will just write an e-mail. Arghl.
16:52:19 [matthiassamwald]
We cannot query our ontologies with SPARQL.
16:52:36 [matthiassamwald]
At least not in a very intuitive manner.
16:53:21 [dbooth]
EricP: I arbitratily picked Banff query #2 for clarifying the motivations when somenoe is reading the doc. Any other query would be fine also if it touches on SenseLab.
16:54:33 [dbooth]
Scott: People involved with incorporating the data should review to see if I have left out anything. Could also use help on getting good use cases.
16:54:39 [matthiassamwald]
16:54:54 [dbooth]
... Was DERI knowlege base send out to public?
16:55:00 [Zakim]
16:55:12 [matthiassamwald]
I'm in charge of that. If you have questions, please contact me at The web address is
16:55:42 [dbooth]
Don: It's unlikely ours will be ready in time.
16:55:58 [AdrianP]
we might also setup a mirror for the KB if there is a need
16:56:37 [matthiassamwald]
The problem is to keep the installations in sync.
16:56:44 [dbooth]
Scott: The main point is to provide links in this section to encourage people to try it out. We could list installations, and those taht are publicly accessible could have URLs.
16:57:29 [matthiassamwald]
It is already hard enough to keep the DERI and Neurocommons KB in sync. With multiple mirrors, we need to set up some automated process.
16:57:30 [dbooth]
ACTION: Susie to ping Wright State University (Amit) to see if there database is up and available
16:58:20 [dbooth]
Susie: in Sec 5, have you been able to incorporate all of the databases in Jonathan's ReadMe?
16:59:25 [dbooth]
Scott: No. SOme are listed multiple times (e.g., MESH). We should just list it once, and maybe refer out to Jonathan's more complete table. Difficult call because keepign these installations in sync is impossible, but if people want the details they should see Jonathan's table. \
16:59:50 [dbooth]
... There are still some things missing and I'll be adding them. Do we list every little thing that's been put in? That makes th list long.
17:00:21 [dbooth]
Susie: My inclination: Add a small amount of info about all of the datasets, to avoid letting anyone feel unloved.
17:01:44 [dbooth]
Scptt: I don't mean leaving anyone out. It's really a technical detail. E.g., the mammalian part of OBO is in another section. I'll check with Jonathan about how to handle that.
17:01:53 [dbooth]
17:02:13 [dbooth]
Susie: You could send out email to ask for feedback.
17:02:30 [dbooth]
Susie: Finish by 20th?
17:03:19 [dbooth]
Scott: I could use some help. If there are diagrams tha someone likes, let me know. Last week I applied jambalaya to your ont, but maybe you meant something else.
17:03:39 [dbooth]
Susie: I was hoping for a screen shot of the OB ont.
17:04:05 [dbooth]
Susie: Screen shot of science commons ont would be interesting for people to see.
17:04:24 [Zakim]
17:04:34 [dbooth]
EricP: We have until the 21st.
17:05:07 [dbooth]
Topic: Next Call
17:05:13 [Zakim]
17:05:14 [Zakim]
17:05:16 [Zakim]
17:05:18 [Zakim]
17:05:20 [kei]
kei: sorry, got to go because of class
17:05:31 [dbooth]
Monday 17-Dec-2007
17:05:34 [dbooth]
17:05:47 [matthiassamwald]
So... I need to generate a SSH key?
17:05:52 [dbooth]
rrsagent, make logs public
17:06:07 [dbooth]
rrsagent, draft minutes
17:06:07 [RRSAgent]
I have made the request to generate dbooth
17:06:15 [dbooth]
Scribe: DBooth
17:06:17 [ericP]
17:06:17 [matthiassamwald]
17:06:23 [matthiassamwald]
17:06:35 [dbooth]
rrsagent, draft minutes
17:06:35 [RRSAgent]
I have made the request to generate dbooth
17:06:47 [matthiassamwald]
17:06:57 [matthiassamwald]
I use 'PUTTY'
17:06:58 [dbooth]
rrsagent, bye
17:06:58 [RRSAgent]
I see 2 open action items saved in :
17:06:58 [RRSAgent]
ACTION: Kei to provide query that makes use of SenseLab graph [1]
17:06:58 [RRSAgent]
recorded in
17:06:58 [RRSAgent]
ACTION: Susie to ping Wright State University (Amit) to see if there database is up and available [2]
17:06:58 [RRSAgent]
recorded in
17:07:02 [matthiassamwald]
as a SSH client.