IRC log of signage on 2012-11-02

Timestamps are in UTC.

07:31:40 [RRSAgent]
RRSAgent has joined #signage
07:31:40 [RRSAgent]
logging to
07:32:09 [Ryosuke]
Ryosuke has joined #signage
07:32:28 [hiroki]
rrsagent, make draft minutes
07:32:28 [RRSAgent]
I'm logging. I don't understand 'make draft minutes', hiroki. Try /msg RRSAgent help
07:34:25 [tokamoto]
tokamoto has joined #signage
07:35:23 [Ryosuke]
rrsagentd, draft minutes
07:35:55 [Ryosuke]
rrsagent, draft minutes
07:35:55 [RRSAgent]
I have made the request to generate Ryosuke
07:36:09 [tomoyuki]
tomoyuki has joined #signage
07:36:57 [Ryosuke]
07:37:19 [Ryosuke]
07:37:42 [Ryosuke]
rrsagent, draft minutes
07:37:42 [RRSAgent]
I have made the request to generate Ryosuke
07:39:18 [hiroki]
rrsagent, make log public
07:43:15 [kotakagi]
kotakagi has joined #signage
07:43:34 [Ryosuke]
07:43:54 [Ryosuke]
scribe: naomi
07:45:17 [naomi]
AA: hoge
07:45:33 [Ryosuke]
07:46:25 [Shinji]
Shinji has joined #signage
07:50:59 [shinichi]
shinichi has joined #signage
07:52:38 [a12u]
a12u has joined #signage
07:54:20 [Alan]
Alan has joined #signage
07:56:14 [hiroki]
hiroki has joined #signage
08:01:43 [yamaday]
yamaday has joined #signage
08:03:08 [kotakagi]
kotakagi has joined #signage
08:04:54 [naomi]
futomi: good morning, thank you for coming
08:05:16 [naomi]
... today's agenda - joint meeting with MMI WG
08:05:23 [naomi]
... thank you for coming MMI WG people
08:05:36 [kaz]
kaz has joined #signage
08:05:37 [naomi]
... appreciate to have advices
08:05:48 [naomi]
... very happy to meet and welcome you
08:06:14 [Skim]
Skim has joined #Signage
08:06:58 [naomi]
... after that we continue our discussion then have a meeting DAP WG
08:07:28 [naomi]
... 13:30 - 14:30 at their room
08:10:34 [naomi]
karen: thank you for coming
08:10:52 [naomi]
... appreciate participants especially from north America
08:11:00 [Skim13]
Skim13 has joined #Signage
08:11:10 [naomi]
[ everybody nods ]
08:11:59 [naomi]
kaz: Kaz Ashimura, activity lead of MMI
08:12:25 [naomi]
daniel: chair of voice brower WG and editor of couple of Web and TV
08:12:38 [shige]
shige has joined #signage
08:12:46 [naomi]
james: worked and Web RTC and @@
08:13:24 [kaz]
s/and @@/MMI and Voice Browser/
08:13:59 [naomi]
[ introducing themselves - Chang, Debora, Herena, Sebastian ]
08:14:09 [naomi]
futomi: introducing himself
08:14:34 [naomi]
s/introducing himself/[ introducing himself ]/
08:15:03 [JonathanJ1]
JonathanJ1 has joined #signage
08:16:27 [mishida]
mishida has joined #signage
08:16:27 [burn]
burn has joined #signage
08:16:34 [naomi]
deborah: we have 3 presentations
08:16:42 [noriya]
noriya has joined #signage
08:17:15 [sfeu]
sfeu has joined #signage
08:17:33 [JonathanJ1]
JonathanJ1 has joined #signage
08:17:47 [sfeu]
Present+ Sebastian_Feuerstack
08:17:57 [futomi]
futomi has joined #signage
08:18:56 [Jaejeung]
Jaejeung has joined #signage
08:18:56 [naomi]
james: will explain about multimodal architecture overview
08:19:08 [gisung]
gisung has joined #signage
08:19:42 [naomi]
[ explaining slides ]
08:19:52 [naomi]
james: @@
08:20:52 [naomi]
scribenick: Ryosuke
08:20:52 [dadahl]
dadahl has joined #signage
08:21:19 [Ryosuke]
james: html5 voicexml TTS all independent
08:21:43 [Ryosuke]
... expaling about architecture of MMI
08:22:25 [Ryosuke]
james: diggram of modality component
08:22:25 [kawakami]
kawakami has joined #signage
08:22:48 [naomi]
08:23:04 [Ryosuke]
... suppose speech difinition system
08:23:38 [Ryosuke]
... components is blockbox to each other
08:24:13 [Ryosuke]
... interoperability testing
08:24:41 [Ryosuke]
... 3 vendors, speech , image, @@
08:25:03 [kaz]
toru: could you give us any concrete examples?
08:25:15 [Ryosuke]
james: in business model, it would depened @@
08:25:25 [kaz]
jim: one company might be an expert of voice, but not so for graphics
08:25:46 [kaz]
toru: dynamic discovery is included?
08:25:48 [Ryosuke]
toru: techinical issue , discovery including architecture
08:25:58 [Ryosuke]
james: yes
08:25:59 [kaz]
jim: will be mentioned later
08:26:19 [Ryosuke]
shinji: all codec is controlled ?
08:26:27 [kaz]
sinji: multiple modalities of same kind?
08:26:30 [kaz]
jim: possible
08:26:37 [whyun]
whyun has joined #signage
08:26:38 [Ryosuke]
james: anyone can control each component.
08:27:06 [kaz]
debbie: e.g., multiple languages
08:27:06 [naomi]
scribenick: kaz
08:27:19 [Skim13]
Skim13 has joined #Signage
08:27:28 [kaz]
jim: IM decides which modality should be used
08:27:41 [kaz]
... might choose different modalities, e.g., different languages
08:28:10 [kaz]
toru: IM knows the capability of each MC?
08:28:30 [kaz]
jim: MC is, e.g., sensors
08:28:33 [kaz]
... and MC is the brain
08:29:16 [kaz]
toru: which part is the server?
08:29:22 [kaz]
jim: either way is possible
08:29:44 [kaz]
debbie: for example, Openstream's implementation includes all the components
08:29:50 [kaz]
... on cell phones
08:30:20 [kaz]
jim: standard piece here is "how all the components talk with each other"
08:30:49 [kaz]
... all we define here is messages between the components (=MCs and IM)
08:31:19 [Shinji]
08:31:21 [kaz]
[ that's why the name of the spec is "Multimodal Architecture and Interfaces" ]
08:31:44 [kaz]
wook: transport?
08:32:03 [kaz]
jim: we used HTTP for the first version of the interoperable testing proto
08:32:44 [kaz]
(some more questions)
08:33:09 [kaz]
jim: application logic is handled by the IM
08:33:12 [kaz]
... MVC model
08:34:02 [kaz]
topic: MMI Discovery - helena
08:34:45 [kaz]
rrsagent, make log public
08:34:49 [kaz]
rrsagent, draft minutes
08:34:49 [RRSAgent]
I have made the request to generate kaz
08:35:26 [kaz]
helena: Discovery & Registration
08:36:16 [kaz]
... MMI Architecture is an architecture for components to orchestrate
08:36:34 [kaz]
... MC layer is abstract and generic
08:36:47 [kaz]
... respoonsible for tasks
08:37:09 [kaz]
... can have multiple devices that provide a task
08:37:24 [kaz]
... Modality Component life-cycle
08:37:40 [kaz]
... advertisement, discovery, registration, and control
08:37:55 [Shinji]
q- sangwhan
08:38:19 [kaz]
08:38:36 [CHONG]
CHONG has joined #signage
08:38:56 [kaz]
... 1. Advertisement
08:39:18 [kaz]
... reach correctness in the MC retrieval
08:39:37 [kaz]
Meeting: Web-based Signage BG f2f Meeting at TPAC2012 - Day 2
08:39:48 [kaz]
futomi: MC?
08:40:00 [kaz]
helena: Modality Components of the MMI Architecture
08:40:19 [kaz]
... and what must be advertised?
08:40:52 [kaz]
... functional and non-functional information
08:41:03 [hiroki]
Present+ Shigeo_Okamoto, Helena_Rodriguez, Kohei_Kawakami, Jaejeung_Kim, Soobin_Lee, Gisung_Kim, Shinichi_Nakao, Noriya_Sakamoto
08:41:10 [kaz]
... e.g., two concurrent synthesizers
08:41:33 [kaz]
... compared to DLNA
08:41:49 [kaz]
s/compared to/ examples are/
08:42:08 [kaz]
s/DLNA/DLNA, Bonjour, Intent and Web Services
08:42:18 [kaz]
rrsagent, draft mintues
08:42:18 [RRSAgent]
I'm logging. I don't understand 'draft mintues', kaz. Try /msg RRSAgent help
08:42:25 [kaz]
rrsagent, draft minutes
08:42:25 [RRSAgent]
I have made the request to generate kaz
08:42:37 [RRSAgent]
I'm logging. I don't understand 'you're too strict :)', kaz. Try /msg RRSAgent help
08:42:49 [kaz]
futomi: seems like UPnP
08:43:03 [kaz]
helena: DLNA uses UPnP for discovery
08:43:14 [kaz]
... and what is needed for MMI?
08:43:33 [Shinji]
Meeting: Joint meeting MMI WG and Web-based signage BG in TPAC2012
08:44:05 [kaz]
helena: (explains the idea using a picture)
08:44:12 [hiroki]
Present+ Hiroshi_Yoshida, Shin-Gak_Kang, Kaz_Ashimura, Sung_Hei_Kim, Deboran_Dahl, Toru_Kobayashi, Dan_Burnett
08:44:35 [kaz]
helena: next Discovery
08:44:46 [kaz]
... for types of discovery criteria
08:44:58 [kaz]
... task goal, intention, behavior and capacities
08:45:07 [kaz]
08:45:20 [kaz]
... fixed, passive, active and mediated
08:45:42 [kaz]
... you use underlying technologies
08:46:12 [kaz]
... requirement is the need of a mechanism of discovery using the MMI events
08:46:21 [kaz]
... (shows another picture on discovery)
08:46:32 [hiroki]
Present+ Wook_Hyun, Karen_Myers, Chong_Gu, Hiroki_Yamada, Masayoshi_Ishida, Ryoichi_Kawada, Toshiyuki_Okamoto, Shinji_Ishii, Sebastian_Feuerstuck, Naomi_Yoshizawa, Ryosuke_Aoki
08:46:50 [kaz]
helena: and Registration
08:47:20 [kaz]
... implies information storing, indexing criteria, registration state and registaration distribution
08:48:19 [kaz]
... you can install MCs on some server
08:48:27 [kaz]
... and the information can be distributed
08:48:56 [kaz]
... requirement: system state handling, multimodal session and registry updates
08:49:09 [kaz]
... and then Control
08:49:39 [kaz]
... use the same MMI life-cycle events to control registration and registration updates
08:49:47 [kaz]
debbie: one use case?
08:49:54 [kaz]
... public sinage could be one
08:50:05 [kaz]
helena: we have a set of use cases
08:50:16 [kaz]
-> use cases note
08:50:31 [kaz]
helena: UI on a mobile
08:50:42 [kaz]
... interacts with big screens
08:50:57 [kaz]
... that is one possibility
08:51:10 [kaz]
... connect with a public display
08:51:22 [kaz]
... communicate with MCs via IM
08:51:39 [kaz]
futomi: my understanding is ...
08:51:49 [kaz]
... there is a big screen for digital signage
08:51:52 [kaz]
... and I have a mobile
08:52:01 [kaz]
... which communicate with the big screen
08:52:27 [kaz]
... maybe there is a list of devices close by on the mobile
08:52:39 [kaz]
... and I can choose which display to be connected
08:52:43 [kaz]
helena: right
08:53:04 [kaz]
... currently we need to stop in front of the display at stations
08:53:33 [kaz]
... using this mechanism devices can interact with each other via IM
08:53:42 [kaz]
futomi: where is the controller (IM)?
08:53:46 [kaz]
... and where are the MCs?
08:53:59 [kaz]
helena: MMI Architecture allows nested structure
08:54:12 [kaz]
... so the IM could be installed on the signage display
08:54:27 [kaz]
... or a separate server is another possibility
08:54:33 [kaz]
... it depends on the application
08:54:44 [kaz]
futomi: MC is not free
08:54:54 [kaz]
... we're a signage operater
08:55:18 [kaz]
... it would be good if we could provide an IM to control the service
08:55:36 [kaz]
jim: right
08:55:56 [kaz]
helena: you can do that
08:56:09 [kaz]
... the question is rather what your own criteria is
08:57:00 [kaz]
shinji: please share the information
08:57:09 [kaz]
kaz: we'll let you know about the URIs
08:57:46 [naomi]
kaz: [ showing a demo ]
08:58:39 [naomi]
... possible system of MMI archtecture
09:00:23 [naomi]
... one interaction manager, DLNA, GUI, VUI MC (voice XLM), GUI MC (HTML5) and other services on the web e.g., EPEG
09:00:53 [naomi]
... two windows connected by a simple socket XML
09:01:04 [naomi]
scribenick: Ryosuke
09:01:33 [Ryosuke]
kaz: demo of TV control using voice interface
09:01:53 [Ryosuke]
toru: @@
09:01:56 [Ryosuke]
kaz: no
09:02:09 [naomi]
s/@@/ is it included advertisement?/
09:02:25 [Ryosuke]
kaz: there is no discovery on this demo
09:03:12 [Ryosuke]
kaz: @@
09:04:21 [Shinji]
09:04:33 [naomi]
scribenick: kaz
09:04:35 [Alan]
Alan has joined #signage
09:05:18 [kaz]
topic: Signage use cases
09:05:27 [kaz]
futomi: Web-based Signage use cases
09:05:31 [kaz]
... we have 19 use cases
09:05:41 [kaz]
... we're now updating them
09:05:51 [kaz]
... some of them are related to MMI
09:06:26 [kaz]
s/toru: @@/toru: does this demo use discovery?
09:06:59 [kaz]
... most of digital signages just show information in one direction
09:07:07 [kaz]
... but the future ones should be interactive
09:07:20 [kaz]
... so we listed possible use cases
09:07:30 [kaz]
... R5: Discovered by personal devices
09:07:43 [hiroki]
-> Web-based Signage Use cases and Requirements
09:08:14 [kaz]
futomi: (explains the UC)
09:09:09 [kaz]
s/kaz: @@/kaz: both th components know each other's IP address and ports
09:09:43 [kaz]
futomi: I can see the big display
09:09:47 [kaz]
... but I can touch it
09:09:55 [kaz]
... how can I interact with it?
09:10:28 [kaz]
... probably I could use my smartphone as the UI for the display
09:11:39 [kaz]
i/topic: Signage use cases/toru: plan for systems which need discovery?/
09:12:25 [kaz]
i/topic: Signage use cases/kaz: the MMI WG have been working on intereoperable testing, and the next version of the interoperable testing proto system should include discovery capability. please help us :)
09:12:45 [kaz]
jim: signage terminal as an IM uses users devices as MCs
09:12:54 [kaz]
... discovery work is very important here
09:13:03 [kaz]
... this idea fits MMI Architecture
09:13:17 [kaz]
futomi: there are many other UCs
09:13:30 [kaz]
... maybe not related to MMI, though
09:13:49 [Skim13]
Skim13 has joined #Signage
09:13:55 [kaz]
helena: Web Intents is very much modality-oriented
09:14:08 [sangwhan]
sangwhan has joined #signage
09:14:22 [kaz]
... UPnP is used for many devices
09:14:42 [Shinji]
09:14:56 [kaz]
... the coordination capability provided by the MMI Architecture should be useful
09:15:20 [kaz]
@1: how can I select one from various signage devices?
09:15:28 [naomi]
09:15:28 [kaz]
09:15:42 [kaz]
helena: depends on application
09:16:05 [Kiyoshi_]
Kiyoshi_ has joined #signage
09:16:17 [kaz]
futomi: MMI Architecture doesn't define that part
09:16:24 [sangwhan]
rrsagent, draft minutes
09:16:24 [RRSAgent]
I have made the request to generate sangwhan
09:17:26 [kaz]
jim: terminals agree with each other
09:18:03 [Shinji]
09:18:05 [sblee]
sblee has joined #signage
09:18:28 [kaz]
kaz: that's similar to wifi connection :)
09:19:04 [kaz]
futomi: interested in the work of the MMI WG
09:19:20 [kaz]
... would like to include MMI's work in the gap analysis
09:19:41 [kaz]
debbie: if you could, please give comments to the discovery work
09:19:57 [kaz]
... it would be very helpful if you could review the note
09:21:03 [sangwhan]
sangwhan has joined #signage
09:21:43 [naomi]
kaz: [ explaining slides ]
09:21:56 [naomi]
... @2
09:24:14 [naomi]
futomi: emotion means markup language?
09:24:18 [naomi]
kaz: yes
09:25:42 [naomi]
toru: futomi might not mention not only @3
09:29:00 [naomi]
futomi: meta data
09:29:13 [naomi]
[ everybody nods ]
09:29:25 [naomi]
ddhal: could have avator
09:29:29 [naomi]
... shows angry face
09:29:36 [naomi]
... detected from the speech
09:29:43 [naomi]
... two separated implementation
09:29:54 [naomi]
futomi: there should be many use cases
09:30:11 [naomi]
... markup language represents lets's say emotion
09:30:18 [naomi]
... this specify @3?
09:30:28 [naomi]
kaz: [ showing slides ]
09:30:55 [naomi]
daniel: agree what is the emotions
09:31:06 [naomi]
... need to research what are emotions
09:31:14 [naomi]
futomi: would be very hard to define
09:31:34 [naomi]
... emotion of each country is different
09:31:37 [naomi]
... how do you solve
09:31:46 [naomi]
dadhal: the group recognize that
09:31:55 [naomi]
... 5 -6 comments
09:32:04 [naomi]
... vocabularies
09:32:14 [naomi]
... you can define your own vocabularies
09:32:35 [naomi]
kaz: that's why we provided @4
09:32:45 [naomi]
dadhal: 5-6 recommended
09:33:07 [naomi]
ryoichi: how do you use it?
09:33:18 [naomi]
kaz: sets of emotion markup language
09:33:51 [naomi]
daniel: any emotion could be readable of this vocabulary
09:34:17 [naomi]
herena: if you want to question or anxious, you can use annotation language
09:34:26 [naomi]
... performed - has a lot of use cases
09:34:43 [Ryosuke]
09:34:44 [naomi]
toru: what is the most applicable use case of this
09:34:57 [naomi]
kaz: given these provided implementation report
09:35:02 [naomi]
... avator system, 3D
09:35:07 [naomi]
... face recognition
09:35:21 [naomi]
dadahl: product testing
09:35:42 [naomi]
... opinion people says good but might don't like the faces
09:35:54 [naomi]
kaz: one of NTT dept is working on emotion analysys
09:36:00 [naomi]
... for call centers
09:36:11 [naomi]
dadhal: that's a good use cases
09:36:22 [naomi]
futomi: break!
09:36:28 [naomi]
... appreciate your attendance
09:36:34 [naomi]
[ everybody applause ]
09:43:14 [tokamoto]
tokamoto has joined #signage
09:53:40 [hiroki]
hiroki has joined #signage
09:56:18 [kawakami]
kawakami has joined #signage
09:57:45 [skim13]
skim13 has joined #signage
10:08:04 [a12u]
a12u has joined #signage
10:08:06 [naomi]
naomi has joined #signage
10:08:09 [Ryosuke]
Ryosuke has joined #signage
10:08:54 [hiroki]
s/this specify @3/Can this specify level of emotion?
10:09:04 [hiroki]
s/[ showing slides ]/Yes, use value attribute. e.g. value="0.85"
10:09:17 [hiroki]
rrsagent, draft minutes
10:09:17 [RRSAgent]
I have made the request to generate hiroki
10:09:54 [burn]
burn has joined #signage
10:10:02 [burn]
burn has left #signage
10:14:34 [Ryosuke]
scribenick: ryosuke
10:15:03 [Ryosuke]
futomi: yesterday r6, today it starts from r7
10:15:30 [Ryosuke]
... expalin summary of r7
10:15:52 [Ryosuke]
... explain use cases of r7.
10:16:22 [Ryosuke]
futomi: R7 include the case of stock in real time communication
10:16:52 [noriya]
noriya has joined #signage
10:17:11 [Ryosuke]
... real time communication using digital signage is a live stream such as live news, notice board.
10:17:32 [Ryosuke]
... forth use case is fire in disaster.
10:17:59 [Ryosuke]
... the use case assume shopping center where there are many displays in the center
10:18:31 [Ryosuke]
... digital signage inform us about fire area etc
10:18:50 [Ryosuke]
futomi: next use case is earthquake
10:19:42 [Ryosuke]
... charles commented push API yesterday
10:20:16 [Ryosuke]
... technique related to R7 add push API
10:21:04 [hiroki]
-> Push API spec
10:21:33 [Ryosuke]
futomi: explanation of websocket API which is low layer
10:21:59 [Ryosuke]
yamada: @@
10:22:33 [Ryosuke]
yamada: push API on SNS in the radio
10:22:35 [Shinji]
[Editors Action] add. "Push API" to R7. Gap analysis.
10:22:48 [Ryosuke]
Kang: BB
10:23:39 [Ryosuke]
Kang: Sever sends information using push API
10:23:53 [Ryosuke]
Kang: explanation of push API
10:24:05 [shinichi]
shinichi has joined #signage
10:24:15 [Ryosuke]
Kang: taliking about fire in disaster
10:25:05 [Ryosuke]
Kang: CC
10:25:21 [Ryosuke]
futomi: preinstall application on digital signage
10:25:35 [whyun]
whyun has joined #signage
10:25:37 [naomi]
10:25:42 [whyun]
10:26:06 [Ryosuke]
Kang: reciving information in real-time using interaction with digital signage
10:26:29 [Ryosuke]
Shinji: how to switch normal case to disaster case on a digital signage?
10:27:02 [Ryosuke]
futomi: assuming Automatically switching
10:27:23 [tomoyuki_]
tomoyuki_ has joined #signage
10:28:08 [Ryosuke]
futomi: we can build up automatically system which switch to disaster mode.
10:28:32 [Ryosuke]
Kang: DD
10:29:07 [Ryosuke]
futomi: Let's talk about terminal side
10:29:23 [Ryosuke]
... how to sensor authentification ?
10:30:27 [Ryosuke]
Shinji: talking about a example
10:30:33 [Ryosuke]
... signage system watch disaster situation using camera
10:31:04 [Ryosuke]
futomi: Our scope is to discuss not multiple sugnage but single signage
10:31:30 [Ryosuke]
s/is to discuss/is
10:32:57 [Ryosuke]
Kang: R6 is audio mesuament
10:34:01 [naomi]
10:34:57 [Ryosuke]
futomi: FF
10:36:15 [Ryosuke]
Kang: server can push emergency information to specific screen
10:37:31 [Ryosuke]
topic: R8 identifying a location of a terminal
10:38:50 [Ryosuke]
futomi: a use case is ads based on a location
10:39:47 [Ryosuke]
... example of situation about this use case is train station
10:40:30 [Ryosuke]
... the problem that differnt station have a digital signage which show the same content
10:42:11 [Ryosuke]
futomi: API relted with R7 are Geolaction API Specification, Geolocation API Specification Level2
10:42:23 [Ryosuke]
10:42:51 [Ryosuke]
okamoto: FF
10:42:58 [Ryosuke]
futomi: twitter route
10:43:08 [Ryosuke]
KK: map
10:43:22 [Ryosuke]
futomi: discuss on map or mapping
10:43:36 [Ryosuke]
futomi: next topic
10:43:48 [Ryosuke]
topic: Synchronizing contents
10:44:36 [Ryosuke]
futomi: a use case is waching course materials on a tablet,
10:44:47 [naomi]
10:44:55 [Ryosuke]
... example educaton field
10:45:15 [naomi]
10:45:24 [Ryosuke]
... next example big conference
10:45:39 [Ryosuke]
s/example big/example is big
10:46:03 [Ryosuke]
futomi: Motivation of R9
10:46:44 [Ryosuke]
... requirement of network connectivty
10:47:39 [Ryosuke]
... API related to R9 is WebRTC
10:47:59 [Ryosuke]
... and WebRTC Tab Content Capture API
10:48:12 [Ryosuke]
Kang: I don't know this API
10:48:39 [Ryosuke]
TT: sharing video stream?
10:48:44 [Ryosuke]
futomi: Both
10:49:03 [naomi]
10:49:16 [naomi]
10:49:24 [naomi]
10:49:25 [Ryosuke]
futomi: twitter video, text, image and so on
10:49:43 [Ryosuke]
Wook: Data channel
10:50:08 [naomi]
10:50:28 [naomi]
s/Wook/ /
10:50:32 [Ryosuke]
futomi: video service, multiscreen services
10:51:00 [Ryosuke]
Wook: using web intent
10:51:24 [Ryosuke]
futomi: does twitter use broadcasting?
10:51:53 [Ryosuke]
Kang: FF
10:52:09 [Shinji]
[Editors Action] add. DataChannel (WebRTC) to R9. Gap analysis.
10:52:30 [Shinji]
10:52:32 [Ryosuke]
futomi: the opinion to DAP member
10:53:30 [Ryosuke]
topic: R10 saving contents and playing saved contents
10:53:59 [Ryosuke]
futomi: a use case is playing contents in network trouble
10:54:59 [Ryosuke]
... [explain wiki page of R10 on web-based digital signage BG]
10:55:15 [naomi]
look for R10 from
10:56:17 [Ryosuke]
futomi: imporatant point is offline web application in R10 use case
10:57:05 [Ryosuke]
topic: R11 Protecting video contents
10:57:41 [Shinji]
[Editors Actions delete "HTML5 4.8.6 The video element ", "HTML5 4.8.8 The source element" and "Media Source Extensions"
10:58:30 [Shinji]
s/Extensions"/Extensions" from R10 Gap analysis.]
10:58:44 [Ryosuke]
futomi: Twitter served data using encrypted media extensions
10:59:09 [Ryosuke]
topic: R12 Saving log data
10:59:38 [sangwhan]
sangwhan has joined #signage
11:00:04 [Ryosuke]
futomi: example of log data is who watach ad, when users watch ad, where users watch ad
11:00:06 [Shinji]
[Editors Actions]: addd "file API" to R10 Gap analysis.
11:02:37 [Ryosuke]
Shinji: log data is evidence of copy
11:02:54 [Ryosuke]
... server side cannot detect evidence of content copy
11:04:16 [Ryosuke]
Shinji: sinage operator want evidence
11:05:23 [Ryosuke]
Shinji: signitiure in the terminal
11:05:36 [Ryosuke]
... serverside cannot PP log data
11:05:38 [naomi]
11:06:21 [Ryosuke]
Wook: CC
11:08:10 [skim13]
s/CC/need to consider proof-of-play/
11:09:42 [Ryosuke]
futomi: DAP joint meeting starts 13:30
11:11:06 [Shinji]
[Editors Actions] add (temporally) with evidence to R11
11:12:09 [hiroki]
rrsagent, draft minutes
11:12:09 [RRSAgent]
I have made the request to generate hiroki
11:12:19 [Alan]
Alan has joined #signage
11:16:56 [kawakami]
kawakami has joined #signage
11:18:39 [a1zu]
a1zu has joined #signage
11:20:28 [kawakami_]
kawakami_ has joined #signage
12:47:41 [RRSAgent]
RRSAgent has joined #signage
12:47:41 [RRSAgent]
logging to
12:49:50 [yoh]
yoh has joined #signage
13:10:38 [sangwhan]
sangwhan has joined #signage
13:20:51 [a1zu]
a1zu has joined #signage
13:23:34 [hiroki]
hiroki has joined #signage
13:23:35 [tokamoto]
tokamoto has joined #signage
13:26:30 [kawada]
kawada has joined #signage
13:29:32 [sangwhan]
sangwhan has joined #signage
13:29:38 [sangwhan1]
sangwhan1 has joined #signage
13:30:05 [sangwhan1]
rrsagent, draft minutes
13:30:05 [RRSAgent]
I have made the request to generate sangwhan1
13:30:24 [sangwhan1]
Present+ Sangwhan_Moon
13:30:33 [sangwhan1]
Scribenick: naomi
13:31:42 [kotakagi]
kotakagi has joined #signage
13:32:10 [sangwhan1]
rrsagent, draft minutes
13:32:10 [RRSAgent]
I have made the request to generate sangwhan1
13:34:42 [Shinji]
Shinji has joined #signage
13:37:57 [sangwhan1]
s/Scribenick: naomi//
13:38:06 [whyun]
whyun has joined #signage
13:38:13 [sangwhan1]
rrsagent, draft minutes
13:38:13 [RRSAgent]
I have made the request to generate sangwhan1
13:38:34 [sangwhan1]
ScribeNick: sangwhan1
13:38:40 [sangwhan1]
rrsagent, draft minutes
13:38:40 [RRSAgent]
I have made the request to generate sangwhan1
13:39:02 [sangwhan1]
[ Resuming meeting after break ]
13:39:08 [sangwhan1]
rrsagent, draft minutes
13:39:08 [RRSAgent]
I have made the request to generate sangwhan1
13:39:09 [shinichi]
shinichi has joined #signage
13:39:37 [shige_]
shige_ has joined #signage
13:39:43 [JonathanJ1]
JonathanJ1 has joined #signage
13:40:39 [sangwhan1]
Futomi: We should finish reviewing our draft
13:40:53 [sangwhan1]
… further discussion should be done through the mailing list
13:40:54 [JonathanJ1]
JonathanJ1 has joined #signage
13:41:06 [sangwhan1]
… let's talk about renewing our draft document
13:41:13 [sangwhan1]
… namely item 3
13:41:17 [sangwhan1]
rrsagent, draft minutes
13:41:17 [RRSAgent]
I have made the request to generate sangwhan1
13:41:52 [kotakagi]
kotakagi has joined #signage
13:42:30 [hiroki]
-> Web-based Signage Use cases and Requirements, R14. Showing contents on time
13:43:45 [a1zu]
Present+ Hiroyuki_Aizu
13:44:54 [sangwhan1]
rrsagent, hello
13:45:00 [Kiyoshi__]
Kiyoshi__ has joined #signage
13:45:05 [sangwhan1]
Futomi: R14 defines showing contents on a given time
13:45:08 [sangwhan1]
... as for now, there is no common format for defining playlists
13:45:10 [sangwhan1]
... XML and JSON based formats are possibilities
13:45:13 [sangwhan1]
... but we need to analyze what is needed inside the metadata
13:45:15 [sangwhan1]
... defining format does not need to be done by a working group
13:46:10 [sangwhan1]
… we can alternatively use javascript code to substitute this
13:46:56 [naomi]
naomi has joined #signage
13:46:57 [sangwhan]
sangwhan has joined #signage
13:47:05 [sangwhan]
rrsagent, bye
13:47:20 [sangwhan]
rrsagent, draft minutes
13:47:20 [RRSAgent]
I have made the request to generate sangwhan
13:47:44 [sangwhan]
rrsagent, make log public
13:47:48 [sangwhan]
rrsagent, make minutes
13:47:48 [RRSAgent]
I have made the request to generate sangwhan
13:48:07 [sangwhan]
rrsagent, bye
13:48:07 [RRSAgent]
I see no action items
13:48:13 [RRSAgent]
RRSAgent has joined #signage
13:48:13 [RRSAgent]
logging to
13:48:18 [Zakim]
Zakim has joined #signage
13:48:28 [sangwhan]
rrsagent, this is signage
13:48:28 [RRSAgent]
I'm logging. I don't understand 'this is signage', sangwhan. Try /msg RRSAgent help
13:48:49 [sangwhan]
rrsagent, hello
13:49:04 [sangwhan]
rrsagent, draft minutes
13:49:04 [RRSAgent]
I have made the request to generate sangwhan
13:49:04 [tomoyuki_]
tomoyuki_ has joined #signage
13:51:02 [whyun]
13:51:27 [Shinji]
13:51:58 [matt]
matt has joined #signage
13:52:06 [matt]
rrsagent, draft seconds
13:52:06 [RRSAgent]
I have made the request to generate matt
13:52:55 [sangwhan]
ScribeNick: sangwhan
13:53:23 [sangwhan]
[ Scribe missed a lot of minutes due to technical difficulties ]
13:53:38 [skim13]
skim13 has joined #signage
13:55:24 [sangwhan]
Noriya: how about WebVTT
13:55:38 [sangwhan]
futomi:  WebVTT is for capturing on the web
13:56:01 [sangwhan]
... not quite similar with time text markup language
13:56:13 [sangwhan]
futomi:  what is the difference of TTML and WebVTT
13:56:36 [sangwhan]
futomi:  R14 should stay
13:56:44 [sangwhan]
topic: R15 Identifying an individual
13:57:02 [sangwhan]
futomi:  there is one use case - personalization
13:57:04 [matt]
rrsagent, draft minutes
13:57:04 [RRSAgent]
I have made the request to generate matt
13:59:49 [matt]
rrsagent, draft minutes
13:59:49 [RRSAgent]
I have made the request to generate matt
14:01:53 [matt]
rrsagent, draft minutes
14:01:53 [RRSAgent]
I have made the request to generate matt
14:02:38 [aizu]
aizu has joined #signage
14:02:58 [matt]
rrsagent, draft minutes
14:02:58 [RRSAgent]
I have made the request to generate matt
14:03:26 [tomoyuki_]
tomoyuki_ has joined #signage
14:04:09 [matt]
rrsagent, draft minutes
14:04:09 [RRSAgent]
I have made the request to generate matt
14:05:06 [aizu]
14:05:29 [futomi]
futomi has joined #signage
14:07:22 [Ralph]
Ralph has joined #signage
14:08:15 [matt]
rrsagent, draft minutes
14:08:15 [RRSAgent]
I have made the request to generate matt
14:11:39 [sangwhan]
futomi: Do we need to keep the requirements
14:11:44 [sangwhan]
... or should we remove the use case?
14:11:47 [sangwhan]
wook: This will have relations with web identity
14:11:50 [sangwhan]
... so this use case should be kept
14:11:52 [sangwhan]
skim1: There are potential privacy issues which need to be considered
14:12:32 [sangwhan]
???: Regarding privacy there are a lot of issues, but if there is user consent
14:12:34 [sangwhan]
... I don't see why this would be a problem
14:12:37 [sangwhan]
... but a interactive model like this should be considered
14:12:39 [sangwhan]
Futomi: OK, let's keep this and discuss later
14:12:42 [sangwhan]
... Moving on
14:12:44 [sangwhan]
Topic: R16, Capturing screenshots
14:12:46 [sangwhan]
Futomi: My personal favorite
14:12:48 [sangwhan]
... There is a need because the control centers can monitor the terminals easily
14:12:50 [sangwhan]
... If each terminal posts screenshots to the control center periodically
14:12:53 [sangwhan]
... it can be used to verify QoS
14:12:55 [sangwhan]
... Also, this mechanism can be used as evidence whether a advertisement
14:12:57 [sangwhan]
... has been showed on the terminal on a given time or not
14:12:59 [sangwhan]
... there are some existing API
14:13:01 [sangwhan]
... WebRTC tab content capture API seems to be relevant
14:13:04 [sangwhan]
... and it fulfills the requirements
14:13:06 [sangwhan]
... if it is possible to fetch the screenshot, XHR or WebSockets can be used
14:13:09 [sangwhan]
... to transmit the screenshot to the server
14:13:16 [sangwhan]
aizu: The browser testing tools working group is working on screenshot API
14:13:20 [sangwhan]
rrsagent, draft minutes
14:13:20 [RRSAgent]
I have made the request to generate sangwhan
14:13:37 [sangwhan]
14:13:52 [sangwhan]
rrsagent, draft minutes
14:13:52 [RRSAgent]
I have made the request to generate sangwhan
14:14:12 [sangwhan]
... The specification is here
14:14:17 [sangwhan]
Futomi: What is the format?
14:14:20 [sangwhan]
B: It is a pre-encoded image
14:14:27 [sangwhan]
14:14:57 [ryosuke]
ryosuke has joined #signage
14:15:01 [Ralph]
Ralph has left #signage
14:16:30 [sangwhan]
topic:  R17 - Seamless transition of contents
14:16:34 [sangwhan]
futomi:  suggested by ETRI
14:17:00 [sangwhan]
skim1:  Ther requirement is for signage terminals to be able to play contents
14:17:08 [sangwhan]
s/aizu: It is a pre-encoded image/aizu: It is lossless PNG images encoded using Base64/
14:17:27 [matt]
matt has left #signage
14:18:14 [sangwhan]
... in a normal web page, transition will introduce blinking
14:18:47 [sangwhan]
... which isn't quite natural to see when you want a smooth transition between content
14:19:08 [sangwhan]
… it might be possible to do this with CSS transitions/animations
14:19:18 [sangwhan]
Futomi: any comments?
14:19:38 [sangwhan]
wook: Small comment, I heard from Chaals that XHR level 2 is now just XHR
14:19:42 [sangwhan]
Futomi: I will fix that
14:19:57 [sangwhan]
ACTION: Futomi to fix XHR reference on use case wiki page
14:21:55 [shige]
shige has joined #signage
14:29:18 [sangwhan]
Topic: R18 Interactivity with the call center
14:29:20 [sangwhan]
skim13: Use case, when you are at a shopping mall
14:29:22 [sangwhan]
... when there is too much information
14:29:25 [sangwhan]
... this makes it possible to directly connect with a call center
14:29:34 [sangwhan]
... and interact with a human operator
14:29:36 [sangwhan]
... WebRTC can probably be used for this
14:29:38 [sangwhan]
... simple online assistance can be achieved with WebSockets
14:29:40 [sangwhan]
... metadata can be packaged in XML
14:29:42 [sangwhan]
Futomi: This is a very valid use case
14:29:43 [sangwhan]
... do you think there are further spec references needed?
14:29:45 [sangwhan]
... like media capture streams?
14:29:47 [sangwhan]
kotakagi: That might be overkill
14:29:49 [sangwhan]
futomi: We don't need to use IP for voice transmission
14:29:50 [sangwhan]
... a normal landline hook could work as well
14:29:52 [sangwhan]
shinji: Do we have to use Web RTC?
14:29:54 [sangwhan]
... what is the benefit from doing so?
14:29:58 [sangwhan]
futomi: Benefit is that it is free
14:30:10 [sangwhan]
Kang: I think this should be considered as a secondary feature
14:30:12 [sangwhan]
Futomi: Any further comments?
14:30:15 [sangwhan]
[ None ]
14:30:17 [sangwhan]
Futomi: Moving on
14:30:19 [sangwhan]
Topic: R19 Video streaming
14:30:21 [sangwhan]
wook: This requirement is for providing live video streaming
14:30:35 [sangwhan]
... this is important because live information is better than anything static
14:30:38 [sangwhan]
... probably the most viable option is to use the video tag
14:30:39 [sangwhan]
... since there is no transport level limitation in the spec
14:30:41 [sangwhan]
... RTSP/HLS/MPEG-DASH can be used as transport
14:30:44 [sangwhan]
Futomi: Very good point
14:30:46 [sangwhan]
... HLS is currently usable, MPEG-DASH can probably be used in the future
14:30:49 [sangwhan]
[ 30 minute break, resuming at 16:00 CET ]
14:30:52 [sangwhan]
rrsagent, draft minutes
14:30:52 [RRSAgent]
I have made the request to generate sangwhan
14:32:42 [ryosuke]
ryosuke has joined #signage
14:33:54 [kotakagi]
Present+ Koichi_Takagi
14:34:26 [sangwhan]
s/rrsagentd, draft minutes//
14:34:37 [sangwhan]
14:35:30 [tomoyuki_]
tomoyuki_ has joined #signage
14:47:32 [tomoyuki]
tomoyuki has joined #signage
14:58:37 [kotakagi]
kotakagi has joined #signage
14:59:37 [shinichi]
shinichi has joined #signage
15:00:10 [whyun]
whyun has joined #signage
15:03:44 [ryosuke]
ryosuke has joined #signage
15:05:18 [ryosuke]
subscribenick: ryosuke
15:05:21 [Kiyoshi__]
Kiyoshi__ has joined #signage
15:05:32 [Shinji]
Next session: What's nex
15:05:37 [hiroki]
scribenick: ryosuke
15:05:44 [Shinji]
15:06:15 [Shinji]
...Making a new document
15:06:22 [ryosuke]
futomi: use case and requirement finish today
15:06:52 [Shinji]
… we need continue discussion
15:06:56 [hiroki]
s/Next session: What's next/Topic: What's next step
15:07:11 [ryosuke]
futomi: KDDI discuss map and mapping using SVG yesterday
15:07:39 [ryosuke]
... today we continue discussing this topic
15:08:03 [ryosuke]
Shinji: we need mile stone
15:08:15 [ryosuke]
toru: BG is temporaly group
15:08:21 [Shinji]
s/discussion/discussion use-case and requirement/
15:08:31 [ryosuke]
futomi: our BG is expired
15:10:02 [ryosuke]
toru: should deside a goal of WG
15:10:32 [shige]
shige has joined #signage
15:11:18 [Kiyoshi__]
15:11:36 [ryosuke]
futomi: we propose concreately what should be next
15:11:58 [sangwhan]
rrsagent, draft minutes
15:11:58 [RRSAgent]
I have made the request to generate sangwhan
15:12:18 [Shinji]
futomi: We need concuss
15:12:21 [sangwhan]
s/temporaly/a temporary/
15:12:22 [ryosuke]
okamoto: trying to identify a new document
15:12:48 [sangwhan]
15:13:10 [ryosuke]
toru: I just want to know goal of making a new document
15:13:38 [ryosuke]
futomi: what is goal of toru
15:14:37 [ryosuke]
toru: today's goal is that our opiniton tell MMI or DAP WG
15:14:49 [sangwhan]
15:14:57 [naomi]
scribenick: naomi
15:15:10 [naomi]
shige: let's clarify the definition of "goal"
15:15:22 [naomi]
... not necessary to talk to other WG
15:15:40 [naomi]
... you can identify solutions within this BG that will work
15:15:46 [ryosuke]
15:16:00 [sangwhan]
15:16:36 [naomi]
futomi: the purpose of BG is not clarified
15:16:42 [naomi]
... we can do anything in anyway
15:16:44 [sangwhan]
scribenick: sangwhan
15:16:55 [sangwhan]
futomi: If we want to do something, it should be possible to do so
15:17:05 [sangwhan]
… Do you all agree?
15:17:29 [sangwhan]
Shinji: Since a BG cannot standardize anything, we should think about talking with WGs
15:17:41 [sangwhan]
… to clear the usecases that are needed for our BG
15:17:50 [sangwhan]
… this should probably be the main goal for our group
15:18:08 [sangwhan]
… if you would like to propose another goal, we are open for that
15:18:38 [sangwhan]
… but let's aim to make a deliverable by coming summer
15:18:50 [naomi]
ack a1zu
15:19:00 [sangwhan]
Shigeo: Is standardization the goal for us?
15:19:14 [sangwhan]
… because my impression is that is not the case
15:19:20 [sangwhan]
Aizu: I have two comments
15:19:51 [sangwhan]
… Maybe BGs and CGs can write draft documents and submit it
15:19:56 [sangwhan]
… and then we can submit it to a WG
15:20:05 [sangwhan]
… for example Core Mobile and Responsive Images did so
15:20:28 [sangwhan]
Futomi: So we'll change the main focus to "communicating with WGs"
15:20:41 [sangwhan]
… the other activities we can do as well
15:21:19 [sangwhan]
Aizu: We would like to have W3C members to understand what exactly web based signage is
15:21:31 [sangwhan]
… and maybe create a prototype and demo during next TPAC
15:21:53 [sangwhan]
Futomi: I agree, we should try to make a demo during next TPAC
15:22:52 [sangwhan]
[ Notes that other activities mentioned are 1. Drafting a BP document 2. Continue to upgrade use cases 3. Discussion on map and mapping with SVG ]
15:23:16 [sangwhan]
Futomi: Which timeframe should we aim for the milestones?
15:23:38 [noriya]
noriya has joined #signage
15:23:57 [sangwhan]
… my opinion is that we should aim for summer, but Shinji mentioned that might be too late
15:24:08 [sangwhan]
Shigeo: Shouldn't the schedule be up to the members to decide?
15:24:32 [sangwhan]
Futomi: Yes
15:24:53 [sangwhan]
Wook: I agree with the point that we should have a deadline
15:25:06 [sangwhan]
… but I don't think it's something that we need to decide during this meeting
15:25:16 [sangwhan]
… we can probably discuss further on the mailing list
15:25:46 [naomi]
sangwhan: you can use probably use W3C polling system
15:25:49 [sangwhan]
ACTION: Futomi to setup a poll for the deadlines
15:25:58 [sangwhan]
rrsagent, draft minutes
15:25:58 [RRSAgent]
I have made the request to generate sangwhan
15:26:30 [naomi]
sangwhan: since this group is temporary and new
15:26:38 [naomi]
... all these minutes will be public one
15:26:49 [naomi]
... people start looking @1
15:27:55 [sangwhan]
Sunghan: I have a comment regarding the activities of the BG
15:28:39 [sangwhan]
… I would like to address the fact that the wiki page needs to be up-to-date with
15:28:49 [sangwhan]
… the current status of the BG's activities
15:29:04 [sangwhan]
s/sangwhan:  since this group is temporary and new/Sangwhan: Since this group will be temporary/
15:29:29 [sangwhan]
s/... all these minutes will be public one/... I believe that there should be a deliverable document for future reference/
15:29:41 [sangwhan]
Futomi: Good point
15:30:34 [sangwhan]
Shinji: I would like to still address that we should aim for a milestone coming spring
15:30:45 [sangwhan]
… we can still continue discussion after the milestone
15:31:23 [sangwhan]
[ Shinji presenting a gantt chart of the business group's activity timeline ]
15:31:43 [sangwhan]
Shinji: We had a AC meeting in May
15:32:13 [sangwhan]
… to continue the BG activity we really need to define a milestone for the activity report
15:32:27 [sangwhan]
… we need a schedule
15:32:56 [sangwhan]
Shigeo: Since there are new people in this group, I would like to ask you if there was a consensus
15:33:07 [sangwhan]
… from my understanding there is no charter
15:33:18 [sangwhan]
… and I don't think the group agreed to such a schedule
15:35:04 [sangwhan]
… although there was a year long plan agreed in the group
15:35:26 [sangwhan]
Futomi: I think we should have a milestone, but the point is that the deadline should not be concrete
15:36:25 [sangwhan]
… we can try to finish our document by April
15:36:44 [sangwhan]
… so, for the use cases and requirements when should we aim for?
15:37:14 [sangwhan]
Sunghan: I agree with your general idea, but we should probably take this discussion online like by using the wiki
15:37:51 [sangwhan]
Naomi: This is why Sangwhan said we should use the W3C poll system
15:38:09 [sangwhan]
Futomi: Ok, let's set this up after TPAC
15:39:03 [Alan]
Alan has joined #signage
15:39:04 [sangwhan]
… as for the current goals, does everyone agree?
15:39:55 [kotakagi]
Please see
15:39:59 [sangwhan]
[ KDDI presenting a proposal for upcoming activities ]
15:41:08 [sangwhan]
Toshi: We are talking with the system applications working group, and discussing about collaborating with this group
15:41:24 [sangwhan]
… presenting the SysApps WG charter
15:42:01 [sangwhan]
[ For reference: ]
15:42:57 [sangwhan]
… system applications covers a large amount of requirements needed for signage
15:43:05 [sangwhan]
… but there are some things that are not being covered
15:43:41 [sangwhan]
… we have thought about Raw Sockets API, although this idea is not concrete
15:43:53 [sangwhan]
… we should also consider the security model when interacting with other devices
15:44:26 [sangwhan]
… a trusted application model is a absolute must for a signage use case
15:44:44 [sangwhan]
… this was discussed on the plenary session from the SysApps WG
15:45:10 [sangwhan]
s/plenary session from/plenary day breakout session from/
15:46:51 [sangwhan]
… so we should think about which use cases map to which WGs
15:46:56 [sangwhan]
… opinions?
15:47:01 [sangwhan]
Futomi: Good point
15:47:17 [tokamoto]
tokamoto has joined #signage
15:47:47 [sangwhan]
… I have been talking with the relevant working group chairs
15:52:39 [tokamoto]
tokamoto has joined #signage
15:56:27 [sangwhan]
Sangwhan: I believe this discussion is not something we should decide right now, and you have to be flexible for the milestone/deadlines due to the anture of W3C
15:56:38 [sangwhan]
s/anture of/nature of/
15:57:41 [skim13]
skim13 has left #signage
15:57:45 [sangwhan]
Futomi: As a side question, does everyone agree to continue this BG next year?
15:58:08 [sangwhan]
[ Consensus made ]
15:58:23 [sangwhan]
Futomi: OK, then we should discuss further online
15:58:29 [sangwhan]
… meeting adjourned
15:58:36 [sangwhan]
[ End of meeting ]
15:58:41 [sangwhan]
rrsagent, make log public
15:58:46 [sangwhan]
rrsagent, make minutes
15:58:46 [RRSAgent]
I have made the request to generate sangwhan
15:58:55 [sangwhan]
rrsagent, publish minutes
15:58:55 [RRSAgent]
I have made the request to generate sangwhan
15:59:01 [sangwhan]
zakim, bye
15:59:01 [Zakim]
Zakim has left #signage
15:59:13 [sangwhan]
rrsagent, bye
15:59:13 [RRSAgent]
I see 2 open action items saved in :
15:59:13 [RRSAgent]
ACTION: Futomi to fix XHR reference on use case wiki page [1]
15:59:13 [RRSAgent]
recorded in
15:59:13 [RRSAgent]
ACTION: Futomi to setup a poll for the deadlines [2]
15:59:13 [RRSAgent]
recorded in