W3C

EmotionML 1.0: Last Call Working Draft Disposition of Comments

10 May 2012

Editor
Marc Schröder, DFKI GmbH

Abstract

This document details the responses made by the Multimodal Interaction Working Group to issues raised during the Last Call Working Draft period (beginning 7 April 2011 and ending 7 June 2011). Comments were provided by other W3C Working Groups and the public via the www-multimodal@w3.org ( archive ) mailing list.

Status

This document of the W3C's Multimodal Interaction Working Group describes the disposition of comments as of 10 May 2012 on the Last Call Working Draft of the Emotion Markup Language (EmotionML) 1.0. It may be updated, replaced or rendered obsolete by other W3C documents at any time.

For background on this work, please see the Multimodal Interaction Activity Statement.

Comment summary

Legend:

ACCEPTED Comment was accepted
REJECTED Comment was rejected by working group.
DEFERRED Comment was deferred to a future version of the spec.

Results:

ID Title Date Opened Last Updated Disposition Acceptance Related Issues
ISSUE-175 Emotion vocabularies should not use custom mechanism 2011-04-13 2011-07-20 DEFERRED EXPLICIT NONE
ISSUE-179 Integration with SSML 2011-04-26 2011-07-20 ACCEPTED IMPLICIT NONE
ISSUE-184 Accessibility use cases for EmotionML 2011-06-21 2011-10-26 ACCEPTED IMPLICIT NONE
ISSUE-185 Easier-to-use emotion markup using attributes 2011-06-21 2011-10-26 DEFERRED IMPLICIT NONE
ISSUE-186 Make explicit the relationship between different emotion vocabularies 2011-06-21 2011-10-26 DEFERRED IMPLICIT NONE
ISSUE-191 Wrong use of timestamps in EmotionML 2011-06-22 2011-10-26 ACCEPTED IMPLICIT NONE
ISSUE-192 Suggestion to use XML Schema's DateTime instead milliseconds in EmotionML timestamps 2011-06-22 2011-10-26 REJECTED IMPLICIT NONE

Issue detail


ISSUE-175 - Emotion vocabularies should not use custom mechanism

Tracker (W3C Member only):

ISSUE-175

Opened: 2011-04-13

Last Updated: 2011-07-20 11:10

State: closed

Description:

In the Emotion Markup Language (EmotionML) 1.0 (W3C Working Draft 7 April 2011) - why don't you use namespaces for vocab terms? (which is the more common approach, with QNAMES, or even CURIES)

So the example (just before section 2.2.2) would be:

<emotion xmlns:big6="http://www.w3.org/TR/emotion-voc/xml#big6">
    <category name="big6:sadness" value="0.3"/>
    <category name="big6:anger" value="0.8"/>
    <category name="big6:fear" value="0.3"/>
</emotion>

And, of course, that namespace can be defined at the top-level and reused elsewhere...

Related e-mails:


ISSUE-179 - Integration with SSML

Tracker (W3C Member only):

ISSUE-179

Opened: 2011-04-26

Last Updated: 2011-07-20 11:14

State: closed

Description:

Dear W3C MMI WG (and in particular the Emotion subgroup),

We reviewed the EmotionML specification during last week's Voice  
Browser Working Group call.  There was only one comment, and it was  
about your recommendations on how to integrate with SSML.
Specifically, the concern was that your second option, suggesting a  
new <style> element in SSML, might be misinterpreted by readers as  
being recommended or endorsed by the creators of SSML as the intended  
way such information can be used in the future.  We believe that even  
clarifying that this is not recommended or endorsed for SSML would be  
insufficient to deter implementers who might be tempted to add a new  
<style> element to their SSML implementations solely because it is  
described in your standards document.

We strongly request, in section 5.2.2, that you completely remove the  
text and examples beginning with "Second, a future version of SSML".   
You would then likely need to adjust the preceding text in 5.2.2 to  
reflect that you now only describe one option.

As an aside, we happen to agree very strongly with that first option  
-- it is a scoped, easily understandable, and backwards-compatible  
mechanism for adding emotion information into SSML.

Please let us know if you have any questions.  We would be happy to  
join you for discussion if necessary.

Dan Burnett, Chair

Voice Browser Working Group

Related e-mails:


ISSUE-184 - Accessibility use cases for EmotionML

Tracker (W3C Member only):

ISSUE-184

Opened: 2011-06-21

Last Updated: 2011-11-30 10:07

State: closed

Description:

From the feedback on the EmotionML LCWD by WAI-PF (http://lists.w3.org/Archives/Public/www-multimodal/2011Jun/0004.html):

1. Use cases.

It would help people understand the accessibility potential for Emotion ML
if some specific accessibility use cases could be included. Although there
might be some overlap with existing use cases, the author/user requirements
are quite distinct. Some possible use cases might be:

- Emotion ML is used for media transcripts and captions. Where emotions
are marked up to help deaf or hearing impaired people who cannot hear
the soundtrack, more information is made available to enrich their experience
of the content.

- Emotion ML is used for content that's translated into synthetic speech. 
This would make more information available to blind and partially sighted
people, and enrich their experience of the content.

- Emotion ML is used to make the emotional intent of content explicit.
This would enable people with learning disabilities (such as Asperger's
Syndrome) to realise the emotional context of the content.

Related e-mails:


ISSUE-185 - Easier-to-use emotion markup using attributes

Tracker (W3C Member only):

ISSUE-185

Opened: 2011-06-21

Last Updated: 2011-11-30 10:08

State: closed

Description:

From the feedback on the EmotionML LCWD by WAI-PF (http://lists.w3.org/Archives/Public/www-multimodal/2011Jun/0004.html):

2. Ease of use.

It's often easier to encourage people to think about accessibility if
the author requirements are as minimal as possible. Adding accessibility
into some of the examples provided within the specification would make the
code quite verbose, and would therefore add a burden onto the developer.

One possible solution might be to change some of the Emotion ML tags into 
attributes that could be applied to any element. For example:

<span emotion-category="irritation">Well of course, you would do that!</span>

Could be used instead of:

<emotion>

<category name="irritation"/>

<span > Well of course, you would do that </span>

</emotion>

    Another example (using SMIL) might be:

<s>

<emo:emotion >

<emo:category name="doubt"/>

<emo:intensity value="0.4"/>

</emo:emotion>

Do you need help?

</s>

    Could become:

<s em- category-set=http://www.example.com/emotion/category/everyday-emotions.xml em-category"doubt"  em-intensity value="0.4">

Do you need help?

</s>


    Or alternatively could become:

<s>

<emo:emotion em- category-set=http://www.example.com/emotion/category/everyday-emotions.xml em-category"doubt"  em-intensity value="0.4"/>

Do you need help?

</emo:emotion >

</s>

It might also be helpful to change the name of the attributes, to clarify
the fact they relate to emotions. For example, the category attribute
could become emotion-category or even em-category (although this is slightly
less clear). ARIA uses this approach to good effect for example.

Related e-mails:


ISSUE-186 - Make explicit the relationship between different emotion vocabularies

Tracker (W3C Member only):

ISSUE-186

Opened: 2011-06-21

Last Updated: 2011-11-30 10:09

State: closed

Description:

From the feedback on the EmotionML LCWD by WAI-PF (http://lists.w3.org/Archives/Public/www-multimodal/2011Jun/0004.html):

3. New vocabularies and extensions.

Emotion ML indicates that user defined custom vocabularies do not need 
to relate to existing vocabularies (although redundancy should be avoided).
To some extent this could put the interoperability of the specification
at risk.

One solution might be to create a requirement that user defined custom 
vocabularies make the relationship with existing requirements explicit.
For example, if you wanted to define a new term "contentment", the author
of the custom vocabulary would need to say something like:

contentment is a type of happiness 
contentment  overlaps with satisfaction by 80% 
contentment  overlaps with relaxed 70% 
contentment  excludes anger 
contentment  excludes excitement by 90%

Our definition of contentment might not be exactly right, but the aim is 
to make the term machine understandable (and hence interoperable).

This suggestion isn't something that would be necessary at this stage in
the lifecycle of the specification. It may be something for consideration
in the future though.

Related e-mails:


ISSUE-191 - Wrong use of timestamps in EmotionML

Tracker (W3C Member only):

ISSUE-191

Opened: 2011-06-22

Last Updated: 2011-11-30 10:10

State: closed

Description:

Issue from comment in http://lists.w3.org/Archives/Public/www-multimodal/2011Jun/0007.html :

In section 2.4.2.1 (Timestamps - Absolute time) the definition says that
the attributes "start" and "end" indicate the number of milliseconds
since 1970-01-01 0:00:00, but the example below seems to use a normal
unix timestamp (1268647200 = 2010-03-15 10:00:00 - a moment during the
definition of EmotionML). Same use in example of 2.4.2.2 (Duration).
That a unix timestamp is meant shows 5.1.2 (Automatic recognition of
emotions) with "23 November 2001 from 14:36 onwards (absolute start time
is 1006526160 milliseconds since 1 January 1970 00:00:00 GMT)".
"1006526160 seconds" will be the right here.

Related e-mails:


ISSUE-192 - Suggestion to use XML Schema's DateTime instead milliseconds in EmotionML timestamps

Tracker (W3C Member only):

ISSUE-192

Opened: 2011-06-22

Last Updated: 2011-11-30 10:11

State: closed

Description:

Issue from comment in http://lists.w3.org/Archives/Public/www-multimodal/2011Jun/0007.html :

With the help of a unix timestamp or a timestamp defined as
xsd:nonNegativeInteger no moments before 1970 can be defined. This
includes that no moments bevor christ can be used. So e.g. ѥmotional
diariesѠof a poets like Friedrich Schiller or Gaius Iulius Caesar can
not be annotated in their real time.

Possible solution
-----------------
I inspire to use xsd:dateTime
(http://www.w3.org/TR/xmlschema-2/#dateTime) instead of
xsd:nonNegativeInteger for the attributes start and end of <emotion>.
With the help of this we can annotate also dates before 1970 and bevore
christ also with fractional seconds.

Related e-mails: