W3C

EmotionML 1.0:
Implementation Report

1 Dec 2012

Contributors:
Felix Burkhardt, Deutsche Telekom AG (Editor in chief)
Marc Schröder (until 2012 while at DFKI GmbH)
Paolo Baggia (while at Loquendo, currently Nuance Communications)

Table of Contents

1. Introduction

The EmotionML Specification entered the Candidate Recommendation period on 10 May 2012.

The planned date for entering Proposed Recommendation is XX December 2012. This document summarizes the results from the EmotionML implementation reports received and describes the process that the Multimodal Working Group followed in preparing the report.

An implementation report must indicate the outcome of evaluating the implementation with respect to each of the test assertions. Possible outcomes are "pass", "fail" or "not-impl". Criteria for Producers and Consumers of EmotionML for determining the outcome of each test assertion are defined as follows.

If a consumer:

If a producer:

1.1 Implementation report objectives

1.2 Implementation report non-objectives

2. Work during the Candidate Recommendation period

During the CR period, the Working Group will carry out the following activities:

  1. Clarification and improvement of the exposition of the specification (http://www.w3.org/TR/2012/CR-emotionml-20120510/).
  2. Disposing of comments that are communicated to the WG during the CR period.
  3. Preparation of this Implementation Report.

3. Participating in the implementation report

Implementers were invited to contribute to the assessment of the W3C EMMA 1.0 Specification by submitting implementation reports describing the coverage of their EMMA implementations with respect to the test assertions outlined in the table below.

Implementation reports, comments on this document, or requests made for further information were posted to the Working Group's public mailing list www-multimodal@w3.org (archive).

4. Entrance criteria for the Proposed Recommendation phase

The Multimodal Working Group established the following entrance criteria for the Proposed Recommendation phase in the Request for CR:

  1. Sufficient reports of implementation experience have been gathered to demonstrate that EmotionML processors based on the specification are implementable and have compatible behavior.
  2. Specific Implementation Report Requirements (outlined below) have been met.
  3. The Working Group has formally addressed and responded to all public comments received by the Working Group.

All three of these criteria have been met. A total of 9 implementations were received from 9 different companies and universities. The testimonials below indicate the broad base of support for the specification. All of the required features of EmotionML had at least two implementations, many had eight or nine implementations. All of the optional features received at least two implementations.

5. Implementation report requirements

5.1 Detailed requirements for implementation report

  1. Testimonials from implementers will be included in the IR when provided to document the utility and implementability of the specification.
  2. IR must cover all specified features in the specification. For each feature the IR should indicate:
  3. Feature status is a factor in test coverage in the report:

5.2 Notes on testing

5.3 Out of scope

EmotionML Implementation Report will not cover:

6. Systems

This section contains testimonials on EmotionML from the 9 companies and universities that submitted EMMA implementation reports.

Gerhard Fobe, Univ. of Chemnitz

Exec Summary

Gerhard Fobe is happy to provide a library for C#, released under FreeBSD license, that can support developers to work with EmotionML. With the help of the integrated EmotionML-parser it is possible to create related object instances automatically. Furthermore object instances can be converted to EmotionML too (in DOM and XML mode). Beside a standalone EmotionML document the plug-in version for the inclusion of emotions in other languages is supported.

Marc Schröder, DFKI

Exec Summary

DFKI GmbH is very happy to add support for EmotionML to its expressive speech synthesis system MARY TTS. A standard format for representing the expressivity to be expressed in speech has been missing, so it is good to have it available now. Particularly the support for both categorical and dimensional representations of emotions is important to our system.

Patrick Gebhart, DFKI

Exec Summary

The W3C EmotionML 1.0 Specification is now supported by the ALMA software. ALMA relies on EmotionML for describing appraisal of emotions, the numeric simulation of emotion intensity decay, and mood characteristic changes within the pad dimensions.

Felix Burkhardt, DTAG

Exec Summary

Deutsche Telekom Innovation Laboratories is happy to support EmotionML to deploy it as a standard format in its developments concerning emotion processing services. Working primarily as a system integrator, standardized interfaces are extremely important for us to plug together components from different providers.

Christian Becker-Asano, Univ. of Freiburg

Exec Summary

Dr. C. Becker-Asano of the Foundations of Artificial Intelligence lab at the Department of Computer Science, Freiburg University, is happy to support EmotionML and will use it as a standard format in its developments concerning emotion simulation services. To let the WASABI architecture seamlessly interface with other software modules, a standard interface such as EmotionML is indispensable.

Tim Llewellin, nViso

Exec Summary

nViso 3D Facial Imaging API is an online service for recognition of emotions depicted through facial expressions in still images and videos. The focus of this implementation of EmotionML is on using the media type and uri time for video which are used in the service. Key features used in API which supports EmotionML include defining our own catagory of emotion key words and linking emotions measurements to media analyzed. As the report was produced as part of a commerical implementation, some parts of the specifications were not deemed appropriate for our service, so were not implemented. More information about our service can be found here: www.nviso.ch

Edmon Begoli, Oak Ridge National Laboratory

Exec Summary

EMLPy is a Python based library for generation of EmotionML compliant documents.

This is a second iteration of the initial implementation of EMLPy Python library that is intended as a utility for generation of EmotionML-compliant XML documents. EMLPy generates EmotionML documents by transforming the user specified and populated Python object tree into XML representation. EMLPy performs EmotionML checks covered in assertions while executing this object to XML transformation. From an API perspective, user interacts with an object tree hierarchy that maps directly to EmotionML hierarchy of elements and attributes. User generates an XML document by invoking the .to_xml() function. At this time EMLPy validates the object tree and its properties against the EmotionML schema and specification rules. There are few cases that require special consideration and handling:

The value of the "media-type", if present, MUST be a valid MIME type - we check the MIME types against the http://www.iana.org/assignments/media-types. This is a live check that performs URI resolution for types and page search for sub-types. We pursued this approach in attempt to dynamically validate MIME types against the authoritative repository.

xsd:anyURI types - anyURI checks are syntactic, not semantic. Checks are performed against common URI patterns - http,ftp, structure of IP address, proper domains, etc. We are not performing checks against the actual live URIs to check if the content is there.

Vocabularies - we perform vocabulary checks for those vocabularies defined within EmotionML instance being created. If the dimension, category, appraisal or sets are pointing to external URL we do not fail the generation of the document if the URL defining the set is not valid, but we only issue warning. We have chosen this approach as aligned with Python's dynamic typing philosophy.

Project URL: https://github.com/ebegoli/EMLPy Implementation Team: Edmon Begoli, begolie@ornl.gov, Oak Ridge National Laboratory/University of Tennessee-Knoxville Chelsey Dunnivan, ckd002@morningside.edu, Morningside University

Roddy Cowie, Queens Univ. Belfast

Exec Summary

The HUMAINE centre (School of Psychology, Queen’s University Belfast) welcomes EmotionML and will deploy it as a standard format in its work. That primarily involves use case 1, manual annotation of material involving emotionality, such as annotation of videos, of speech recordings, and of faces, in naturalistic databases. As psychologists, it is important to have a format that we can use, which allows us to use psychologically well-founded descriptors, and know that consumers in the engineering community will be able both to access, and to understand them. The system currently implements tracing for category and dimensional descriptors. We will extend that to appraisal and action tendency descriptors if research confirms that they can be traced reliably, but at present that is not clear.

Abe Kazemzadeh, USC SAIL Lab

Exec Summary

Emotion twenty questions (EMO20Q) is a experimental framework for studying how people describe emotions in language and how computers can simulate this type of verbal behavior. In EMO20Q, the familiar spoken parlor game of twenty questions is restricted to words that players feel refers to emotions.

In this implementation report, we examine the case where a server-side computer agent plays the role of the questioner and a human plays the answerer via a web browser.

The EMO20Q questioner agent can be decomposed into several components, notably a **vocabulary**, **semantic knowledge**, an **episodic buffer**, and a **belief state**. The vocabulary is a list of 110 emotion words and this vocabulary is expected to grow over time as more data is collected, but remains constant during the agent's instantiation. Semantic knowledge is a large object that remains the same across different agent instantiations and states, while the episodic buffer and belief state are smaller objects that vary over time for each interactive session. Because of the size of the semantic knowledge object, serialization of the agent for each session is not possible. Rather, the episodic buffer and belief states are serialized while the semantic knowledge persists as a static object in the server memory. The belief state is represented as a probability vector indexed by items of the vocabulary.

EmotionML is used to implement the questioner agent's vocabulary and belief state. The agent's vocabulary is implemented using the EmotionML ``vocabulary`` idiom and the agent's belief state is represented using the ``emotion`` idiom with a ``category`` as a child and the ``value`` attribute to hold numerical probability values.

The only thing that produced a failure to validate was having a space in the name of a vocabulary item.

7. Test assertions

The aim of this section is to describe the range of test assertions developed for the EmotionML 1.0 Specification. The table lists all the assertions that were derived from the EmotionML 1.0 Specification.

The Assert ID column uniquely identifies the assertion. The Feature column indicates the specific elements or attributes which the test assertion applies to. The Spec column identifies the section of the EmotionML 1.0 Specification from which the assertion was derived. The REQ column is a Y/N value indicating whether the test assertion is for a feature which is required. The SUB column is a Y/N value indicating whether the test assertion is a subconstraint which is dependent on the implementation of the preceding non subconstraint feature test assertion. The Semantics column specifies the semantics of the feature or the constraint which must be met. The Result column will be annotated with the number of 'pass', 'fail', and 'not implemented' (P/F/NI) in the set of implementation reports.

7.1 Classification of test assertions

Test assertions are classified into two types, basic test assertions which test for the presence of each feature, and sub constraints which only apply if that particular feature is implemented. Generally, sub constraints encode structural constraints that could not be expressed in the EmotionML schema. Sub constraints are marked with 'SUB CONSTRAINT:' in the Semantics field.

7.2 EmotionML XML Schema conformance

The most fundamental test of a conforming EmotionML implementation is that the EmotionML documents it utilizes must successfully validate with respect to the EmotionML XML Schema.

7.3 EmotionML Test assertions

Assert ID Feature Spec Req Sub Semantics Results
P F NI
Document structure
100 [2.1.1] Y N All EmotionML documents must validate against the XML schema. 900
101 emotionml [2.1.1] Y N The root element of standalone EmotionML documents MUST be <emotionml>. 900
102 emotionml [2.1.1] Y N The <emotionml> element MUST define the EmotionML namespace: "http://www.w3.org/2009/10/emotionml". 900
103 emotion [2.1.1] N N The <emotionml> element MAY contain one or more <emotion> elements. 900
104 vocabulary [2.1.1] N N The <emotionml> element MAY contain one or more <vocabulary> elements. 702
105 info [2.1.1] N N The <emotionml> element MAY contain a single <info> element. 504
110 version [2.1.1] Y N The root element of a standalone EmotionML document MUST have an attribute "version". 900
111 version [2.1.1] Y N The "version" attribute of <emotionml> MUST have the value "1.0" 900
112 category-set [2.1.1] N N The <emotionml> element MAY contain an attribute "category-set". 801
113 category-set [2.1.1] Y N The "category-set" attribute of <emotionml>, if present, MUST be of type xsd:anyURI. 801
114 [2.1.1] Y Y SUB CONSTRAINT: The "category-set" attribute of <emotionml>, if present, MUST refer to the ID of a <vocabulary> element with type="category". 801
115 dimension-set [2.1.1] N N The <emotionml> element MAY contain an attribute "dimension-set". 603
116 dimension-set [2.1.1] Y N The "dimension-set" attribute of <emotionml>, if present, MUST be of type xsd:anyURI. 603
117 [2.1.1] Y Y SUB CONSTRAINT: The "dimension-set" attribute of <emotionml>, if present, MUST refer to the ID of a <vocabulary> element with type="dimension". 603
118 appraisal-set [2.1.1] N N The <emotionml> element MAY contain an attribute "appraisal-set". 405
119 appraisal-set [2.1.1] Y N The "appraisal-set" attribute of <emotionml>, if present, MUST be of type xsd:anyURI. 405
120 [2.1.1] Y Y SUB CONSTRAINT: The "appraisal-set" attribute of <emotionml>, if present, MUST refer to the ID of a <vocabulary> element with type="appraisal". 405
121 action-tendency-set [2.1.1] N N The <emotionml> element MAY contain an attribute "action-tendency-set". 405
122 action-tendency-set [2.1.1] Y N The "action-tendency-set" attribute of <emotionml>, if present, MUST be of type xsd:anyURI. 405
123 [2.1.1] Y Y SUB CONSTRAINT: The "action-tendency-set" attribute of <emotionml>, if present, MUST refer to the ID of a <vocabulary> element with type="action-tendency". 405
124 emotionml [2.1.1] N N The <emotionml> element MAY contain arbitrary plain text. 405
150 category [2.1.2] N N The <emotion> element MAY contain one or more <category> elements. 801
151 dimension [2.1.2] N N The <emotion> element MAY contain one or more <dimension> elements. 702
152 appraisal [2.1.2] N N The <emotion> element MAY contain one or more <appraisal> elements. 405
153 action-tendency [2.1.2] N N The <emotion> element MAY contain one or more <action-tendency> elements. 405
154 reference [2.1.2] N N The <emotion> element MAY contain one or more <reference> elements. 504
155 info [2.1.2] N N The <emotion> element MAY contain a single <info> element. 405
156 emotion [2.1.2] Y N The <emotion> element MUST contain at least one <category> or <dimension> or <appraisal> or <action-tendency> element. 900
157 emotion [2.1.2] N N The allowed child elements of <emotion> MAY occur in any order. 900
158 emotion [2.1.2] N N The allowed child elements of <emotion> MAY occur in any combination. 900
159 category-set [2.1.2] N N The <emotion> element MAY contain an attribute "category-set". 702
160 category-set [2.1.2] Y N The "category-set" attribute of <emotion>, if present, MUST be of type xsd:anyURI. 801
161 [2.1.2] Y Y SUB CONSTRAINT: The "category-set" attribute of <emotion>, if present, MUST refer to the ID of a <vocabulary> element with type="category". 801
162 dimension-set [2.1.2] N N The <emotion> element MAY contain an attribute "dimension-set". 702
163 dimension-set [2.1.2] Y N The "dimension-set" attribute of <emotion>, if present, MUST be of type xsd:anyURI. 702
164 [2.1.2] Y Y SUB CONSTRAINT: The "dimension-set" attribute of <emotion>, if present, MUST refer to the ID of a <vocabulary> element with type="dimension". 702
165 appraisal-set [2.1.2] N N The <emotion> element MAY contain an attribute "appraisal-set". 405
166 appraisal-set [2.1.2] Y N The "appraisal-set" attribute of <emotion>, if present, MUST be of type xsd:anyURI. 405
167 [2.1.2] Y Y SUB CONSTRAINT: The "appraisal-set" attribute of <emotion>, if present, MUST refer to the ID of a <vocabulary> element with type="appraisal". 405
168 action-tendency-set [2.1.2] N N The <emotion> element MAY contain an attribute "action-tendency-set". 405
169 action-tendency-set [2.1.2] Y N The "action-tendency-set" attribute of <emotion>, if present, MUST be of type xsd:anyURI. 405
170 [2.1.2] Y Y SUB CONSTRAINT: The "action-tendency-set" attribute of <emotion>, if present, MUST refer to the ID of a <vocabulary> element with type="action-tendency". 405
171 version [2.1.2] N N The <emotion> element MAY have an attribute "version". 603
172 version [2.1.2] Y N The "version" attribute of <emotion>, if present, MUST have the value "1.0". 603
173 id [2.1.2] N N The <emotion> element MAY contain an attribute "id". 306
174 id [2.1.2] Y N The "id" attribute of <emotion>, if present, MUST be of type xsd:ID. 206
175 start [2.1.2] N N The <emotion> element MAY have an attribute "start". 306
176 end [2.1.2] N N The <emotion> element MAY have an attribute "end". 306
177 duration [2.1.2] N N The <emotion> element MAY have an attribute "duration". 306
178 time-ref-uri [2.1.2] N N The <emotion> element MAY have an attribute "time-ref-uri". 207
179 time-ref-anchor-point [2.1.2] N N The <emotion> element MAY have an attribute "time-ref-anchor-point". 207
180 offset-to-start [2.1.2] N N The <emotion> element MAY have an attribute "offset-to-start". 207
181 expressed-through [2.1.2] N N The <emotion> element MAY have an attribute "expressed-through". 207
182 emotion [2.1.2] N N The <emotion> element MAY contain arbitrary plain text. 504
Representations of emotions and related states
210 category [2.2.1] Y N If the <category> element is used, a category vocabulary MUST be declared using a "category-set" attribute on either the enclosing <emotion> element or the root element <emotionml>. 801
211 name [2.2.1] Y N A category element MUST contain a "name" attribute. 801
212 [2.2.1] Y Y SUB CONSTRAINT: The value of the "name" attribute of the <category> element MUST be contained in the declared category vocabulary. If both the <emotionml> and the <emotion> element has a "category-set" attribute, then the <emotion> element's attribute defines the declared category vocabulary. 801
213 name [2.2.1] Y N For any given category name in the set, zero or one occurrence is allowed within an <emotion> element, i.e. a category with name "x" MUST NOT appear twice in one <emotion> element. 801
214 value [2.2.1] N N A <category> MAY contain a "value" attribute. 603
215 trace [2.2.1] N N A <category> MAY contain a <trace> element. 405
216 value / trace [2.2.1] Y N A <category> MUST NOT contain both a "value" attribute and a <trace> element. 603
217 confidence [2.2.1] N N A <category> element MAY contain a "confidence" attribute. 405
220 dimension [2.2.2] Y N If the <dimension> element is used, a dimension vocabulary MUST be declared using a "dimension-set" attribute on either the enclosing <emotion> element or the root element <emotionml>. 702
221 name [2.2.2] Y N A <dimension> element MUST contain a "name" attribute. 702
222 [2.2.2] Y Y SUB CONSTRAINT: The value of the "name" attribute of the <dimension> element MUST be contained in the declared dimension vocabulary. If both the <emotionml> and the <emotion> element has a "dimension-set" attribute, then the <emotion> element's attribute defines the declared dimension vocabulary. 702
223 name [2.2.2] Y N For any given dimension name in the set, zero or one occurrence is allowed within an <emotion> element i.e. a dimension with name "x" MUST NOT appear twice in one <emotion> element. 702
224 value / trace [2.2.2] Y N A <dimension> MUST contain either a "value" attribute or a <trace> element. 702
225 confidence [2.2.2] N N A <dimension> element MAY contain a "confidence" attribute. 405
230 appraisal [2.2.3] Y N If the <appraisal> element is used, an appraisal vocabulary MUST be declared using an "appraisal-set" attribute on either the enclosing <emotion> element or the root element <emotionml>. 405
231 name [2.2.3] Y N An <appraisal> element MUST contain the "name" attribute. 405
232 [2.2.3] Y Y SUB CONSTRAINT: The value of the "name" attribute of the <appraisal> element MUST be contained in the declared appraisal vocabulary. If both the <emotionml> and the <emotion> element has an "appraisal-set" attribute, then the <emotion> element's attribute defines the declared appraisal vocabulary. 405
233 name [2.2.3] Y N For any given appraisal name in the set, zero or one occurrence is allowed within an <emotion> element, i.e. an appraisal with name "x" MUST NOT appear twice in one <emotion> element. 405
234 value [2.2.3] N N An <appraisal> element MAY contain a "value" attribute. 405
235 trace [2.2.3] N N An <appraisal> element MAY contain a <trace> element. 306
236 value / trace [2.2.3] Y N An <apraisal> element MUST NOT contain both a "value" attribute and a <trace> element. 306
237 confidence [2.2.3] N N An <appraisal> element MAY contain a "confidence" attribute. 405
240 action-tendency [2.2.4] Y N If the <action-tendency> element is used, an action tendency vocabulary MUST be declared using an "action-tendency-set" attribute on either the enclosing <emotion> element or the root element <emotionml>. 405
241 name [2.2.4] Y N An <action-tendency> element MUST contain the "name" attribute. 405
242 [2.2.4] Y Y SUB CONSTRAINT: The value of the "name" attribute of the <action-tendency> element MUST be contained in the declared action tendency vocabulary. If both the <emotionml> and the <emotion> element has an "action-tendency-set" attribute, then the <emotion> element's attribute defines the declared action tendency vocabulary. 405
243 name [2.2.4] Y N For any given action tendency name in the set, zero or one occurrence is allowed within an <emotion> element, i.e. an action tendency with name "x" MUST NOT appear twice in one <emotion> element. 405
244 value [2.2.4] N N An <action-tendency> element MAY contain a "value" attribute. 405
245 trace [2.2.4] N N An <action-tendency> element MAY contain a <trace> element. 306
246 value / trace [2.2.4] Y N An <action-tendency> element MUST NOT contain both a "value" attribute and a <trace> element. 306
247 confidence [2.2.4] N N An <action-tendency> element MAY contain a "confidence" attribute. 405
Meta-information
300 confidence [2.3.1] Y N The value of the "confidence" attribute MUST be a floating point number in the closed interval [0, 1]. 405
301 expressed-through [2.3.2] Y N The attribute "expressed-through" of the <emotion> element, if present, MUST be of type xsd:nmtokens. 405
302 info [2.3.3] N N The <info> element MAY contain any elements with a namespace different from the EmotionML namespace, "http://www.w3.org/2009/10/emotionml". 405
303 info [2.3.3] N N The <info> element MAY contain arbitrary plain text. 504
304 info [2.3.3] Y N The <info> element MUST NOT contain any elements in the EmotionML namespace, "http://www.w3.org/2009/10/emotionml". 405
305 id [2.3.3] N N The <info> element MAY contain an attribute "id". 405
306 id [2.3.3] Y N The "id" attribute of the <info> element, if present, MUST be of type xsd:ID. 405
References and time
410 uri [2.4.1] Y N The <reference> element MUST contain a "uri" attribute. 603
411 uri [2.4.1] Y N The "uri" attribute of <reference> MUST be of type xsd:anyURI. 603
412 [2.4.1] N Y SUB CONSTRAINT: The URI in the "uri" attribute of a <reference> element MAY be extended by a media fragment. 306
413 role [2.4.1] N N The <reference> element MAY contain a "role" attribute. 504
414 role [2.4.1] Y N The value of the "role" attribute of the <reference> element, if present, MUST be one of "expressedBy", "experiencedBy", "triggeredBy", "targetedAt". 504
415 media-type [2.4.1] N N The <reference> element MAY contain a "media-type" attribute. 207
416 media-type [2.4.1] Y N The value of the "media-type" attribute of the <reference> element, if present, MUST be of type xsd:string. 207
417 [2.4.1] Y Y SUB CONSTRAINT: The value of the "media-type" attribute of the <reference> element, if present, MUST be a valid MIME type. 207
420 start [2.4.2] Y N The value of the "start" attribute of <emotion>, if present, MUST be of type xsd:nonNegativeInteger. 306
421 end [2.4.2] Y N The value of the "end" attribute of <emotion>, if present, MUST be of type xsd:nonNegativeInteger. 207
422 duration [2.4.2] Y N The value of "duration" attribute of <emotion>, if present, MUST be of type xsd:nonNegativeInteger. 207
423 time-ref-uri [2.4.2] Y N The value of the "time-ref-uri" attribute of <emotion>, if present, MUST be of type xsd:anyURI. 207
424 time-ref-anchor-point [2.4.2] Y N The value of the "time-ref-anchor-point" attribute of <emotion>, if present, MUST be either "start" or "end". 207
425 offset-to-start [2.4.2] Y N The value of the "offset-to-start" attribute of <emotion>, if present, MUST be of type xsd:integer. 207
Scale values
500 value [2.5.1] Y N The value of a "value" attribute, if present, MUST be a floating point value from the closed interval [0, 1]. 702
501 freq [2.5.2] Y N The <trace> element MUST have a "freq" attribute. 504
502 freq [2.5.2] Y N The value of the "freq" attribute of <trace> MUST be a positive floating point number followed by optional whitespace followed by "Hz". 504
503 samples [2.5.2] Y N The <trace> element MUST have a "samples" attribute. 504
504 samples [2.5.2] Y N The value of the "samples" attribute of <trace> MUST be a space-separated list of floating point values from the closed interval [0, 1]. 504
Defining vocabularies for representing emotions
600 item [3.1.1] Y N A <vocabulary> element MUST contain one or more <item> elements. 603
601 info [3.1.1] N N A <vocabulary> element MAY contain a single <info> element. 207
602 type [3.1.1] Y N A <vocabulary> element MUST contain a "type" attribute. 603
603 type [3.1.1] Y N The value of the "type" attribute of the <vocabulary> element MUST be one of "category", "dimension", "action-tendency" or "appraisal". 603
604 id [3.1.1] Y N A <vocabulary> element MUST contain an "id" attribute 603
605 id [3.1.1] Y N The value of the "id" attribute of the <vocabulary> element MUST be of type xsd:ID . 603
606 info [3.1.2] N N An <item> element MAY contain a single <info> element. 306
607 name [3.1.2] Y N An <item> element MUST contain a "name" attribute. 504
608 name [3.1.2] Y N An <item> MUST NOT have the same name as any other <item> within the same <vocabulary>. 603
Conformance
700 [4.1] Y N All EmotionML elements MUST use the EmotionML namespace, "http://www.w3.org/2009/10/emotionml". 900

Appendices

Appendix A - Acknowledgements

The Multimodal Working Group would like to acknowledge the contributions of several individuals: