Important note: This Wiki page is edited by participants of the RDWG. It does not necessarily represent consensus and it may have incorrect information or information that is not supported by other Working Group participants, WAI, or W3C. It may also have some very useful information.


Cognitive, Emotion, and Affective Computing Accessibility

From Research and Development Working Group Wiki
Jump to: navigation, search

Affective Computing is an emerging discipline with the aim to develop computer systems that could recognize emotional states of its users and communicate with them considering the detected affective states ‎[4].

Contacts

Page author(s): Yehya Mohamad (Fraunhofer FIT)

Keywords

Affective computing; Sensor technology; none verbal communication

Rationale

Affective computing could support disabled persons to express their emotions to other persons e.g. users with communication disabilities like users with cerebral palsy, here a talker / communicator could change the colors and sounds of its user interface to convey none verbal aspects of user’s communication to others. Similarly, an affective computing system based on visual sensors could help blind users to perceive none verbal emotions expressed by others.


Description

Affective computing is based on several approaches and technologies:

  • Methods used to elicit emotional data from user’s feedback e.g. the semantic differential method and the Self-Assessment Manikin (SAM) method [‎5] and tools like the online tool TRUE (Testing Platform for Multimedia Evaluation) based on SAM technique or emotion stimulation tool compliant IADS-2 (The In-ternational Affective Digitized Sounds) [‎3]
  • Sensor frameworks for sensing and measurement of signals for the detection of emotional states ‎1‎2 e.g. detecting the basic emotions from the face using visual sensors, from voice using microphones or from physio-psychological signals using sensors like skin conductivity sensors (GSR), Electromyogram (EMG) that measures, muscle activity, Electrocardiogram (EKG or ECG) that measures heart activity, Electro-ouclogram (EOG) measuring eye movement and Electroencephalog-raphy (EEG, measuring brain activity).
  • Several machine learning techniques, they are used to recognize patterns in meas-ured signals that indicate emotional states. E.g. Hidden Markov Model, Bayesian networks, Fuzzy Logic, Neural Network or Support Vector Machines.

Background

Human beings are emotional, as our social interaction is based on the ability to communicate our emotions and to perceive the emotional states of others [6][7]. A wide range of physical disabilities involves deficits in the different stages of sensing, expressing, or interpreting of affect-relevant signals. Consequently, people with these kinds of disabilities can be considered emotionally disabled [8]. Affective computing, a discipline that develops systems for detecting and responding to users’ emotions [4], and affective mediation, computer-based technology that enables the communication between two or more people displaying their emotional states [9][4], are growing research areas that must join assistive technology research to improve the neglected area of affective communication in disabled people [10]. Emotion detection by computers can be done using various channels such as facial expression, voice, gestures, and postures [4]. However, this method is not reliable in every context as individuals may be disabled and their emotions/affective state cannot be detected using these channels. In order to overcome this problem the use of psycho-physiological measurement was introduced [4][9]. Hence, it is now being actively used to acquire emotional states using different biosensors i.e. Galvanic Skin Response (GSR), Electromyography (EMG) and Electrocardiography (ECG) and [11][12]. The activation of sympathetic nerves of Autonomous Nervous System (ANS) generates authentic not alterable signals [13]. Signals from GSR, EMG and ECG are processed using different signal processing techniques and methods to convert them into meaningful data. Machine learning techniques are used to recognize patterns in psycho-physiological signals that indicate user’s affective states [1][2]. Classification of data is performed with machine learning algorithms. Different machine learning algorithms are common like support vector machine (SVN). There exist several implementations of these algorithms like the commercial Matlab1 or the open source data-mining tool Waikato Environment for knowledge analysis (WEKA). The first goal of our work is to build an affective computing detection system based on GSR and to automatically identify correlates of emotional processing in the electroencephalogram (EEG) in individuals with CP (Cerebral Palsy). The second goal is the building of an affective mediation system by implementing a web-based emotion management system to define rules that maps the actions with the corresponding emotional state, time, and location as well any other relevant context parameters.

Discussion

[Specific questions that need to be addressed]

References

  • [1] R. A. Calvo and S. D’Mello, “Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications,” IEEE Trans. Affect. Comput., vol. 1, no. 1, pp. 18–37, Jan. 2010.
  • [2] R. Calvo, I. Brown, and S. Scheding, “Effect of experimental factors on the recognition of affective mental states through physiological measures,” AI 2009 Adv. Artif. Intell., pp. 62–70, 2009.
  • [3] Bradley, Margaret M. and Lang, Peter. J : The International Affective Digitized Sounds (2nd Edition (IADS-2): Affective Ratings of Sounds and Instruction Manual NIMH Center for the Study of Emotion and Attention 2007
  • [4] Picard R. (2000). IBM Systems Journal, Volume 39, Numbers 3 & 4, 2000 MIT Media Laboratory
  • [5] Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior

Therapy and Experimental Psychia-try, 49-59.

  • [6] Goleman D., Emotional Intelligence, Bantam Books, New York (1995).
  • [7] N. Garay, I. Cearreta, J. M. López, and I. Fajardo, “Assistive technology and affective

mediation,” An Interdiscip. J. Humans ICT Environ., vol. 2, no. 1, pp. 55–83, 2006.

  • [8] N. A. Gershenfeld, When Things Start to Think. 2000. New York: Owl Books.
  • [9] Mohamad, Yehya (2005). Integration of Emotional Intelligence in Interface Agents: The example

of a Training Software for Learning-Disabled Children. ISBN-13: 978-3832244637.

  • [10] N. Sharma and T. Gedeon, “Objective measures, sensors and computational techniques for

stress recognition and classification: a survey.,” Comput. Methods Programs Biomed., vol. 108, no. 3, pp. 1287–301, Dec. 2012.

  • [11] Yuan Shi, et. Al. Personalized Stress Detection from Physiologi-cal Measurements. International

Symposium on Quality of Life Technology (2010)

  • [12] María Viqueira Villarejo, Begoña García Zapirain and Amaia Méndez Zorrilla. A Stress

Sensor Based on Galvanic Skin Response (GSR) Controlled by ZigBee. sensors ISSN 1424- 8220 www.mdpi.com/journal/sensors - 2012

  • [13] C. L. Bethel, K. Salomon, R. R. Murphy, and J. L. Burke, “Survey of Psychophysiology

Measurements Applied to Human-Robot Interaction,” in RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication, 2007, pp. 732– 737.


Back to the list of topics.