LCS SEMINAR: Friday 15 November 2002


W3C

Speaker: Dr. Dave Raggett, W3C/Openwave

Activity Lead for Voice Browsers & Multimodal Interaction,
World Wide Web Consortium
, MIT, Laboratory for Computer Science

Title: Web pages you can speak to and gesture at - W3C is developing
standards for a new class of devices that support multiple modes of interaction

Abstract:

This talk will present a vision for transforming how we interact with the Web, and describe the steps W3C is taking to realize this vision. The Web started with a purely textual interface and then evolved to add images, forms, richer layout and scripting. More recently W3C began work to develop standards to enable people to access appropriately designed services from any phone using speech and DTMF keypads. This year W3C launched a new working group to develop standards for multimodal interaction offering users the choice of using their voice, or the use of a keypad, keyboard, mouse, stylus or other input device. For output, users will be able to listen to spoken prompts and audio, and to view information on graphical displays. The Multimodal Interaction working group is developing markup specifications supporting multiple modalities and devices with a wide range of capabilities.


Location:
MIT Laboratory for Computer Science
Friday, 15 November 2002
3:15 pm Refreshments
3:30 pm Talk
NE43-518/200 Technology Square



World Wide Web Consortium (W3C)
MIT Laboratory for Computer Science
200 Technology Square
Cambridge, MA, USA 02139
tel:+1.617.253.2613
http://www.w3.org

$Revision: 1.35 $ of $Date: 2002/11/05 18:55:52 $ by $Author: Carol Nicolora $