Important note: This Wiki page is edited by participants of the RDWG. It does not necessarily represent consensus and it may have incorrect information or information that is not supported by other Working Group participants, WAI, or W3C. It may also have some very useful information.
"Accessible Maps" aims at discussing and outlining the need for detailed guidelines and standards how maps and geographical data should made accessible and how maps and geographical data could provide a valuable source for assistive functionalities for people with disabilities.
- 1 Contacts
- 2 Keywords
- 3 Description
- 4 Background
- 5 Discussion of Topics and Target Groups
- 6 Guidelines and Standards
- 7 Techniques to make Visual Maps Accessible to People with Visual Disabilities
- 8 Techniques to make Visual Maps Accessible to deaf or hard-of-hearing people, people with cognitive disabilities or elderly people
- 9 References
Accessible Maps, Geographical Data, Navigation, Mobility
This topic wants to discuss the following aspects:
- The target groups and requirements for the target groups of the accessible maps: Which requirements come along with the different target groups?
- Guidelines and standards: Which guidelines and standards do exist and what extension are required?
- Which concepts and techniques do exist for map data storage, processing and presentation and how do they respect accessibility requirements?
- Which concepts and techniques do exist to make visual maps accessible to people with visual disabilities?
- Which techniques do exist to make visual maps accessible to deaf or hard-of-hearing people, people with cognitive disabilities or elderly people?
- Location-based assistive technology: Which kinds of AT do exist which use geographic data, maps or positioning?
- What support does a developer need to consider making location-based applications accessible?
- Accessibility map annotations: annotations highlighting the composition of the environment in terms of accessibility are crucial for the development of many location-based ATs. Which types of annotation do exist and what guidelines are available or needed?
- Which approaches do exist to create and edit annotations in an accessible way by the community?
Since car/person navigation support and location based information services are standard in almost any smartphone, the potential of maps and geographical data are well understood and accepted in society. The location of the user or other people and objects is a key information included in reasoning for implementing targeted location based systems and services. Advances in outdoor (e.g. using GPS and GPRS) and indoor (e.g. using Wi-Fi, Bluetooth) tracking as well as near field communication allow the implementation of sophisticated systems and services supporting the user by providing information and functionality which are of particular importance in a certain geographical context. Web services like Google Maps or Openstreetmaps offer map data for free. These days, mobile applications use this kind of data to enhance search requests for surrounding stores or restaurants and show the results on a map which makes it easy for the user to get an overview and to orientate him- or herself. In general, visual maps provide a quick and effective way to communicate the composition of the surrounding area and offer useful information for orientation and navigation. Users which are not capable of using standard visual maps are not able to access this information. For this reason, accessible maps are a desperately desired need of many people. There is a strong relation to other fields which should take accessibility into account like
- Location based systems
- Pervasive Computing
- Internet of Things
- Ubiquitous Computing
- Sensor and Tracking Technology
Discussion of Topics and Target Groups
Users of accessible maps suffering from visual disabilities or blindness need more detailed information about their surroundings [Haptimap2012]. An accuracy of less than 0.1 meter is desirable including the shape of the buildings on the road which are possibly partly blocking the sidewalk. Moreover, the maps need to be suited to pedestrians, this means that information about sidewalks, crosswalks, pedestrian areas and public transport is necessary to provide helpful guidance to the user. People who are deaf or hard of hearing do have some quite specific requirements too. Since the auditive perception drops out as possible communication channel, the information which could be communicate using audio needs to be rendered in a different manner. Navigation systems usually try to disburden the user’s optical sense as far as possible to provide eyes-free navigation and provide output via spoken commands. Possible solutions considering both requirements are see-through glasses or systems which communicate through the haptic-tactile channel with the user. Furthermore, persons suffering from cognitive disabilities or deaf people usually prefer simple language or even better, intuitive symbols. Some approaches also focus on sign language to communicate with the deaf user. Furthermore, information which is specific to users suffering from different kinds of disabilities is required as well. Important in this context is information about the accessibility of trails and buildings, like the availability of accessible crosswalks or the accessibility of trials to wheelchair users. Sighted users who want to use an accessible map because they do not want to get distracted while looking at it, have different needs depending on the use case. If they want to perform pedestrian navigation, the requirements are quite similar to the first target group. If they want to use it as replacement or extension to traditional car navigation systems for example, the required level of detail and the type of data which is relevant to the user varies. Generally speaking, maps do not need to reach the same level of detail. For example a car driver does not need to know the exact dimensions of a building next to the road to successfully pass it, while a person with visually impaired would need this kind of information. To sum up, different use cases require different levels of details, accuracy and in addition to the basic data, like streets and junctions, specific data depending on the purpose of use.
Guidelines and Standards
These days, no standard does exist, which describes how to make a visual map accessible to people with visual disabilities. One of the reasons is that the information on a visual map usually is very compressed and compared to the visual sense the tactile sense of the fingertips can only provide a rather limited channel of perception and information through-put[Zeng2011]. Therefore, it is very hard to transfer the same detailed information to the user without using the visual sense. Researchers carried out various user studies and experiments to identify a set of distinguishable lines and symbols [Jehoel2006, McCallum2006, Lobben2012]. Moreover, a number of guidelines do exist considering the design and layout of tactile maps [Amick1997, Gardiner2012, Edman1992]. Besides, people with visual disabilities, the other target groups need an adapted representation of visual maps as well, because the level of detail is usually too high. Paladugu et al. [Paladugu2010] took an audio-tactile system as platform to expose common HCI flaws in modern location- or map-based ATs. In general, the other target groups benefit from text which can be easily understood. Easy-to-read is an important term in this context. An Overview is provided under more information on Easy to Read.
Techniques to make Visual Maps Accessible to People with Visual Disabilities
An international study mentions that swell paper and thermoform diagrams are most frequently used to make maps accessible to people with visual disabilities [Roswell2003]. This kind of map usually contains major items like roads or buildings represented tactically. They are labeled in Braille either directly besides the object or – to prevent overlapping – in a legend. Normally, the generation of tactile maps is manually, which is cost- and time-consuming. On that account, some approaches aim at simplifying the generation process and automatically convert geographic data to tactile maps. For example, TMAP [Miele2004] is a project obtaining geographic data from GIS (geographic information system) using a special labeling algorithm [Kulyukin2010]. It is supposed to calculate the best position of the street names and names of places on large printed street maps of towns and cities in the USA.
Virtual Acoustic Map
Another type of approach is the virtual map. Instead of printing geographic data on a piece of paper, virtual maps use the digital representation of the environment to communicate the information directly to the user. Virtual acoustic maps convert map items into different sounds. Heuten [Heuten 2007] proposed the use of so called auditory torches to limit the level of detail and enhance the usability. The user can directly control the visibility range of the virtual city walk-through and explore the objects of the city map which are represented by spatial non-speech sounds. Another implementation of this approach is the “Navigation for the blind through audio-based virtual environments” [Sanchez2010]. AccessibleMaps is based on speech output describing both the composition of the streets including their directions and more detailed information like names [Hoeckner2012].
Virtual Tactile Map
Virtual tactile maps use tactile displays [Schneider1999, Schneider2000] or other haptic devices like joysticks or haptic mice[Parente2003, Moustakas2007] as input and output channels to communicate the geographic data to the user, often in combination with acoustic output for more detailed information. Another project focused on “Understanding environmental structure with tactile map” [Maingreaud2004].
Accessible Maps on Touch Devices
The recent development in the field of mobile devices led to increasing numbers of touch devices. Nowadays, smartphones and tablets are capable of displaying visual maps. Researchers conducted several studies to make visual maps accessible on touch devices. These days, several approaches focus on this issue. Kane [Kane2011] compared several techniques:
- Edge projection: Map items are projected to two sides of the touch screen. The user can locate the x and y-projection using both hands and move his or her hands towards the interior of the screen and the desired target.
- Neighborhood Browsing: The system calculates the area surrounding each target item reclaiming empty space. Touching anywhere on an area speaks the name of the target and provides guided directions after performing a finger gesture.
- Touch-and-Speak: This is a combination of touch gestures and voice commands to overcome the issue of tedious touch input for blind users.
- The outcome of this survey was, that all these approaches provide a better and faster interaction than the standard Voice-Over Method provide by Apple on iOS.
TouchOver Map [Poppinga2011] is another approach which combines vibration and speech feedback to make visual maps on touch screens accessible. The implementation is based on Openstreetmaps and allows the exploration of the surrounding city, the shape of the buildings and streets on a conventional touch screen included in a smartphone. Another popular application is Ariadne-GPS. Besides other features it provides the exploration of a visual map tactically [Ciaffoni2012].
Augmented Paper-Based Tactile Map
This is a combination of traditional paper-based tactile maps and a touch sensitive pad [Wang2009]. The map is printed by a tactile printer and this piece of paper is placed on the pad. The user can explore the composition of the city in the same way he or she would with a normal tactile map, but he or she can obtain more detailed information about the currently touched item acoustically.
Braille Tactile Map
Novel touch-sensitive Braille displays allow the representation of geographic data in a tactile way and offer touch input features like panning, zooming and search at the same time. One implementation [Zeng2010] of an audio-haptic map system is based on the HyperBraille [HyperBraille2012] display.
Techniques to make Visual Maps Accessible to deaf or hard-of-hearing people, people with cognitive disabilities or elderly people
Though, this target group is quite diverse, it has a common set of requirements and needs concerning the user interface, especially when it comes to visual maps. Individuals of this target group often prefer an easier language or clear and easy-to-understand symbols. Adaptive user-interfaces can also provide benefits to meet the specific requirements of an individual and consider the trade-off between functionality and simplicity. In addition, the user interface needs to be especially adapted to deaf people, since this group often has problems with the normal written language, because they are used to sign language, which has a completely different grammar and structure [Boulares2012].
Projects considering these issues are:
- Socio-Technical Environments Supporting People with Cognitive Disabilities Using Public Transportation [Carmien2005]
- Indoor Wayfinding: Developing a Functional Interface for Individuals with Cognitive Impairments [Liu2006]
- A Context Aware Handheld Wayfinding System for Individuals with Cognitive Impairments [Chang2008]
- A Route Planner Interpretation Service for Hard of Hearing People [Boulares2012]
- [amick1997]Amick, Nancy and Corcoran, Jane, Guidelines for the Design of Tactile Graphics, http://www.aph.org/edresearch/guides.htm, 1997
- [boulares2012]Boulares, Mehrez and Jemni, Mohamed, A route planner interpretation service for hard of hearing people, Proceedings of the 13th international conference on Computers Helping People with Special Needs - Volume Part II, Springer-Verlag, http://dx.doi.org/10.1007/978-3-642-31534-3_9, 2012
- [carmien2005]Carmien, Stefan and Dawe, Melissa and Fischer, Gerhard and Gorman, Andrew and Kintsch, Anja and Sullivan,JR., James F., Socio-technical environments supporting people with cognitive disabilities using public transportation, ACM Trans. Comput.-Hum. Interact., ACM, http://doi.acm.org/10.1145/1067860.1067865, 2005
- [chang2008]Chang, Yao-Jen and Tsai, Shih-Kai and Wang, Tsen-Yung, A context aware handheld wayfinding system for individuals with cognitive impairments, Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, ACM, http://doi.acm.org/10.1145/1414471.1414479, 2008
- [ciaffoni2012]Ciaffoni, Giovanni Luca, Ariadne - GPS, http://www.ariadnegps.eu/, 2012
- [edman1992]Edman, Polly, Tactile Graphics, New York: American Foundation for the Blind, 1992
- [haptimap2012]Haptimap Consortium, HaptiMap, 2012
- [gardiner2012]Gardiner, Ann and Perkins, Chris, Best practice guidelines for the design, production and presentation of vacuum formed tactile maps, 2012
- [heuten2007]Heuten, Wilko and Henze, Niels and Boll, Susanne, Interactive exploration of city maps with auditory torches, CHI '07 extended abstracts on Human factors in computing systems, ACM, http://doi.acm.org/10.1145/1240866.1240932, 2007
- [hoeckner2012]Höckner, Klaus and Marano, Daniele and Neuschmid, Julia and Schrenk, Manfred and Wasserburger, Wolfgang, AccessibleMap: web-based city maps for blind and visually impaired, Proceedings of the 13th international conference on Computers Helping People with Special Needs - Volume Part II, Springer-Verlag, http://dx.doi.org/10.1007/978-3-642-31534-3_79, 2012
- [hyperbraille2012]HyperBraille Consortium, HyperBraille, 2012
- [jehoel2006]Jehoel, S. and McCallum, D. and Rowell, J. and Ungar, S., An empirical approach on the design of tactile maps and diagrams: The cognitive tactualization approach, British Journal of Visual Impairment, 2006
- [kane2011]Kane, Shaun K. and Morris, Meredith Ringel and Perkins, Annuska Z. and Wigdor, Daniel and Ladner, Richard E. and Wobbrock, Jacob O., Access overlays: improving non-visual access to large touch screens for blind users, Proceedings of the 24th annual ACM symposium on User interface software and technology, ACM, http://doi.acm.org/10.1145/2047196.2047232, 2011
- [kulyukin2010]Kulyukin, V. and Marston, J. and Miele, J. and Kutiyanawala, A., Automated SVG Map Labeling for Customizable Large Print Maps for Low Vision Individuals, RESNA Annual Conference, 2010
- [liu2006]Liu, Alan L. and Hile, Harlan and Kautz, Henry and Borriello, Gaetano and Brown, Pat A. and Harniss, Mark and Johnson, Kurt, Indoor wayfinding:: developing a functional interface for individuals with cognitive impairments, Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility, ACM, http://doi.acm.org/10.1145/1168987.1169005, 2006
- [lobben2012]Lobben, A. and Lawrence, M., The Use of Environmental Features on Tactile Maps by Navigators Who Are Blind, The Professional Geographer, 2012
- [maingreaud2004]Maingreaud, F. and Pissaloux, E. and Velasquez, R., Understanding environment structure with tactile map, Information Visualisation, 2004. IV 2004. Proceedings. Eighth International Conference on, 2004
- [mccallum2006]McCallum, D. and Ungar, S. and Jehoel, S., An evaluation of tactile directional symbols, British Journal of Visual Impairment, 2006
- [Miele2004]Miele, J., Tactile Map Automated Production (TMAP): using GIS data to generate Braille maps, CSUN International Conference on Technology and Persons with Disabilities, 2004
- [moustakas2007]Moustakas, K. and Nikolakis, G. and Kostopoulos, K. and Tzovaras, D. and Strintzis, M.G., Haptic Rendering of Visual Data for the Visually Impaired, MultiMedia, IEEE, 2007
- [paladugu2010]Paladugu, Devi Archana and Wang, Zheshen and Li, Baoxin, On presenting audio-tactile maps to visually impaired users for getting directions, Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems, ACM, http://doi.acm.org/10.1145/1753846.1754085, 2010
- [parente2003]Peter Parente and Gary Bishop, BATS: The Blind Audio Tactile Mapping System, Proceedings of ACM South Eastern Conference, 2003
- [poppinga2011]Poppinga, Benjamin and Magnusson, Charlotte and Pielot, Martin and Rassmus-Gröhn, Kirsten, TouchOver map: audio-tactile exploration of interactive maps, Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, ACM, http://doi.acm.org/10.1145/2037373.2037458, 2011
- [rowell2003]Rowell, J. and Ungar, S., The world of touch: an international survey of tactile maps, British Journal of Visual Impairment, 2003
- [sanchez2010]Sánchez, Jaime and Sáenz, Mauricio and Pascual-Leone, Alvaro and Merabet, Lotfi, Navigation for the blind through audio-based virtual environments, Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems, ACM, http://doi.acm.org/10.1145/1753846.1753993, 2010
- [schneider1999]Jochen Schneider and Thomas Strothotte, Virtual Tactile Maps, Human-Computer Interaction: Ergonomics and User Interfaces, Proc. HCI Int’l. '99 Vol. 1, Mahwah, NJ & London, 1999
- [schneider2000]Schneider, Jochen and Strothotte, Thomas, Constructive exploration of spatial information by blind users, Proceedings of the fourth international ACM conference on Assistive technologies, ACM, http://doi.acm.org/10.1145/354324.354375, 2000
- [wang2009]Wang, Z. and Li, B. and Hedgpeth, T. and Haven, T, Instant tactile-audio map: enabling access to digital maps for people with visual impairment, Eleventh International ACM SIGACCESS Conference on Computers and Accessibility ASSETS09, 2009
- [zeng2010]Zeng, L. and Weber, G., Audio-Haptic Browser for a Geographical Information System. the International Conference on Computers Helping People With Special Needs, ICCHP, 2010
- [zeng2012]Zeng, Limin and Weber, Gerhard, Accessible Maps for the Visually Impaired, 2011