Techniques for WCAG 2.0

Skip to Content (Press Enter)

SMIL Techniques for WCAG 2.0

This Web page lists SMIL Techniques from Techniques for WCAG 2.0: Techniques and Failures for Web Content Accessibility Guidelines 2.0. For information about the techniques, see Introduction to Techniques for WCAG 2.0. For a list of techniques for other technologies, see the Table of Contents.


Table of Contents



SM1: Adding extended audio description in SMIL 1.0

Applicability

Applies whenever SMIL 1.0 player is available

This technique relates to:

Description

The purpose of this technique is to allow there to be more audio description than will fit into the gaps in the dialogue of the audio-visual material.

With SMIL 1.0 there is no easy way to do this but it can be done by breaking the audio and video files up into a series of files that are played sequentially. Additional audio description is then played while the audio-visual program is frozen. The last frame of the video file is frozen so that it remains on screen while the audio file plays out.

The effect is that the video appears to play through from end to end but freezes in places while a longer audio description is provided. It then continues automatically when the audio description is complete.

To turn the extended audio description on and off one could use script to switch back and forth between two SMIL scripts, one with and one without the extended audio description lines. Or script could be used to add or remove the extended audio description lines from the SMIL file so that the film clips would just play sequentially.

If scripting is not available then two versions of the video could be provided, one with and one without extended audio descriptions.

Examples

Example 1: SMIL 1.0 Video with audio descriptions that pause the main media in 4 locations to allow extended audio description

Example Code:

   
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns:qt="http://www.apple.com/quicktime/resources/smilextensions" 
xmlns="http://www.w3.org/TR/REC-smil" qt:time-slider="true">
  <head>
    <layout>
      <root-layout background-color="black" height="266" width="320"/>
      <region id="videoregion" background-color="black" top="26" left="0" 
      height="144" width="320"/>
    </layout>
  </head>
  <body>
  <par>
   <seq>
     <par>
       <video src="video.rm" region="videoregion" clip-begin="0s" clip-end="5.4" 
       dur="8.7" fill="freeze" alt="videoalt"/>   
       <audio src="no1.wav" begin="5.4" alt="audio alt"/>
     </par>
     <par>
       <video src="video.rm" region="videoregion" clip-begin="5.4" clip-end="24.1" 
       dur="20.3" fill="freeze" alt="videoalt"/>
       <audio src="no2.wav" begin="18.7" alt="audio alt"/>
     </par>
     <par>
       <video src="video.rm" region="videoregion" clip-begin="24.1" clip-end="29.6" 
       dur="7.7" fill="freeze" alt="videoalt"/>
       <audio src="no3.wav" begin="5.5" alt="audio alt"/>
     </par>
     <par>
       <video src="video.rm" region="videoregion" clip-begin="29.6" clip-end="34.5" 
       dur="5.7" fill="freeze" alt="videoalt"/>
       <audio src="no4.wav" begin="4.9" alt="audio alt"/>
     </par>
     <par>
       <video src="video.rm" region="videoregion" clip-begin="77.4" alt="video alt"/>
     </par>
   </seq>
  </par>
</body>
</smil>

The markup above is broken into five <par> segments. In each, there is a <video> and an <audio> tag (the last <par> has no <audio> tag intentionally). The convention with extended audio descriptions is that the main media pauses during the description. The way to make this happen in SMIL 1.0 is to set a "clip-begin" and "clip-end" which dictate the start and end of the video clip, and to set a duration for the clip that is longer than what is defined by the "clip-begin" and "clip-end". The fill="freeze" holds the last frame of the video during the extended description. The <audio> tag has a "begin" attribute with a value that is equal to the "clip-end" value of the preceding <video> tag.

The way to determine the values for "clip-begin," "clip-end", and "dur" is to find out the time the portion of the video before the audio description starts and ends, and to find out the total length of the extended audio description. The "clip-begin" and "clip-end" define their own values, but the "dur" value is the sum of the length of the extended description and the clip defined by the "clip-begin" and "clip-end". In the first <par>, the video clip starts at 0 seconds, ends and 5.4 seconds, and the description length is 3.3 seconds, so the "dur" value is 5.4s + 3.3s = 8.7s.

Resources

Tests

Procedure

  1. Play file with extended audio descriptions

  2. Play file with audio description

  3. Check whether video freezes in places and plays extended audio description

Expected Results

  • #3 is true


SM2: Adding extended audio description in SMIL 2.0

Applicability

Applies whenever SMIL 2.0 player is available

This technique relates to:

Description

The purpose of this technique is to allow there to be more audio description than will fit into the gaps in the dialogue of the audio-visual material.

With SMIL 2.0 it is possible to specify that particular audio files be played at particular times, and that the program be frozen (paused) while the audio file is being played.

The effect is that the video appears to play through from end to end but freezes in places while a longer audio description is provided. It then continues automatically when the audio description is complete.

To turn the extended audio description on and off one could use script to switch back and forth between two SMIL scripts, one with and one without the extended audio description lines. Or script could be used to add or remove the extended audio description lines from the SMIL file so that the film clips would just play uninterrupted.

If scripting is not available then two versions of the SMIL file could be provided, one with and one without extended audio description.

Examples

Example 1: Video with extended audio description.

Example Code:


<smil xmlns="http://www.w3.org/2001/SMIL20/Language"> 
<head> 
<layout> 
<root-layout backgroundColor="black" height="266" width="320"/> 
<region id="video" backgroundColor="black" top="26" left="0" 
height="144" width="320"/> 
</layout> 
</head> 
<body>	 
<excl> 
<priorityClass peers="pause"> 
<video src="movie.rm" region="video" title="video" alt="video" /> 
<audio src="desc1.rm" begin="12.85s" alt="Description 1" /> 
<audio src="desc2.rm" begin="33.71s" alt="Description 2" /> 
<audio src="desc3.rm" begin="42.65s" alt="Description 3" /> 
<audio src="desc4.rm" begin="59.80s" alt="Description 4" /> 
</priorityClass> 
</excl> 
</body> 
     </smil> 

Resources

Tests

Procedure

  1. Play file with extended audio description

  2. Check whether video freezes in places and plays extended audio description

Expected Results

  • #2 is true


SM6: Providing audio description in SMIL 1.0

Applicability

Applies whenever SMIL 1.0 player is available

This technique relates to:

Description

The objective of this technique is to provide a way for people who are blind or otherwise have trouble seeing the video in audio-visual material to be able to access the material. With this technique a description of the video is provided via audio description that will fit into the gaps in the dialogue in the audio-visual material.

Examples

Example 1: SMIL 1.0 audio description sample for QuickTime player

Example Code:

   
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns:qt="http://www.apple.com/quicktime/resources/smilextensions" 
  xmlns="http://www.w3.org/TR/REC-smil" qt:time-slider="true">
  <head>
    <layout>
      <root-layout background-color="black" height="266" width="320"/>
      <region id="videoregion" background-color="black" top="26" left="0" 
      height="144" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video dur="0:01:20.00" region="videoregion" src="salesdemo.mov" 
      alt="Sales Demo"/>
      <audio dur="0:01:20.00" src="salesdemo_ad.mp3" 
      alt="Sales Demo Audio Description"/>
    </par>
  </body>
</smil> 

Example 2: SMIL 1.0 audio description sample for RealTime player

Example Code:

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="http://www.w3.org/TR/REC-smil">
  <head>
    <layout>
      <root-layout background-color="black" height="266" width="320"/>
      <region id="videoregion" background-color="black" top="26" left="0" 
      height="144" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mov" region="videoregion" title="Sales Demo" 
      alt="Sales Demo"/>
      <audio src="salesdemo_ad.mp3" title="audio description" 
      alt="Sales Demo Audio Description"/>
    </par>
  </body>
</smil>

Resources

Tests

Procedure

  1. Find method for turning on audio description from content/player (unless it is always played by default)

  2. Play file with audio description

  3. Check whether audio description is played

Expected Results

  • #3 is true


SM7: Providing audio description in SMIL 2.0

Applicability

Applies whenever SMIL 2.0 player is available

This technique relates to:

Description

The objective of this technique is to provide a way for people who are blind or otherwise have trouble seeing the video in audio-visual material to be able to access the material. With this technique a description of the video is provided via audio description that will fit into the gaps in the dialogue in the audio-visual material.

Examples

Example 1: SMIL 2.0 audio description sample for RealMedia player

Example Code:


<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
  <head>
    <layout>
      <root-layout backgroundColor="black" height="266" width="320"/>
      <region id="video" backgroundColor="black" top="26" left="0" 
      height="144" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo" 
      alt="Sales Demo"/>
      <audio src="salesdemo_ad.mp3" begin="33.71s" title="audio description" 
      alt="Sales Demo Audio Description"/>
    </par>
  </body>
</smil>

The example shows a <par> segment containing an <audio> and a <video> tag. The audio stream is not started immediately.

Resources

Tests

Procedure

  1. Find method for turning on audio description from content/player (unless it is always played by default)

  2. Play file with audio description

  3. Check whether audio description is played

Expected Results

  • [begin change]

    #3 is true

    [end change]

SM11: Providing captions through synchronized text streams in SMIL 1.0

Applicability

Applies to SMIL 1.0

This technique relates to:

User Agent and Assistive Technology Support Notes

There is no universal standard format for representing captions in SMIL 1.0. Different user agents support different caption formats. A file in a supported format must be provided as the textstream src argument for the caption textstream.

QuickTime supports QTText caption files. Real-based players, such as RealPlayer and GRiNS, support RealText caption files. WindowsMedia supports SAMI files, but does not support SMIL. Flash does not support a specific file type, but can parse XML-based caption file; actually the FLVPlayback component support for SMIL is intended to detect parameters like movie/server url or multi-bandwidth indications specified in a <switch> tag.

Description

The objective of this technique is to provide a way for people who are deaf or otherwise have trouble hearing the dialogue in audio visual material to be able to view the material. With this technique all of the dialogue and important sounds are available in a text stream that is displayed in a caption area.

With SMIL 1.0, separate regions can be defined for the video and the captions. The captions and video playback are synchronized, with the caption text displayed in one region of the screen, while the corresponding video is displayed in another region.

Examples

Example 1: SMIL 1.0 caption sample for Quickime player

Example Code:

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns:qt="http://www.apple.com/quicktime/resources/smilextensions" 
  xmlns="http://www.w3.org/TR/REC-smil" qt:time-slider="true">
  <head>
    <layout>
      <root-layout width="320" height="300" background-color="black"/>
      <region top="0" width="320" height="240" left="0" background-color="black" 
      id="videoregion"/>
      <region top="240" width="320" height="60" left="0" background-color="black" 
      id="textregion"/>
    </layout>
  </head>
  <body>
    <par>
      <video dur="0:01:20.00" region="videoregion" src="salesdemo.mov" 
      alt="Sales Demo"/>
      <textstream dur="0:01:20.00" region="textregion" src="salesdemo_cc.txt" 
      alt="Sales Demo Captions"/>
    </par>
  </body>
</smil> 

Example 2: SMIL 1.0 caption sample for RealMedia player

Example Code:

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="http://www.w3.org/TR/REC-smil">
  <head>
    <layout>
      <root-layout background-color="black" height="310" width="330"/>
      <region id="video" background-color="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="captions" background-color="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo" 
      alt="Sales Demo"/>
      <textstream src="salesdemo_cc.rt" region="captions" 
      system-captions="on" title="captions" 
      alt="Sales Demo Captions"/>
    </par>
  </body>
</smil>

The example shows a <par> segment containing a <video> and a <code><![CDATA[<textstream> tag. The system-captions attribute indicates that the textstream should be displayed when the user's player setting for captions indicates the preference for captions to be displayed. The <layout> section defines the regions used for the video and the captions.

Example 3: SMIL 1.0 caption sample with internal text streams

Example Code:


<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="http://www.w3.org/TR/REC-smil">
  <head>
    <layout>
      <root-layout background-color="black" height="310" width="330"/>
      <region id="video" background-color="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="captions" background-color="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo" 
      alt="Sales Demo"/>
      <text src="data:,This%20is%20inline%20text." region="captions" begin="0s" 
      dur="3" alt="Sales Demo Captions">
        <param name="charset" value="iso-8859-1"/>
        <param name="fontFace" value="System"/>
        <param name="fontColor" value="yellow"/>
        <param name="backgroundColor" value="blue"/>
      </text>
    </par>
  </body>
</smil>

This example shows a <text> element that includes synchronized text streams within the SMIL file.

Resources

Tests

Procedure

  1. Enabled caption preference in player, if present

  2. Play file with captions

  3. Check whether captions are displayed

Expected Results

  • #3 is true


SM12: Providing captions through synchronized text streams in SMIL 2.0

Applicability

Applies to SMIL 2.0

This technique relates to:

User Agent and Assistive Technology Support Notes

Only RealPlayer supports SMIL 2.0.

Description

The objective of this technique is to provide a way for people who are deaf or otherwise have trouble hearing the dialogue in audio visual material to be able to view the material. With this technique all of the dialogue and important sounds are available in a text stream that is displayed in a caption area.

With SMIL 2.0, separate regions can be defined for the video and the captions. The captions and video playback are synchronized, with the caption text displayed in one region of the screen, and the corresponding video displayed in another region.

Examples

Example 1: SMIL 2.0 caption sample for RealMedia player

Example Code:


<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
  <head>
    <layout>
      <root-layout backgroundColor="black" height="310" width="330"/>
      <region id="video" backgroundColor="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="captions" backgroundColor="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo"
      alt="Sales Demo"/>
      <textstream src="salesdemo_cc.rt" region="captions" systemCaptions="on" 
      title="captions" alt="Sales Demo Captions"/>
    </par>
  </body>
</smil>

The example shows a <par> segment containing a <video> and a <textstream> tag. The systemCaptions attribute indicates that the textstream should be displayed when the user's player setting for captions indicates the preference for captions to be displayed. The <layout> section defines the regions used for the video and the captions.

Example 2: SMIL 2.0 caption sample with internal text streams for RealMedia player

Example Code:

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
  <head>
    <layout>
      <root-layout backgroundColor="black" height="310" width="330"/>
      <region id="video" backgroundColor="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="captions" backgroundColor="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo" 
      alt="Sales Demo"/>
      <text src="data:,This%20is%20inline%20text." region="captions" 
      begin="0s" dur="3">
        <param name="charset" value="iso-8859-1"/>
        <param name="fontFace" value="System"/>
        <param name="fontColor" value="yellow"/>
        <param name="backgroundColor" value="blue"/>
      </text>
      <text src="data:,This%20is%20a%20second%20text." 
      region="captions" begin="3s" dur="3">
        <param name="charset" value="iso-8859-1"/>
        <param name="fontFace" value="System"/>
        <param name="fontColor" value="yellow"/>
        <param name="backgroundColor" value="blue"/>
      </text>
    </par>
  </body>
</smil>

This example shows a <text> element that includes synchronized text streams within the SMIL file.

Resources

Tests

Procedure

  1. Enabled caption preference in player, if present

  2. Play file with captions

  3. Check whether captions are displayed

Expected Results

  • #3 is true


SM13: Providing sign language interpretation through synchronized video streams in SMIL 1.0

Applicability

Applies whenever SMIL 1.0 player is available

This technique relates to:

Description

The objective of this technique is to provide a way for people who are deaf or otherwise have trouble hearing the dialogue in audio visual material to be able to view the material. With this technique all of the dialogue and important sounds are available in a sign-language interpretation video that is displayed in a caption area.

With SMIL 1.0, separate regions can be defined for the two videos. The two video playbacks are synchronized, with the content video displayed in one region of the screen, while the corresponding sign-language interpretation video is displayed in another region.

Examples

Example 1: SMIL 1.0 sign-language interpretation sample for QuickTime player

Example Code:

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns:qt="http://www.apple.com/quicktime/resources/smilextensions" 
  xmlns="http://www.w3.org/TR/REC-smil" qt:time-slider="true">
  <head>
    <layout>
      <root-layout width="320" height="300" background-color="black"/>
      <region top="0" width="320" height="240" left="0" background-color="black" 
      id="videoregion"/>
      <region top="240" width="320" height="60" left="0" background-color="black" 
      id="signingregion"/>
    </layout>
  </head>
  <body>
    <par>
      <video dur="0:01:20.00" region="videoregion" src="salesdemo.mov" 
      alt="Sales Demo"/>
      <video dur="0:01:20.00" region="signingregion" system-captions="on" 
      src="salesdemo_si.mov" alt="Sales Demo Sign Language Interpretation"/>
    </par>
  </body>
</smil>

Example 2: SMIL 1.0 sign-language sample for RealMedia player:

Example Code:

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="http://www.w3.org/TR/REC-smil">
  <head>
    <layout>
      <root-layout background-color="black" height="310" width="330"/>
      <region top="0" width="320" height="240" left="0" background-color="black"
       id="videoregion"/>
      <region top="240" width="320" height="60" left="0" background-color="black" 
      id="signingregion"/>
    </layout>
  </head>
  <body>
    <par>
      <video dur="0:01:20.00" region="videoregion" src="salesdemo.mov" 
      alt="Sales Demo"/>
      <video dur="0:01:20.00" region="signingregion" system-captions="on" 
      src="salesdemo_si.mov" alt="Sales Demo Sign Language Interpretation"/>
    </par>
  </body>
</smil>

The example shows a <par> segment containing two <video> tags. The system-captions attribute indicates that the sign language video should be displayed when the user's player setting for captions indicates the preference for captions to be displayed. The <layout> section defines the regions used for the main video and the sign language interpretation video.

Resources

Tests

Procedure

  1. Enable control in content player to turn on sign language interpretation (unless it is always shown)

  2. Play file with sign-language interpretation

  3. Check whether sign language interpretation is displayed

Expected Results

  • #3 is true


SM14: Providing sign language interpretation through synchronized video streams in SMIL 2.0

Applicability

SMIL 2.0

This technique relates to:

Description

The objective of this technique is to provide a way for people who are deaf or otherwise have trouble hearing the dialogue in audio visual material to be able to view the material. With this technique all of the dialogue and important sounds are available in a sign-language interpretation video that is displayed in a caption area.

With SMIL 2.0, separate regions can be defined for the two videos. The two video playbacks are synchronized, with the content video displayed in one region of the screen, while the corresponding sign-language interpretation video is displayed in another region.

Examples

Example 1: SMIL 2.0 sign-language video sample for RealMedia player

Example Code:


<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
  <head>
    <layout>
      <root-layout backgroundColor="black" height="310" width="330"/>
      <region id="video" backgroundColor="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="signing" backgroundColor="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo" 
      alt="Sales Demo"/>
      <video src="salesdemo_signing.mpg" 
      region="signing" systemCaptions="on" 
      title="sign language interpretation" 
      alt="Sales Demo Sign Language Interpretation"/>
    </par>
  </body>
</smil>

The example shows a <par> segment containing two <video> tags. The systemCaptions attribute indicates that the sign language video should be displayed when the user's player setting for captions indicates the preference for captions to be displayed. The <layout> section defines the regions used for the main video and the sign language interpretation video.

Resources

Tests

Procedure

  1. Enable control in content or player to turn on sign language interpretation (unless it is always shown)

  2. Play file with sign-language interpretation

  3. Check whether sign language interpretation is displayed

Expected Results

  • #3 is true