Skip to content

Technique SM12:Providing captions through synchronized text streams in SMIL 2.0

Applicability

Applies to SMIL 2.0

This technique relates to:

Description

The objective of this technique is to provide a way for people who are deaf or otherwise have trouble hearing the dialogue in audio visual material to be able to view the material. With this technique all of the dialogue and important sounds are available in a text stream that is displayed in a caption area.

With SMIL 2.0, separate regions can be defined for the video and the captions. The captions and video playback are synchronized, with the caption text displayed in one region of the screen, and the corresponding video displayed in another region.

Examples

Example 1: SMIL 2.0 caption sample for RealMedia player

<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="https://www.w3.org/2001/SMIL20/Language">
  <head>
    <layout>
      <root-layout backgroundColor="black" height="310" width="330"/>
      <region id="video" backgroundColor="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="captions" backgroundColor="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo"
      alt="Sales Demo"/>
      <textstream src="salesdemo_cc.rt" region="captions" systemCaptions="on" 
      title="captions" alt="Sales Demo Captions"/>
    </par>
  </body>
</smil>

The example shows a <par> segment containing a <video> and a <textstream> tag. The systemCaptions attribute indicates that the textstream should be displayed when the user's player setting for captions indicates the preference for captions to be displayed. The <layout> section defines the regions used for the video and the captions.

Example 2: SMIL 2.0 caption sample with internal text streams for RealMedia player

 
<?xml version="1.0" encoding="UTF-8"?>
<smil xmlns="https://www.w3.org/2001/SMIL20/Language">
  <head>
    <layout>
      <root-layout backgroundColor="black" height="310" width="330"/>
      <region id="video" backgroundColor="black" top="5" left="5" 
      height="240" width="320"/>
      <region id="captions" backgroundColor="black" top="250" 
      height="60" left="5" width="320"/>
    </layout>
  </head>
  <body>
    <par>
      <video src="salesdemo.mpg" region="video" title="Sales Demo" 
      alt="Sales Demo"/>
      <text src="data:,This%20is%20inline%20text." region="captions" 
      begin="0s" dur="3">
        <param name="charset" value="iso-8859-1"/>
        <param name="fontFace" value="System"/>
        <param name="fontColor" value="yellow"/>
        <param name="backgroundColor" value="blue"/>
      </text>
      <text src="data:,This%20is%20a%20second%20text." 
      region="captions" begin="3s" dur="3">
        <param name="charset" value="iso-8859-1"/>
        <param name="fontFace" value="System"/>
        <param name="fontColor" value="yellow"/>
        <param name="backgroundColor" value="blue"/>
      </text>
    </par>
  </body>
</smil>

This example shows a <text> element that includes synchronized text streams within the SMIL file.

Other sources

No endorsement implied.

Tests

Procedure

  1. Enabled caption preference in player, if present
  2. Play file with captions
  3. Check whether captions are displayed

Expected Results

  • #3 is true
Back to Top