Realtime Multimedia and the Web
Authors: JP Henot, J. Douget
CCETT , 4 rue du Clos Courtel, BP 59, 35512 Cesson Sevigne Cedex
What is realtime multimedia
Today the Web can be considered as the most popular system for the transmission of multimedia documents, which includes text, audio, video (either natural or synthetic) and other types of data.
However the transmission of large documents which is required with audio, and even more, with video, implies today a retrieval delay which can be very large: for many applications, such as videophony, or online video, realtime is needed. This means that the retrieval delay (including access to the information, transmission, decoding and display) needs to be limited in a way that the service is acceptable to the final user.
This limitation in delay can be very strict for communication applications where interactivity between end users has to be maintained, but can be looser for consultation applications, with a delay of the order of the second (or even more).
Added to the delay consideration, is the synchronism aspect of
realtime components: the different media included in the multimedia
information should be synchronized with a sufficient accuracy
from the source (for instance when sound accompanies fixed pictures,
or when audio and video are parts of the same multimedia content),
and displayed with as best as possible regularity on the receiving
An other aspect that needs to be considered is the network capacity. As realtime multimedia will be very demanding is terms of bandwidth ( and continuity of the needed capacity), concurrent problems need to be tackled:
- how the network faces this increasing demand in bitrate
- how is a quality of service is guaranteed for a period of time for an application (for instance by granting a minimum rate).
In some cases, multicasting could be a solution to reduce the
overload on the net (for broadcast type services), but preserving
a strong interactivity will still require that most services are
of single casting type.
Last but not least, there is a need to provide seamless services
from narrow to broadband services. Digital TV and multimedia decoders
which are today on a commercial deployment phase, could be adapted
to access Internet services, but some interoperability problems
have to be solved, especially regarding the protocol stack used
to transport audio and video. DAVIC and IETF are working to solve
Evolution and interoperability
For a generalization of realtime multimedia on the web, a minimum
set of standards is needed, in order to cope with most of the
problems quoted above. The ultimate goal is to offer a simple
solution to the final users on the Web, which enables interoperability
of servers, applications and works on any kind of network.
The basic principles to be defined include:
- the description of the individual media, including audio and
video, with a scalability in terms of bitrates to fit to different
networks (POTS, ADSL....) and different terminals ;
- the description of the multimedia documents including the synchronisation
and links between individual contents ;
- the protocols for the transmission of the multimedia documents
with a goal to limit the delay, to keep the display synchronism
and to provide seamless services;
Furthermore, the minimum set of description needs to remain open
to fit to different physical platforms and to adapt to further
extensions, such as synthetic/natural hybrid coding in preparation
Previous experience at CCETT
CCETT is involved in a number of standardization bodies and international
forums which deal with mutimedia services:
- ISO MHEG and MPEG (and DSM-CC) groups on the representation of multimedia information ;
- DAVIC (Digital Audio-Visual Council) for the interoperability of multimedia and audiovisual services ;
- DAB (Digital Audio Broadcasting) and DVB (Digital Video Broadcasting)
for audio and TV broadcasting.
JP Henot has been involved in the standardization of video encoding (MPEG in particular) and is responsible for the MPEG video developments at CCETT.