Vous êtes sur la page 1sur 8

Global Time-based Synchronization of Real-time Multimedia Streaming

Moon Hae Kim


Konkuk University
Seoul 143-170, Korea
mhkim@konkuk.ac.kr
Eun Hwan Jo
Konkuk University
Seoul 143-170, Korea
ehjo@konkuk.ac.kr
Doo-Hyun Kim
ETRI
Tae-Jon 305-350, Korea
doohyun@etri.re.kr
Abstract
The goal of distributed multimedia applications is to
provide reliable high-quality multimedia services to users
on a network. Typical distributed multimedia applications
use multimedia streaming that continuously processes and
transports time-based media such as audio and video on a
network. Inter-media synchronization and intra-media
synchronization should be supported to facilitate high-
quality reliable multimedia services. In this paper, we
present an approach to develop distributed multimedia
applications by using a real-time multimedia object model
named MMStream TMO and a synchronization scheme for
real-time streaming based on MMStream TMO and global
time.
1. Introduction
Real-time technology has been universally applied to
diverse fields of computing including telemetry, industrial
automation and process control and the latest military
systems. However, advents of high-speed networks and
enhancements in computer performance with affordable
costs have made it possible to incorporate real-time
features to distributed multimedia systems. The goal of a
distributed multimedia application is to provide reliable
high-quality multimedia services to users on a network.
Common distributed multimedia systems use streaming
techniques that process and transport digitized media on a
network. But, the current multimedia streaming
technology is not suited to the development of a complex
real-time multimedia application such as video
conferencing system. The reason can be summarized as
follows. Firstly, most multimedia streaming technologies
cannot guarantee reliable services to users. They do not
provide timely service capabilities. Secondly, it is difficult
to develop a distributed multimedia application that is
executed on heterogeneous hardware platforms and
networks based on a client/server architecture. Thirdly,
owing to the lack of real-time multimedia streaming API,
it is not easy to develop complex multimedia applications.
Therefore, we present a real-time multimedia streaming
technology to facilitate the development of reliable
multimedia applications.
The basic structure of the proposed technology is
composed of MMStream TMO (MultiMedia Stream
TMO) and MMTMOSM (MMStream TMO Support
Middleware). MMStream TMO is a specialized form of
the TMO model for distributed multimedia applications.
MMTMOSM is a distributed middleware for real-time
multimedia streaming services. The functions in a
MMTMOSM include acquisition, transformation, and
transportation of multimedia data in a timely fashion.
The rest of this paper is structured as follows. In section
2, we briefly describe requirements in multimedia
streaming and TMO that is the basis of our approach.
Section 3 presents the framework for the development of
multimedia applications based on TMO. We also describe
the structure and functions of MMTMOSM in detail. In
section 4, we show the characteristics of MMTMOSL API.
Section 5 presents a synchronization scheme for real-time
streaming based on global time. Finally, section 6
summarizes the paper with future works.
2. Backgrounds
In this section, we briefly describe requirements in
current multimedia streaming and the TMO structuring
scheme.
2.1 Basic requirements in multimedia streaming
Generally, the basic multimedia streaming functions are
as follows [3, 6].
z Transportation of sequential multimedia stream from
source such as file, device and network to sink such as
file, device, network.
z Transformation and processing (filtering, encoding,
decoding, multiplexing, de-multiplexing, mixing, and
synchronization) of multimedia.
z Synchronization of audio and video that those can be
started and stopped at the same time, and can be
played at the same rate.
This research was supported by University IT
Research Center Project and ETRI
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
In addition, in order to provide high quality multimedia
streaming services, the following factors should be
considered:
z Multimedia stream contains large amounts of data,
which must be processed in timely fashion.
z Data can come from many sources, including local
files, computer networks and video cameras.
z A developer does not know in advance which
hardware devices will be used at the end-users system.
2.2 TMO Structuring Scheme
TMO is a natural, syntactically minor, and semantically
powerful extension of the conventional object(s) [5, 6, 7].
As depicted in Figure 1, the basic TMO structure consists
of four parts:
z Object Data Store (ODS): list of object-data-store
segments (ODSSs) which stores real-time data
z Environment Access Capability (EAC): list of entry
points to remote object methods, logical
communication channels, and I/O device interfaces
z Spontaneous Methods (SpM): a new type of method,
also known as the time-triggered (TT) method. The
SpM executions are triggered when the real-time clock
reaches specific values determined at design time,
z Service Method (SvM): conventional service methods.
The SvM executions are triggered by service request
messages from clients.
The reasons for using TMO for our design framework
are as follows:
z The TMO model provides an abstract mechanism that
can design and implement complex distributed
multimedia systems.
z The TMO model supports accuracy analysis and
specification of stream synchronization times.
z The TMO execution engine, TMOSM (TMO Support
Middleware), can support precise stream
synchronization.
z TMO provides high-level API for real-time
application developers.
3. Real-time Multimedia Streaming
Architecture based on TMO
Currently, most multimedia applications are
client/server systems that send and present time-based
media (such as audio, video, animation, etc.) on a network.
However, client/server architecture is unsuitable for
development of reliable distributed multimedia
applications such as a real-time video conferencing system.
Because, it is difficult to satisfy inter-media and intra-
media synchronizations needed for high quality
multimedia streaming. Therefore, we present a framework
for the development of multimedia applications based on
the TMO structuring scheme. The characteristics of this
framework are as follows:
z The framework supports an environment for design
and implementation of distributed multimedia
applications using the MMStream TMO.
z The framework provides a MMTMOSM( MMStream
TMO Support Middleware) and a
MMTMOSL(MMTMOSM Support Library) to
facilitate real-time multimedia stream processing.
3.1 Real-time Multimedia Streaming Framework
Based on TMO
The purpose of this framework is twofold. Firstly, the
framework helps developers to design
complex distributed real-time multimedia
applications. Secondly, it provides
functions described in section 2 and real-
time streaming facilities. In order to support
timely streaming service, the framework
middleware must guarantee multimedia
stream synchronizations. Distributed
middleware systems synchronize each other
through global time information.
Figure 2 shows the relationship among
application and framework components. In
the following, we describe the roles of each
framework component:
z MMStream TMO: MMStream TMO is
a special form of TMO which processes
multimedia streams at application level.
Use of MMStream TMO for
development of real-time multimedia
applications is supported by
MMTMOSL and TMOSL. MMStream
Object Data Store (ODS)
Capabilities for accessing
other TMOs and network
environment including
logical multicast channels
and I/O devices
SpM1
SpM2
Deadlines
SvM1
SvM2
Client
TMOs
from SvMs, SpMs
Service
Request
Queue
Time-triggered (TT-)
Spontaneous
Methods (SpMs)
Message-triggered (MT-)
Service
Methods (SvMs)
Reservation Q
ODSS
1
Concurrency
Control
ODSS
2
Name of TMO
AAC
Absolute
time domain
Relative
time domain
AAC
EAC
Object Data Store (ODS)
Capabilities for accessing
other TMOs and network
environment including
logical multicast channels
and I/O devices
SpM1
SpM2
Deadlines
SvM1
SvM2
Client
TMOs
from SvMs, SpMs
Service
Request
Queue
Time-triggered (TT-)
Spontaneous
Methods (SpMs)
Message-triggered (MT-)
Service
Methods (SvMs)
Reservation Q
ODSS
1
Concurrency
Control
ODSS
2
Name of TMO
AAC
Absolute
time domain
Relative
time domain
AAC
EAC
Figure 1. Structure of TMO [5,6,7]
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
TMO's distributed in multiple nodes manage
multimedia streams cooperatively.
z MMTMOSL API: This API contains high-level
functions for real-time multimedia applications based
on MMStream TMO. The MMTMOSL provides
multimedia streaming support middleware services for
MMStream TMO.
z TMOSL API: This API contains high-level functions
for real-time multimedia applications based on TMO.
The TMOSL provides the basic classes for TMO.
z TMO Execution Engine: This engine supports the
TMO programming model for precision of action
timings.
3.2 Architecture of Real-time Multimedia
Streaming Support Middleware
In this section, we discuss the architecture of
MMTMOSM that supports basic real-time multimedia
streaming facilities. Figure 3 shows a multimedia-
streaming environment based on MMTMOSM. The
purpose is as follows.
z MMTMOSM provides a real-time multimedia-
streaming interface to facilitate development of
multimedia applications.
z MMTMOSM provides an environment on which
distributed multimedia applications (MMStream
TMO1, MMStream TMO2, and MMStream TMO3)
can be executed irrespective of the heterogeneous
architecture. (TMO execute engine and OS)
z MMTMOSM has enabled timely multimedia
streaming services by using real-time service
functionalities provided by TMOSM, TMO-Linux or
KELIX/RT.
Execution of an MMTMOSM results in automatically
processing multimedia streams. This process is as follows:
An MMTMOSM gets stream data from a local file,
capture card, or network. The MMTMOSM performs a
specific type of processing/transformation designated at
design time on the stream data. In the next phase, the
MMStream TMO sends transformed stream data to other
MMTMOSM or presents it to users on their output device.
The most important issue here is time management.
Accordingly, MMTMOSM processes stream data in a
timely fashion to maintain real-time features.
MMTMOSM consists of three sections: SpM section,
SvM section, and an ODS section. The basic features of
MMSlream TMC!
MMSlream TMC Supporl
Mddleware
TMCSM
WNDCWS
MMSlream TMC2
MMSlream TMC Supporl
Mddleware
TMCLnux
LNUX
MMSlream TMC3
MMSlream TMC Supporl
Mddleware
KELX[PT
Embedded LNUX
MMSlream TMC!
MMSlream TMC Supporl
Mddleware
TMCSM
WNDCWS
MMSlream TMC2
MMSlream TMC Supporl
Mddleware
TMCLnux
LNUX
MMSlream TMC3
MMSlream TMC Supporl
Mddleware
KELX[PT
Embedded LNUX
Figure 3. Multimedia streaming environment
based on MMTMOSM
Figure 2. Overview of our framework
MMTMCSL AP
Dslrbuled Mullmeda Applcalon MMSlream TMC)
MMSlream TMC Supporl Mddleware
TMCSL AP
TMC Execulon Enone
Wndows XP[Lnux[Embedded CS
Dolal Camera[Mc[Monlor[Speaker[Dsk[Nelwork
MMTMCSL AP
Dslrbuled Mullmeda Applcalon MMSlream TMC)
MMSlream TMC Supporl Mddleware
TMCSL AP
TMC Execulon Enone
Wndows XP[Lnux[Embedded CS
Dolal Camera[Mc[Monlor[Speaker[Dsk[Nelwork
Source_SpM Trans_SpM Snk_SpM
MMSlream TMC Supporl AP

AAC! AAC2 AAC3
A[V
CDSS
Translormed
A[V CDSS
SvM
Mddleware Communcalon Servce Manaoer
ApplcalonMMSlream TMC!)
Source_SpM Trans_SpM Snk_SpM
MMSlream TMC Supporl AP

AAC! AAC2 AAC3
A[V
CDSS
Translormed
A[V CDSS
SvM
Mddleware Communcalon Servce Manaoer
Communcalon Servce Messaoe
slream
llow
TMCSM KELX[PT. TMCLnux

olobal
lme

olobal
lme
ApplcalonMMSlream TMC2)
Source_SpM Trans_SpM Snk_SpM
MMSlream TMC Supporl AP

AAC! AAC2 AAC3
A[V
CDSS
Translormed
A[V CDSS
SvM
Mddleware Communcalon Servce Manaoer
ApplcalonMMSlream TMC!)
Source_SpM Trans_SpM Snk_SpM
MMSlream TMC Supporl AP

AAC! AAC2 AAC3
A[V
CDSS
Translormed
A[V CDSS
SvM
Mddleware Communcalon Servce Manaoer
Communcalon Servce Messaoe
slream
llow
TMCSM KELX[PT. TMCLnux

olobal
lme

olobal
lme
ApplcalonMMSlream TMC2)
Figure 4. The basic structure of MMTMOSM
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
MMTMOSM are shown in Figure 4.
z ODS: it is used to store stream data for buffering so as
to synchronize intra-media stream data. Also, ODS is
utilized as a connection point for SpM filters that take
part in streaming. Access to ODS is performed by
reading and writing operations of SpM's periodically.
z Source_SpM: It reads multimedia data from media
source such as capture device, disk, or network by
AAC automatically. SpMs start and stop streaming by
AAC specifications in a timely fashion to provide
inter-media synchronization and intra-media
synchronization.
z Proc_SpM: It performs transformation of multimedia
data received from source. The functions performed
here include encoding, decoding, weaving and
splitting.
z Sink_SpM: It sends transformed multimedia data to a
media destination such as monitor, speaker, disk or
network. Especially, Sink_SpM serves to send or
receive multimedia data through UDP socket.
z SvM: MMTMOSM provides SvM's that handle
communication services and session service events.
SvM converts another MMTMOSM service message
into service event and transmits this event to the
application. Vice-versa, it converts application events
into network messages and transmits this message to
the appropriate middleware.
MMTMOSM provides these basic functions to support
inter-media synchronization and intra-media
synchronization.
Figure 5 shows an example of MMTMOSM processing
structure and stream flow. Source_SpM1 and
Souce_SpM2 are activated to capture raw media data from
the USB camera and Microphone. Received media data
from source is encoded by Encode_SpM. Encoded media
data is then sent to the network by Sink_SpM1.
Simultaneously, Sink_SpM2 and Sink_SpM3 present
video and audio.
4. Structure of MMTMOSL API
In this section, we describe MMTMOSL as a high level
API supported by MMTMOSM. Application developers
can develop distributed multimedia applications based on
TMO using MMTMOSL API. The purpose of designing
MMTMOSL is as follows.
z Support acquisition, transformation, and
transportation of raw media
z Provision of MMTMOSM service interface
The MMTMOSL consists of classes such as
MMStreamTMO class, MMTMOSM class, MediaSource
class, MediaSink class, MediaProcessor class, MediaTime
class, and MediaPlayer class.
Table 1 illustrates the main MMTMOSL API.
Figure 6 illustrates a local video streaming application
using three SpM's: MediaSource SpM, MediaProcessor
SpM and MediaSink SpM. While the MediaSource SpM
reads raw video data from USB camera at every 50ms, the
MediaSink SpM writes data from MediaProcessor SpM to
display devices. The MediaProcessor SpM works for video
processing such as chroma-key, alpha-blending and
graphics overlay. The MediaProcessor SpM must be in
charge of compressing the raw video data to send a video
to the remote node through the network.
To facilitate the developer to implement the above
application based on TMO, the MMTMOSL API can be
used as follows:
Encode_SpM Snk_SpM!
SvM
Mddleware Communcalon Servce Manaoer
AAC2 AAC3
Encoded Meda
Frame CDSS
Snk_SpM2
AAC4

Snk_SpM3
AAC4

Source_SpM!
AAC!
Source_SpM2
AAC!

Paw vdeo
CDSS
Paw audo
CDSS
MMSlream TMC Supporl Mddleware
Camera
Mc
Monlor
Speaker
Nelwork

Encode_SpM Snk_SpM!
SvM
Mddleware Communcalon Servce Manaoer
AAC2 AAC3
Encoded Meda
Frame CDSS
Snk_SpM2
AAC4

Snk_SpM3
AAC4

Source_SpM!
AAC!
Source_SpM2
AAC!

Paw vdeo
CDSS
Paw vdeo
CDSS
Paw audo
CDSS
Paw audo
CDSS
MMSlream TMC Supporl Mddleware
Camera
Mc
Monlor
Speaker
Nelwork

Figure 5. Example of MMTMOSM processing
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE

The MediaPlayer class is a user-friendly interface for streaming flow control.
MediaPlayer
class
The MediaTime class is derived from tml class of TMOSL. It represents a time condition for presentation of time-based media. It
specifies a streaming time value such as start time, stop time, and current time.
MediaTime
class
The MediaOdss class is an abstraction for any object that stores media data for buffering.
MediaOdss
class
The MediaProcessor class is an abstraction for any object that transforms media data. This processing includes encoding, decoding,
weaving and splitting.
MediaProcess
or class
The MediaSink class is an abstraction for any object that provides data to a stream destination. Once constructed, it functions
automatically.
MediaSink
class
The MediaSource class is an abstraction for any object that receives data from a stream source. Once constructed, it functions
automatically.
MediaSource
class
The MMTMOSM class is an abstraction for basic TMOSM services related to time-based media capturing, processing, and
presentation. MMStreamTMO programmers can register his/her own MMStreamTMO by defining MMTMOSM class.
MMTMOSM
class
The MMStreamTMO class is a template of MMStreamTMO. MMStreamTMO programmers can define his/her own TMO class by
inheriting MMStreamTMOclass.
MMStreamT
MOclass
API description API name
The MediaPlayer class is a user-friendly interface for streaming flow control.
MediaPlayer
class
The MediaTime class is derived from tml class of TMOSL. It represents a time condition for presentation of time-based media. It
specifies a streaming time value such as start time, stop time, and current time.
MediaTime
class
The MediaOdss class is an abstraction for any object that stores media data for buffering.
MediaOdss
class
The MediaProcessor class is an abstraction for any object that transforms media data. This processing includes encoding, decoding,
weaving and splitting.
MediaProcess
or class
The MediaSink class is an abstraction for any object that provides data to a stream destination. Once constructed, it functions
automatically.
MediaSink
class
The MediaSource class is an abstraction for any object that receives data from a stream source. Once constructed, it functions
automatically.
MediaSource
class
The MMTMOSM class is an abstraction for basic TMOSM services related to time-based media capturing, processing, and
presentation. MMStreamTMO programmers can register his/her own MMStreamTMO by defining MMTMOSM class.
MMTMOSM
class
The MMStreamTMO class is a template of MMStreamTMO. MMStreamTMO programmers can define his/her own TMO class by
inheriting MMStreamTMOclass.
MMStreamT
MOclass
API description API name
<Table 1. Main API of MMTMOSL>
MediaSource
SpM
AAC
(50ms)
USB
Cam
read media data
MediaSink
SpM
Monitor
processing
present
media data
AAC
(50ms)
MediaProcessor
SpM
push
MMTMOSM
MediaOdss1 MediaOdss2
get push get
AAC
(50ms)
USB Camera Capture Compress/
Decompress
Send across
the network
Monitor
sychStart
sychStart
sychStart

MediaTime

AAC1

AAC2

AAC3
Middleware level
Application level MMStreamTMOApp
Media Streaming Example
Register Activation
MediaSource
SpM
AAC
(50ms)
USB
Cam
read media data
MediaSink
SpM
Monitor
processing
present
media data
AAC
(50ms)
MediaProcessor
SpM
push
MMTMOSM
MediaOdss1 MediaOdss2
get push get
AAC
(50ms)
USB Camera Capture Compress/
Decompress
Send across
the network
Monitor
sychStart
sychStart
sychStart

MediaTime

AAC1

AAC1

AAC2

AAC2

AAC3

AAC3
Middleware level
Application level MMStreamTMOApp
Media Streaming Example
Register Activation
Figure 6. An illustration of local streaming application using MMTMOSM
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
z define MMStreamTMOApp class by inheriting
MMStreamTMO
- declare source and sink objects as instances of
MediaSource and MediaSink classes, respectively
- declare streaming time object as an instance of
MediaTime class
- define player object as an instance of MediaPlayer
class
z implement member functions of MMStreamTMOApp
class
- make MMStreamTMOApp constructor to register
source and sink objects to MMTMOSM by using a
member function of MMStreamTMO class, that is, int
RegisterSS(MediaSource1, MediaSource2,
MediaSink1, MediaSink2)
- make MMStreamTMOApp constructor to activate
MMTMOSM by using a member function of the
MMStreamTMO class, i.e. void
StartMMTMOSM(MediaTime,
MMTMOSM_start_time)
- implement SpM's and SvM's defined in the
MMStreamTMOApp class
Figure7 shows some parts of the implantation of the
above example application.
5. Multimedia synchronization scheme based
on TMO
A multimedia data stream transferred from a source
node to one or more sink nodes consists of consecutive
logical data units (LDUs). In the case of an audio stream,
LDUs are periodically observed and individual samples or
blocks of such samples transferred together from source to
sink(s). Similarly, in the case of a video data stream, one
LDU may typically correspond to a single video frame.
The data elements in a stream must be presented at the
sink node in manners exhibiting the same temporal
relationship that existed among the data elements when
they were captured at the source node. Timing the
movement or play of each data element to maintain a
class MMTMCSM publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme [[ MMTMCSM slarl lme
MedaBuller bullerMAX_BUFF| [[CDSS
MedaSource sourceMAX_SPC| [[SpM
MedaSnk snkMAX_SNK| [[SpM
MedaProcessor processorMAX_PPCC| [[SpM
MedaPlaver palver [[SvM
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale MMTMCSM
nl oelTMCD)
[[ Pelurn TMC d.
`
=nclude MMTMCSL.h
class MMSlreamApp publc MMSlreamTMC
{
prvale
MedaSource source! [[ source crealon
MedaSnk snk! [[ snk crealon
PlavSvM! TSvM! [[ slream plaver

publc
MMSlreamAppchar . char . char . lms. lms)
`
=nclude TMCSL.h
=nclude MMTMCSL.h
class MMSlreamTMC publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme
MMTMCSM mmTMCsm
MedaTme slreamnoTme! [[slreamno AAC speclcalon
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale lhs TMC
nl oelTMCD) [[ Pelurn TMC d
vod slarlMMTMCSMMedaSource &source!. MedaSnk &snk!.
MedaSource &source2. MedaSnk &snk2.
MedaTme &medaTme!. lms mmTMCsm_slarl_lme)
`
MedaSource Class
MedaProcessor Class
MedaSnk Class
MedaPlaver Class
MedaCdss Class
MMSlreamAppMMSlreamApp char TMC_name ) {
lms mmTMCsm_slarl_lme lm4_DCS_aoe5!000!000)
lms oale_slarl_lme lm4_DCS_aoe7!000!000)
PreoserSSmedaTme!. mmTMCsm_slarl_lme)
SlarlMMTMCSMsource!. snk!. NULL. NULL)
aclvaleTMC_name. TMC_slarl_lme!)
`
vod man)
{
SlarlTMCenone)

ManThrSleep)
`
class MMTMCSM publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme [[ MMTMCSM slarl lme
MedaBuller bullerMAX_BUFF| [[CDSS
MedaSource sourceMAX_SPC| [[SpM
MedaSnk snkMAX_SNK| [[SpM
MedaProcessor processorMAX_PPCC| [[SpM
MedaPlaver palver [[SvM
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale MMTMCSM
nl oelTMCD)
[[ Pelurn TMC d.
`
=nclude MMTMCSL.h
class MMSlreamApp publc MMSlreamTMC
{
prvale
MedaSource source! [[ source crealon
MedaSnk snk! [[ snk crealon
PlavSvM! TSvM! [[ slream plaver

publc
MMSlreamAppchar . char . char . lms. lms)
`
=nclude TMCSL.h
=nclude MMTMCSL.h
class MMSlreamTMC publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme
MMTMCSM mmTMCsm
MedaTme slreamnoTme! [[slreamno AAC speclcalon
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale lhs TMC
nl oelTMCD) [[ Pelurn TMC d
vod slarlMMTMCSMMedaSource &source!. MedaSnk &snk!.
MedaSource &source2. MedaSnk &snk2.
MedaTme &medaTme!. lms mmTMCsm_slarl_lme)
`
MedaSource Class
MedaProcessor Class
MedaSnk Class
MedaPlaver Class
MedaCdss Class
MMSlreamAppMMSlreamApp char TMC_name ) {
lms mmTMCsm_slarl_lme lm4_DCS_aoe5!000!000)
lms oale_slarl_lme lm4_DCS_aoe7!000!000)
PreoserSSmedaTme!. mmTMCsm_slarl_lme)
SlarlMMTMCSMsource!. snk!. NULL. NULL)
aclvaleTMC_name. TMC_slarl_lme!)
`
vod man)
{
SlarlTMCenone)

ManThrSleep)
`
Figure 7. Sample codes using MMTMOSL
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
desirable temporal relationship among the data elements in
a single data stream is called the intra-stream
synchronization.
The temporal relationship maintenance must also be
applied to related data streams. One of the most commonly
encountered situations where such requirement is evident
is where the simultaneous playback of audio and video
must be done with the so-called "lip synchronization". If
both media are not played in good synchronization, the
result will not be accepted to be of high quality. In general,
the inter-stream synchronization involves relationships
among the data of various kinds of media including
pointers, graphics, images, animations, text, audio and
video. In cases where multiple senders and multiple
receivers work simultaneously such as in multi-party non-
moderated video-conferencing applications, inter-stream
synchronization requirements are highly challenging. [8,
9]
A fundamental approach that holds great promises in
this area but has been minimally exploited by the research
community is the global time based coordination of
distributed actions [9, 10, 11]. The reason for slow
exploitation of this promising approach is due to the lack
of economically viable facilities for providing a good-
quality global time base. Recent advancements in the GPS
receiver technology as well as the emergence of hardware
and software mechanisms for high-quality synchronization
of real-time clocks in the computing platforms networked
to support cooperative distributed computing have
eliminated that part of the reason. [11]
In order to achieve inter-stream synchronization, the
global timing feature provided by TMOSM can be used
relatively easily by specifying temporal conditions in AAC
for Source_SpM and Sink_SpM at design time as follows:
for T = from TMO_START + S + F to TMO_START + P
every SY start-during (T, T +
OS_DELAY_FOR_STREAMING) finish-by T +
DEADLINE_FOR_STREAMING)
The S and P denote Start time and StoPtime, and SY
and F denote Intra-SYnchronization time and Pre-Fetch
time for initial buffering before streaming is started. An
SpM with the above AAC starts at the time TMO_START
+ S + F and is executed until the time TMO_START + P
with the period SY. Also, the SpM must start between T
and T + OS_DELAY_FOR_STREAMING and complete
its task within T + DEADLIN_FOR_STREAMING. T is a
time variable and TMO_START refers to the start time of
the TMO execution engine.
OS_DELAY_FOR_STREAMING is the time spent by OS
for activating an SpM. Each stream may have different
start time S. By start time of each stream, inter-
synchronization times can be specified. Also SY
represents intra-synchronization. Therefore, by giving
appropriate values to these times, the flow of streaming
can be controlled.
A well-synchronized global time base is essential for
proper and timely operation of the distributed real-time
multimedia system. Local clocks in distributed computing
nodes will diverge due to their difference in clock drift
rates. Each local clock should be adjusted periodically so
that at any given time, the difference among local clocks
of all participating distributed streaming nodes may be
bounded within a specific deviation. Local nodes have to
keep a clock within a bounded accuracy with respect to an
external source of standard time such as GPS. TMO
execution engine such as TMOSM uses the time
information from a high-resolution performance counter
that the hardware provides.
6. Summary
We proposed a framework for the development of real-
time multimedia applications based on the TMO
structuring scheme. The framework consists of MMStream
TMO, MMTMOSL, TMOSL, MMTMOSM, and the TMO
execution engine. The MMTMOSM performs reliable
stream I/O and transformation so that it can guarantee both
inter-media synchronization and intra-media
synchronization. Consequently, MMStream TMO provides
a way to model a distributed multimedia system easily and
to specify real-time characteristics of streaming precisely.
The quality of multimedia services is largely dependent on
accuracy of the inter-media and intra-media
synchronizations. Currently, we are applying our
framework for the development of more complex
multimedia applications. G
REFERENCES
[1] Kim, D.H, "An Extended Object Composition Model for
Distributed Multimedia Services," Proc. WORDS 2002 (IEEE
Workshop in Object-oriented Real-time Dependable Systems),
San Diego, Jan. 2002.
[2] Kim, D.H, A TMO-based Software Architecture for
Distributed Real-time Multimedia Processing, Proc, WORDS
2003(IEEE Workshop in Object-oriented Real-time Dependable
Systems), Guadalajara, Mexico, Jan. 2003.
[3] Sitaram, D. and Dan, A., Multimedia Servers: Application,
Environments, and Design, Morgan Kaufmann Publishers, San
Francisco, 2000.
[4] IONA Technologies, Plc., Siemens-Nixdorf AG, and Lucent
Technologies, Inc., "Control and Management of Audio/Video
Streams", OMG RFPSubmission, Supported by Fore Systems,
Inc, May 7, 1997.
[5] Kim, K and Kopetz, H., "A Real-Time Object Model RTO.k
and an Experimental Investigation of Its Potentials", Proc.
COMPSAC'94, Nov. 1994, Taipei, pp.392-402.
[6] Kim, K., "Toward New-Generation Object-Oriented Real-
Time Software and System Engineering", SERI Journal, Vol. 1,
No. 1, Korea, January 1997, pp.1-13.
[7] Kim, K., "Object-Oriented Real-Time Distributed
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
Programming and Support Middleware", Keynote paper, Proc.
ICPADS 2000, Japan, July 2000, pp.10-20
[8] T.D.C. Little, A. Ghafoor, C.Y.R. Chen, C.S. Chang, and P.B.
Berra, "Multimedia Synchronization", IEEE Data Engineering
Bulletin, Vol. 14, No.3, September 1991, pp.26-35
[9] Akyildiz, I.F., and Yen, W., "Multimedia group
synchronization protocols for integrated services networks",
IEEE Journal of selected areas in communications, Vol. 14, No.
1, Jan. 1996, pp. 162-173.
[10] Kopetz, H. and Kim, K.H., "Temporal Uncertainties in
Interaction among Real-Time Objects", Proc. IEEE CS 9th Symp.
on Reliable Distributed Systems, Huntsville, AL, Oct. 1990,
pp.165-174.
[11] Kopetz, H., 'Real-Time Systems: Design Principles for
Distributed Embedded Applications', Kluwer Academic Pub.,
ISBN: 0-7923-9894-7, Boston, 1997
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE

Vous aimerez peut-être aussi