Académique Documents
Professionnel Documents
Culture Documents
olobal
lme
olobal
lme
ApplcalonMMSlream TMC2)
Source_SpM Trans_SpM Snk_SpM
MMSlream TMC Supporl AP
AAC! AAC2 AAC3
A[V
CDSS
Translormed
A[V CDSS
SvM
Mddleware Communcalon Servce Manaoer
ApplcalonMMSlream TMC!)
Source_SpM Trans_SpM Snk_SpM
MMSlream TMC Supporl AP
AAC! AAC2 AAC3
A[V
CDSS
Translormed
A[V CDSS
SvM
Mddleware Communcalon Servce Manaoer
Communcalon Servce Messaoe
slream
llow
TMCSM KELX[PT. TMCLnux
olobal
lme
olobal
lme
ApplcalonMMSlream TMC2)
Figure 4. The basic structure of MMTMOSM
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
MMTMOSM are shown in Figure 4.
z ODS: it is used to store stream data for buffering so as
to synchronize intra-media stream data. Also, ODS is
utilized as a connection point for SpM filters that take
part in streaming. Access to ODS is performed by
reading and writing operations of SpM's periodically.
z Source_SpM: It reads multimedia data from media
source such as capture device, disk, or network by
AAC automatically. SpMs start and stop streaming by
AAC specifications in a timely fashion to provide
inter-media synchronization and intra-media
synchronization.
z Proc_SpM: It performs transformation of multimedia
data received from source. The functions performed
here include encoding, decoding, weaving and
splitting.
z Sink_SpM: It sends transformed multimedia data to a
media destination such as monitor, speaker, disk or
network. Especially, Sink_SpM serves to send or
receive multimedia data through UDP socket.
z SvM: MMTMOSM provides SvM's that handle
communication services and session service events.
SvM converts another MMTMOSM service message
into service event and transmits this event to the
application. Vice-versa, it converts application events
into network messages and transmits this message to
the appropriate middleware.
MMTMOSM provides these basic functions to support
inter-media synchronization and intra-media
synchronization.
Figure 5 shows an example of MMTMOSM processing
structure and stream flow. Source_SpM1 and
Souce_SpM2 are activated to capture raw media data from
the USB camera and Microphone. Received media data
from source is encoded by Encode_SpM. Encoded media
data is then sent to the network by Sink_SpM1.
Simultaneously, Sink_SpM2 and Sink_SpM3 present
video and audio.
4. Structure of MMTMOSL API
In this section, we describe MMTMOSL as a high level
API supported by MMTMOSM. Application developers
can develop distributed multimedia applications based on
TMO using MMTMOSL API. The purpose of designing
MMTMOSL is as follows.
z Support acquisition, transformation, and
transportation of raw media
z Provision of MMTMOSM service interface
The MMTMOSL consists of classes such as
MMStreamTMO class, MMTMOSM class, MediaSource
class, MediaSink class, MediaProcessor class, MediaTime
class, and MediaPlayer class.
Table 1 illustrates the main MMTMOSL API.
Figure 6 illustrates a local video streaming application
using three SpM's: MediaSource SpM, MediaProcessor
SpM and MediaSink SpM. While the MediaSource SpM
reads raw video data from USB camera at every 50ms, the
MediaSink SpM writes data from MediaProcessor SpM to
display devices. The MediaProcessor SpM works for video
processing such as chroma-key, alpha-blending and
graphics overlay. The MediaProcessor SpM must be in
charge of compressing the raw video data to send a video
to the remote node through the network.
To facilitate the developer to implement the above
application based on TMO, the MMTMOSL API can be
used as follows:
Encode_SpM Snk_SpM!
SvM
Mddleware Communcalon Servce Manaoer
AAC2 AAC3
Encoded Meda
Frame CDSS
Snk_SpM2
AAC4
Snk_SpM3
AAC4
Source_SpM!
AAC!
Source_SpM2
AAC!
Paw vdeo
CDSS
Paw audo
CDSS
MMSlream TMC Supporl Mddleware
Camera
Mc
Monlor
Speaker
Nelwork
Encode_SpM Snk_SpM!
SvM
Mddleware Communcalon Servce Manaoer
AAC2 AAC3
Encoded Meda
Frame CDSS
Snk_SpM2
AAC4
Snk_SpM3
AAC4
Source_SpM!
AAC!
Source_SpM2
AAC!
Paw vdeo
CDSS
Paw vdeo
CDSS
Paw audo
CDSS
Paw audo
CDSS
MMSlream TMC Supporl Mddleware
Camera
Mc
Monlor
Speaker
Nelwork
Figure 5. Example of MMTMOSM processing
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
The MediaPlayer class is a user-friendly interface for streaming flow control.
MediaPlayer
class
The MediaTime class is derived from tml class of TMOSL. It represents a time condition for presentation of time-based media. It
specifies a streaming time value such as start time, stop time, and current time.
MediaTime
class
The MediaOdss class is an abstraction for any object that stores media data for buffering.
MediaOdss
class
The MediaProcessor class is an abstraction for any object that transforms media data. This processing includes encoding, decoding,
weaving and splitting.
MediaProcess
or class
The MediaSink class is an abstraction for any object that provides data to a stream destination. Once constructed, it functions
automatically.
MediaSink
class
The MediaSource class is an abstraction for any object that receives data from a stream source. Once constructed, it functions
automatically.
MediaSource
class
The MMTMOSM class is an abstraction for basic TMOSM services related to time-based media capturing, processing, and
presentation. MMStreamTMO programmers can register his/her own MMStreamTMO by defining MMTMOSM class.
MMTMOSM
class
The MMStreamTMO class is a template of MMStreamTMO. MMStreamTMO programmers can define his/her own TMO class by
inheriting MMStreamTMOclass.
MMStreamT
MOclass
API description API name
The MediaPlayer class is a user-friendly interface for streaming flow control.
MediaPlayer
class
The MediaTime class is derived from tml class of TMOSL. It represents a time condition for presentation of time-based media. It
specifies a streaming time value such as start time, stop time, and current time.
MediaTime
class
The MediaOdss class is an abstraction for any object that stores media data for buffering.
MediaOdss
class
The MediaProcessor class is an abstraction for any object that transforms media data. This processing includes encoding, decoding,
weaving and splitting.
MediaProcess
or class
The MediaSink class is an abstraction for any object that provides data to a stream destination. Once constructed, it functions
automatically.
MediaSink
class
The MediaSource class is an abstraction for any object that receives data from a stream source. Once constructed, it functions
automatically.
MediaSource
class
The MMTMOSM class is an abstraction for basic TMOSM services related to time-based media capturing, processing, and
presentation. MMStreamTMO programmers can register his/her own MMStreamTMO by defining MMTMOSM class.
MMTMOSM
class
The MMStreamTMO class is a template of MMStreamTMO. MMStreamTMO programmers can define his/her own TMO class by
inheriting MMStreamTMOclass.
MMStreamT
MOclass
API description API name
<Table 1. Main API of MMTMOSL>
MediaSource
SpM
AAC
(50ms)
USB
Cam
read media data
MediaSink
SpM
Monitor
processing
present
media data
AAC
(50ms)
MediaProcessor
SpM
push
MMTMOSM
MediaOdss1 MediaOdss2
get push get
AAC
(50ms)
USB Camera Capture Compress/
Decompress
Send across
the network
Monitor
sychStart
sychStart
sychStart
MediaTime
AAC1
AAC2
AAC3
Middleware level
Application level MMStreamTMOApp
Media Streaming Example
Register Activation
MediaSource
SpM
AAC
(50ms)
USB
Cam
read media data
MediaSink
SpM
Monitor
processing
present
media data
AAC
(50ms)
MediaProcessor
SpM
push
MMTMOSM
MediaOdss1 MediaOdss2
get push get
AAC
(50ms)
USB Camera Capture Compress/
Decompress
Send across
the network
Monitor
sychStart
sychStart
sychStart
MediaTime
AAC1
AAC1
AAC2
AAC2
AAC3
AAC3
Middleware level
Application level MMStreamTMOApp
Media Streaming Example
Register Activation
Figure 6. An illustration of local streaming application using MMTMOSM
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
z define MMStreamTMOApp class by inheriting
MMStreamTMO
- declare source and sink objects as instances of
MediaSource and MediaSink classes, respectively
- declare streaming time object as an instance of
MediaTime class
- define player object as an instance of MediaPlayer
class
z implement member functions of MMStreamTMOApp
class
- make MMStreamTMOApp constructor to register
source and sink objects to MMTMOSM by using a
member function of MMStreamTMO class, that is, int
RegisterSS(MediaSource1, MediaSource2,
MediaSink1, MediaSink2)
- make MMStreamTMOApp constructor to activate
MMTMOSM by using a member function of the
MMStreamTMO class, i.e. void
StartMMTMOSM(MediaTime,
MMTMOSM_start_time)
- implement SpM's and SvM's defined in the
MMStreamTMOApp class
Figure7 shows some parts of the implantation of the
above example application.
5. Multimedia synchronization scheme based
on TMO
A multimedia data stream transferred from a source
node to one or more sink nodes consists of consecutive
logical data units (LDUs). In the case of an audio stream,
LDUs are periodically observed and individual samples or
blocks of such samples transferred together from source to
sink(s). Similarly, in the case of a video data stream, one
LDU may typically correspond to a single video frame.
The data elements in a stream must be presented at the
sink node in manners exhibiting the same temporal
relationship that existed among the data elements when
they were captured at the source node. Timing the
movement or play of each data element to maintain a
class MMTMCSM publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme [[ MMTMCSM slarl lme
MedaBuller bullerMAX_BUFF| [[CDSS
MedaSource sourceMAX_SPC| [[SpM
MedaSnk snkMAX_SNK| [[SpM
MedaProcessor processorMAX_PPCC| [[SpM
MedaPlaver palver [[SvM
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale MMTMCSM
nl oelTMCD)
[[ Pelurn TMC d.
`
=nclude MMTMCSL.h
class MMSlreamApp publc MMSlreamTMC
{
prvale
MedaSource source! [[ source crealon
MedaSnk snk! [[ snk crealon
PlavSvM! TSvM! [[ slream plaver
publc
MMSlreamAppchar . char . char . lms. lms)
`
=nclude TMCSL.h
=nclude MMTMCSL.h
class MMSlreamTMC publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme
MMTMCSM mmTMCsm
MedaTme slreamnoTme! [[slreamno AAC speclcalon
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale lhs TMC
nl oelTMCD) [[ Pelurn TMC d
vod slarlMMTMCSMMedaSource &source!. MedaSnk &snk!.
MedaSource &source2. MedaSnk &snk2.
MedaTme &medaTme!. lms mmTMCsm_slarl_lme)
`
MedaSource Class
MedaProcessor Class
MedaSnk Class
MedaPlaver Class
MedaCdss Class
MMSlreamAppMMSlreamApp char TMC_name ) {
lms mmTMCsm_slarl_lme lm4_DCS_aoe5!000!000)
lms oale_slarl_lme lm4_DCS_aoe7!000!000)
PreoserSSmedaTme!. mmTMCsm_slarl_lme)
SlarlMMTMCSMsource!. snk!. NULL. NULL)
aclvaleTMC_name. TMC_slarl_lme!)
`
vod man)
{
SlarlTMCenone)
ManThrSleep)
`
class MMTMCSM publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme [[ MMTMCSM slarl lme
MedaBuller bullerMAX_BUFF| [[CDSS
MedaSource sourceMAX_SPC| [[SpM
MedaSnk snkMAX_SNK| [[SpM
MedaProcessor processorMAX_PPCC| [[SpM
MedaPlaver palver [[SvM
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale MMTMCSM
nl oelTMCD)
[[ Pelurn TMC d.
`
=nclude MMTMCSL.h
class MMSlreamApp publc MMSlreamTMC
{
prvale
MedaSource source! [[ source crealon
MedaSnk snk! [[ snk crealon
PlavSvM! TSvM! [[ slream plaver
publc
MMSlreamAppchar . char . char . lms. lms)
`
=nclude TMCSL.h
=nclude MMTMCSL.h
class MMSlreamTMC publc TMCBaseClass
{
prvale
char TMC_NameMAX_TMCNAME|
nl d
lms TMC_slarl_lme
MMTMCSM mmTMCsm
MedaTme slreamnoTme! [[slreamno AAC speclcalon
publc
vod aclvalechar lmo_name. lms TMC_slarl_lme)
[[ Peosler and aclvale lhs TMC
nl oelTMCD) [[ Pelurn TMC d
vod slarlMMTMCSMMedaSource &source!. MedaSnk &snk!.
MedaSource &source2. MedaSnk &snk2.
MedaTme &medaTme!. lms mmTMCsm_slarl_lme)
`
MedaSource Class
MedaProcessor Class
MedaSnk Class
MedaPlaver Class
MedaCdss Class
MMSlreamAppMMSlreamApp char TMC_name ) {
lms mmTMCsm_slarl_lme lm4_DCS_aoe5!000!000)
lms oale_slarl_lme lm4_DCS_aoe7!000!000)
PreoserSSmedaTme!. mmTMCsm_slarl_lme)
SlarlMMTMCSMsource!. snk!. NULL. NULL)
aclvaleTMC_name. TMC_slarl_lme!)
`
vod man)
{
SlarlTMCenone)
ManThrSleep)
`
Figure 7. Sample codes using MMTMOSL
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
desirable temporal relationship among the data elements in
a single data stream is called the intra-stream
synchronization.
The temporal relationship maintenance must also be
applied to related data streams. One of the most commonly
encountered situations where such requirement is evident
is where the simultaneous playback of audio and video
must be done with the so-called "lip synchronization". If
both media are not played in good synchronization, the
result will not be accepted to be of high quality. In general,
the inter-stream synchronization involves relationships
among the data of various kinds of media including
pointers, graphics, images, animations, text, audio and
video. In cases where multiple senders and multiple
receivers work simultaneously such as in multi-party non-
moderated video-conferencing applications, inter-stream
synchronization requirements are highly challenging. [8,
9]
A fundamental approach that holds great promises in
this area but has been minimally exploited by the research
community is the global time based coordination of
distributed actions [9, 10, 11]. The reason for slow
exploitation of this promising approach is due to the lack
of economically viable facilities for providing a good-
quality global time base. Recent advancements in the GPS
receiver technology as well as the emergence of hardware
and software mechanisms for high-quality synchronization
of real-time clocks in the computing platforms networked
to support cooperative distributed computing have
eliminated that part of the reason. [11]
In order to achieve inter-stream synchronization, the
global timing feature provided by TMOSM can be used
relatively easily by specifying temporal conditions in AAC
for Source_SpM and Sink_SpM at design time as follows:
for T = from TMO_START + S + F to TMO_START + P
every SY start-during (T, T +
OS_DELAY_FOR_STREAMING) finish-by T +
DEADLINE_FOR_STREAMING)
The S and P denote Start time and StoPtime, and SY
and F denote Intra-SYnchronization time and Pre-Fetch
time for initial buffering before streaming is started. An
SpM with the above AAC starts at the time TMO_START
+ S + F and is executed until the time TMO_START + P
with the period SY. Also, the SpM must start between T
and T + OS_DELAY_FOR_STREAMING and complete
its task within T + DEADLIN_FOR_STREAMING. T is a
time variable and TMO_START refers to the start time of
the TMO execution engine.
OS_DELAY_FOR_STREAMING is the time spent by OS
for activating an SpM. Each stream may have different
start time S. By start time of each stream, inter-
synchronization times can be specified. Also SY
represents intra-synchronization. Therefore, by giving
appropriate values to these times, the flow of streaming
can be controlled.
A well-synchronized global time base is essential for
proper and timely operation of the distributed real-time
multimedia system. Local clocks in distributed computing
nodes will diverge due to their difference in clock drift
rates. Each local clock should be adjusted periodically so
that at any given time, the difference among local clocks
of all participating distributed streaming nodes may be
bounded within a specific deviation. Local nodes have to
keep a clock within a bounded accuracy with respect to an
external source of standard time such as GPS. TMO
execution engine such as TMOSM uses the time
information from a high-resolution performance counter
that the hardware provides.
6. Summary
We proposed a framework for the development of real-
time multimedia applications based on the TMO
structuring scheme. The framework consists of MMStream
TMO, MMTMOSL, TMOSL, MMTMOSM, and the TMO
execution engine. The MMTMOSM performs reliable
stream I/O and transformation so that it can guarantee both
inter-media synchronization and intra-media
synchronization. Consequently, MMStream TMO provides
a way to model a distributed multimedia system easily and
to specify real-time characteristics of streaming precisely.
The quality of multimedia services is largely dependent on
accuracy of the inter-media and intra-media
synchronizations. Currently, we are applying our
framework for the development of more complex
multimedia applications. G
REFERENCES
[1] Kim, D.H, "An Extended Object Composition Model for
Distributed Multimedia Services," Proc. WORDS 2002 (IEEE
Workshop in Object-oriented Real-time Dependable Systems),
San Diego, Jan. 2002.
[2] Kim, D.H, A TMO-based Software Architecture for
Distributed Real-time Multimedia Processing, Proc, WORDS
2003(IEEE Workshop in Object-oriented Real-time Dependable
Systems), Guadalajara, Mexico, Jan. 2003.
[3] Sitaram, D. and Dan, A., Multimedia Servers: Application,
Environments, and Design, Morgan Kaufmann Publishers, San
Francisco, 2000.
[4] IONA Technologies, Plc., Siemens-Nixdorf AG, and Lucent
Technologies, Inc., "Control and Management of Audio/Video
Streams", OMG RFPSubmission, Supported by Fore Systems,
Inc, May 7, 1997.
[5] Kim, K and Kopetz, H., "A Real-Time Object Model RTO.k
and an Experimental Investigation of Its Potentials", Proc.
COMPSAC'94, Nov. 1994, Taipei, pp.392-402.
[6] Kim, K., "Toward New-Generation Object-Oriented Real-
Time Software and System Engineering", SERI Journal, Vol. 1,
No. 1, Korea, January 1997, pp.1-13.
[7] Kim, K., "Object-Oriented Real-Time Distributed
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE
Programming and Support Middleware", Keynote paper, Proc.
ICPADS 2000, Japan, July 2000, pp.10-20
[8] T.D.C. Little, A. Ghafoor, C.Y.R. Chen, C.S. Chang, and P.B.
Berra, "Multimedia Synchronization", IEEE Data Engineering
Bulletin, Vol. 14, No.3, September 1991, pp.26-35
[9] Akyildiz, I.F., and Yen, W., "Multimedia group
synchronization protocols for integrated services networks",
IEEE Journal of selected areas in communications, Vol. 14, No.
1, Jan. 1996, pp. 162-173.
[10] Kopetz, H. and Kim, K.H., "Temporal Uncertainties in
Interaction among Real-Time Objects", Proc. IEEE CS 9th Symp.
on Reliable Distributed Systems, Huntsville, AL, Oct. 1990,
pp.165-174.
[11] Kopetz, H., 'Real-Time Systems: Design Principles for
Distributed Embedded Applications', Kluwer Academic Pub.,
ISBN: 0-7923-9894-7, Boston, 1997
Proceedings of the Ninth IEEE International Workshop on Object-Oriented Real-Time Dependable Systems (WORDS03)
0-7695-2054-5/04 $20.00 2004 IEEE