Vous êtes sur la page 1sur 21

4

Audio and Video in Depth


It could certainly be said that streaming media makes up the majority of what goes on, on the Internet. From your favorite video portal to internet radio, the things we watch or listen to are almost all streamed using a media server of some sort. Although many of these providers use a commercial product like FMS or Apple s !arwin streaming server, a few of them stream with "ed#. As an aside, the "ed# development team is aware that Facebook and $ahoo labs have used "ed# and may have it deployed in their datacenters. In this chapter we will cover% Managing streams Stream listeners Supported codecs Supported file types &ypassing firewalls

Managing streams
'he act of managing streams on a media server means to publish, play, pause, stop, and delete in most cases. Since "ed# is completely open to whatever we can create, management of our streams may take on a multitude of variations based on these concepts. Stream management is handled via different classes depending upon where the management function is performed. (n the client)side when using Flash, management of streams is handled in ActionScript via the *etStream class. 'he *etStream class is very well documented across the Internet and in Adobe literature, so herein we will only cover the features applicable to "ed# streaming. "ed# provides several classes for stream management on the server)side, two very important classes are the +lient&roadcastStream and ,laylistSubscriberStream both of which descend from the IStream interface.

'he diagram above shows the relationship with the IStream interface for publishers. As a point for reference, most classes with an -I. prefi/ will consist of an interface in the "ed# codebase.

'his diagram shows the relationship with the IStream interface for subscribers.

What's in a name?
A stream name is used as a locator for data either currently being streamed or which will be streamed at a future time. !ata streaming in this instance means any digital information, whether it be audio, video, te/t, images, or anything else which can be encoded into bytes and sent over the network. Stream names must be uni0ue per scope, we cannot have two streams with the name of -mystream. for instance. 'he name selected for our live stream can pretty much be anything which has a string representation and may be a valid file name on the servers operating system. 'his essentially means
2

letters, numbers, and a small set of punctuation characters1 the underscore -2. and dash -). are usually safe in terms of cross)platform support. For prerecorded streams which are often referred to as video on demand 34(!5, the stream name would be the name used for the recorded file 3i.e. mystream.flv5. *aming collisions may be prevented in several ways, such as 0uerying the application for the current list of stream names, by adding a time value suffi/, or using a sever controlled se0uence number. 'o return a list of stream names for a given scope name, we can utili6e this method.
public ArrayCollection<String> getScopeStreamList(String scopeName) { ArrayCollection<String> streams = new ArrayCollection<String>(); IScope target = null; i (scopeName == null) { target = !e"#$getConnectionLocal()$getScope(); scopeName = target$getName(); % else { target = Scope&tils$resol'eScope(scope( scopeName); % List<String> streamNames = get)roa"castStreamNames(target); or (String name * streamNames) { streams$a""(name); % return streams; %

'he method returns a collection of all the stream names for a scope and it also contains a little bit of error prevention as well. 'he getScopeStream7ist method would be best placed within our scope handler or application adapter class. +lient)side ActionScript used to re0uest the stream list may look like this%
nc$call(+getScopeStreamList+( new !espon"er(onStreamList)( +streams+);

'his *et+onnection call uses a "esponder to handle the callback from the server which will contain the list of streams currently broadcasting live in the -streams. scope. 'he client handler method may be implemented as shown below.
public unction onStreamList(list*ArrayCollection)*'oi" { trace(+Streams* + , list$lengt-); i (list$lengt- > .) {
3

'ar stream*String = list$remo'eItemAt(.) as String; trace(+/etting + , stream , + rom list+); % %

8hen working with streams in "ed#, it is important to remember that some methods do not return what we might e/pect. 'he +lient&roadcastStream class is the default stream implementation and it e/poses two methods for working with stream names. 9. get,ublished*ame : "eturns what would normally be e/pected as the stream name, a name such as -live. or -mystream.. ;. get*ame : 'his method on the other hand, returns an internal name which resembles an identifier more than it would a regular stream name. An e/ample return from this method looks like this% 9#ccb<f;);a=>)?@>a)a=;a) ff?A>@Afba<#.

Publishing in detail
,ublishing, is the act of sending our content to a server for subscribers to consume or to record to a file. (f course by content we mean pretty much anything that can be converted into bytes using codecs, as covered later on in this +hapter. "ed# supports several different types of publishing by default% 7ive : Analogous to e/periencing something in real)time, this stream is not pre)recorded. An e/ample of live stream would be a video teleconference between to office locations or a video phone call. 4ideo on demand : Similar to watching a television show or movie from a !4!. 'his media occurred in the past and is stored in a compatible digital format which may be played back upon re0uest. 4(! content may be fast forwarded, rewound, or played from any available position in the file. ,laylist ) A list of streams or video files that have been grouped together as one stream. 'he ne/t item in the group is played after the previous one is complete. 'his may be set to repeat without client input.

Publishing a live stream


7ive stream publishing is a very important capability for a media server to support. 'his feature allows us to broadcast anything in near real)time to as many viewers as we can support with our installation. Since Flash ,layer version @, it has been incredibly simple to publish content. Following these five steps gives us the ability to stream to the world. 9. Simply select our audio and video sources
4

'ar mic*0icrop-one = 0icrop-one$get0icrop-one(); 'ar camera*Camera = Camera$getCamera();

;. +onnect to a media server using *et+onnection


'ar nc*NetConnection = new NetConnection(); nc$connect(1rtmp*22local-ost2myapp3);

B. +reate a *etStream from the *et+onnection


'ar ns*NetStream = new NetStream(nc);

?. Attach the audio and video source to the *etStream


ns$attac-Au"io(mic); ns$attac-Camera(camera);

#. ,ublish the stream


ns$publis-(1mystream3( 1li'e3);

'o stop publishing our stream we simply issue this command.


nc$publis-( alse);

,roviding the best 0uality stream possible for our subscribers is a matter of our available bandwidth and the encoder settings that we select, however in most cases the default values that Flash ,layer selects are sufficient. If we have plenty of bandwidth available and would prefer high 0uality using standard dimensions, this is how we would configure our camera.
camera$set0o"e(45.( 56.( 54) camera$set7uality(.( 8.);

'he default values for setMode are a width of 9@= pi/els, a height of 9;= pi/els, and 9# frames per second. &e aware that the Flash ,layer will not e/ceed the ma/imum dimensions allowed by the camera hardware, if our values are too high they will be automatically reset to the highest available value. For setCuality the default values are = for 0uality and 9@B<? for bandwidth. 'he valid range for 0uality is from = to 9==, in which 9 represents the lowest 0uality with ma/imum compression and 9== is highest 0uality with no compression. If 0uality is set to =, picture 0uality will be varied to prevent e/ceeding available bandwidth. 'he bandwidth property is the amount of bandwidth that the video may utili6e in byte per second. Setting bandwidth to 6ero will cause the Flash ,layer to consume as much bandwidth possible to maintain the 0uality selected.

A third option to when publishing live is the key frame interval, or the number of frames which are to be sent without being manipulated based on previous frames by the compression algorithm. 4alid values are from 9 which means every frame is a key frame to ?< which means every ?> frames there is one full key frame. 'he lower the interval value, the higher our bandwidth re0uirements will be since we will be sending a smaller number of compressed frames. In our e/ample code we specified an fps value of ;B so a key frame interval of > should provide good 0uality with regard to our situation.
camera$set9ey:rameInter'al(;);

8hen publishing a live stream using a Flash)based client, we only have the ability to select from two of the seven available audio codecs. 'he selection must be performed prior to attaching the microphone to the *etStream. If we want to use Spee/ we must specify it as shown in the e/ample, if we do not specify Spee/ then the default which is *ellyMoser will be selected for us.
22get t-e au"io source 'ar mic*0icrop-one = 0icrop-one$get0icrop-one(); 22select Spee< au"io enco"ing mic$co"ec = Soun"Co"ec$S=>>?; 22set to t-e -ig-est @uality mic$enco"e7uality = A.; 22attac- t-e microp-one to our NetStream ns$attac-Au"io(mic);

"emember that we are not limited only to audio and video when publishing, we could just as easily publish events via a shared object or a data frame call instead. An e/cellent e/ample of this kind of publishing is the -shared ball. demo, which could be considered a multiple publisher application.

Publishing a VOD stream


Dven more simple than publishing a live broadcast stream, is providing prerecorded media files for subscribers. 'he default way to accomplish this is to create a -streams. directory within our application directory and place our media files within it. 'he second way is to create the media file during our live publish, by indicating to the server that we want to record our stream.
nc$publis-(1mystream3( 1recor"3);

'his re0uest will create a file named -mystream.flv. in the applications -streams. directory using the default "ed# streaming classes. 8hile the stream is being published it may also be re0uested as a live stream by subscribers without interfering with the recording processes. 'here are times when we may want to append to a previously recorded media file for our stream, this is accomplished by sending the append type in the *etStream publish.
nc$publis-(1mystream3( 1appen"3);

It is important to understand that the type of publish selected will determine the handling of any e/isting media files for the stream. 'he following publish rules apply% If -live. is specified, no file is created. If -record. is selected the file will be created if it does not e/ist. If -live. or -record. are specified and a file already e/ists for the stream, the file will be deleted. If -append. is re0uested and the file e/ists, the file will be appended to. If -append. is selected and no file e/ists, the type will be switched internally on the server)side to -record.

"ecording media on a "ed# can only be in two forms by default, but this does not prevent us from creating our own media writer implementations for other media types like M,B. &oth of the recorded media types are written to an F74 container file and may consist of stream data encoded with any of the supported codecs which are compatible with this container format. *ormally the media files will contain Sorenson video with *ellyMoser audio, but other codecs are available such as h.;@? or (n; 4,@ video using e/isting media encoders. 'he Flash Media 7ive Dncoder 3FM7D5 is one such encoder that will work with "ed#, but the Adobe s license disallows its use with anything other than FMS.

Playlists
8hen we have a set of media files to play back as a group, we should use a play list. ,lay lists may be created on the client)side or server)side. 'o create a server)side play list in our application, we need to collect the names of the target media files and then decide upon the conditions for our play lists creation. ,lay lists are usually created when a particular -room. or scope is created so that the content related with its name or description is played back to subscribers in a preset order. ,lay list streams that are created on the server)side are treated as live streams, in that they do not have a known

fi/ed duration and seek operations would return une/pected results beyond the first play item. Esing a method which creates a play list stream in a room, we can reuse the code for any or all of our rooms. 'his method could be placed in our ApplicationAdapter to create the play list.
pri'ate 'oi" create=layList(IScope room) { 22get a list o all t-e currently publis-ing streams List<String> streamNames = get)roa"castStreamNames(room); 22looB or our playlist stream in t-e list i (CstreamNames$contains(+myplaylist+)) { ISer'erStream ser'erStream = Stream&tils$createSer'erStream(room( +myplaylist+); Simple=layItem item = new Simple=layItem(); item$setName(+streamA+); Simple=layItem item5 = new Simple=layItem(); item5$setName(+stream5+); ser'erStream$a""Item(item); ser'erStream$a""Item(item5); ser'erStream$start(); ser'erStream$set!epeat(true); % streamNames$clear(); }

If we call our create,lay7ist method from the application scope, -myapp. in this e/ample we could play the stream on the client)side with this method.
ns$play(1myplaylist3);

Since we set repeat to true, the videos will loop over and over until we stop the stream or restart the server. 'he play items specified in the e/ample would e/ist within our application at the following locations on the hard drive.
C*Dre"#DwebappsDmyappDstreamsDstreamA$ l' C*Dre"#DwebappsDmyappDstreamsDstream5$ l'

ubs!ribing t" a stream


Subscribing to a stream consists of the act of consuming or playing a stream. 8hile there are a multitude of options we can configure when playing a stream, most of the defaults
#

are good enough for generali6ed usage. 8e will again be utili6ing the *etStream class to accomplish our tasks, but in this section we will be a streaming media consumer instead of a producer of the media. 'o publish a stream we had to attach our audio and video sources to a *etStream. For subscription it is a bit simpler in that we need only attach the *etStream to a 4ideo object and then place that object in the display list. As with much of the stuff in Flash, this is not the whole story because we can certainly do a lot more with our incoming data than we will cover here such as volume control.
'ar 'i"eo*Ei"eo = new Ei"eo(); 'i"eo$attac-Ei"eo(ns); ns$play(1mystream3); a""C-il"('i"eo);

In the block of code, we have created a 4ideo object and attached our *etStream to it. 8e then re0uested playback of our stream and added the 4ideo object to the display list, which will allow it to be rendered in the Flash ,layer. *ow that our stream is playing, we have additional functions available on the *etStream to control the streamed data. pause : ,auses playback until we indicate that we want playback to proceed. resume : 'his causes the pause to be canceled and resumes playback. toggle,ause : A short)cut method to switch between pause or resume. seek : Moves the play head to the key frame closest to the specified time. 'he time value supplied to seek is offset from the start of the stream. receiveAudio : Indicates whether or not we will to play the audio data in the stream. Setting this to false, basically mutes the audio. receive4ideo : Indicates whether or not we will render the video images contained in the stream. send : Sends an AMF data object to all the stream subscribers. 'his is intended to be used only by a publisher, but in "ed# it has some functionality available to subscribers. close : +loses all the communication over the *etStream, effectively stopping playback.

(ne last thing to consider when attempting to obtain the best e/perience for our subscribers involves tweaking two properties of the *etStream. buffer7ength : 'he amount in seconds of data that we will buffer for playback. 'his value should be higher for slow connections and lower for
$

consumers with more available bandwidth. A good starting value is around # seconds. buffer'ime : 'he time in seconds to delay handling of buffered stream data. For high bandwidth networks playing back live streams, a value of =.9 is often possible and provides outstanding responsiveness. In most environments this should be set at 9 to ; seconds.

tream listeners
Stream listeners provide low level access to a stream and its individual data packets via notifications. Faving this level of access to a streams data packets provides a means to manipulate the data in any way we can imagine, anything from creating image snapshots of the live stream to transcoding the audio into another format. 'he only limitation is that we have a codec that is capable of decoding the streams raw byte data. 'o create a stream listener we need only implement one interface and one method, 0uite easy when compared to other media server solutions. 'he following listener simply prints the type of packet received to the console.
public class 0yStreamListener implements IStreamListener { public 'oi" pacBet!ecei'e"(I)roa"castStream stream( IStream=acBet pacBet) { byte "ataFype = pacBet$getGataFype(); i ("ataFype == Constants$FH=>IA&GIJIGAFA) { System$out$println(1Au"io pacBet recei'e"3); % else i ("ataFype == Constants$FH=>IEIG>JIGAFA) { System$out$println(1Ei"eo pacBet recei'e"3); % % %

8hile this code e/ample may not seem useful, it would help during debugging to inform us that audio and video data is being streamed. For our class to receive the packet notifications on a stream, it is advised that a listener be added when the stream&roadcastStart method of our application is called as shown below.
KJ'erri"e public 'oi" stream)roa"castStart(I)roa"castStream stream) { 22a""s a listener to t-e stream t-at Lust starte" stream$a""StreamListener(new 0yStreamListener()); %

%&

A best practice in a standard application would be to keep track of our listeners and then remove them when the broadcasting stream is finished. 'his is the way it could theoretically be done.
pri'ate IStreamListener myListener = null; KJ'erri"e public 'oi" stream)roa"castStart(I)roa"castStream stream) { 22create our listener myListener = new 0yStreamListener(); 22a""s a listener to t-e stream t-at Lust starte" stream$a""StreamListener(myListener); % KJ'erri"e public 'oi" stream)roa"castClose(I)roa"castStream stream) { 22remo'es our listener stream$remo'eStreamListener(myListener); %

'o create a more generali6ed stream listener which would write a single frame snapshot of our streams video to disk. 8ith which we could then process into a ,*G or H,G image with another process or tool like Iuggler or FFMpeg. 'he following e/ample creates an F74 which contains only one frame, a -key frame. of video for our conversion.
public 'oi" pacBet!ecei'e"(I)roa"castStream stream( IStream=acBet pacBet) { 22we are only intereste" in 'i"eo "ata i (pacBet instanceo Ei"eoGata) { 22cast t-e pacBet into a 'i"eo obLect Ei"eoGata 'i"eoGata = (Ei"eoGata) pacBet; 22we are only intereste" in t-e irst Bey rame i ('"$get:rameFype() == Ei"eoGata$:rameFype$9>H:!A0>) { 22remo'e t-is listener since we only want one snaps-ot stream$remo'eStreamListener(t-is); Io)u er bu er = pacBet$getGata(); 22Beep tracB o t-e 'i"eo "ata siMe int siMe = bu er$limit(); 22create a byte array to -ol" t-e 'i"eo bytes byteNO blocBGata = new byteNsiMeO; 22rea" t-e 'i"eo image "ata into our byte array bu er$get(blocBGata( .( siMe); try { 22create ile output -an"ler :ileJutputStream os = new :ileJutputStream(stream$get=ublis-e"Name() , +$ l'+); 22write :LE -ea"er
%%

byteNO -"r = {P:P(PLP(PEP(.<.A(.<.A(.<..(.<..(.<..(.<.8(.<..(.<..(.<..(.<. .(.<.8(.<..(.<..(.<..(.<..(.<..(.<..(.<..(.<..(.<..(.<..%; 22write t-e siMe o our 'i"eo image -"rNA6O = (byte) ((siMe Q .< ....) >> 5); -"rNA#O = (byte) ((siMe Q .< ..) >> A); -"rNARO = (byte) ((siMe Q .< )); 22write t-e l' -ea"er to t-e ile os$write(-"r( .( -"r$lengt-); 22write t-e Bey rame os$write(blocBGata( .( siMe); os$close(); % catc- (><ception e) { e$printStacBFrace(); % % % %

A good point to make about the previous e/ample is that it will work with video data encoded in any of the Flash supported video codecs that are permitted to be stored within an F74 container file. Dven more complicated code could possibly decode the video data and manipulate it, such as adding a company watermark.

u''"rted !"de!s
A codec, which is short for -coder)decoder. is a computer program that can accept a signal or digital data and encode andJor decode it into a usable format. An e/ample of this would be speaking into your microphone and having an application use an M,B codec to convert your analog speech into bits and bytes. 'he recorded voice would then be playable on any device which supports the M,B codec. +odecs come is two primary forms named -lossy. and -lossless.. 7ossy codecs achieve compression by reducing the 0uality of the content. It is possible that content could be compressed with a lossy codec at such a high level, that it would no longer resemble the source material1 so a balance must be maintained to produce acceptable 0uality. (n the other hand, using a lossless codec we would retain all the source data collected and thus we would lose any of original 0uality. 'he main reason to use lossy versus lossless is to reduce the storage re0uirements for the encoded source material. All the of the codecs detailed here in may be streamed from "ed# server, but some of them may not be rendered by the Flash ,layer1 this is outside the control of the server. "ed# does not provide encoding or decoding of the media by default, but this does not prevent us from creating such services.

%2

Vide"
'he latest versions of the Flash ,layer support a decent number of video codecs, some of these codecs are available as open source. Since some of the codecs are available for free, this means that we could theoretically write our own publishing and viewing applications1 this has been done by others in the community. Hpeg : 4ery little information is available to the public about the support of this codec for video in Flash. 'he name associated with the codec identifier eludes to its possible similarity to image format of the same name. Screen 4ideo : 'here are two versions of this codec, originally created by the Macromedia corporation for sharing screen shots. 'he first version -Screen4ideo. uses a simple &G" image difference routine is available in several programming languages on the Internet, including ActionScript. 'he -Screen4ideo;. codec was used in the commercial &ree6e product and very little is known about its details. Sorenson : A variant of the International 'elecommunications Enion) 'elecommunication Standardi6ation Sector 3I'E)'5 h.;@B video codec. 'his codec is also often referred by the F74 acronym or F749 in the ffmpeg tool. Sorenson adds the following features beyond the h.;@B standard% difference frames, greater arbitrary dimensions 3@##B# pi/els5, unrestricted motion vector turned on by default, and a deblocking flag. 'his codec is the most commonly used codec in Flash video on the Internet today. 4,@ : 'he (n; corporation s proprietary high 0uality video codec. 'here are various Flash player supported versions of this low bitrate codec such as 4,@, 4,@A, 4,@D, 4,@F, and 4,@S. 'his codec provides higher 0uality video than Sorenson at a lower bitrate. It was chosen over h.;@? by Macromedia for several reasons back in ;==# such as 0uality at a given bandwidth, portability to non Intel architectures, code si6e in Flash ,layer, stability, and overall performance. A4+ J h.;@? : (therwise know as M,DG)?, this is the standard codec for ne/t generation video on the Internet. +urrently the A4+ codec provides the best 0uality video at the lowest bitrate than any other codec supported by Flash ,layer. 'his codec is specified in IS( 9??A@)9= documentation. Flash player supports the following profiles% &ase 3&,5, Main 3M,5, and Figh 3Fi,5. (ne downside to using this superior codec is that the license is difficult for the average user to comprehend. 'he license terms should be understood before we implement this codec in our applications.

("de! )d

("de! *ame

+irst 'layer

%3

versi"n su''"rt 1 2 3 4 Jpeg Sorenson Spark Screen Video VP6, VP6E, VP6F VP6A, VP6S 6 $ Screen Video 2 AV% & h"264 6 6 8 !"#"11 "# 8 !"#"11 "#

,udi"
Standard voice audio is usually encoded at <=== samples per second 3< kF65, music or any other audio will normally be sampled at 99 kF6, ;; kF6, or ?? kF6. +! 0uality is commonly stated as ??.9 kF6 and we all know this is very good 0uality sound, further proved by the popularity of compact discs. Keep in mind that higher our sample rate, the greater our bandwidth re0uirements will be times the number of consumers of our stream1 this can get very e/pensive. 'he selection of audio codec should be made based on our target bandwidth and 0uality. ,+M : 'his codec represents the magnitude of an analog audio waveform with a finite range of values. 'he audio wave is sampled at fi/ed regular intervals. !igital telephone systems, compact discs, and computers have been using this format since the 9A<= s. ,+M audio data is by default, not compressed. A!,+M : 'he adaptive delta pulse)code modulation codec is based on differential pulse code modulation 3!,+M5 which itself is based on ,+M. Adaptive in the case of this codec indicates that the 0uanti6ation si6e is varied which reduces the bandwidth needed for streaming. *ellyMoser Asao : ,roprietary lossy codec produced by the *ellymoser corporation. 'his codec is good for speech and general audio content. Flash support three sample rates for this codec, roughly e0uivalent in 0uality to telephone, radio, or compact disc. M,B : 'he very common audio format used in all our audio devices and players. 'his patented codec was created by the Moving ,icture D/perts

%4

Group 3M,DG5 in the early 9AA= s and was approved as an IS( standard in 9AA9. 'his codec handles two channel audio and can sample from 9@ kF6 to ?< kF6. As with many of the codecs outlined here, this one is also lossy. G.>99 : Eses logarithmic ,+M to sample voice fre0uencies at < kF6. 'his codec is available in two flavors% L)law and a)law. 8hether we use L)law or a)law is based on the region of the world where it will be used. In *orth America and Hapan L)law is used and a)law is used in everywhere else. 'his codec is reserved for internal use by Adobe at the time of this writing. AA+ : 'he advanced audio coding codec is a lossy codec at the heart of the M,DG)? specifications, it was designed to replace M,B. 'his codec can handle up to ?< channel audio and sample from < kF6 to A@ kF6. 'he output produced by this codec is of similar 0uality to M,B but at half of the bitrate, an audio clip encoded in AA+ at @? KbitJs will sound the same as an M,B encoded at 9;< KbitJs. +urrently the Flash player supports the following AA+ profiles% AA+ Main, AA+ 7+ or S&". Many more details about this codec are specified in the IS( 9??A@)B specification document. (ne last thing to note is that Flash player is capable of playing back multichannel AA+ audio, but it is re)sampled down to two channels at a sample rate of ??.9 kF6. Spee/ : A patent free lossy codec designed specifically for speech. 'his codec is available as open source and supports features that are not present in proprietary offerings such as% variable bitrate 34&"5, intensity stereo encoding, and integration of different sample rates in the same stream.
("de! *ame 'nco(pressed P%) *+ig endian, ADP%) )P3 'nco(pressed P%) *-itt-e endian, .e--/ )oser 16k01 .e--/ )oser 8k01 +irst 'layer versi"n su''"rt 6 6 6 6 6 6 6 6nterna- use on-/
%5

("de! )d # 1 2 3 4

6 $

.e--/ )oser " k01 2"$11 a34a5

8 ! 1# 11 1

2"$11 734a5 8eser9ed +/ Ado+e AA% Spee: )P3 8k01

6nterna- use on-/ .A !"#"11 "# 1#

'he easiest way available for creating an audio publisher, is to use ActionScript. 8hen selecting the audio codec to use between the two available to ActionScript, we should know the bandwidth re0uirements for *ellyMoser and Spee/. Spee/ bandwidth re0uirements are controlled with the encodeCuality property of the Microphone class, values from = to 9= are allowed and are shown with their bandwidth values below.
-uality # 1 2 3 4 .and/idth 01b's2 3"! "$ $"$ !"8 12"8 16"8 6 $ 8 ! 1# 2#"6 23"8 2$"8 34"2 42"2

%6

*ellyMoser bandwidth is controlled via the rate property of the Microphone class. 'he rate property signifies the sample rate at which the audio is captured. 'he following table shows the *ellyMoser bandwidth re0uirements for each supported sample rate.
am'le rate .and/idth 01b's2 11"#3 8 *de;au-t, 11 22 44 16 22"# 44"1 88"2

'here is also an additional rate of 9@ kF6 which was made available with the release of Flash ,layer version 9= and Adobe AI" version 9.#.

u''"rted 3ile ty'es


"ed# server may be used to stream a plethora of media file types right out)of)the)bo/. 'he server originally only supported Flash 4ideo 3F745 and M,DG 7ayer B 3M,B5 files, but has since been e/tended to stream other popular formats. 'he current list of media file formats is listed below% F74 : +ontainer file for Flash media files, created by Macromedia in ;==; and supported by all versions of Flash ,layer starting with version @. !ata is stored in this container in a manner similar to an S8F file. +urrently this is the only container that "ed# will save to by default. M,B : 'he de facto standard for audio)only files on the Internet. 'his format is well known and supported by a great number of applications and hardware devices available today. M,? : D/tension for general M,DG)? files, containing any of the content encoding formats supported by this IS( container type. Audio, video, and te/t content is stored in the file as -tracks. and has no fi/ed limit. *ormally the content contained within this file would be A4+ video with AA+ audio, but there are many more types that may be in a file simultaneously. M?A : 'his e/tension is used to signify an audio)only M,DG)? file. +onsider this similar to M,B but of higher 0uality.
%7

AA+ : +ontainer for audio)only files encoded with AA+, essentially the same as an M?A file. F?4 : 'he latest container format available for Flash media files which is not based in any way on the F74 container type. 'his container is based on the M,DG)? standard and was first supported by Flash ,layer in version A update B. 'he older codecs Screen4ideo, Sorenson, 4,@, A!,+M, and *ellyMoser supported by F74 are not available for this container. M(4 : 'he Cuicktime container format is the basis of the M,DG)? file format. 'his format was created in 9AA9 by the Apple corporation and is still utili6ed by their commercial software products. *ot all of the codecs used in this container are supported by Flash ,layer. BG, J BG; : File container type used for mobile media created by the Brd Generation ,artnership ,roject 3BG,,5. 'he "F+ B<BA contains details for this format. 'his format is an e/tension of the M,DG)? standard format and is targeted for mobile devices. Keep in mind that the majority of BG, files recorded on mobile phones are not supported by Flash ,layer due to their use of the AM" codec for audio. (ther file types and formats, may also be streamed with via plug)ins such as *ullsoft 4ideo 3*S45.

45tensi"n ";-9 ";49 "(p4 "(p3 "(4a "aac "(o9 "3gp "3g2

Mime ty'e 9ideo&:3;-9 9ideo&(p4 9ideo&(p4 audio&(peg audio&(p4 audio&:3aac 9ideo&<uickti(e 9ideo&3gpp or audio&3gpp 9ideo&3gpp2 or audio&3gpp2

%#

.y'assing 3ire/alls
Most of us are familiar with the use of firewalls on the Internet, in the simplest terms they are used to prevent unauthori6ed network access. 'here are a lot of different types of firewalls available today, but we will focus on how they affect our "ed# application. 'he passage of our streams through firewalls is very important for viewership. A great many institutions and corporations block well known applications based on the port that they utili6e, but they normally leave the web traffic ports wide open and this is our way in. 'he method used by "ed# to traverse these restrictions is to encapsulate "'M, binary data into a te/t based format, in the most broad sense this is known as tunneling.

67MP7
"eal)time Media ,rotocol 'unneling or "'M,' for short, is the means with which "'M, data is transmitted via F'',. Esing F'', as the -transport. for our streams data, we will be able to pass though the majority of firewalls due to its popularity. Since -surfing. the web is e/tremely common and e/pected, the F'', port 3<=5 is usually open by default. 8hen using the "'M,' protocol in Flash ,layer, the player will act in a similar manner to a browser by making re0uests and accepting responses. Action Message Format 3AMF5 data originating from the player will be encapsulated in IM7 and then be submitted via a ,(S' re0uest. 'he servers response will also be constructed in IM7 and sent back to the player. 8hile the content of these re0uests and responses is technically te/t data, we may not be able to decode it as we would a document.

%$

'his diagram shows a network connection being blocked by a firewall for -4iewer &. and unblocked for -4iewer A.. 'o initiate a re0uest using "'M,' we simply replace the -rtmp. with -rtmpt. in the protocol position of the resource locator of our *et+onnection. Dverything else from creating *etStreams, attaching sources, or playing media is e/actly the same synta/)wise as when we use the "'M, protocol.
nc$connect(1rtmpt*22local-ost2myapp3);

'o handle "'M,' re0uests on the server, our application will need either a dedicated HDD server instance with the "'M,' servlet attached or just the "'M,' servlet configured in our web./ml file.

ummary
In this chapter, we covered a lot of details about streaming media using "ed# server. Esing "ed# to stream enables us to do a lot of things with live and pre)recorded media such as conduct video conferences, play slide shows, audio chat, and play music files. 8e are limited only by our capabilities to write applications and produce content. 8e also learned about%
2&

*aming our streams ,ublishing a live stream

,roviding 4(! streaming +reating playlists Streaming audio files Subscribing to streams +reating our own stream listeners 'he codecs supported by "ed# Supported media files Fow to bypass firewalls

In +hapter #, our focus will be on Shared (bjects. 8e will learn the difference between client)side and server)side shared objects in addition to their persistence options. Shared object listeners will also be covered in detail.

2%

Vous aimerez peut-être aussi