iptv over wimax

7
IPTV over Wimax: Overview on the video path from the server to the Wimax end-user. Rabih Badih MOAWAD Invited Professor at Saint Joseph University, Faculty of Engineering, ESIB, Electrical Department. Mar Roukoz, Mkalles, Lebanon Email: [email protected] Abstract-In this paper we will have an overview on Television and video streams, how they are generated, how they are transmitted over an Internet Protocol (IP) network. This technique, known as IPTV, requires fulfillment of Quality of Service (QoS) parameters, as well as Quality of Experience (QoE) parameters, to deliver an application-dependant quality video reception. We will mainly focus on the transmission over the access network Wimax, the "last mile hop" that delivers wirelessly the stream to the end-user. I. INTRODUCTION Television and video streams are gaining a more important share in the total transmitted traffic over data networks. IPTV (Internet Protocol TV) end-users are continuously increasing in number and. The IPTV Bandwidth Study from Bell Labs Research (March 2006) reports that the service mix in the network is changing dramatically. Thanks to the rapid growth of video services, overall traffic bandwidth is expected to increase by lOx from 2005 to 2009, and by 2009 more than 9000 of the traffic will be video[1]. II. TELEVISION AND VIDEO STREAMS In standard digital television SDTV, two main systems exist. The first one 625/50 has an active frame resolution of 720 pixels in each line by 576 lines, with a frame rate of 25 frames per second, and is related to the European analog systems PAL and Secam. The second one 525/60 has 720 or 704 pixels by 480 lines, a frame rate of 30 fps, and is related to the American analog system NTSC. The High Definition Television HDTV gives a better image quality relying on a maximum resolution of 1920 pixels by 1080 lines, with 25 or 30 frames per second, and using interlaced or progressive scan. By using a sampling pattern of 4:2:2 (The sampling frequency for both the chroma or color signals CR and CB is half the sampling frequency of the luminance or the black and white signal) and a 10 bits per sample quantization, the video rate reaches a value of 270Mbps for SDTV and 1.485 Gbps for professional HDTV. This rate can be handled in television studios to have the maximum image quality, but for efficient transmission over telecoms or packet-switched networks, compression is needed to reduce the rate. Video compression uses spatial and temporal coding to reduce the amount of information. Spatial or Intra-frame coding uses the repetition of the same part within the same frame (e.g. a blue sky) to reduce the data amount, while temporal or inter-frame coding 978-1-4244-1754-4/08/$25.OOC2008 IEEE uses the difference between two consecutives frames (e.g. a car is moving in front of a still background. The background is transmitted once, and for the next frame, the car's motion vector and the minor differences between the two frames, after motion compensation, are spatially coded and transmitted.) Two compression formats are used for television compression, MPEG2 and MPEG4. MPEG2 compress the frames considered as sets of pixels. MPEG4 which is backward compatible with MPEG2, is a content-dependent and object- oriented format that uses video objects, natural or synthesized (2D or 3D) and audio objects, natural or synthesized. SDTV can be compressed by both MPEG2 and MPEG4, while HDTV is compressed only by MPEG4. MPEG4 can deliver the same image quality as MPEG2 but with a reduced bit rate. In order to have a PAL-equivalent image quality in SDTV, we need a minimum bit rate of 4 to 4.5Mbps in MPEG2 and 2Mbps in MPEG4. ( see table I) [2] III. VIDEO Qosand QoE REQUIREMENTS A. Quality ofService QoS QoS or Quality of Service requirements are the minimal conditions that the network must fulfill to deliver the corresponding stream with the required quality to the end-user. QoS comprises different parameters: bandwidth, total delay in the network, jitter, packet drop rate etc. TABLE I Video technologies Name resolution fps Rate Compression ElCCIR 601 720x576 23 270 Mbps uncompressed Ea SDTV 720x576 25 4.5Mbps MPEG-2 Ea SDTV 704x480 30 4 Mbps MPEG-2 Ea SD-VoD 704x480 30 3.75Mbps MPEG-2 Ea SDTV 704x480 30 2 Mbps MPEG4-AVC Ea HDTV 1920x1080i2 30 19.1 Mbps MPEG-2 Ea HDTV 1920x 1080ib 30 9 Mbps MPEG4-AVC Ea Pro HD 1920x1080 30 1.485 Gbps uncompressed Ea Digital 4096x2160p3 24 250 Mbps JPEG 2000 cinema M4 QCIF 176x144 12 > 100 Kbps MPEG4-AVC MdQVGA 15- MdApple 320x240 30 878 Kbps MPEG4-AVC iPOD 3 1 Entertainment 2 Interlaced Scanning: each frame is divided in two fields. 3 Progressive Scanning 4 Mobile 17

Upload: alejandra-panchana

Post on 16-Apr-2015

30 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: IPTV Over Wimax

IPTV over Wimax: Overview on the video path from

the server to the Wimax end-user.Rabih Badih MOAWAD

Invited Professor at Saint Joseph University, Faculty of Engineering, ESIB, Electrical Department.Mar Roukoz, Mkalles, LebanonEmail: [email protected]

Abstract-In this paper we will have an overview on Televisionand video streams, how they are generated, how they aretransmitted over an Internet Protocol (IP) network. Thistechnique, known as IPTV, requires fulfillment of Quality ofService (QoS) parameters, as well as Quality of Experience (QoE)parameters, to deliver an application-dependant quality videoreception. We will mainly focus on the transmission over theaccess network Wimax, the "last mile hop" that deliverswirelessly the stream to the end-user.

I. INTRODUCTION

Television and video streams are gaining a more importantshare in the total transmitted traffic over data networks. IPTV(Internet Protocol TV) end-users are continuously increasingin number and. The IPTV Bandwidth Study from Bell LabsResearch (March 2006) reports that the service mix in thenetwork is changing dramatically. Thanks to the rapid growthof video services, overall traffic bandwidth is expected toincrease by lOx from 2005 to 2009, and by 2009 more than9000 of the traffic will be video[1].

II. TELEVISION AND VIDEO STREAMS

In standard digital television SDTV, two main systemsexist. The first one 625/50 has an active frame resolution of720 pixels in each line by 576 lines, with a frame rate of 25frames per second, and is related to the European analogsystems PAL and Secam. The second one 525/60 has 720 or704 pixels by 480 lines, a frame rate of 30 fps, and is relatedto the American analog system NTSC.The High Definition Television HDTV gives a better imagequality relying on a maximum resolution of 1920 pixels by1080 lines, with 25 or 30 frames per second, and usinginterlaced or progressive scan.By using a sampling pattern of 4:2:2 (The sampling frequencyfor both the chroma or color signals CR and CB is half thesampling frequency of the luminance or the black and whitesignal) and a 10 bits per sample quantization, the video ratereaches a value of 270Mbps for SDTV and 1.485 Gbps forprofessional HDTV. This rate can be handled in televisionstudios to have the maximum image quality, but for efficienttransmission over telecoms or packet-switched networks,compression is needed to reduce the rate. Video compressionuses spatial and temporal coding to reduce the amount ofinformation. Spatial or Intra-frame coding uses the repetitionof the same part within the same frame (e.g. a blue sky) toreduce the data amount, while temporal or inter-frame coding

978-1-4244-1754-4/08/$25.OOC2008 IEEE

uses the difference between two consecutives frames (e.g. acar is moving in front of a still background. The background istransmitted once, and for the next frame, the car's motionvector and the minor differences between the two frames, aftermotion compensation, are spatially coded and transmitted.)Two compression formats are used for television compression,MPEG2 and MPEG4. MPEG2 compress the framesconsidered as sets of pixels. MPEG4 which is backwardcompatible with MPEG2, is a content-dependent and object-oriented format that uses video objects, natural or synthesized(2D or 3D) and audio objects, natural or synthesized. SDTVcan be compressed by both MPEG2 and MPEG4, whileHDTV is compressed only by MPEG4.MPEG4 can deliver the same image quality as MPEG2 butwith a reduced bit rate. In order to have a PAL-equivalentimage quality in SDTV, we need a minimum bit rate of 4 to4.5Mbps in MPEG2 and 2Mbps in MPEG4. ( see table I) [2]

III. VIDEO Qosand QoE REQUIREMENTS

A. Quality ofService QoSQoS or Quality of Service requirements are the minimal

conditions that the network must fulfill to deliver thecorresponding stream with the required quality to the end-user.QoS comprises different parameters: bandwidth, total delay inthe network, jitter, packet drop rate etc.

TABLE IVideo technologies

Name resolution fps Rate Compression

ElCCIR 601 720x576 23 270 Mbps uncompressedEa SDTV 720x576 25 4.5Mbps MPEG-2Ea SDTV 704x480 30 4 Mbps MPEG-2Ea SD-VoD 704x480 30 3.75Mbps MPEG-2Ea SDTV 704x480 30 2 Mbps MPEG4-AVCEa HDTV 1920x1080i2 30 19.1 Mbps MPEG-2Ea HDTV 1920x 1080ib 30 9 Mbps MPEG4-AVCEa Pro HD 1920x1080 30 1.485 Gbps uncompressedEa Digital 4096x2160p3 24 250 Mbps JPEG 2000

cinemaM4 QCIF 176x144 12 > 100 Kbps MPEG4-AVC

MdQVGA 15-MdApple 320x240 30 878 Kbps MPEG4-AVC

iPOD 3

1 Entertainment2 Interlaced Scanning: each frame is divided in two fields.3 Progressive Scanning4 Mobile

17

Page 2: IPTV Over Wimax

Video and Voice have competing QoS requirements. Whilethe allowed packet drop rate is about 10-6 and the allowedjitter is about 200ms for video, these values are 10-2 and 60 msfor voice. So voice requires minimal jitter, thus the need ofpriority queues that prioritize the buffering of voice traffic toother streams. Video requires an extremely low drop rate, andthis impose large buffering or longer queues to accumulate thetransmitted video stream packet bursts. [3]

B. Quality ofExperience QoEThe main objective is to satisfy the subscriber and give him

what he is expecting from the service. So, even if the objectiveQoS parameters should be fulfilled, other subjective attributesdepicting the perceived service quality should also berespected. These attributes, called QoE or Quality ofExperience for video content services, reflect how the userperceives the quality of the transmitted stream, which is thefinal target of the service provider.Three major quality categories exist for QoE:

1) Session quality: is concerned with the user's overallexperience in viewing the video, but especially from aconnection perspective. The initial buffering time (i.e. the timeit takes the video to load and begin playing) and re-bufferingduring the playback are important factors of session quality.Synchronization between audio and video, which aretransmitted as separate streams, is also an importantcomponent. A frequent problem with digital video isdesynchronization between video and audio signals, called lipsynch. Lip synch can be intolerable if it is greater than about185 ms [4]. Loss of synchronization is caused by narrowingbandwidth, packet losses, buffer over and under flows etc.

2) The audio attributes of QoE: can be categorized into thefidelity and mono/stereo characteristics of the audiocomponents of the content. Note that fidelity is almost theperfect reconstitution of the original sound and does not equalaudio quality. Users may have different perceptions of whatthey are seeing based on what they are hearing. The requiredaudio quality can vary greatly with the stream's content. Audioquality and fidelity can also affect perceptions of videoquality. High-quality audio can overcome poor video quality,particularly for some content types, for example, a musicvideo.

3) Video quality: can be sub-divided into frame quality,fidelity of motion and stalling. Frame quality is inherent toeach frame separately, considered as a still image. Smoothnessof motion usually depends on the frame rate, i.e. the numberof frames per second. However, smoothness is related to thecontent of the frames, and can be achieved with lower framerate for newscaster or weather forecast video, while for anaction clip or sport event higher frame rate is needed. Imagestalling is the drop of frame rate to zero for a perceptibleduration. It affects negatively quality perception, and QoE.

The nature of the content contributes to the weight of eachfactor in shaping QoE. A sports video may have a higher QoEif video smoothness is favored over picture clarity. A newspresenter clip may deliver a higher QoE by focusing on audioand picture clarity.

An index useful in evaluating QoE is the Media DeliveryIndex MDI, comprising two significant parameters: DelayFactor (DF) representing the IP cumulative jitter and MediaLoss Rate (MLR) which tracks the amount of media lost persecond. The MDI is expressed as DF: MLR.The Delay Factor DF, is the amount of time the media streamwill need to buffer to compensate for the ratio of the deliveryrate differences versus the payload decode rate. For example ifa DF for a video stream was measured to be 150 ms and adecoder was placed at that point in the network it would needa buffer of 150 ms to accommodate the difference between themaximum and minimum delivery rates.

It is worth mentioning that the Motion Picture QoS Metrics(MPQM), and the V-Factor which are objective modelsbased on the specific properties of human vision that canaccurately reproduce the subjective experience of a user,give a better evaluation of the QoE parameters for video signalthan MDI or the Picture Signal to Noise Ratio PSNR.[4]

IV. INTERNET PROTOCOL TELEVISION IPTV

Internet Protocol Television IPTV is the transmission of TVand video content over IP networks. It is gaining more andmore importance with the increasing number of users havinginterest with video content applications and services. Theemergence of new video applications, in addition to the oldones, which uses the traditional broadcast networks (cable,terrestrial transmitters, satellite), opens new horizons for theextensive use of IP networks to satisfy the new servicesdemands.The video and television applications include:* Entertainment TV services [e.g., IPTV and video-on-demand];* Security (e.g., surveillance systems);* Real-time communications (e.g., video telephony,

teleconferencing);* Interactive applications (e.g., interactive TV, gaming, and

telepresence);* Internet sharing and streaming (e.g., user-created video contentand web-based streaming to a desktop PC or mobile device);

* Corporate training and marketing videos.

Every application from the above mentioned ones needsdifferent QoS requirements in order to be delivered in the bestquality to the end-user. In this section we will have anoverview of how entertainment broadcast television over IP,and Video on demand are transmitted from the server throughthe IP network before reaching the wireless access network,Wimax.

18

Page 3: IPTV Over Wimax

A. Broadcast IPTV

For broadcast IPTV, a video head-server delivers thecompressed video stream using MPEG-2 or MPEG-4 (H264)codec which is then sent in an MPEG Transport Stream(MPEG-TS) formed of fixed length packets of 188 bytescalled Packet Elementary Stream PES. These packets aretransported by the Real-time Protocol RTP, used on top of theUser Datagram Protocol (UDP). RTP defines a standardizedpacket format for delivering real-time audio and video overthe Internet. The services provided by RTP include payload-type identification, sequence numbering, time stamping(allowing synchronization and jitter calculations) and deliverymonitoring. The real-time constraint leads the choice towardsUDP rather than Transmission control protocol TCP, becauseUDP can deliver packets faster than TCP, but with a probableincrease in packet drop ratio ( no packet retransmission inUDP), a problem that should be dealt with elsewhere in thenetwork. RTP is used in conjunction with the RTP controlprotocol RTCP which provides out-of-band controlinformation for an RTP flow (i.e. sent bytes, sent packets, lostpackets, jitter, feedback and round trip delay). The mainfunction of RTCP is to provide periodically feedback toparticipants in a streaming multimedia session on the Qualityof Service being provided by RTP, so the application canchange parameters to increase the QoS (e.g. reducing the ratein case of network congestion). In some applications MPEG2-TS and/or RTP are not necessarily used. Other mechanismscan do their work.

IP multicast, a point-to-multipoint technique for forwardingIP packets over an IP infrastructure, is used for transmittingTelevision programs to multiple users. IP multicast utilizesnetwork infrastructure efficiently and saves networkbandwidth, because packets are transmitted as one stream overthe backbone and only replicated at the edge router before themultiple flows are delivered to the end-users. These users canjoin or leave a multicast group by using the Internet GroupManagement Protocol IGMPv2 with the local multicast router.The Protocol Independent Multicast PIM is a family of routingprotocols used to insure multicast transmission among the corenetwork routers.

The Multi Protocol Label Switching MPLS can be usedbetween the network layer and the data link layer to providefaster switching and forwarding of the IP packets, using thepacket's label instead of the IP address; these labels are addedto each packet at the ingress of the MPLS network. Tables,containing information about the levels of quality of service(QoS) that the network can support, are used to provide trafficengineering. The tables and the labels are used together toestablish an end-to-end path called a Label Switched Path(LSP). MPLS provides bandwidth guarantees, priority-basedbandwidth allocation, and preemption services for a specificuser application (or flow).Generalized MPLS (GMPLS)extends MPLS to provide the control plane (signaling and

routing) for devices that switch in any of these domains:packet, time, wavelength, and fiber.

On the data link layer level, protocols as Gigabit Ethernet,Synchronous Optical Networking or Synchronous DigitalHierarchy (SONET/SDH), Asynchronous Transfer ModeATM etc. are used on the backbone. One of many solutions toresolve the bottlenecks in IPTV networks was proposed in [1];it consists of deploying an optical fiber network, usingwavelength multiplex on each fiber and Time DivisionMultiplex TDM for each wavelength. Packets are thenforwarded during the allocated time slots.

B. VOD

For Video on Demand other QoS requirements must befulfilled. In this case the user can access the video serverwhenever he likes, to choose his favorite program or movie.The Real-Time Streaming Protocol RTSP is used inconjunction with RTP to permit the user the access the videoserver and the functionalities as Play, Pause, Stop, RewindFast Forward, as if he is using his home video player. Unicastmust also be used because it is a point-to-point transmission,between the video server and the end-user.

V. IPTV OVER WIMAX

The last mile hop is the access network, in our case theWorldwide Interoperability for Microwave Access or Wimax.Currently two standards are laying down regulations for thiswireless network: IEEE 802.16d (published in 2004) for fixedor nomadic users and IEEE 802.16e (published in 2005) formobile, nomadic, portable or fixed users. While 802.16d canbe received by fixed devices as computers, set top boxes, orlaptops using fixed antennas or slowly changing positionterminals; 802.16e also targets mobile devices as PDA,laptops, and fourth generation 4G cellular phones moving at aspeed up to 120Km/h. Due to the lack of backwardscompatibility between the 802.16e and 802.16d, and the betterperformance of 802.16e, this standard may prevail in the nearfuture [5]. Focus will be done on the 802.16e standard, and insome cases hints will be given on the differences with802.16d.

The 802.16e supports Non Line of Site (NLOS)transmission, in the 2.3GHz, 2.5 GHz, 3.3GHz and 3.5 GHzbands. Mobile Wimax support peak DL data rates up to 63Mbps per sector and peak UL data rates up to 28 Mbps persector in a 10 MHz channel [6]. From the beginning Wimaxwas designed to support QoS and to guarantee the bestpossible quality to multimedia flows as video or voice. QoS isimplemented on two levels: the physical layer PHY, and thedata link or MAC layer.

19

Page 4: IPTV Over Wimax

A. Physical Layer Description

A. 1 OFDMA BasicsOrthogonal Frequency Division Multiplexing (OFDM) is a

multiplexing technique that subdivides the bandwidth intomultiple orthogonal frequency sub-carriers. Each sub-carrier ismodulated by a sub-stream, obtained by dividing the mainstream into several parallel sub-streams of reduced data rate(thus increased symbol duration).Due to the increased symbolduration, the delayed symbol caused by multipath propagationcannot spread over multiple following symbols. Furthermore,by using the cyclic prefix (CP) which is the appending of thelast portion of the data block at the beginning of this sameblock. This process can completely eliminate Inter-SymbolInterference (ISI) as long as the channel delay spread fallwithin the CP duration. This allows low-complexity frequencydomain equalization.The model of terrestrial wireless transmission is a multipath,

frequency-selective channel (due to phenomena as reflection,refraction and scattering etc.).Prior to transmission, the data iscoded and interleaved to overcome burst errors or deep fadingon one or several sub-carriers; the data then could berecovered using powerful error correction codes, likeconvolutionnal and Reed-Solomon coding. OFDM modulationcan be realized with efficient Inverse Fast Fourier Transform(IFFT), using up to 2048 sub-carriers. In an OFDM system,resources are divided in frequency (sub-carriers) and time(OFDM symbol). While each sub-carrier is modulated (in16QAM, 64QAM or QPSK) by a low-rate signal, the OFDMsymbol is the time-domain signal obtained by the totalmodulated sub-carriers during a specified time called activesymbol time. The total OFDM symbol duration is the sum ofthe CP duration and the active symbol time.

A.2 OFDMAIn the Orthogonal Frequency Division Multiple Access

(OFDMA) the time and frequency resources can be organizedinto sub-channels for allocation to individual users.Orthogonal Frequency Division Multiple Access (OFDMA) isa multiple-access/multiplexing scheme that providesmultiplexing operation of data streams from multiple usersonto the downlink sub-channels and uplink multiple accessesby means of uplink sub-channels.

A. 3 OFDMA Symbol Structure and Sub-ChannelizationThe OFDMA symbol structure consists of three types of

sub-carriers:1) Data sub-carriers for data transmission.2) Pilot sub-carriers for estimation and synchronizationpurposes3) Null sub-carriers for no transmission used for guard bandsand DC carriers.Active (data and pilot) sub-carriers are grouped into subsets ofsub-carriers called subchannels. Subchannelization is used inboth DownLink (DL) and UpLink (UL) transmissions. Eachsubchannel contains 48 data subcarriers, which can be

allocated choosing one of three modes depending on the usagescenario as follows:

1) The Fully Used Sub-Channelization or FUSC: Subcarriers are selected pseudo-randomly, throughout thefrequency channel range. This mode is used for DL.

2) The Partially Used Sub-Channelization or PUSC: Subcarriers selected pseudo-randomly from several scatteredclusters of subcarriers can be used to form a subchannel. Thismode is used in DL and UL.

3) Adjacent Subcarrier Permutation (ASM): Thismethod uses adjacent subcarriers to form subchannels. It isoptional in DL and UL, and is used in conjunction withthe Advanced Modulation and Coding (AMC). Whenused with fast feedback channels it can rapidly assign amodulation and coding combination per subchannel. TheAMC subchannels enable the use of "water-pouring" types ofalgorithms for multi-user resources allocation. [7]FUSC and PUSC provide frequency diversity and inter-cell

interference averaging. They are mainly best alternatives formobile applications. AMC which enables multi-user diversityby choosing the sub-channel with the best frequency responsefor each user is well suited for stationary, portable, or lowmobility applications. These options enable the systemdesigner to trade-off mobility for throughput.

A. 4 Scalable OFDMA

The IEEE 802.16e-2005 Wireless MAN OFDMA mode isbased on the concept of scalable OFDMA (S-OFDMA). S-OFDMA supports a wide range of bandwidths to reply to theneed for different bandwidth requirements for a wide varietyof applications. The bandwidths of 1.25MHz, 2.5MHz, 5MHz,10MHz and 20MHz using respectively Fast Fourier Transform(FFT) size of 128, 256, 512, 1024 and 2048 have fixed sub-carrier frequency spacing of 10.94 kHz, and a useful symboltime (Tb = A/Af) of 91.4 microseconds. This feature allows thesystem to keep the same Doppler spread compensation for allbandwidths, otherwise if a fixed FFT size of 1024 for examplewas chosen, retune would be necessary for each change ofbandwidth. For further details about OFDMA and SOFDMArefer to [7], [8]

A. 5 Time Division Duplex (TDD) Frame StructureMobile WiMAX initially chooses to implement Time

Division Duplex (TDD) even if the 802.16e standard supportsboth TDD and Full and Half-Duplex Frequency DivisionDuplex (FDD) operation. The 802.16d also supports FDD.TDD was chosen because it has many advantages over FDD.Unlike FDD which uses two channels with the samebandwidth for UL and DL, TDD uses only one channel for ULand DL transmission, and can dynamically adjust thedownlink/uplink ratio to efficiently manage the asymmetricdemand for bandwidth.The OFDM TDD frame structure is divided into DL and UL

sub-frames separated by Transmit/Receive andReceive/Transmit Transition Gaps (TTG and RTG,respectively) to prevent DL and UL transmission collisions.

20

Page 5: IPTV Over Wimax

The total frame duration ranges from 2ms to 20ms. In the DLsub-frame, the preamble is used for synchronization; theFrame Control Header (FCH) provides the frameconfiguration information such as MAP message length andcoding scheme and usable sub-channels.The DL-MAP and UL-MAP provide sub-channel allocation,

the type of modulation and coding used by the burstscontaining the data information, in DL and UL respectively, aswell as other control information.

In the UL sub-frame the UL Ranging sub-channel isallocated for mobile stations (MS) to perform closed-looptime, frequency, and power adjustment as well as bandwidthrequests. The bandwidth requests can be granted by the BS perconnection GPC or per subscriber GPSS, giving in the lattercase the freedom to the SS to allocate the bandwidth amongmultiple streams.

A. 6 Other Advanced PHYLayer Features

1) Multiple Input Multiple Output (MIMO) antennasconfiguration uses 2 antennas at the Base Station BS and twoantennas at the Subscriber Station SS. With Space TimeCoding (STC) also known as MIMO Matrix A, identicaldownlink data streams are sent from each transmit antennaproviding space and time diversity, which enhances DLcapacity as well as DL range, by increasing the signal to noiseratio (SNR) of the received signal at the mobile station, thusenabling the use of higher modulation efficiency bursts.MIMO is generally used with the FUSC and PUSC modes.With Spatial Multiplexing (SM), also known as MIMO MatrixB, each of the base station transmit antennas sends a differentdownlink data stream. This technique has the potential todouble the DL capacity under favorable channel conditions.Adaptive MIMO Switching enables dynamic switchingbetween Matrix A and Matrix B depending on existingchannel conditions. Up to 4x4 MIMO are covered in theMobile Wimax standard.

2) With the Active Antenna System (AAS) also known asBeamforming, a 4 array antenna elements fed with specificdifferent phase signals, give a narrower beam directed towardsthe user, with much higher gain, and reducing by the sameway the gain in the interference direction, thus increasing thecarrier to interference ratio. This technique requires channelcondition feedback, and is used with stationary or pedestrian-moving subscribers. AAS allows an increasing of up to 4000 inthroughput, a reduction to more than half the number ofsites[9]. It uses the AMC mode in SOFDMA.Combining a 4x4 MIMO and Beamforming gives the "best

of both worlds". It allows robust and high capacity whatevercould be the channel conditions. It permits also a frequencyreuse factor of 1.

B. MAC Layer DescriptionThe 802.16 MAC layer can support bursty data traffic with

high peak rate demand, while simultaneously supportingstreaming video and latency-sensitive voice traffic over the

same channel. All this can be achieved by supporting multipleQoS classes, an efficient resources allocation to one terminal,varying from a single time-frequency slot to the entire frame,and dynamically changing the modulation and coding profileon a frame-by-frame basis to adapt to the bursty nature of thetraffic, and the channel propagation conditions. The resourcesallocation is conveyed in the MAP messages at the beginningof each frame.

B. ] Quality ofService (QoS) SupportWith fast air link, asymmetric downlink/uplink capability,

fine resource granularity and a flexible resource allocationmechanism, Mobile WiMAX can meet QoS requirements for awide range of data services and applications. QoS is providedvia service flows. A service flow is a unidirectional flow ofpackets provided with a particular set of QoS parameters. Inorder to provide end-to-end QoS over the air interface, Wimaxis connection-oriented. The data connection, a unidirectionallogical link between the two peer MACs at the BS and SS, isfirst established between the base station and user-terminal.The service flow with service flow identification SFID is thenassigned to a connection with a connection identification CIDon 14bit. The SFID is fixed across Base-Stations, and can beused for mapping MPLS. Each Base-Station maps a SFID to anew CID.

Specific connections are used for management and controlpurposes, while others are created to transmit multicast andbroadcast messages to the SS.

Service flows can be created on-demand or pre-provisioned,depending on the application. The service flow parameters canbe dynamically managed through MAC messages toaccommodate the dynamic service demand, in both thedownlink and uplink. Mobile WiMAX supports a wide rangeof data services and applications with varied QoSrequirements. These are summarized in Table II. Streamingvideo uses Real-Time Polling Service. In rtPS the BS sendsperiodically Polling messages to the SS allowing them torequest the current needed bandwidth.

Table IIMobile WiMAX Applications and Quality of Service

rtps

IlliIzsI

21

Page 6: IPTV Over Wimax

C. Networking and MobilityThe network reference model developed by the WiMAX

Forum Network Workgroup defines a number of functionalentities and interfaces between those entities. Fig. 1 showssome of the more important functional entities.

1) Base station (BS): The BS is responsible for providing theair interface to the MS. Additional functions that may be partof the BS are micromobility management functions, QoSpolicy enforcement, traffic classification, and multicast groupmanagement.

2) Access service network gateway (ASN-GW): The ASNgateway typically acts as a layer 2 traffic aggregation pointwithin an ASN. Additional functions that may be part of theASN gateway include intra-ASN location management andpaging, establishment and management of mobility tunnelwith base stations, QoS and policy enforcement, foreign agentfunctionality for mobile IP, and routing to the selected CSN.

3) Connectivity service network (CSN): The CSN providesconnectivity to the Internet, the Application Service ProviderASP, or other public or corporate networks. The CSN isowned by the Network Service Provider (NSP) and includesAAA servers that support Authentication and Authorizationfor the devices, users, and specific services, as well asAccounting. The CSN also provides per user policymanagement of QoS and security. The CSN is also responsiblefor IP address management, support for roaming betweendifferent NSPs, location management between ASNs, andmobility and roaming between ASNs.To access the Wimax network, the SS scans for a channel,

synchronizes, gets the network parameters and send a ranging-request message. Then the BS assigns a managementconnection and negotiates with the SS the capabilities(i.e.bandwidth allocation, transmitted power, modulationssupported etc.). The SS send certificates in the AuthorizationRequest and Authentication Information Messages and receivean authorization key if he has the authorization to access thenetwork. The SS requests registration to the network and aftergetting the response indicating the accepted subset ofparameters, the SS gets an IP address via a DHCP server in thenetwork [10].

Fig. 1 Wimax network reference model.

D. Multicast Broadcast Services (MBS)In order to have successful mobile video delivery, an

architecture implementation specifically optimized forWiMAX-based mobile video is needed. An optimizedWiMAX TV architecture is based on the Multicast-BroadcastServices (MBS) specification, which is part of the MobileWiMAX (802.16e) standard.MBS uses the Single Frequency Network (SFN)--also used

with Digital Video Broadcast-Handheld (DVB-H)-- whereseveral contiguous BS transmit at the same time in the samefrequency, but need to be synchronized (e.g. by a GPS signal).These groups of BS are called MBS zones, where thesynchronized BSs use the same CID and encryption Keys for aparticular SS, allowing it to communicate with any BS in thezone to get the signal, but it is registered with one BS.The SFN architecture brings an additional gain of several

dBs in the radio channel, thus improving reception qualityallowing higher rates and extending coverage. SFN alsomakes user transition from one sector to another virtuallyseamless.

"Time-slicing" technology, successfully implemented inDVB-H and other broadcast standards, increases the lifetimeof the terminal's battery by receiving the content in shortbursts, rather then continuously. MBS can be accessed whenthe SS is in Idle mode (i.e. not registered, but is paged if anytraffic is addressed to the SS while it is not active) to allowlow SS power consumption.MBS also allows flexible allocation of radio resources, data-

casting (e.g. weather, stocks etc.) and low channel switchingtime (2s).Macro diversity enhances MBS delivery. With Macro

Diversity Handover (MDHO) the SS having activeconnections with several BS, combines the multiple receivedsignals with one diversity algorithm to retrieve one signal. TheSS send packets to all BS and the best copy is selected by theanchor BS. This technique supports seamless handover or softhandover between cells. Mobile IP is used to ensure IPconnectivity and continuity of flow delivery, by keeping thesame IP address for the SS moving between two BS, or eventwo ASN-GW zones. The Home Agent HA is located in theCSN as an Anchor point, while the Foreign Agent is located atthe ASN-GW connected to the BS delivering the traffic to theSS. Tunneling is used to deliver the traffic to its finaldestination.Mobile Wimax supports integrated MBS and unicast

services transmitted in the same DL sub-frame, or standalonebroadcast service using a whole DL frame.MBS architecture is very useful when transmitting broadcast

or multicast information, as for broadcast TV. So afteraccessing and registration to the network, the TV applicationclient in the SS requests a bandwidth allocation (if not pre-provisioned) depending on the resolution and quality of thereceived TV signal. When the bandwidth allocation is granted,the SS sends an IGMP message to the BS to join a multicastgroup comprising subscribers watching the same channel asthe one requested by the subscriber. The BS then forwards thismessage via the ASN-GW to an edge router supporting

22

Page 7: IPTV Over Wimax

multicast in the connectivity network. If the requested channelis already delivered by this router, a new video stream isforwarded to the SS. Otherwise a new multicast path is createdin the core network to deliver the requested stream.Statistical multiplexing is also used to distribute dynamicallyin almost real-time the total bandwidth among multiplestreams, by assigning more bandwidth to a bandwidth-intensive stream of video than to one that is less bandwidth-intensive. Because you can do that in real time, you canguarantee the quality of the audio and video, and increase thecontent per channel by 30-40°OO.To further enhance network performances in the QoS domain,admission control procedure as well as the diffservarchitecture could be used.

While developing WiMAX TV architecture may not involvea heavy infrastructure upgrade, it is necessary to integratesoftware to manage the network's operations. The WiMax TVManager, the first software module implemented in theApplication Service Network which ensures the managementof the entire WiMAX TV network, as well as the integrationof the broadcast/multicast system with content sources, thesecond module in the ASN implements the core functions ofthe WiMAX Multicast-Broadcast Service Controller. The BSModule enforces SFN broadcasting and time-slicing andexecutes procedures for local content adaptation and insertion.

To have an idea about the performances of MBS and otherfeatures in Mobile Wimax TV, some numbers are revealed: upto 45 users per transmitter can be served in a 10 MHzbandwidth channel, with a video resolution of 320x240 pixels(Quarter Video Graphics Array QVGA), a frame rate of 15-30fps at a 300Kb/s bitrate. The SS battery can last up to 4 hours[1 1].

E. Video on Demand

In Video on Demand, the client application installed insubscriber terminal communicates with the head-end videoserver application to receive the video stream of a chosenprogram or movie. This is done in unicast mode. The networkmust deliver this stream only to the requesting subscriber.With the increasing number of users, the constraints on thenetwork will also increase making the fulfillment of QoS andQoE requirements more difficult to achieve. To reduce thestrains on the network a distributed architecture isimplemented. Many local servers are distributed in differentgeographical sites provided with the most popular programs,so when a subscriber asks to watch a specific program it isdirected to the local server at the local Central Office and notthe head server, at the Video Head-end VHO. Byimplementing this architecture the real-time QoS requirementsare fulfilled in a better cost-effective way, especially wheninteractivity is required.

VI. CONCLUSION AND FUTURE EVOLUTION

The end-to-end TV and VoD transmissions need more than theexcellent performances of the last-mile access networkWimax, to fulfill the QoE requirements of the end-user.Enhancements must be undertaken in the IP network to allowthe cost-effective delivery of very high quality video contentas HDTV to the subscriber, who can compare what he isactually receiving to what he expects, taking as referenceswhat is provided by other medias as DVD, satellite, cable andterrestrial. With the announcement from the CERN, (theEuropean Nuclear Research Center that created the web), ofthe superfast internet called "the grid" 10,000 times fasterthan a typical broadband connection, we are entering a newera of video content streams. Holographic, 3D video streams,interactive gaming with thousands of players, flawless highdefinition video telephony, 3D telepresence will become areality.

Concerning Wimax's evolution, the new standard 802.16mwill provide full mobility for the users, without reduction ofQoS during handovers, with a maximum bit rate of 100Mbpsat speeds up to 350Km/h, and a 1Gbps for nomadic users [12].

REFERENCES

[1] C. Bohm, "Solving the QoS bottleneck in video and tripleplay networks",www.lembedded. comrdesign/207100744

[2] F. Horsfall, T. Rahrer, L. Thorpe, and C. York, "Video services:Addressing the quality of experience and deployment challenges",Nortel Technical Journal, Issue 5.

[3] s. Beldona, "Alternative architectures for Broadcast TV/VideoDistribution in Metro Ethernet Networks", Cisco Systems, 2005,www.sanog.org/resources/sanog7/beldona-alternative-arch-broadcast.pdf

[4] Y. Cognet, "Measuring QoS parameters at the box",www. digitaltvdesignline. comrhowto/advancedvideodisplayl177100489

[5] Motorola white paper, "WiMAX: E vs. D.The Advantages of 802.16eover 802.16d", www.motorola.comlnetworkoperatorslpdfslnewIWIMAX E vs D.pdf

[6] "Mobile WiMAX- Part I: A Technical Overview and PerformanceEvaluation", Wimax Forum, Aug.2006, www.wimaxforum.orglnewsIdownloads Mobile WiMAX Part]Overview_and_Performance.pdf

[7] H. Yaghoobi, "Scalable OFDMA Physical Layer in IEEE 802.16WirelessMAN", Intel Technology Journal, p. 201-212, vol. 8,issue 3Aug 2004

[8] S. Srikanth, V. Kumaran, C. Manikandan, "Orthogonal FrequencyDivision Multiple Access: Is it the Multiple Access System of theFuture", http://comm.au-kbc.org/Docs/Tutorils/OFDMA_BCW cv6.pdf

[9] C.-E. Joys, "WIMAX", AlcatelLucent, 2007, wirelesscenter.dk/arrangementer/07/wimax-mobile-accesslpresentationslWiMAX-CarlEdwardsJoys.pdf, p.48

[10] C. Eklund, R. B. Marks, K.L. Stanwood, S. Wang, "A TechnicalOverview of the WirelessMAN Air Interface for Broadband WirelessAccess". IEEE Communications Magazine, p. 98-107, June 2002.

[11] MxTV white paper "The Innovative New Mobile Multimedia andAdvertising Platform for WiMAX Operators", NextWave wireless, 2008

[12] J. Smith, "Broadband Wimax Update & Solution", www.cisco.com/web/YU/eventslexpo 07/pdfs/Wimax-John-Smith.pdf, Cisco Systems,2007

23