Académique Documents
Professionnel Documents
Culture Documents
projects
Paper Title
Author Name
INTERNAL
LTE History:
LTE started off as an idea in 2004 as NTT DoCoMo of Japan proposed LTE as the
international standard. While the first presentation of an LTE demonstrator offering HDTV
streaming (>30 Mbit/s) and Mobile IP-based handover between the LTE radio test subject and the
commercially available HSDPA RAT was shown during the ITU trade fair in Hong Kong in
December 2006 by Siemens (today Nokia Siemens Networks).
In February 2007, Ericsson demonstrated for the first time in the world LTE with max
throughput of 144 Mbit/s, later in September of the same year, NTT DoCoMo demonstrated LTE
data rates of 200 Mbit/s with power level below 100 mW during tests. Eventually, LTE technology
was introduced commercially in December 2009 by TeliaSonera in Norway and Sweden.
INTERNAL
LTE Throughput:
Upon the introduction of LTE, many have seen or heard about wild figures, mainly pushed by
system vendors and consumed by operators, journalists and writers who like to wow the readers
while promising 1 or 2 GBit/s throughput.
On the network operator side (customer), the capacity expectations has negative
consequences as capacity mainly impacts the cost of the network both on the access side and the
backhaul side. Exaggerated capacity figures would lead to under-dimensioning on the access side
and over-dimensioning on the backhaul side. So, for example, if we think LTE cell will provide
100 Mbps of throughput while in reality it can only offer 50 Mbps, the operator will be short by
50% of capacity in the access network resulting in a "Below Expectations" user experience and
will be 50% over the required capacity for backhaul in which case its investment in capacity that's
sitting idle. This is why it is important for Vendors to match the capacity expectations right and the
customer to be realistic about their demands.
While many have heard about the standard and realistic LTE's peak throughput, e.g. 300
Mbps, it's not a standard by any means and can vary based on a lot of dependencies starting from
the hardware equipment all the way to the transmission medium conditions. In this paper, we will
explain the calculations of theoretical throughput for both the LTE FDD and TDD systems and
how to use this knowledge in the design and acceptance phases of wireless LTE rollout projects.
Overview of LTE Physical Layer:
The LTE physical layer deals with parameters like frequency, bandwidth, modulation
scheme, cyclic prefix and coding rate which play important roles in the calculation of the
peak throughput of LTE networks.
LTE system uses OFDMA as access technology in downlink to increase the spectra
efficiency and SC-FDMA in uplink due to low peak-to-average power ratio advantage.
LTE supports both TDD and FDD duplexing, flexible bandwidth i.e. 1.4,3,5,10,15 and
20 MHz and modulation schemes QPSK, 16QAM, 64QAM. Below we will discuss the
significance of each parameter.
INTERNAL
(A) OFDM: OFDM is used in LTE as users can be allocated to different resources in BOTH time
and frequency domains, unlike TDMA or FDMA where the user was mainly
allocated to different resources in one of the two domains and fixed in the other.
OFDM main advantage is the spectral efficiency, as it eliminates the usage of guard
bits in the frequency domain which allows the maximum and most efficient use and
transmission
of
data.
Figure (2): Orthogonal Frequency Subcarriers eliminate the usage of guard bands
INTERNAL
INTERNAL
(B) Resource Block: Combining the previous information about the OFDM, we can now understand the
concept of a resource block. Resource Block - A unit of transmission resource
consisting of 12 subcarriers in the frequency domain and 1 time slot (0.5 mSec) in
the time domain.
INTERNAL
(C) LTE Frame: An LTE frame consists of 10 ms = 10 subframes = 20 time slots. While an LTE
subframe or TTI, which is the least resource that carries data in LTE consists of two
slots i.e. 1 millisecond in time.
Figure (6): LTE Time Frame / SubFrame System with CP in Time Domain
INTERNAL
For Each channel frequency we'll have different number of resrouce blocks and higher througput,
in this case, let's assume the 20 MHz channel, remove 10% as total guard bands (used to
compensate the inter-bit loss).
Figure (9): Different LTE Channel Frequencies related to subcarriers after guard bands
INTERNAL
INTERNAL
While speaking of Modulation, we must also speak of a parameter called coding rate. Coding rate
defines the efficiency of particular modulation scheme. for example, if we say 16 QAM with
coding rate of 0.5, it means this modulation has 50% of efficiency i.e. as 16QAM can carry 4 bits
but with coding rate 0.5 it can carry 2 information bits and 2 redundancy bits for these
information.
This is called Modulation Coding Scheme (MCS). Below is the table for it. LTE supports 0 to 28
MCS in DL and 0 to 22 MCS in UL in R8.
INTERNAL
One more factor to calculate is the UE category, this is reported in the UE capability report in LTE
signaling between UE and EUTRAN.
UE category 1-5 are for release 8 and 9 while UE category 6-8 are for release 10 (LTE-Advance).
INTERNAL
20 MHz channel
64QAM
Normal - Type 1 cyclic Prefix
No MCS
4x4 MIMO
But be noted that this is shared throughput for an eNodeB, which means this peak throughput will
be divided among users according the the QoS system of LTE
INTERNAL
ULCYCLICPREFIX > Uplink Cyclic Prefix Length, you can choose between
Normal and Extended
DLCYCLICPREFIX > Downlink Cyclic Prefix length, you can choose between
Normal and Extended
ULBANDWIDTH > Uplink Bandwidth, CELL_BW_N6 (1.4 MHz)
CELL_BW_N15 (3 MHz) CELL_BW_N25 (5 MHz) CELL_BW_N50 (10 MHz)
CELL_BW_N75 (15 MHz) CELL_BW_N100 (20 MHz)
DLBANDWIDTH > Downlink Bandwidth, CELL_BW_N6 (1.4 MHz)
CELL_BW_N15 (3 MHz) CELL_BW_N25 (5 MHz) CELL_BW_N50 (10 MHz)
CELL_BW_N75 (15 MHz) CELL_BW_N100 (20 MHz)
TXRXMODE > MIMO Mode, 1T1R (No MIMO), 1T2R (No MIMO), 2T2R
(throughput x2), 2T4R (throuput x2), 4T4R (throughput x4), 8T8R (througput x8)