KPI Optimization Test Plan for LTE

March 29, 2017 | Author: huzaif zahoor | Category: N/A
Share Embed Donate


Short Description

Download KPI Optimization Test Plan for LTE...

Description

Subject:

Alcatel-Lucent KPI Optimization Test Plan for LTE

Date:

April 22, 2010

Version 1.0

ABSTRACT This document provides a detailed test plan for KPI Optimization of the LTE field technology deployment as detailed in the Verizon High Level Test Plan. The tests will be executed in Boston, MA and surrounding areas. The primary objective of the Optimization is to validate basic KPI functionality, evaluate performance of LTE AirInterface functionalities. The scope of test cases included in this KPI Test Plan spans several areas including access, latency, coverage and Capacity

Alcatel-Lucent – Proprietary Alcatel-Lucent - Proprietary This document contains proprietary information of Alcatel-Lucent and is not to be disclosed or used except in accordance with applicable agreements. Copyright © 2006 Alcatel-Lucent Technologies Unpublished and Not for Publication All rights reserved

Revision History Version Description 1.0 Initial version of test plan that will be executed for KPI Optimization

Date 04/22/10

Table 1: Revision History

References [1] LTE KPI Optimization High Level Test Plan, Version 1, April 22 2010, Verizon Wireless

Alcatel-Lucent– Proprietary See notice on first page

2

Glossary of Terms DLLS EAT ePC GbE LLDM LMT MME MPLS MS NPO OAM OLSM RAMSES RRH SAE TLS

VLAN

Down-Link Load Simulator Enhanced Analysis Tool Enhanced Packet Core Gigabit Ethernet LGE Logging and Diagnostic Module Local Maintenance Tool Mobility Management Entity Multi Protocol Label Switching Management Server Network Performance Optimization Operation, Administration and Management Open Loop Spatial Multiplexing Role-based Access Management Security System Remote Radio Head System Architecture Evolution Transparent LAN Service Virtual LAN

Alcatel-Lucent– Proprietary See notice on first page

3

Table of Contents ABSTRACT.................................................................................................................................................... 1 REVISION HISTORY ................................................................................................................................. 2 REFERENCES ............................................................................................................................................. 2 1

INTRODUCTION ............................................................................................................................... 7

2

KPI OPTIMIZATION TARGET ....................................... ERROR! BOOKMARK NOT DEFINED.

3

OPTIMIZATIOIN SYSTEM.............................................................................................................. 8 3.1 DEPLOYMENT SYSTEM ARCHITECTURE .................................................................................... 8 3.2 AIR INTERFACE OVERVIEW ..........................................................................................................11 3.3 ACCESS TERMINALS ......................................................................................................................11 3.4 LTE ENODEB FUNCTIONS .............................................................................................................11 3.5 LTE MME FUNCTIONS .................................................................................................................11 3.6 LTE SAE FUNCTIONS ...................................................................................................................12 3.7 DATA LAPTOP CONFIGURATION ...................................................................................................12 3.8 APPLICATION SERVERS .................................................................................................................12 3.9 RF OPTIMIZATION TOOLS ........................................................................................................12 3.9.1 eDAT Tool ...........................................................................................................................13 3.9.2 Enhanced Analysis Tool (EAT) ...........................................................................................13 3.9.3 Agilent tool ..........................................................................................................................14 3.9.4 WPS .....................................................................................................................................14 3.9.5 WINDS ................................................................................................................................14 3.9.6 SYNCRO TEST ....................................................................................................................14

4

PROCESS OVERVIEW ..................................................... ERROR! BOOKMARK NOT DEFINED. 4.1 DEPLOYMENT SITE LOCATIONS ...............................................................................................17 4.2 SITE READINESS..................................................................ERROR! BOOKMARK NOT DEFINED. 4.2.1 Spectrum Clearance ............................................................................................................13 4.2.2 Antenna Audit ......................................................................................................................13 4.2.3 Sector Verification...............................................................................................................14 4.2.4 Baseline Existing System .....................................................................................................14 4.3 RF OPTIMIZATION PLANNING ..............................................................................................19 4.3.1 Perform Parameter Audit ....................................................................................................13 4.3.2 Validate Initial Neighbor lists .............................................................................................13 4.3.3 Tool Readiness ....................................................................................................................14 4.3.4 Define Clusters ....................................................................................................................14 4.3.5 Drive Route Planning ..........................................................................................................14 4.4

RF OPTIMIZATION EXECUTION ............................................................................................17 4.4.1 Cluster Optimization ...........................................................................................................13 4.4.2 System Verification ..............................................................................................................13

5

TEST CASES ......................................................................................................................................20 5.1 PHYSICAL LAYER THROUGHPUT TESTS PEAK ...............................................................................20 5.1.1 Single User Downlink Physical Layer Throughput Test Peak .................................................20 5.1.2 Single User Uplink Physical Layer Throughput Test Peak .....................................................28 5.2 RLC THROUGHPUT TESTS PEAK .................................................................................................30 5.2.1 Downlink RLC Throughput Peak Test .....................................................................................30 5.2.2 Uplink RLC Throughput Peak Test .........................................................................................31 5.3 PHYSICAL LAYER THROUGHPUT TESTS MEDIAN ..................................................................30 5.3.1 Downlink Physical Layer Throughput Median Test ................................................................30 5.3.2 Uplink Physical Layer Throughput Median Test ....................................................................31

Alcatel-Lucent– Proprietary See notice on first page

4

5.4 5.5 5.6 5.7 5.8 5.9 5.10 5.11 5.12 5.13 5.14 5.15 5.16 5.17 5.18 5.19 5.20 5.21 5.22 5.23 5.24 5.25 5.26 5.27

RRC SETUP FAILURE RATE ....................................................................................................33 ATTACH FAILURE RATE.........................................................................................................34 ATTACH DELAY .......................................................................................................................35 SERVICE REQUEST FAILURE RATE .....................................................................................36 SERVICE REQUEST DELAY ....................................................................................................37 DEDICATED BEARER ACTIVATION FAILURE RATE ........................................................39 DEDICATED BEARER ACTIVATION DELAY .......................................................................40 DEDICATED BEARER DROP RATE........................................................................................41 CONTEXT DROP .................................................................ERROR! BOOKMARK NOT DEFINED. RRC DROP ..................................................................................................................................33 ACCESS RACH LATENCY .......................................................................................................34 DL PHYSICAL THROUGHPUT 5TH.%-ILE ..............................................................................35 UL PHYSICAL THROUGHPUT 5TH.%-ILE ..............................................................................36 RLC ARQ/HARQ RETRANSMISSION RATE ..........................................................................37 PACKET LATENCY(ROUND-TRIP DELAY) ..........................................................................39 S1/X2 HANDOVER FAILURE RATE .......................................................................................40 S1/X2 HANDOVER INTERRUPTION TIME, INTRA-ENB .....................................................41 S1/X2 HANDOVER INTERRUPTION TIME, INTER-ENB ERROR! BOOKMARK NOT DEFINED. INTRA/INTER MME TAU FAILURE RATE ............................................................................33 PAGING PERFORMANCE ........................................................................................................34 ATTACH DELAY .......................................................................................................................35 IRAT HANDOVER FAILURE RATE ........................................................................................36 RF-SINR ......................................................................................................................................37 RF-RSR[ ......................................................................................................................................39

Alcatel-Lucent– Proprietary See notice on first page

5

List of Tables Table 1: Revision History................................................................................................................ 2 Table 2: DEPPLOYMENT Cell Locations .................................... Error! Bookmark not defined. Table 3: SNR of Different Cell Locations ...................................... Error! Bookmark not defined.

List of Figures Figure 1: LTE DEPLOYMENT Network Architecture .................................................................. 9 Figure 2: LTE DEPLOYMENT Network Transport Configuration ............ Error! Bookmark not defined. Figure 3: LTE DEPLOYMENT Network Transport Configuration ............ Error! Bookmark not defined. Figure 4: EAT configuration in LTE DEPLOYMENT ................................................................. 14

Alcatel-Lucent– Proprietary See notice on first page

6

Introduction KPI OPTIMIZATION TARGET Category

Sub-Category

Performance- RRC Setup Failure Rate Accessibility Attach Failure Rate Attach Delay Service Request Failure Rate

Service Request Delay Dedicated Bearer Activation Failure Rate

Scope C L L C

L N

Dedicated Bearer Activation Delay Performance- Dedicated Bearer Drop Rate Retainability

N N

Context Drop RRC Drop Performance- Access RACH Latency Integrity

N C L

DL/UL Physical Layer Throughput, peak

Target Value 0.70% 2.50% 2 seconds

2.50% 0.5 second 1.50% 0.5 seconds 1.20% 1.20% 1.20% 0.5 seconds

L

60/20 Mbps DL/UL RLC Throughput, peak

L

55/18 Mbps DL/UL Physical Layer Throughput, median

C

DL/UL Physical Layer Throughput, 5th %-ile

R

RLC ARQ/HARQ Retransmission Rate

N

Alcatel-Lucent– Proprietary See notice on first page

7/3 Mbps 1/0.5 Mbps 1% 7

Packet Latency (round-trip delay) Performance- S1/X2 Handover Failure Rate Mobility

L C

30 msec

1.20%

RF-SINR

RF-RSRP

S1/X2 Handover Interruption Time, intra-eNB

L

S1/X2 Handover Interruption Time, inter-eNB

L

Intra/Inter-MME TAU Failure Rate Paging Performance IRAT handover failure rate Percent Included Area > 13 dB SINR Percent Included Area > -5 dB SINR Percent Included Area < 143 dB RL OPL (referenced to full-power signal)

R

100 msec 100 msec 2% 95%

R R R

10% R 90% R 90%

DEPLOYMENT SYSTEM 1.1 Trial System Architecture This section provides a high level description of the LTE system architecture and a description of all involved entities and interfaces between entities. As shown in Figure 1, the LTE architecture network is composed of:    

Multiple eNodeBs An ePC encompassing two functions: MME and SAE Gateway Metro Ethernet Backhaul Applications servers

As shown in the figure, the cluster of eNodeBs will be connected via the 7705 and 7750 routers to the other network elements such as the ePC and application servers. Each eNodeB will consist of a D2U and three TRDUs. Each D2U will consist of a uCCMs (controller with interface to backhaul) and three eCEMs (modems; one for each sector). Each eCEM will be connected via fiber optic cable to the TRDU.

Alcatel-Lucent– Proprietary See notice on first page

8

CLIENTS

VIDEO VoIP SMS

TCP/ UDP HTTP

hub

VIDEO

hub

hub

IP IP

EAT Agilent PDM EAT server Analyser MU server

WMS

SUN Sparc/5620 SAM

RAMSES Gate VPN

Monitoring 7750

eNB OPTICAL FIBER

LTE RAN

RAMSES Mediation device

OPTICAL FIBER

7705

CISCO PIX

Metro Metro Ethernet Ethernet backhaul backhaul

MME/SAE GW Copper shielded

NTP server

LTE Core

7705 eNB

Figure 1: LTE Deployment Network Architecture

Alcatel-Lucent– Proprietary See notice on first page

9

The information below details the hardware description and the associated OAM equipment list for the Trial network: 

     

eNodeB o D2U V2 (1 uCCM + 3 eCEM) o TRDU (remote-radio-heads comprising of amplifiers and filters), 40W Tx power Backhaul o 7750 SR (service router) o 7705 SAR (service aggregation router) Transport services o Cisco TLS EVC ePC o PDN/MME/SAEGW - IPD ATCA Security o Cisco PIX LAN switching o Module from 7750 Remote support o RAMSES

In order to manage the eNodeBs, a complete OAM system has to be designed to host the following functions:    

Configuration management Fault management Performance management Traces

Additional OAM systems include:         

LMT to configure and set up the D2U platform to commission IP addresses, DLCP server, and default gateway. The LMT connects locally to console port LMT to configure the ePC complex (MME, S-GW and PDN GW) (connects locally to console port) Management Server (MS) to configure the eNB and display eNB status and fault information Network Performance Optimization (NPO) to collect performance counts and measurements. The MS and the NPO are together referred to as the LTE Management System Server MS and NPO clients to interface to MS and NPO servers Netscreen firewall to protect the LTE network elements from intrusion Netscreen Gate firewall to filter access to the RAMSES Mediation system RAMSES Remote Access and RAMSES Mediation PC to provide access control and authentication on remote access to the LTE network elements 5620 S/W product managing the monitoring aspects on 7750

Alcatel-Lucent– Proprietary See notice on first page

10

1.2 Air Interface Overview Main inputs to setup LTE air interface during the Optimization are:      

10 MHz spectrum bandwidth in the Upper Band C (700 – 770 MHz) Number of frequency carriers: 1 3 sectors per eNB site 3 TRDUs per eNB site SFBC/MIMO in DL and SIMO in UL Cross pole and Vertical pole antennas

1.3 Access terminals LGE G Series UEs will be used during the Optimization 1.4 LTE eNodeB functions The eNodeB hosts the following functions:     

Functions for Radio Resource Management: Radio Bearer Control, Radio Admission Control, Connection Mobility Control, Dynamic allocation of resources to UEs in both Uplink and Downlink (scheduling) Routing of user plane data towards SAE gateway Scheduling and transmission of paging messages (originated from the MME) Scheduling and transmission of broadcast information (originated from the MME or OAM). Measurement and measurement reporting configuration for mobility and scheduling functions

1.5 ALU 7750 Service Router Functions The SR7750 is an edge router that will host the following functions:    

Link aggregation DSCP mapping VLAN and LAN switching IP router to reach different Application servers

1.6 ALU 7705 Service Aggregation Router Functions The ALU 7705 is a Service Aggregation Router (SAR) that offers:  

A service-oriented capability to the RAN IP/MPLS RAN transport solution

1.7 LTE MME Functions The MME hosts the following functions: 

Idle mode mobility o S1 connection establishment Alcatel-Lucent– Proprietary See notice on first page

11

 

o Idle to active mode transition o Active to idle mode transition Session management o QoS control S1 handling during HO

1.8 LTE SAE Functions The SAE Gateway hosts the following functions:    

Multiple bearer support (one default and one dedicated) S1 GTP-U bearer endpoint Idle mode handling: bearer suspension with paging request S1 path switch during handover

1.9 Data Laptop Configuration The access terminal will interface with a data laptop to support Pings, FTP, UDP, and HTTP data transfers. The laptop should be configured per the recommended parameters to optimize performance and provide an appropriate comparison to existing data. These recommendations include:     

Windows 2000 Professional or Windows XP edition IP header compression (VJ compression) turned OFF PPP software compression OFF except for HTTP data transfer 128Kbytes TCP window size MTU of 1500 bytes

1.10 Application Servers Application servers will be used during the Trial to provide the data content for the various tests. These servers will be provided by Alcatel-Lucent and they need to be easily accessible and not blocked or restricted by low bandwidth pipes. Windows 2000 Server Edition will be used as a data server, which should reside as close to the PDN gateway as possible. This will eliminate performance uncertainty due to any external network delay. Data applications available through this server will be:   

UDP DOS FTP - TCP/IP Ping

1.11 Test tools In order to validate functionality and quantify performance, a variety of test tools will be used. Traffic Generation Tools: 

DOS-FTP, WINDS, Ping – To generate TCP/IP and UDP based traffic data for measuring data capacity and network latency Alcatel-Lucent– Proprietary See notice on first page

12

 Ping scripts for access tests Logging tools 

Agilent is a Diagnostic Monitor for logging and analyzing over-the-air Network system performance and parameters  Enhanced Analysis Tool (EAT): to collect internal traces generated by the eNB Analysis tools   

Agilent Protocol analyzer KPI collector and generator Packet Data Monitoring (PDM) tool: a distributed data performance, analysis and troubleshooting service

1.11.1 PDM Tool PDM is a distributed data performance analysis and troubleshooting service for packet data, providing end-to-end and per-link data quality analysis. It can be used for the following functions:       

Vendor independent Packet data network monitoring capabilities Consistent and automated testing and analysis of packet networks Characterization of end-user perceived performance in terms of throughput, latency, dropped packets, etc. Ability to monitor / test packet network to support time sensitive applications, such as VoIP Identification of links/components requiring maintenance and optimization, and monitoring of link and end-to-end performance Enables precise data correlation across the entire network Reduces resource requirements with an automated and remote controlled sniffer system

1.11.2 Enhanced Analysis Tool (EAT) The EAT stores internal traces of up to 15 eNodeBs. EAT runs on a Linux PC which is connected to one or several eNodeBs via Ethernet (Figure 2). It connects to each eNodeB and configures the trace service by providing a destination IP address (i.e. its own IP address) and which traces to activate. EAT does not know the exact moment the traces start, so it has to listen via a socket server. Traces are received via UDP.

Alcatel-Lucent– Proprietary See notice on first page

13

CLIENTS EAT EAT server

hub

OAM

Eng’ Eng’rules rulesfor forEAT: EAT:

LTE RAN Backbone Backbone

7750

LTE CORE

To get EAT traces at eNB site: a local access point to OAM VLAN required either from an existing connection point or from an external hub/switch to come with on eNB site

Traces Tracesare aretransported transportedinside inside UDP payloads UDP payloads––trace tracetransport transport UDP/IP/Eth UDP/IP/Eth(standard (standardEthernet Ethernet frames) frames) Trace Tracecontent contentofficially officiallyreleased released (5.3 (5.3Mbps) Mbps)by bySystem Systemdesign design team. team. Traces Tracesforwarded forwardedinside insideVLAN VLAN OAM OAMand androuted routedthanks thanksto toLAN LAN switch to EAT server (dest IP switch to EAT server (dest IP@@ used) used) Internal InternaleNB eNBtraces tracesactivated activatedby by default default––Limited Limitedexport exportto to outside outside(up (upto toEAT EATserver) server)TBC TBC Traces coming from Traces coming fromstandard standard Ethernet Ethernetport portofofthe theeNB. eNB.Debug Debug port not used. port not used.

Figure 2: EAT configuration in LTE Trial

1.11.3 Agilent tool The Agilent DNA is a protocol analyzer and can be used for user-plane analysis. It offers a scaleable, distributed probing architecture and the raw data is collected at the S1 level. Owing to the true client-server architecture, each user client is able to test independently. To enable detailed protocol analysis, Hardware Intelligent Packet Processing (HI-PI2) at line rate will be required. At the user-plane, the analysis will also require a separation of the signalling and payload packets to be able to correlate and analyze events at both levels. 1.11.4 Wireshark Wireshark is a network protocol analyzer. Wireshark offers the capabilities to capture network data elements and provide some metrics on the data snooping. 1.11.5 SyncroTest SyncroTest is an automated test tool to control test mobiles (called “Probes”) remotely using a central “Master Controller” console. (See Figure 3: SyncroTest Architecture below) SyncroTest probes control the functionality of LLDM/Agilent, WINDS and FTP to generate traffic and monitor the connections from the probe’s point-of-view. The log data is then sent back to the master control for analysis. Probes have also been developed to support remote control of EAT so that testing and data collection can be synchronized. A single SyncroTest master controller can control at least 8 Agilent/LLDM/WINDS probes and an EAT probe simultaneously eliminating the need for an RF engineer in each drive test vehicle.

Alcatel-Lucent– Proprietary See notice on first page

14

SyncroTest uses self healing TCP connections with each probe to direct the probe and it receives periodic heartbeats from the probe to update the probes status.

Figure 3: SyncroTest Architecture

1.11.6 Data Analysis Tool (eDAT) eDAT is a post-processing tool that allows a user to analyze RF performance KPIs. eDAT takes input from both UE logs and eNodeB logs. eDAT provides a standardized approach to analyze metrics in the Radio Access Network. eDAT uses the UE Logs generated by Agilent/LLDM as input from the UE perspective. The eNodeB logs are generated by EAT and provide an additional input to eDAT to analyze KPIs from the eNodeB perspective. The following diagram illustrates the eDAT configuration for the LTE Deployment.

Alcatel-Lucent– Proprietary See notice on first page

15

UE logs

EAT

LDAT

eNodeB logs

LGE-LLDM

LTE UE

eNodeB KPIs

Messages Events KPIs KPIs KPIs

Figure 4: eDAT LTE Trial Configuration

Once the data has been collected, eDAT post-processes the UE and eNodeB log files to generate the analyses of the KPIs. eDAT is a standalone tool that does not need to be connected to the fixed infrastructure. The output includes maps, graphs, plots, reports and message decoding. All these can be used to evaluate the RF performance of the LTE Trial network. eDAT makes use of event timestamps and locations of the UEs to geographically plot out the data.

Alcatel-Lucent– Proprietary See notice on first page

16

Process Overview 1.12 Deployment Site Locations maps

1.13 Site Readiness The Site Readiness are health checks that ensure all cells are operating as required. These procedures are usually performed after deploying a new network or when introducing new cell sites required for a professional service. Once these health checks have been performed and it a satisfactory performance of all cells can be guaranteed, these health checks are no longer a prerequisite of the RF Optimization. 1.13.1 Spectrum Clearance Verification The spectrum clearance assures that no external interference is present and sufficient guard bands are obeyed. The detection of the interferences can be a very time consuming and difficult task once the LTE system is up and running. It is desirable to have a very high degree of confidence that the spectrum is cleared prior to any testing. 1.13.2 Antenna Audit This phase involves a series of quality checks to ensure proper installation of the antenna system. The number of audited cell sites will depend on the customer contract. There is a recommended audit minimum of 25% of cell sites in a cluster. The selection of cell sites must be done with input from the customer. If more than 50% of the audited antennas uncover installation errors, the remaining antennas in the cluster must also be audited. Based on the results and the confidence level of the antenna installations, the percentage of cell sites to be audited may vary for successive clusters. The audit process consist of various inspections on antenna height, antenna azimuth, antenna type, antenna mechanical down-tilt, cable length, etc. 1.13.3 Sector Verification The sector tests include verification of basic call processing functions including origination, termination and handover tests. Measurements are made on LTE signal levels to verify that each sector is transmitting with the appropriate power levels and the Cell id. These basic functions tests are intended to detect hardware -, software-, configuration – and parameter errors for each cell site in the cluster prior to further drive testing. Sector drives should be executed for each sector in the system or according to contractual obligations. Due to the simple nature of the drives, sector drives do not require customer approval. 1.13.4 Baseline Existing System The objective for the Baseline Existing System is to collect the RF performance metrics of the existing LTE system equipment. Baseline driving should be performed prior to any RF Optimization activity and contains measuring of the Key Performance Indicators. Alcatel-Lucent– Proprietary See notice on first page

17

Drive routes and Key Performance Indicators will be the same as the ones used later for System Verification. It is important to keep the drive routes and KPI’s identical for performance validation and comparison purposes. Drive routes and KPI’s must be agreed upon with the customer. 1.14 RF Optimization Planning The Optimization planning phase ensures system and tool readiness for RF Optimization before beginning the actual drive testing. 1.14.1 Perform RF Parameter Audit RF parameters must be inspected for consistency with the LTE parameter catalogue. The RF parameter settings used in the network can be obtained from the NDP project database. These settings are then audited using the LTE parameter catalogue WPS 1.14.2 Validate Initial Neighbor Lists An important step within the RF Optimization preparation phase is associated with the neighbor list verification. The complete neighbor lists in the LTE network are required to compare the neighbor relations with network design plots. Neighbor relations need to be verified for recent updates, validity and appropriateness. The recommended strategy is to have a minimum number of neighbor relations in the neighbor lists. The neighbor lists used in the network can also be obtained from the WPS project database. 1.14.3 Tool Readiness Appropriate drive test tools and post-processing tools, need to be prepared for optimization. 1.14.4 Define Clusters Approximately 15-19 eNodeBs should be combined into one cluster. The actual number used is based on the network expansion as well on topographical environment. The clusters are selected to provide a center eNodeB with tow rings of surrounding eNodeBs . 1.14.5 Drive Route Planning Drive routes need to be defined for Sector Verification, Cluster Optimization and System Verification. Coverage prediction plots, morphology and clusters can define all drive test routes. The drive route should maintain a distance equal to ½ of the cell site radius for sector verification. The routes for Cluster Optimization shall consist of major roads, highways and hotspots. Total time to drive all routes in atypical cluster should be approximately 6 to 8 hours. Additional border route is chosen by the way it crosses the cluster borders witout going into the cluster areas. The System Verification drive route are used to collect the metrics for the Exit Criteria. The routes are a combination of individual clusters.

Alcatel-Lucent– Proprietary See notice on first page

18

1.15 RF Optimization Execution The RF Optimization Execution consists of drive tests, problem area identification, verification drives, and final drives to ensure completion of Exit Criteria. The core activity is to provide system tuning, as well as data collection and reporting. LTE network optimization would be performed under loaded network conditions. 1.15.1 Cluster Optimization The Cluster Optimization consists of three phrases:  Unloaded Cluster Optimization  Loaded Cluster Optimization  Cluster Performance Verification During the first Cluster Optimization phase, a measurement drive is performed under unloaded network conditions using the optimization route. Once the data from the first phase are collected, problem spots are identified and optimized. The unloaded drive test identifies coverage holes, handover regions and multiple pilot coverage areas. It also spots eventual overshooting sites (as interferences is minimal) from areas belonging to neighbor clusters. The first pass might lead to correction of neighbor lists and adjustments of the fundamental RF parameters such as transmit powers and/or antenna azimuths and antenna tilts. The drive test information highlights fundamental flaws in the RF design under best-case conditions The second Cluster Optimization phase is performed under loaded conditions. The drive routes for the loaded Cluster Optimization will be exactly the same routes as those used for the unloaded measurements drives. Loading the cell will cause an increase of negative SNR valuses, identify potential coverage holes, result in higher BLER, result in lower mobility throughput, and more dropped calls. The objective is to fix the problems observed by the field teams. This involves the fine-tuning of RF parameters such as the transmit power or handover parameters. Antenna re-adjustments (e.g. down-tilts, azimuths, patterns/types or heights) are also occasional performed. The Cluster performance is measured against the cluster Exit Criteria. The exit drive’s purpose is to verify and to confirm specific Exit Criteria demanded by the customer. The final statistics from the cluster exit drive are presented to the customer for approval. These statistics contain plots as well as data in tabular form. 1.15.2 System Verification System Verification is the final phase of the RF Drive Test Based Optimization activity and it focuses specifically on collecting overall performance statistics. It is performed under loaded conditions with all cells activated. System Verification involves fusion of the previously optimized clusters and once again is required to demonstrate that Exit Criteria are met system-wide. The final statistics from the System Verification are presented to the customer for approval.

Alcatel-Lucent– Proprietary See notice on first page

19

1.16 Test Cases The default test mode in the DL will be the open loop spatial multiplexing (OLSM) and for the UL it will be SIMO. DLLS (down-link load simulator) will be used to generate interference on the DL of the neighboring cells for loading purposes.

1.17 Single User Throughput Test Peak 1.17.1 Single User Downlink Airlink Throughput Peak Test Test Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under 50% cell loading conditions.

Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM). For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route. DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of 50%, 50% of the total DL RBs will be occupied. Agilent Tool, Backhual Bandwidth: 100 Mbps, SINR: 17 to 21 dB, Loaded Conditions: 50% . Test Routes: Near Cell, SNR17dB – 21 dB Terminal Speeds: Stationary, Limited Mobility (25-30)km/hr Test Set Up:

A drive test van will be used with rooftop mounted antennas 1. 2. 3. 4. 5.

Ensure that 500MB file is available at the servers for downloads Ensure that Tx/Rx Antenna Correction is below 20% Ensure that UE is reporting Rank of 2 Ensure that antenna on the Test van are cross polarized Ensure that TCP Window size of client laptop is set to 512kbytes

Alcatel-Lucent– Proprietary See notice on first page

20

Procedure: Action

Response

1

Park test Van at predetermined location on NC route.

2

Open Agilent and connect UE to the Agilent tool.

3

Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds Open LGE LTE CM and click on “Connect”

4 5

Ping the Application Server to make sure UE has acquired an IP address 6 Open Winds UDP and configure the right adapter. Populate the fields with right values 7 Start logging Agilent and click “Request” on Winds 8 Log Data for 3 mins. 9 Stop the Winds UDP sessions. Stop logging on Agilent 10 Repeat Steps 7 -9 for two more runs 11 Save the UE and Winds files

UE would start the attach process IP address verified

UE log files collected

Key Metrics: 1. Physical Layer Downlink Airlink Throughput Peak 2. Application Layer Throughput 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution Expected Result: Peak Tput should be 60Mbps

Expected Test Duration: 0.5 day 1.17.2 Single User Uplink Airlink Throughput Peak Test Test Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under no loading.

Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM). For the stationary tests, the test UE will be located at selected locations in an embedded Alcatel-Lucent– Proprietary See notice on first page

21

sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route.

Agilent Tool, Backhual Bandwidth: 100 Mbps . Test Routes: Near Cell, SNR17dB – 21 dB Terminal Speeds: Stationary, Limited Mobility (25-30)km/hr Test Set Up:

A drive test van will be used with rooftop mounted antennas 1.Ensure that 500MB file is available at the client laptop for uploads 2.Ensure that SIR target is set to 18dB at the eNodeB

Procedure: Action

Response

1

Park test Van at predetermined location on NC route.

2

Open Agilent and connect UE to the Agilent tool.

3

Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds Open LGE LTE CM and click on “Connect”

4 5

Ping the Application Server to make sure UE has acquired an IP address 6 Open Winds UDP and configure the right adapter. Populate the fields with right values 7 Start logging Agilent and click “Send” on Winds 8 Log Data for 3 mins. 9 Stop the Winds UDP sessions. Stop logging on Agilent 10 Repeat Steps 7 -9 for two more runs 11 Save the UE and Winds files

UE would start the attach process IP address verified

UE log files collected

Alcatel-Lucent– Proprietary See notice on first page

22

Key Metrics: 1. Physical Layer Uplink Airlink Throughput Peak 2. Application Layer Throughput 3. SIR target 4. Initial Block Error Rates 5. Residual Block Error Rates 6. Scheduled Transport Format distribution

Output: Physical Layer Uplink Airlink Throughput Peak should be recorded.

Expected Result: Peak Tput should be 20Mbps

Expected Test Duration: 0.5 day

1.17.3 Single User Downlink RLC Throughput Peak Test Test Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under 50% cell loading conditions.

Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM). For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route. DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of 50%, 50% of the total DL RBs will be occupied. Agilent Tool, Backhaull Bandwidth: 100 Mbps, SINR: 17 to 21 dB, Loaded Conditions: 50% . Test Routes: Near Cell, SNR17dB – 21 dB Terminal Speeds: Stationary, Limited Mobility (25-30) km/hr

Alcatel-Lucent– Proprietary See notice on first page

23

Test Set Up:

A drive test van will be used with rooftop mounted antennas 1. 2. 3. 4. 5.

Ensure that 500MB file is available at the servers for downloads Ensure that Tx/Rx Antenna Correction is below 20% Ensure that UE is reporting Rank of 2 Ensure that antenna on the Test van are cross polarized Ensure that TCP Window size of client laptop is set to 512kbytes

Procedure: Action

Response

1

Park test Van at predetermined location on NC route.

2

Open Agilent and connect UE to the Agilent tool.

3

Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds Open LGE LTE CM and click on “Connect”

4 5

Ping the Application Server to make sure UE has acquired an IP address 6 Open Winds UDP and configure the right adapter. Populate the fields with right values 7 Start logging Agilent and click “Request” on Winds 8 Log Data for 3 mins. 9 Stop the Winds UDP sessions. Stop logging on Agilent 10 Repeat Steps 7 -9 for two more runs 11 Save the UE and Winds files

UE would start the attach process IP address verified

UE log files collected

Key Metrics: 1. Physical Layer Downlink RLC Throughput Peak 2. Application Layer Throughput 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution

Output: Peak Downlink RLC layer throughput should be recorded.

Expected Result: Peak Tput should be 55Mbps

Expected Test Duration: 0.5 day

Alcatel-Lucent– Proprietary See notice on first page

24

1.17.4 Single User Uplink RLC Throughput Peak Test Test Objectives: Test Validate the performance by conducting single-user stationary and limited mobility tests on pre-selected locations in an embedded sector and in a limited drive route within the same sector, respectively. The tests shall be performed using both UDP applications for performance comparison, and under no loading.

Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM). For the stationary tests, the test UE will be located at selected locations in an embedded sector corresponding to the appropriate SNR ranges for Near Cell (NC). See for SNR ranges. For the mobility tests, the test UE will be driven according to the predefined drive route.

Agilent Tool, Backhaul Bandwidth: 100 Mbps . Test Routes: Near Cell, SNR17dB – 21 dB Terminal Speeds: Stationary, limited Mobility (25-30) km/hr Test Set Up:

A drive test van will be used with rooftop mounted antennas 1.Ensure that 500MB file is available at the client laptop for uploads 2.Ensure that SIR target is set to 18dB at the eNodeB

Procedure: Action

Response

1

Park test Van at predetermined location on NC route.

2

Open Agilent and connect UE to the Agilent tool.

3

Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds Open LGE LTE CM and click on “Connect”

4 5 6

Ping the Application Server to make sure UE has acquired an IP address Open Winds UDP and configure the right adapter. Populate the fields with right values

UE would start the attach process IP address verified

Alcatel-Lucent– Proprietary See notice on first page

25

Start logging Agilent and click “Send” on Winds 8 Log Data for 3 mins. 9 Stop the Winds UDP sessions. Stop logging on Agilent 10 Repeat Steps 7 -9 for two more runs 11 Save the UE and Winds files 7

UE log files collected

Key Metrics: 1. Physical Layer Uplink RLC Throughput Peak 2. Application Layer Throughput 3. SIR target 4. Initial Block Error Rates 5. Residual Block Error Rates 6. Scheduled Transport Format distribution

Output: Peak Uplink RLC layer throughput should be recorded

Expected Result: Peak Tput should be 18Mbps

Expected Test Duration: 0.5 day

Alcatel-Lucent– Proprietary See notice on first page

26

Alcatel-Lucent– Proprietary See notice on first page

27

1.17.5 Single User Downlink Physical Layer Throughput Mean Test Objectives: Evaluate the Average Downlink Physical Layer Throughput in a cluster of 15 – 20 eNodeBs Test Description: Tests will be executed in Close-Loop Spatial Multiplexing (CLSM). For the cluster test, the test UE will be driven according to the predefined drive route. The route definition would cover SNR distribution that reflect NC, MC, CE. DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of 50%, 50% of the total DL RBs will be occupied. Agilent Tool, Backhaul Bandwidth: 50 Mbps Backhual

Test Setup: A drive test van will be used with rooftop mounted antennas 1. Ensure that 500MB file

Test Routes: Near Cell, SNR17dB – 21 dB Terminal Speeds: Stationary, limited Mobility (25-30) km/hr Test Set Up:

A drive test van will be used with rooftop mounted antennas 1.Ensure that 500MB file is available at the client laptop for uploads 2.Ensure that SIR target is set to 18dB at the eNodeB

Procedure: Action

Response

1

Park test Van at predetermined location on NC route.

2

Open Agilent and connect UE to the Agilent tool.

3

Power the UE and ensure that the right port is assigned to the UE. Check that GPS is working on Agilent and Winds Open LGE LTE CM and click on “Connect”

4 5

Ping the Application Server to make sure UE has acquired an IP address

UE would start the attach process IP address verified

Alcatel-Lucent– Proprietary See notice on first page

28

6

Open Winds UDP and configure the right adapter. Populate the fields with right values 7 Start logging Agilent and click “Send” on Winds 8 Log Data for 3 mins. 9 Stop the Winds UDP sessions. Stop logging on Agilent 10 Repeat Steps 7 -9 for two more runs 11 Save the UE and Winds files

UE log files collected

Key Metrics: 1. Physical Layer Uplink RLC Throughput Peak 7. Application Layer Throughput 8. SIR target 9. Initial Block Error Rates 10. Residual Block Error Rates 11. Scheduled Transport Format distribution

Output: Peak Uplink RLC layer throughput should be recorded

Expected Result: Peak Tput should be 18Mbps

Expected Test Duration: 0.5 day Key Metrics: 7. Physical Layer Downlink Throughput Mean 8. Application Layer Throughput 9. Initial Block Error Rates 10. Residual Block Error Rates 11. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

29

1.18 Best Effort Sector Throughput Tests 1.18.1 Downlink Best Effort Sector Throughput Test Objectives: Evaluate the sector throughput for multiple UEs at stationary locations and for limited mobility drive tests within the same sector. Tests will be conducted under unloaded and loaded conditions, and for UDP and FTP applications. Test Description: Three different scenarios will be tested: Open-Loop Spatial Multiplexing (OLSM), SFBC and SIMO For the stationary tests, 8 test UEs will be placed at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE) locations. The 8 UEs will be placed in a (2, 4, 2) configuration: 2 UEs at NC, 4 UEs at MC and 2 UEs at CE locations (shown as 242 in Tables below). The 8 UEs will be placed in four different vans (V1-V4 in Tables below) with 2 UEs in each van. See Error! Reference source not found. for SNR ranges. Four different sets of (2, 4, 2) configurations will be tested (Loc1-4 in Tables below). For the mobility tests the test UEs will be driven according to a predefined limited mobility (single sector) drive route. DLLS will be used to load the DL of the cells neighboring the target cell. Loading will be generated by occupying portions of the Resource Blocks (RB). For example, to generate a cell loading of X% (CLX in Tables below), X% of the total DL RBs will be occupied. Three different loading conditions will be used: 0%, 50% and 100%. Three different scenarios of active Sectors will be tested: only target sector active (1S in Tables below), all three sectors of target cell active (3S), and all cells in the cluster active All tests will be conducted using the default scheduler setting. Test Setup: 1. One or more drive test vans will be used with rooftop mounted antennas

Key Metrics: 1. Physical Layer Throughput 2. Application Layer Throughput (UDP/FTP) 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

30

1.18.2 Uplink Best Effort Sector Throughput Test Objectives: Evaluate the sector throughput for multiple UEs at stationary locations in an embedded sector and for limited mobility drive tests within the same sector. Tests will be conducted under unloaded and loaded conditions, and for UDP and FTP applications. Test Description: For the stationary tests the test UEs will be located at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE). See Error! Reference source not found. for SNR ranges. For the mobility tests the test UEs will be driven according to a predefined limited mobility (single sector) drive route Loading is generated by placing loading UEs in neighboring cells at pre-selected locations. Loading of 100% will result in an IoT of TBD dB in the target cell, while a loading of 50% will result in an IoT of TBD dB in the target cell. All tests will be conducted with the default scheduler setting. Test Setup: 1. One or more drive test vans will be used with rooftop mounted antennas 2. Each stationary location consists of a unique set of (NC,MC,CE) locations Key Metrics: 1. Physical Layer Throughput 2. Application Layer Throughput (UDP/FTP) 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

31

1.18.3 Uplink MU-MIMO Sector Throughput Test Objectives: Evaluate the sector throughput with MU-MIMO at stationary locations and for limited mobility drive tests. Tests will be conducted under unloaded and loaded conditions, and for UDP and FTP applications. Test Description: For the stationary tests the test UEs will be located at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE). See Error! Reference source not found. for SNR ranges. For the mobility tests the test UEs will be driven according to a predefined limited mobility (single sector) drive route. Loading is generated by placing loading UEs in neighboring cells at pre-selected locations. Loading of 100% will result in an IoT of TBD dB in the target cell, while a loading of 50% will result in an IoT of TBD dB in the target cell. All tests will be conducted using the default scheduler setting. MU-MIMO implementation allows for up to 4 UEs paired (2 pairs). Test Setup: 1. One or more drive test vans will be used with rooftop mounted antennas 2. Each stationary location consists of a unique set of (NC,MC,CE) locations Key Metrics: 1. Physical Layer Throughput 2. Application Layer Throughput (UDP/FTP) 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

32

1.19 Downlink Scheduler Test Objectives: Evaluate the Scheduler performance for multiple UEs at stationary locations and for limited mobility drive tests within the same sector. Tests will be conducted under loaded conditions with UDP and FTP applications. Three Scheduler settings will be tested: proportionalfair (PF), conservative (CO), and aggressive (AG).

Test Description: Settings common to all tests: - Open-Loop Spatial multiplexing (OLSM) mode - 100% loading on DL of neighbor cells For the stationary tests, 8 test UEs will be placed at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE) locations. The 8 UEs will be placed in four different vans with 2 UEs in each van. The number of active UEs will be increased incrementally to illustrate the scheduling gain. For the stationary cases, each combination of a subset of the 8 UEs will be depicted as a triplet (x, y, z) in the tables below to represent the number of active UEs at each of the three locations. For the cases with a mix of stationary and mobility UEs, each combination of a subset of the 8 UEs will be depicted as a quartet (x, y, z, m) where m will denote the number of mobility UEs. See Error! Reference ource not found. for SNR ranges. For the mobility tests the test UEs will be driven according to a pre defined limited mobility (single sector) drive route. DLLS will be used to load the DL of the cells neighboring the target cell. Loading of 100% will be used for the tests.

Test Setup: 1. One or more drive test vans will be used with rooftop mounted antennas

Key Metrics: 1. Physical Layer Throughput 2. Application Layer Throughput 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

33

1.20 Uplink Scheduler Test Objectives: Evaluate the uplink scheduler performance for multiple UEs at stationary locations and for limited mobility drive tests within the same sector. Tests will be conducted under loaded conditions with UDP and FTP applications. Three Scheduler settings will be tested: proportional-fair (PF), conservative (CO), and aggressive (AG). Test Description: For the stationary tests, 8 test UEs will be placed at selected locations corresponding to the appropriate SNR ranges for Near Cell (NC), Mid Cell (MC), and Cell Edge (CE) locations. The 8 UEs will be placed in four different vans with 2 UEs in each van. The number of active UEs will be increased incrementally to illustrate the scheduling gain. For the stationary cases, each combination of a subset of the 8 UEs will be depicted as a triplet (x, y, z) in the tables below to represent the number of active UEs at each of the three locations. For the cases with a mix of stationary and mobility UEs, each combination of a subset of the 8 UEs will be depicted as a quartet (x, y, z, m) where m will denote the number of mobility UEs. See Error! Reference ource not found. for SNR ranges. For the mobility tests the test UEs will be driven according to a pre-defined limited mobility (single sector) drive route. All tests will be executed with 100% UL loading. The uplink loading will be generated by placing loading UEs in the neighboring cells to generate an IoT corresponding to 100% loading in the target sector.

Test Setup: 1. One or more drive test vans will be used with rooftop mounted antennas

Key Metrics: 1. Physical Layer Throughput 2. Application Layer Throughput 3. Initial Block Error Rates 4. Residual Block Error Rates 5. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

34

1.21 Latency C-plane Test Objectives: To assess the control plane latency associated with call setup events Test Description: This test will determine the call setup time. The delay will be measured from the first RACH attempt to the time the UE completes traffic channel setup. This test will be executed with UE in BE and GBR modes. Tests will also be executed to measure mobile terminated connection setup time. These tests will also be executed with UE in BE and GBR modes. 5.5.1 C-Plane Latency Test Case

Priority

5.5.1.1 5.5.1.2 5.5.1.3 5.5.1.4

H H H H

Test Case Description UE_Init_NC_BE UE-Init_NC_QoS UE_Term_NC_BE UE_Term_NC_QoS

Call Setup UE Initiated UE Initiated UE Terminated UE Terminated

QoS of the Test UE BE GBR BE GBR

Procedure: 1. 2. 3. 4. 5. 6. 7.

BE C-Plane Latency Set the log mask for the DM tool to include the debug messages. Initiate a call from the test UE with the UE in BE mode Initiate a call from the network side to the UE with UE in BE mode Repeat 100 times each QoS C-Plane Latency Initiate a call from the test UE with UE in GBR QoS mode Initiate a call from the network side to the UE with UE in GBR QoS mode Repeat 100 times each

Key Metrics (per UE): 1. Call Setup Time

Alcatel-Lucent– Proprietary See notice on first page

35

Number of UEs 1 1 1 1

1.22 Latency U-plane Test Objectives: To assess the end user experienced latency. To measure the round trip delay from the time a packet is generated at the IP level to the time a response is received. Test Description: This test will be conducted with a total of 8 UEs placed at different sector locations (NC, MC, EC). Ping tests with 32 bytes/1462 bytes will be executed on the test UE (BE/GBR QoS modes) while bi-directional IP traffic will be run on the other UEs to generate DL and UL loading. The number of loading UEs will be varied in the tests. Procedure: 1. 2. 3. 4.

U-Plane Latency Execute 32 byte Ping tests on the 8 UEs one at a time for 30 seconds each. Execute 32 byte Ping tests on the test UE1 with bi-direction IP traffic running on the other loading UEs (3 UEs, 5 UEs and 7 UEs) Repeat steps 1 and 2 the test with a ping payload size of 1462 Bytes. Repeat steps 1-3 with UE in GBR mode

Key Metrics (per UE): 1. Resource Utilization 2. Transport Format Distribution 3. Latency 4. Physical Layer Throughput 5. Application Layer Throughput {UDP} 6. Initial Block Error Rates 7. Residual Block Error Rates

Alcatel-Lucent– Proprietary See notice on first page

36

1.23 Quality of Service Test Objectives: To assess the QoS performance of an LTE UE with VoIP and HTTP applications in various multi-UE loading scenarios Test Description: VoIP QoS Test: This test will be executed with the test UE in VoIP mode. Tests will be executed under different loading conditions. The loading UEs will be executing BE traffic. The number of loading UEs will be varied from 3 to 7 and they will be placed at a Near Cell location. HTTP QoS Test: This test will be executed with the test UE in GBR mode running HTTP application. Tests will be executed under different loading conditions. The loading UEs will be executing BE traffic. The number of loading UEs will be varied from 3 to 7 and they will be placed at a Near Cell location.

Procedure: 1.

2.

3.

4. 5.

GBR VOIP QoS Test Configure test UE’s MAC Downlink Scheduler with the following settings: a. VoIP Flag = True b. Initial MCS: 5 (QPSK, Code Rate 0.438) c. HARQ Max Number of Transmissions: 1 Configure test UE with the following UL/DL TFT information: a. Remote IP address/subnet mask b. Port Range for RTP/RTCP: 10000 – 10010 c. Protocol: UDP Configure test UE with the following QoS Information: a. QoS Class Identifier (QCI): 1 b. UL/DL MBR: (not used) c. UL/DL GBR: (not used) Initiate a call from UE1 to IxChariot to simulate voice traffic. Incrementally add UEs with best effort IP transfers until you have 7 active UEs. Repeat steps 1 – 4 for the Mid Cell and the Cell Edge geometries

GBR HTTP QoS Test 6. Repeat steps 1-3 and activate IxChariot. Initiate an HTTP session at the Near Cell location. Incrementally add UEs with best effort IP transfers until you have 7 active UEs. Repeat the tests for both the Mid Cell and the Cell Edge locations Key Metrics (per UE): 1. Resource Utilization 2. Transport Format Distribution Alcatel-Lucent– Proprietary See notice on first page

37

3. 4. 5. 6. 7. 8.

Latency Mean Opinion Score Physical Layer Throughput Application Layer Throughput {UDP} Initial Block Error Rates Residual Block Error Rates

Alcatel-Lucent– Proprietary See notice on first page

38

1.24 Coverage Testing Test Objectives: Validate the coverage for single-UE tests on the pre-selected drive route. Tests will be conducted under interfered and non-interfered conditions for UDP application. Test Description: Test UE will be driven according to the pre-selected drive route from Near Cell (NC) to Cell Edge (CE) until call drops. Uplink (UL) interference is generated by placing loading UEs in neighboring cells at pre-selected locations. DLLS will be used to generate DL interference in the neighboring cells. Interference of 100% will result in an Interference over Thermal (IoT) of TBD dB in the target device (i.e., cell for UL and UE for DL), while interference of 50% will result in an IoT of TBD dB in the target device. Both UL and DL physical-layer data rate and Signal to Interference plus Noise Ratio (SINR) will be measured and signaling will be recorded in the tests.

Procedure: UL Tests 1. Set SINR target in neighboring cells to control the power of loading UEs 2. Place loading UEs in neighboring cells at pre-selected locations to generate desired IoT in the target cell 3. Make a UDP Best Effort (BE) call to the target cell on the test UE and measure both UL and DL physical-layer data rates, SINR, and record the signaling messages 4. Place the test UE in a van and drive on a pre-selected route from NC to CE of the target cell until the call drops 5. Repeat steps 1 to 4 for each interference condition DL Tests 1. Set DLLS in neighboring cells to generate desired DL interference levels 2. Make a UDP BE call on the test UE from the target cell and measure both UL and DL physicallayer data rates, SINR, and record the signaling messages 3. Place the test UE in a van and drive on a pre-selected route from NC to CE of the target cell until the call drops 4. Repeat steps 1 to 3 for each interference condition

Key Metrics: 1. Physical Layer Throughput 2. SINR 3. PDCCH error rate

Alcatel-Lucent– Proprietary See notice on first page

39

1.25 Handover Test Objectives: Evaluate handover performance in following scenarios: - Intra-Site (different sectors within one eNodeB) - Inter-Site (different eNodeBs) - Loaded and Unloaded Destination eNodeBs

Test Description: Test UE will be driven along two routes:  Handover Route comprising of intra and inter eNodeB handovers between 3-4 sectors. On this route, additional UEs will be stationed in each sector along the drive route. These UEs will load both downlink and uplink of their respective sectors with BE traffic.  Cluster Route comprising of intra and inter eNodeB handovers in the entire 10 eNodeB Cluster. Only DL loading will be generated via DLLS on this route. Non-guaranteed and guaranteed Quality of Service (QoS) tests will be conducted via Best Effort (BE) and Guaranteed Bit Rate (GBR) QoS classes, respectively. Application performance will be measured quantitatively and subjectively. While quantitative measurements are throughput (physical-layer data rate) and latency, subjective performance will be based on user’s perception of the application. For example, quality of a Voice over IP (VoIP) call can be either “clear,” “choppy but audible,” or “not audible.” Both UL and DL SINR (Signal to Interference plus Noise Ratio) will be measured and Signaling messages will be recorded in the tests. Key Metrics: 1. Physical Layer Throughput 2. SINR 3. Latency

Alcatel-Lucent– Proprietary See notice on first page

40

1.26 V-Pol vs. Cross-Pol Test Objectives: Compare the performance of Vertically and Cross polarized antenna configurations

Test Description: Settings common to all tests: - Open-Loop Spatial multiplexing (OLSM) mode - 100% loading on DL of neighbor cells - Stationary A single UE will be used for all the tests. Tests will be executed in NC, MC and EC locations with UDP and FTP applications on both DL and UL. For the V-pol (vertically polarized) tests, both the eNodeB and UE antennas will be set to the Vpol configurations. Similarly, for the X-pol (cross polarized) tests, both the eNodeB and UE antennas will be set to the X-pol configurations. DLLS will be used to load the DL of the cells neighboring the target cell. Loading of 100% will be used for all the tests.

Test Setup: 1. One test van will be used with rooftop mounted antennas

Procedure: 1. Set the eNB and UE to V-pol configuration 2. Run DL/UL UDP/FTP tests in each of NC, MC and EC locations 3. Repeat for X-pol configuration setting

Key Metrics: 1. Channel Correlation Statistics at UE for DL tests 2. Physical Layer Throughput 3. Application Layer Throughput 4. Initial Block Error Rates 5. Residual Block Error Rates 6. Scheduled Transport Format distribution

Alcatel-Lucent– Proprietary See notice on first page

41

Appendix A:

Performance Metrics

The following metrics will be collected during the trial execution phase. The list shall include (but not necessarily limited to): 

Air Interface o UE Tx power o RSSI o SINR o BLER o Retransmission statistics (HARQ and RLC) o Transport Format o Number of resource blocks (DL/UL) o Channel rank statistics o MIMO mode (Tx diversity or Spatial Multiplexing) o Serving sector o Location (GPS) o UE Velocity



Throughput o Individual user throughput and aggregated sector throughput o UDP individual user throughput and aggregated sector throughput o TCP individual user throughput and aggregated sector throughput o User statistics (peak rates, average rates, standard deviations)



Latency o U-plane latency o Connection set up times o Handover interruption time within the same site and across different sites

Alcatel-Lucent– Proprietary See notice on first page

42

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF