3g kpi guidelines

Page 1

Appendix U12 UMTS Network KPI

NOTES: This document can be used as an example of defining KPIs in a contract. It is in Ericsson’s interest that KPIs are well defined, including: a

KPI target values (keep as few with targets as possible)

b

Formula and measurement definitions (for drives and stats)

c

Where and how KPI will be measured (locations, setup, tools, drive, statistics)

d

Conditions – prerequisites, exclusions

e

Statistical considerations

As much of this as practical should be defined in the contract, particularly anything that will limit Ericsson’s risks and costs. Some of the material in this document may best be included in other parts of the contract, including the Acceptance Procedure and associated documents. KPI target values have very little meaning until the conditions of measurement are defined as above. Committing to a target without defining how, when and where it will be measured leaves Ericsson at great risk. N.B. The KPIs and their targets included in this document may not be the ones recommended by Ericsson. Refer to the KPI Technical Guidelines and KPI data base for current recommendations.


Table of Contents UMTS NETWORK KPI.........................................................................................................................................................1 1 INTRODUCTION.................................................................................................................................................................3 SUMMARY OF LEVEL-1 KPI AND TARGETS ..........................................................................................................................3 2 PERFORMANCE COMMITMENTS AND REQUIREMENTS.....................................................................................6 MILESTONE DEFINITION........................................................................................................................................................6 SCOPE....................................................................................................................................................................................6 GENERAL REQUIREMENTS.....................................................................................................................................................6 CONTRACTUAL KPI COMMITMENTS.....................................................................................................................................7 3 UTRAN LEVEL-1 KEY PERFORMANCE INDICATORS............................................................................................7 CIRCUIT SWITCHED LEVEL-1 KEY PERFORMANCE INDICATORS..........................................................................................7 3.1.1 Circuit Switched Call Setup.............................................................................................................................8 3.1.2 RRC Establishment Causes.............................................................................................................................9 3.1.3 CSV Access Failure.........................................................................................................................................9 In the event that there are several consecutive RRC Connection Request only the first RRC connection request shall be taken into account for the KPI calculation. .....................................................................................................11 Calls are considered as failed during access when a connection is attempted and the UE does not receive Alerting/Connect message in case of drive test measurement. For OSS RANAP: RAB Assignment Response message from the RNC to core Network is considered as a call successful setup........................................................................11 3.1.4 CSV Drop.......................................................................................................................................................11 3.1.5 CSV Quality...................................................................................................................................................12 3.1.6 CSV Cell Capacity.........................................................................................................................................14 3.1.7 CSV Soft/Softer Handover Overhead............................................................................................................15 3.1.8 CSV Inter-Radio Access Technology Handover Failure ..............................................................................16 3.1.9 CSV Call Setup Time.....................................................................................................................................18 3.1.10 CSD Access Failure.....................................................................................................................................19 In case there are several consecutive RRC Connection Requests only the first RRC connection request will be taken into account for the KPI calculation.....................................................................................................................20 3.1.11 CSD Drop....................................................................................................................................................20 3.1.12 CSD Quality.................................................................................................................................................21 3.1.13 CSD Call Setup time....................................................................................................................................22 PACKET SWITCHED LEVEL-1 KEY PERFORMANCE INDICATORS.........................................................................................23 3.1.14 Packet Switched Call Setup.........................................................................................................................23 3.1.15 PSD Access Failure ....................................................................................................................................24 3.1.16 PSD Drop....................................................................................................................................................25 3.1.17 PSD Latency................................................................................................................................................26 3.1.18 PSD Throughput .........................................................................................................................................26 3.1.19 PSD Call Setup Time...................................................................................................................................28 3.1.20 PSD Inter-Radio Access Technology Handover Failure.............................................................................29 3.1.21 PSD IRAT Interruption time........................................................................................................................30 3.1.22 HSDPA Access Failure................................................................................................................................31 3.1.23 HSDPA Drop...............................................................................................................................................32 3.1.24 HSDPA Latency...........................................................................................................................................33 3.1.25 HSDPA Throughput ....................................................................................................................................34 3.1.26 HSUPA Throughput ....................................................................................................................................36 3.1.27 HSDPA Data Session Setup Time................................................................................................................37 SYSTEM AVAILABILITY ......................................................................................................................................................37 3.1.28 Average Cell Availability.............................................................................................................................37 3.1.29 Average Cell Non-Maintenance Availability...............................................................................................38 COMPARISON OF UMTS AND GSM NETWORK PERFORMANCE.........................................................................................39 IMPACT OF UMTS ON GSM NETWORK PERFORMANCE.....................................................................................................39 4 UTRAN LEVEL-2 KEY PERFORMANCE INDICATORS .........................................................................................39

2 of 40


1

Introduction

The purpose of this Appendix U12 is to define Network Key Performance Indicators (KPI) for the UMTS drive test, UTRAN counter based measurements and performance targets and formulas for each of the KPI. All capitalized terms, acronyms, and definitions used in this Appendix U12 are listed in either the UMTS Acronyms and Definitions (Appendix U18), or in the Original Agreement. The UTRAN KPI definition and development effort is performed at the two levels as defined below: Level-1 KPI: are the high-level metrics used for overall service quality measurement and monitoring the health of Purchaser’s UTRAN Network. These metric definitions shall be agreed to by Seller, and Seller‘s implementation of the Level-1 metrics shall be in compliance with Purchaser‘s definitions defined in this Appendix U12. Level-2 KPI: are the detailed engineering level metrics that shall be used for engineering, dimensioning the Network, the investigation and troubleshooting of problem areas in the UMTS Network.

Summary of Level-1 KPI and Targets

NOTE: The selection of KPI and the target values should use the KPI Technical Guidelines and KPI Database. The following is an example from a real contract and should NOT be seen as Ericsson recommendation and must not be used in any contract in this form. KPI Parameter

Table Reference

Pre-Launch

Launch

Post Launch

(Milestone 1)

(Milestone 2)

(Milestone 3)

CSV Access Failure Rate

Table 3

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

CSV Drop Rate

Table 5

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

CSV Quality (DL)

Table 8

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

CSV Quality (UL)

Table 8

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

Cell Capacity

Table 10

≥ 40 UE AMR 12.2k (provided it is achieved in RF design)

≥ 40 UE AMR 12.2k (provided it is achieved in RF design)

Measure and report

Soft/Softer Handover Overhead

Table 12

≤ 1.6

≤ 1.6

≤ 1.7

CSV IRAT Failure Rate

Table 14

N/A

≤ 5.0 %

≤ 5.0 % 3 of 40


*Voice Call Setup time (Mobile to PSTN)

Table 15

95th percentile ≤ 6 seconds

95th percentile ≤ 6 seconds

95th percentile ≤ 6 seconds

*Voice Call Setup time (Mobile to Mobile)

Table 15

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

CSD Access Failure Rate

Table 17

≤ 2.0 %

≤ 2.0 %

CSD Drop Rate

Table 19

≤ 2.5 %

≤ 3.0 % (Counter)

≤ 3.0 % ≤ 2.5 %

≤ 2.5 % (Drive Test) CSD Quality (DL)

Table 21

95th percentile of samples ≤ 1.0 % BLER

95th percentile of samples ≤ 1.0 % BLER

95th percentile of samples ≤ 1.0 % BLER

CSD Quality (UL)

Table 21

95th percentile of samples ≤ 1.0 % BLER

95th percentile of samples ≤ 1.0 % BLER

95th percentile of samples ≤ 1.0 %BLER

*CSD Call Setup time

Table 22

N.A

95th percentile ≤ 9 seconds

N.A

PSD Access Failure Rate

Table 24

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

PSD Drop Rate

Table 26

≤ 2.5 %

≤ 2.0 %

*PSD Latency (any R99 RAB)

Table 28

N.A

95th percentile

PSD R99 Average DL throughput (kbps)

Table 31

≥ 240

≥ 240

Table 31

≥ 210

≥ 210

(Loaded)

≥ 150 (Counter)

≥ 210 (Drive Test)

Table 32

≥ 200

≥ 200

≥ 150 (Counter)

≥ 200 (Drive Test)

(Unloaded) PSD R99 Average UL throughput (kbps)

≥ 150 (Counter) ≥ 240 (Drive Test)

(Loaded) PSD R99 Average UL throughput (kbps)

N.A

≤ 200ms

(Unloaded) PSD R99 Average DL throughput (kbps)

≤ 2.0 %

Table 32

≥ 180

≥ 180

≥ 150 (Counter)

≥ 180 (Drive Test) 4 of 40


*PSD Call Setup time

Table 34

N.A

95th percentile sessions ≤ 5 seconds

PSD IRAT Failure Rate

Table 36

N/A

≤ 5.0 %

PSD IRAT Interruption time

Table 39

N.A

95th percentile

N.A

≤ 5.0 % N.A

≤ 12 seconds PSD IRAT User Data Interruption time

Table 39

N.A

Measure and Report

N.A

HSDPA Access Failure Rate

Table 41

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

HSDPA Drop Rate

Table 43

≤ 3.0 %

≤ 2.0 %

≤ 2.0 %

*HSDPA Latency

Table 45

N.A

95th percentile sessions ≤ 100 ms

N.A

Stationary Maximum DL HSDPA Bit Rate (kbps) under no load (UE Category 12)

Table 47

≥ 1300

≥ 1300

≥1300

Stationary Maximum DL HSDPA Bit Rate (kbps) with 50% DL loading (UE Category 12)

Table 47

≥ 1100

≥ 1100

N.A

HSDPA Average DL Throughput (kbps) for UE category 12 under no load

Table 48

≥ 700

≥ 700

≥ 700

HSDPA Average DL Throughput (kbps) for UE category 12 under 50% DL loading

Table 48

≥ 600

≥ 600

N.A

HSUPA Stationary Peak throughput (kbps) for UE Category 3 (1.45 Mbps)

Table 50

≥ 1100

≥ 1100

Measure and Report

HSUPA Stationary Average throughput (kbps) for UE Category 3 (1.45 Mbps)

Table 50

≥ 500

≥ 500

Measure and Report

*HSDPA Session Activation time

Table 51

N.A

95th percentile sessions ≤ 5 seconds

N.A

(at RLC Layer using counter)

(at off peak)

5 of 40


*Average Cell Availability KPI

Table 53

N.A

N.A

N.A**

*Average Cell NonMaintenance Availability KPI

Table 55

N.A

N.A

N.A**

GSM Performance Degradation due to Introduction of UMTS

Page No: 39

Cell ≤ 10 %

Cell≤ 10 %

Cell≤ 10 %

Cluster ≤ 7.5%

Cluster ≤ 7.5%

Cluster ≤ 7.5%

Market ≤ 5%

Market ≤ 5%

Market ≤ 5%

* Seller shall commit to these KPI during the life of the Network as described in Section of this document ** KPI Applicable one (1) year after UMTS Network commercial Launch.

2

Performance Commitments and Requirements

Milestone Definition Milestone description is provided in the UMTS System Acceptance (Appendix U16). Three milestones are defined in this Appendix U12 as follows: 1. Milestone-1 (MS1): Pre-launch. Pre-launch is defined as the completion of Cluster Acceptance testing. The KPI target levels will be based on drive tests and not OSS statistics. 2. Milestone-2 (MS2): Launch. Launch is defined as the completion of Market Acceptance testing of the initial Network configuration. The KPI target levels shall be based on drive tests and not OSS statistics. 3. Milestone-3 (MS3): Post-Launch. Post-Launch is defined as six (6) months after Launch. The KPI target levels shall be based on OSS statistics at Market Level or Drive Test data if there is not sufficient traffic, as defined in UMTS System Acceptance (Appendix U16) in the UMTS Network.

Scope The scope in this Appendix U12 is to define applicable UMTS Network KPIs for Systems and Equipment supplied by Seller.

General Requirements Performance Management Measurement Entity shall be available at Cell, Node-B, Cluster, RNC, Market, Region and Network level for every KPI unless otherwise stated in this Appendix U12. RNC Measurement Granularity shall be a minimum period of fifteen (15) minutes unless otherwise stated in this Appendix U12. All tests for KPI purposes shall be performed in Mobile (vehicle drive) environment unless otherwise specified. IRAT Hard Handover Feature shall be switched off during KPI verification within the UMTS Service Area at Pre-Launch. At Launch milestone IRAT HHO failure rate in 3G-2G RF Service boundary shall be verified with one dedicated drive per Market unit. The tests shall be performed under a multi-supplier environment, in which nodes from different suppliers may be used (e.g. SGSN, GGSN, MSC, MGW etc). In the event that a particular test is not passed and Seller claims the problem is with the other supplier, Seller shall clearly demonstrate that the element in question is the source of the problem. If the problem is confirmed to be in a third-party supplier’s Equipment then Seller shall re-schedule the individual test cases only after the problem has been resolved by the third-party supplier. The initial UTRAN parameter set shall be mutually agreed. No UTRAN parameter shall be changed during Cluster/Market Acceptance tests without prior approval of Purchaser as defined in the UMTS System Acceptance (Appendix U16). As terminal performance may also have an impact on the 6 of 40


performance, the brand and model of terminal to be used for the tests shall be mutually agreed to by Seller and Purchaser. Counter based statistics for Level-1 KPI shall cover all traffic classes and include all causes of failures/successes to indicate end to end performance as viewed by UTRAN.

Contractual KPI Commitments The following KPIs and targets shall be valid for the lifetime of the Network, provided that there is a Support Agreement in place for the Seller provided Systems. •

Table 2: KPI Parameter Commitment for the lifetime of the UMTS Network

SN

KPI Parameter

Target

1

CSV Call Setup Time (Mobile to PSTN)

≤ 6 seconds

2

CSV Call Setup Time (Mobile to Mobile)

≤ 9 seconds

3

CSD Call Setup time (Mobile to Mobile)

≤ 9 seconds

4

PSD Latency - any R99 RAB

5

PSD Call Setup Time

6

HSDPA Latency

7

HSDPA Data Session Setup Time

8

Cell Availability

As per Table 53

9

Cell Availability (Non-Maintenance)

As per Table 55

≤ 200 ms ≤ 5 seconds ≤ 100ms ≤ 5 seconds

The UMTS Network is expected to meet or exceed the KPI targets specified in this Appendix U12 within the UMTS Service Area with or without loading the UMTS Network. Seller commitment applies to Level-1 KPI for achieving target values at the milestone applicable to KPI. KPI counter metrics shall be based on UTRAN view of Network Performance. Seller shall provide all Level-1 KPI counters used in the formulas defined in this Appendix U12 at Milestone-2 with the exception of SRB counters for Access Failure. Seller shall provide all Level-1 KPI reports for KPI Acceptance. Purchaser shall define KPI Level-2 requirements as outlined in Section 4.

3

UTRAN Level-1 Key Performance Indicators

This Section describes the UTRAN Level-1 KPI, or high level service metrics. Level-1 KPI is subdivided into two basic service domains: the Circuit Switched (CS) domain that includes Voice and Video calls, and the Packet Switched (PS) domain that includes R99 Packet Data and HSDPA/HSUPA Data Sessions.

Circuit Switched Level-1 Key Performance Indicators The two (2) types of services in the CS domain are Circuit Switched Voice (CSV) and Circuit Switched Data (CSD).

7 of 40


3.1.1 Circuit Switched Call Setup The setup of a CS service is comprised of three basic steps (see Figure 1). First, UE must access the UTRAN and establish an RRC connection. Once this connection is completed the Non Access Stratum (NAS) messages are exchanged between the UE and the Core Network (CN) e.g. CM Service Request, Authentication, Security, etc. The last step of the call setup is the establishment of a Radio Access Bearer (RAB) between the CN and the UE. Figure 1: Call Setup Block Diagram

Establish RRC Connection

Non Access Stratum Messaging

Radio Access Bearer (RAB) Setup

Figure 2 illustrates a call setup flow diagram for a CS mobile originated call. If a problem occurs between RRC Connection Request (step 1), and RANAP: RAB Assignment Response (step 27), Alerting/Connect (step 29) in case of drive test, it is considered an Access Failure. Any RAB abnormal release after RAB Assignment Response (in case of Counter Formula), Alerting/Connect message (in case of Drive Test Formula) is considered a dropped call.

8 of 40


Figure 2: Circuit Switched Call Setup Flow Diagram

3.1.2 RRC Establishment Causes Table 1 provides a list of RRC establishment causes. When the UE attempts to setup a RRC Connection with the UTRAN, the UE sends a RRC Connection Request message. Embedded within this message is an information element that communicates the establishment cause for the connection request. This establishment cause shall be utilized to differentiate CS and PS call KPI related to Access Failure. Separate counters for each traffic classes for every RRC Connection Request and RRC Connection Complete (as per 3GPP) shall be made available in Software release at the time of commercial launch (Milestone 2). Table 1: RRC Establishment Causes RRC Establishment Cause

Description

0

Originating Conversational Call

1

Originating Streaming Call

2

Originating Interactive Call

3

Originating Background Call

4

Originating Subscribed traffic Call

5

Terminating Conversational Call

6

Terminating Streaming Call

7

Terminating Interactive Call

8

Terminating Background Call

9

Emergency Call

10

Inter-RAT Cell re-selection

11

Inter-RAT Cell change order

12

Registration

13

Detach

14

Originating High Priority Signaling

15

Originating Low Priority Signaling

16

Call re-establishment

17

Terminating High Priority Signaling

18

Terminating Low Priority Signaling

19

Terminating – cause unknown

3.1.3 CSV Access Failure Access failure is the product of the RRC Connection Failure, NAS Setup Failure and the RAB Establishment Failure. RRC Connection Success is counted when the RNC receives a RRC Setup Complete from the UE. NAS Setup is considered successful when the signaling messages in the call flow during call setup flow is successfully completed by relevant Network Elements. A RAB is considered successfully established when the RAB Assignment Response is sent by RNC to the CN. 9 of 40


Equation 1: CSV Access Failure Rate (Counter Formula)

100*

  # RRC Connection Success # SRBSuccess # RAB Assignment Re sponse(CSV )   * * 1 −    # RRC Connection Attempt # SRBAttempt # RAB Assignment Re quest (CSV )  The above formula definition shall be met with the following Seller’s counters. 100*

  pmTotNoRrcConnectReqCsSucc(UtranCell) pmNoRabEstablishSuccessSpeech(Utran * 0.9993 * 1 −  pmNoRabEstablishAttemptSpeech(Utran   pmTotNoRrcConnectReqCs(UtranCell) Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8. As an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment, fault rates higher than 0.1% are observed, seller shall develop counters in P7. •

Measurement Condition:

In case of multiple RRC connection requests only the first RRC connection request shall be taken into consideration for KPI calculation. •

Table 2: CSV Access Term Definition Key Performance Indicator Term

Definition

#RRC Connection Success (CSV)

The number of successful RRC Connection Complete with any establishment cause that lead to CSV RAB.

#RRC Connection Attempt (CSV)

The number RRC Connection Request Messages received with any establishment cause that leads to a CSV RAB.

#RAB Assignment Response (CSV)

The number of RANAP: RAB Assignment Response messages sent from the RNC to the MSC for a voice call.

#RAB Assignment Request (CSV)

The number of RANAP: RAB Assignment Request messages sent from the MSC to the RNC to establish a voice call.

# SRB Attempt

The number of RRC Connection Completes that will lead to an Initial Direct Transfer to the CS domain and will also lead to a CSV RAB Assignment Request.

# SRB Success

The number of CSV RAB Assignment Requests

Equation 2: CSV Access Failure Rate (Drive Test Formula)

 ∑ CC _ Alerting / Connect   CSV _ Calls CSVAccessFailureRate = 100 * 1 − RRC _ Connection Re quest   CSV∑  _ Calls   10 of 40


Measurement Condition: In the event that there are several consecutive RRC Connection Request only the first RRC connection request shall be taken into account for the KPI calculation. Calls are considered as failed during access when a connection is attempted and the UE does not receive Alerting/Connect message in case of drive test measurement. For OSS RANAP: RAB Assignment Response message from the RNC to core Network is considered as a call successful setup.

Table 3: CSV Access Failure Rate KPI Pre-Launch

Launch

Post-Launch

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

3.1.4 CSV Drop Circuit Switched Voice Drop measures the Network’s inability to maintain a call. CSV Drop is defined as the ratio of abnormal speech disconnects, relative to all speech disconnects (both normal and abnormal). A normal disconnect is initiated by a RAB Disconnect RANAP message from the MSC at the completion of the call. An abnormal RAB disconnect includes Radio Link Failures, UL or DL interference or other reason and can be initiated by either UTRAN or CN. •

Equation 3: CSV Drop Rate (Counter Formula)

 # RAB Normal Re lease (CSV )   # RABNormal Re lease(CSV )+# RABAbnormal Re lease(CSV )  

100* 1 − 

100*

   pmNoNormalRabReleaseSpeech(UtranCell)  1 −    pmNoNormalRabReleaseSpeech(UtranCell) + pmNoSystemRabReleaseSpeech(UtranCell)  Counter pmNoSystemRabReleaseSpeech pmNoNormalRabReleaseSpeech

Description Number of system RAB releases (Speech 12.2k and AMR Narrow Band) for the best cell in the Active Set. Number of normal RAB releases (Speech 12.2k and AMR Narrow Band) for the best cell in the Active Set.

Table 4: CSV Drop Term Definition Key Performance Indicator Term

Definition

#RABNormalRelease(CSV)

Number of voice RAB normally released

#RABAbnormalRelease(CSV)

Number of voice RAB abnormally released

Equation 4: CSV Drop Rate (Drive Test Formula)

11 of 40


CSVDropRate = 100 *

∑CallDropped

CSV _ Calls

∑CallSetupSuccess

CSV _ Calls

Measurement Condition:

Any CSV call successfully handed over from 3G to 2G network within the CSV Service Area shall be considered as drop call for Milestone 2. •

Table 5: CSV Drop Rate KPI Pre-Launch

Launch

Post-Launch

≤ 2.0 %

≤ 2.0 %

≤ 2.0%

3.1.5 CSV Quality Voice quality shall be measured by BLER •

Equation 5: CSV UL Quality (Counter Formula)

100*

# Faulty Transport Blocks In Uplink After SelectionCombining ( Speech) # TotalTransport Blocks In Uplink After SelectionCombining ( Speech)

UL BLER is also available as average over the whole Reporting period (15 minutes) as a counter described in the following formula using RNC based counters:

100*

∑ i

pmFaultyTransportBlocksAcUl[UeRc = i]

pmTransportBlocksAcUl[UeRc = i]

i

i (value of UeRc) Radio Connection Configuration 2 Conventional CS Speech (12.2/12.2 kbps) 33 Conventional CS Speech AMR (7.95/7.95 kbps) 34 Conventional CS Speech AMR (5.9/5.9 kbps) 35 Conventional CS Speech AMR (4.75/4.75 kbps) Measurement results are available every 15 minutes. •

Measurement Condition:

UL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch) This KPI shall be measured at 1% BLER operating point set in UTRAN for voice call. UL BLER is collected and presented in the MRR-W (Measurement Results Recording WCDMA) feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks. Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample. •

Table 6: CSV UL Quality Terms Definition Key Performance Indicator Term

#Faulty Transport Blocks in Uplink after SelectionCombining (Speech)

Definition Number of faulty Uplink DCH transport blocks for speech after selection and combining. Sampling period shall be every ten (10) seconds.

12 of 40


Key Performance Indicator Term #Total Transport Blocks in Uplink after SelectionCombining (Speech)

Definition Total number of Uplink DCH transport blocks for speech after selection and combining. Sampling period shall be every ten (10) seconds.

Equation 6: CSV DL Quality (Drive Test Formula)

100*

# Faulty Transport Blocks In Downlink AfterCombining ( Speech) # TotalTransport Blocks In Downlink AfterCombining ( Speech)

CPI for RES is found under RAN Performance Management radio Environment Statistics: USER DESCRIPTION 106/1553-HSD 101 02/5 Uen B •

Measurement Condition:

DL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch) This KPI shall be measured at 1% BLER operating point set in UTRAN for voice call. BLER shall be available using counters with Cell Level granularity. KPI for MS3 shall be calculated at Market Level. For MS3, DL BLER shall be measured using RES counters or drive test as required. DL BLER is reported by the mobile according to 3GPP specification. Measurement reports are collected and presented in the MRR-W feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks. Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample. •

Table 7: CSV DL Quality Terms Definition Key Performance Indicator Term

Definition

#Faulty Transport Blocks in Downlink after Combining (Speech)

Number of faulty Downlink DCH transport blocks for speech after combining. Sampling period shall be every two (2) seconds or less.

#Total Transport Blocks in Downlink after Combining (Speech)

Total number of Downlink DCH transport blocks for speech after combining. Sampling period shall be every two (2) seconds or less.

Equation 7: CSV DL Quality (Drive Test Formula)

CSVDLQuality = 100 *

∑ SamplesBelowBLERT arg et

CSV _ Calls

∑ AllBLERSamples

CSV _ Calls

Measurement Condition:

DL quality shall be measured in all three (3) milestones (Pre-Launch, Launch and Post Launch) This KPI shall be measured at 1% BLER operating point set in UTRAN for voice call. The downlink BLER value shall be collected from the UE having reporting period of 2 seconds and aggregated over a period of 10 seconds in Post Processing Tool for calculation of KPI. Any fraction of 10 seconds shall also be 13 of 40


included in this KPI calculation. The BLER calculation shall be made from the time when Connect message is received. •

Table 8: CSV Quality KPI KPI

Pre-Launch

Launch

Post Launch

CSV Quality (DL)

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

CSV Quality (UL)

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 % BLER

3.1.6 CSV Cell Capacity •

Equation 8: CSV Cell Capacity (Counter Formula) CSVCellCap acity =

CellTraffi cPower AvgPowerPerRL * AvgRLPerCall

CellTraffi cPower = max DlPowerCapability −CommonChannelPower CommonChan nelPower ≈ 2.5 * primaryCpichPower 74

AvgPowerPerRL =

∑ (6 + 0.5i) * pmDpchCodePowerSf 128[i] i =0

74

∑ pmDpchCodePowerSf 128[i] i =0

AvgRLPerCall ≈ SoftSofterHandoverOverhead Note the above is an approximation as Soft/Softer Handover Overhead includes both CS and PS connections.

Measurement Condition:

CSV Cell Capacity shall be calculated based on average DL power allocated to AMR 12.2 RB for this KPI. The above formula is for the purpose of calculating Cell Capacity not a guideline for RF Planning/Design purposes. •

Table 9: Cell Capacity Definition Key Performance Indicator Term

Definition

CSV CellCapacity

Average number of Voice Calls (AMR 12.2) that can be supported by cell

CellTrafficPower

Net DL Power available in a cell to carry user traffic and is equal to total cell carrier RF power (at system reference point) minus Common/signaling channel power as per RF Planning Guideline/Link Budget

AvgRL PerCall

Average number of Radio Link used to support a Voice Call (AMR 12.2)

AvgPowerPerRL

Average RF Power used for a Radio Link to support Voice Call (AMR 12.2)

Equation 9: Cell Capacity (Drive Test Formula)

14 of 40


CSVCellCap acity =100 *

CellTraffi cPower AvgPowerPerRL * AvgRLPerCall

Table 10: Cell Capacity KPI Pre-Launch

Launch

Post-Launch

Units

≥ 40 (provided it is achieved in RF design)

≥ 40 (provided it is achieved in RF design)

≥ Measure and report

# of AMR12.2 UE’s

Measurement Condition:

CSV Cell Capacity shall be calculated based on average DL power allocated to AMR 12.2 RB for this KPI. The whole of Cluster/Market drive data shall be used to calculate this KPI.

3.1.7 CSV Soft/Softer Handover Overhead Soft/Softer Handover Overhead KPI provides an indication of how many Cells or Sectors were in the active set during the call on an average basis. •

Equation 10: Soft/Softer Handover Overhead (Counter Formula) M

SoftSofterHandoverOverhead =

∑ AS * Duration( AS )

AS =1

M

∑ Duration( AS )

AS =1

(pmSumUesWith1Rls1RlInActSet + 2 * (pmSumUesWith1Rls2RlInActSet + pmSumUesWith2Rls2Rl + 3 * (pmSumUesWith1Rls3RlInActSet + pmSumUesWith2Rls3RlInActSet + pmSumUesWith3Rls3R + 4 * (pmSumUesWith2Rls4RlInActSet + pmSumUesWith3Rls4RlInActSet + pmSumUesWith4Rls4R SoftSofterHandoverOverhead = (pmSumUesWith1Rls1RlInActSet + pmSumUesWith1Rls2RlInActSet + pmSumUesWith1Rls3RlIn + pmSumUesWith2Rls2RlInActSet + pmSumUesWith2Rls3RlInActSet + pmSumUesWith2Rls4RlI + pmSumUesWith3Rls3RlInActSet + pmSumUesWith3Rls4RlInActSet + pmSumUesWith4Rls4RlI The counters for this KPI measures not only CSV Soft/Softer Handover Overhead for all CS connections but also for PS connections. •

Measurement Condition:

The counters have a sampling rate of 1 minute and the counters should be aggregated over a sufficiently long period for statistical validity. •

Table 11: Soft/Softer Handover Overhead definition Key Performance Indicator Term

Definition

M

Maximum Active Set size as defined in RNC

AS

Active Set size

Duration(AS)

Duration of the call in Active Set size of AS (AS=1 to M)

Equation 11: Soft/Softer Handover Overhead (Drive Test Formula) 15 of 40


M

SoftSofterHandover =

∑ AS * Duration( AS )

AS =1

M

∑ Duration( AS )

AS =1

Table 12: Soft/Softer Handover Overhead KPI Pre-Launch

Launch

Post-Launch

≤ 1.6

≤ 1.6

≤ 1.7

3.1.8 CSV Inter-Radio Access Technology Handover Failure Inter-Radio Access Technology (IRAT) Hard Handover Failure rate from UMTS and GSM System for voice calls. •

Equation 12: CSV IRAT Failure Rate (Counter Formula) CSV 3G 2GHandoverFailureRate =100 *

# HandoverFromUtranFailure # HandoverFromUtranCommand

Figure 3: IRAT HO Call Flow – CSV Call

16 of 40


pmNoSuccessOutIratHoSpeech(GsmRelation): the trigger is when IU RELEASE COMMAND is received with cause ‘Normal release’ or ‘Successful relocation’ and based on the CS RAB state. pmNoAttOutIratHoSpeech(GsmRelation): this counter is increased when RNC sends HANDOVER FROM UTRAN COMMAND.  pmNoSuccessOutIratHoSpeech(GsmRelation)  CSV 3G 2GHandoverFailureRate =100 * 1 −  pmNoAttOutIratHoSpeech(GsmRelation)  

The above formula captures speech 12.2k and AMR narrow band. •

Measurement Condition:

In order to verify CSV IRAT HO performance, some prerequisites have to be fulfilled such as: definition of IRAT strategy, setting up services priorities. Both 3G and 2G Networks shall have IRAT neighbors defined. The GSM Network shall have available resources without showing congestion. Mutually agreed test UE (with the latest available Software, Firmware and Equipment) shall be used for the IRAT KPI 17 of 40


verification. The 2G and 3G Networks shall be properly configured in accordance with Purchaser’s Network design including the definition of the routing tables through the Core Network. PURCHASER shall inform Seller about major changes in the configuration of the GSM Network (frequency re-plan, Cell parameter changes, etc) that will degrade the 3G IRAT performance. Seller may review Purchaser’s GSM Network changes/planned activities before, during or after IRAT KPI verification drive. CSV IRAT Hard Handover Failure rate shall be measured by a specific drive per Market Unit. The specific drive shall be around UMTS and GSM service boundary. CSV IRAT HHO failure KPI does not include HHO Preparation failure. The HHO Preparation failure accounts for configuration mismatch and availability of resources in the GSM System. •

Table 13: CSV IRAT Term Definition Key Performance Indicator Term

Definition

CSV3G2GHandoverFailureRate

Hard Handover failure rate when a Voice Call fail to handover from UMTS network to GSM network

#HandoverFromUTRAN failure

The number of Handover from UTRAN Failure messages sent from UE to RNC

#HandoverFromUtranCommand

The number of Handover from UTRAN command sent by RNC to UE

Equation 13: CSV IRAT Failure Rate (Drive Test Formula)

 ∑ Handover From UTRAN Failure  CS _ Calls CSV 3G 2GHandover Failure Rate = 100 *  Handover From UTRAN Command  CS∑  _ Calls

    

Trigger for Handover from UTRAN Command is when the Command is sent to the UE. Trigger for Handover from UTRAN Failure is when it is received from the UE. •

Measurement Condition:

CSV IRAT HHO failure KPI does not include HHO Preparation failure. Specific drive route shall be identified at the UMTS RF Service Area boundary to verify this KPI. •

Table 14: CSV IRAT Failure Rate KPI Pre-Launch

Launch

Post-Launch

N/A

≤ 5.0 %

≤ 5.0%

3.1.9 CSV Call Setup Time Voice call set up time indicates Network response time to a user request for a voice service. This KPI is applicable only for drive test. •

Equation 14: CSV Call Setup Time (Drive Test Formula)

Voice Call setup time = CC_Alerting(MOC) – RRC Connection Request(MOC) •

Measurement Condition:

For mobile to mobile call, calling UE shall be mobile and called UE shall be stationary. •

Table 15: CSV Call Setup time KPI KPI

Pre-Launch

Launch

Post Launch 18 of 40


KPI

Pre-Launch

Launch

Post Launch

Voice Call Setup time (Mobile to Mobile)

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

Seller agrees to the 95th percentile provided the impact of non UTRAN Equipment is within the industry typical value range.

3.1.10 CSD Access Failure Access failure is the product of the RRC Connection Failure, NAS setup failure and the RAB Establishment Failure. RRC Connection Success is counted when the RNC receives a RRC Setup Complete from the UE. NAS Setup is considered successful when the signaling messages in the call flow during call setup flow is successfully completed by relevant Network elements. A RAB is considered successfully established when the RAB Assignment Response is sent by RNC to the CN. •

Equation 15: CSD Access Failure Rate (Counter Formula)

100*

  # RRC ConnectionSuccess # SRBSuccess # RABAssignment Re sponse (CSD )   * * 1 −    # RRC ConnectionAttempt # SRBAttempt # RAB Assignment Re quest (CSD)  The following formula includes both Conventional CS 64/64 kbps and Streaming CS 57.6/57.6 kbps:

  pmNoRabEstablishSuccessCs64(UtranCell)        pmTotNoRrcConnectReqCsSucc(UtranCell) * 0.9993* + pmNoRabEstablishSuccessCs57(UtranCell)   100* 1 −   pmTotNoRrcConnectReqCs(UtranCel ) pmNoRabEstablishAttemptCs64(UtranCell)      + pmNoRabEstablishAttemptCs57(UtranCell)     Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8. For an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment, fault rates higher than 0.1% are observed, seller shall develop counters in P7. •

Table 16: CSD Access Term Definition Key Performance Indicator Term

Definition

#RRCConnectionSuccess

The number of successful RRC Connection Setups with Conversational and Streaming Call Establishment Causes (both originating and terminating).

#RRCConnectionAttempt

The number of RRC Connection Request Messages received with Conversational and Streaming Call Establishment Causes (both originating and terminating).

#RABAssignmentResponse (CSD)

The number of RANAP: RAB Assignment Response messages sent from the RNC to the MSC for CSD. 19 of 40


Key Performance Indicator Term

Definition

#RABAssignmentRequest(CSD)

The number of RANAP: RAB Assignment Request messages sent from the MSC to the RNC to establish a CSD call.

# SRBAttempt

The number of RRC Connection Complete that will lead to an Initial Direct Transfer to the CS domain and will also lead to a CSD RAB Assignment Request.

# SRBSuccess

The number of CSD RAB Assignment Requests

Equation 16: CSD Access Failure Rate (Drive Test Formula)

  ∑ CC _ Alerting / Connect      CSD _ Calls CSD Access Failure Rate = 100 * 1 −   RRC Connection Re quest     CSD∑    _ Calls •

Measurement Condition: In case there are several consecutive RRC Connection Requests only the first RRC connection request will be taken into account for the KPI calculation.

Table 17: CSD Access Failure Rate KPI Pre-Launch ≤ 3.0%

Launch

Post-Launch

≤ 2.0 %

≤ 2.0 %

3.1.11 CSD Drop •

Equation 17: CSD Drop (Counter Formula)

 # RAB Normal Re lease (CSD )   # RABNormal Re lease(CSD )+# RABAbnormal Re lease(CSD )  

100* 1 − 

The following formula includes both Conventional CS 64/64 kbps and Streaming CS 57.6/57.6 kbps. 100*

      pmNoNormal RabRelease Cs64(Utran Cell) + pmNoNormal RabRelease CsStream(U tranCell)   1 −    pmNoNormalRabReleaseCs64(UtranCell) + pmNoSystemRabReleaseCs64(UtranCell)      + pmNoNormal RabRelease CsStream(U tranCell) + pmNoSystem RabRelease CsStream(U tranCell)     •

Table 18: CSD Drop Term Definition Key Performance Indicator Term

Definition

#RABNormalRelease(CSD)

Number of video RABs (Conversational and streaming) normally released

#RABAbnormalRelease(CSD)

Number of video RABs (Conversational and streaming) abnormally released 20 of 40


Equation 18: CSD Drop (Drive Test Formula)

CSD _ DropRate = 100 *

∑ CallsDrop

CSD _ Calls

∑ CallsSetupSuccess

CSD _ Calls

Table 19: CSD Drop Rate KPI Pre-Launch

Launch

Post-Launch

≤ 2.5 %

≤ 2.5 %

≤ 3.0 % (Using Counter method) ≤ 2.5 % (Using Drive Test method)

3.1.12 CSD Quality This equation is similar to that of CSV; however it is specific for CSD RAB. •

Equation 19: CSD Quality (UL)

100*

# Faulty Transport Blocks AfterSelectionCombiningIn Uplink (CSD ) # TotalTransport Blocks AfterSelectionCombiningIn Uplink (CSD )

100*

∑pmFaultyTransportBlocksAcUl[UeRc = i] ∑pmTransportBlocksAcUl[UeRc = i] i (value of UeRc) 3 8

Radio Connection Configuration Conversational CS 64/64 kbps Streaming CS57.6 kbps

Measurement Condition:

UL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch) Measurement results are available every 15 minutes. BLER shall be available using counters with Cell Level granularity. UL BLER is collected and presented in the MRR-W (Measurement Results Recording WCDMA) feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks. Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample. Equation 20: CSD DL Quality (Counter Formula)

100*

# FaultyTransportBlocksInDownlinkAfterCombining (CSD ) # TotalTrans portBlocksInDownlinkAfterCombining (CSD )

DL BLER is reported by the mobile according to 3GPP specification. Measurement reports are collected and presented in the MRR-W feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks. Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample. CPI for RES is found under RAN Performance Management radio Environment Statistics 21 of 40


Measurement Condition:

DL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch with optional feature for UE to report DL BLER measurement) BLER shall be available using counters with Cell Level granularity. KPIs at MS3 shall be calculated at Market Level. •

Table 20: CSD Quality Terms Definition Key Performance Indicator Term

Definition

#FaultyTransportBlocksInUplink AfterSelectionCombining(CSD)

Number of faulty Uplink DCH transport blocks for CSD after selection and combining. Sampling period shall be every ten (10) seconds

#TotalTransportBlocksInUplinkAfterS electionCombining(CSD)

Total number of Uplink DCH transport blocks for CSD after selection and combining. Sampling period shall be every ten (10) seconds .

#FaultyTransportBlocksInDownlinkAft erCombining(CSD)

Number of faulty Downlink DCH transport blocks for CSD after combining. Sampling period shall be every ten (10) seconds

#TotalTransportBlocksInDownlinkAfte rCombining(CSD)

Total number of Downlink DCH transport blocks for CSD after combining. Sampling period shall be every ten (10) seconds

Equation 21: CSD Quality DL (Drive Test Formula)

CSDQualityDL = 100 *

∑SamplesBelowBLERT arg et

CSD _ Calls

∑ AllBLERSamples

CSD _ Calls

Measurement Condition:

DL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch) •

Table 21: CSD Quality KPI KPI

Pre-Launch

Launch

Post Launch

CSD Quality (DL)

95th percentile of samples ≤ 1% BLER

95th percentile of samples ≤1% BLER

95th percentile of samples ≤ 1% BLER

CSD Quality (UL)

95th percentile of samples ≤ 1% BLER

95th percentile of samples ≤ 1%BLER

95th percentile of samples ≤ 1%BLER

This KPI will be measured at 0.3 % BLER operating point set in UTRAN for CS Video Call.

3.1.13 CSD Call Setup time Circuit Switched Data call set up time indicates Network response time to a user request for a video service. The test shall be initiated by making a video call from one UMTS UE to another UMTS UE. In case of multiple RRC connection requests the first RRC connection request will be taken into account for KPI calculation. •

Equation 22: Mobile to Mobile Video call (Drive Test Formula)

Video Call setup time = CC_Alerting (MOC) – RRC Connection Request (MOC) •

Table 22: CSD Call Setup time KPI

22 of 40


Parameter

Pre-Launch

Launch

Video Call Setup time

N.A

95th percentile ≤ 9 seconds

Seller agrees to the 95th percentile and 9 seconds provided the impact of non UTRAN Equipment is within the industry typical value range.

Packet Switched Level-1 Key Performance Indicators R99 DL RAB allowed: 384 kbps,128 kbps, and 64 kbps, only R99 UL RAB allowed: 384 kbps, 64 kbps only HSDPA DL HS-DSCH allowed: UE Cat 12 (1800 kbps) HSUPA UL allowed: UE Cat 3 (1.450 Mbps)

3.1.14 Packet Switched Call Setup Figure 4 is a flow diagram for a PS call. If a problem occurs between RRC Connection Request (step 1) and the RAB Assignment Response (step 27), the result is considered Access Failure. Any RAB abnormal release after the RAB Assignment Response is considered a call Drop.

23 of 40


Figure 4: Packet Switched Setup Flow Diagram

3.1.15 PSD Access Failure The aim is to measure Packet Switch Data access failure from a user perspective. •

Equation 23: PSD Access Failure Rate (Counter Formula)

100*

  # RRC Connection Complete # SRBSuccess # RABAssignment Re sponse ( PSD )   * * 1 −    # RRC Connection Re quest # SRBAttempt # RABAssignment Re quest ( PSD)  The following formula includes Packet Interactive and Packet Background. Please refer to Section 6 of the CPI document User Description Connection Handling 4/1553-HSD 101 02/5 Uen. 100*

  pmTotNoRrcConnectReqPsSucc(UtranCell)  * 0.9993 * R99 InteractiveRABEstablishSuccessRate  1 −     pmTotNoRrcConnectReqPs(UtranCell) R99InteractiveRABEstablishSuccessRate equals:

pmNoRabEstablishSuccessPacketInteractive(UtranCell) - pmNoRabEstablishSuccessPacketInteractive pmNoRabEstablishAttemptPacketInteractive(UtranCell) - pmNoRabEstablishAttemptPacketInteractive - HS2_HardHO_Flow(UtranCell) and, HS2_HardHO_Flow(UtranCell) = pmOutgoingHsHardHoAttempt(UtranCell) – pmNoHsHardHoReturnOldChSource(UtranCell) – pmIncominHsHardHoAttempt(UtranCell) Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8. For an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment fault rates higher than 0.1% are observed, seller shall develop counters in P7. •

Measurement Condition:

Both Interactive and Background traffic are included in the counter based formula above. •

Table 23: PSD Access Failure Term Definition Key Performance Indicator Term

Definition

#RRCConnectionComplete

The number of successful RRC Connection Setups with PSD Establishment Causes (both originating and terminating).

#RRCConnectionRequest

The number RRC Connection Request Messages received with PSD Establishment Causes (both originating and terminating).

#RABAssignmentResponse (PSD)

The number of RANAP: RAB Assignment Response messages sent from the RNC to the SGSN PS service. 24 of 40


Key Performance Indicator Term

Definition

#RABAssignmentRequest(PSD)

The number of RANAP: RAB Assignment Request messages sent from the SGSN to the RNC to establish PS Service.

#SRBAttempt

The number of RRC Connection Completes that will lead to an Initial Direct Transfer to the PS domain and will also lead to a PS (R99) RAB Assignment Request.

#SRBSuccess

The number of PS R99 RAB Assignment Requests.

Equation 24: PSD Access Failure Rate (Drive Test Formula)

  PSD Access Failure Rate = 100 * 1 −   •

∑ Activate PDP Context Accept 

PSD _ Calls

∑ RRC Connection Re quest

PSD _ Calls

  

Measurement Condition:

Access Failure Rate shall be measured by data session activation (PS) followed by download. In case of multiple RRC connection requests the first RRC connection request will be considered for KPI calculation. •

Table 24: PSD Access Failure Rate KPI Pre-Launch

Launch

Post-Launch

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

3.1.16 PSD Drop Packet session is considered as dropped when associated RAB has been released abnormally by either UTRAN or CN. Any drop after RANAP: RAB Assignment Response is considered as PS drop call. •

Equation 25: PSD Drop Rate (Counter Formula)

100*

# RABDrop ( PSD) # RABSetupComplete ( PSD)

  pmNoSystemRabReleasePacket 100*   pmNoRabEstablishSuccessPacketInteractive     The formula above calculates drop rate for Packet Interactive and Packet Background (including HSDPA). •

Measurement Condition:

Traffic classes Background, and Interactive (both originating and terminating) shall be considered in counter-based metrics. •

Table 25: PSD Drop Term Definition Key Performance Indicator Term

#RABSetupComplete

Definition The number of completed RAB setup phase for PS Data, when the RNC sends RANAP: RAB Assignment Responses to Core Network after a successful RAB establishment 25 of 40


Equation 26: PSD Drop Rate (Drive Test Formula)

  PSDCallDropRate = 100 * 1 −   •

∑ ftpDownloadSuccess 

ftpSession

∑CallSetupSuccess

ftpSession

  

Measurement Condition:

PSD drop Rate shall be measured after start of ftp download . PSD call is considered as dropped when ftp session is manually/abnormally disconnected without completing file transfer due to any reason. •

Table 26: : PSD Drop Rate KPI Pre-Launch

Launch

≤ 2.5 %

≤ 2.0%

Post-Launch ≤ 2.0 %

3.1.17 PSD Latency The following is extracted from 3GPP Specification 23.107, Section 6.4.3.1 for reference: Transfer delay (ms) Definition: Indicates maximum delay for 95th percentile of the distribution of delay for all delivered SDUs during the lifetime of a bearer service, where delay for an SDU is defined as the time from a request to transfer an SDU at one SAP to its delivery at the other SAP. NOTE 3: Transfer delay of an arbitrary SDU is not meaningful for a bursty source, since the last SDUs of a burst may have long delay due to queuing, whereas the meaningful response delay perceived by the user is the delay of the first SDU of the burst. •

Table 27: PSD Latency Term Definition Key Performance Indicator Term

PSD Latency •

Definition Round trip time for a 32 Bytes ping for any R99 PS RAB

Measurement Condition:

Latency will be measured with the destination server for ping connected directly to the GGSN (i.e. the server on the same Intranet domain as GGSN). RTT will be measured in stationary position. This test will be done with a single UE with R99 RAB PS call with Ping application. •

Table 28: PSD Latency KPI Latency at Launch

RAB

95th percentile ≤ 200 ms

R99 RAB (64 Kbps or 128 Kbps or 384 Kbps)

Seller agrees to the 95th percentile and the 200ms provided the impact of non UTRAN Equipment is within the industry typical value range.

3.1.18 PSD Throughput Total number of RLC blocks sent over the observation window including re-transmission per transport type. •

Equation 27: PSD Average Throughput (Counter Formula) 26 of 40


UserThroughput =

∑ AM _ RLC _ SDU _ Data(kb) ∑ AM _ RLC _ SDU _ Duration

PSD _ Call

PSD _ Call

The above formula is applicable for both Downlink and Uplink

DlUserThroughput =

pmSumActDlRlcUserPacketThp(UtranCell ) pmSamplesActDlRlcUserPacketThp(UtranCell )

UlUserThroughput =

pmSumActUlRlcUserPacketThp(UtranCell) pmSamplesActUlRlcUserPacketThp(UtranCell)

Measurement Condition:

This metric shall be measured only for Packet Switch Traffic Class (Interactive and Background). KPI Measurement shall be based on RLC (SDU) layer throughput with five percent (5%) target for DL/UL BLER for R99 RAB. •

Table 29: PSD Average Throughput Term Definition Key Performance Indicator Term

Definition

AM_RLC_SDU_Data(kb)

Total AM RLC SDUs (kb) transferred excluding retransmission in the downlink or uplink

DlUserThroughput

Downlink Average Packet data throughput (kbps)

AM_RLC_SDU_Duration

The total RLC SDU transmission duration in seconds. For DL/UL, this excludes the period when the DL/UL transmission buffer for the RLC entity is empty

Equation 28: PSD Average Throughput (Drive Test Formula)

PSDAvgThroughput ( kbps ) =

∑UserDataTransferred (kb)

PSD _ Call

∑SessionDuration(sec)

PSD _ Call

Formula is valid for uplink and downlink. •

Measurement Condition:

KPI Measurement shall be for application layer throughput with five-percent (5%) target DL/UL BLER operating point for R99 RAB. Throughput shall be calculated using data from the entire Cluster/ Market drive test for Milestone 1 and Milestone 2. The download/upload file shall be compressed type. •

Table 30: PSD Average Throughput Definition Key Performance Indicator Term

Definition

UserDataTransferred(kb)

FTP download in kilo-bit during one session.

SessionDuration(sec)

Time duration (seconds) to download/upload a file.

PSD_Throughput(kbps)

Packet data throughput in kbps using R99 RAB measured at application layer.

Table 31: PSD Average DL Throughput KPI 27 of 40


KPI

Pre-Launch

Launch

Post Launch

Average DL throughput (Unloaded) kbps

≥ 240

≥ 240

≥ 150 (Counter) ≥ 240 (Drive Test)

Average DL Throughput (Loaded) kbps

≥ 210

≥ 210

≥ 150 (Counter) ≥ 210 (Drive Test)

Table 32: PSD Average UL Throughput KPI

KPI

Pre-Launch

Launch

Post Launch

Average UL throughput (Unloaded) kbps

≥ 200

≥ 200

≥ 150 (Counter) ≥ 200 (Drive Test)

Average UL Throughput (Loaded) kbps

≥ 180

≥ 180

≥ 150 (Counter) ≥ 180 (Drive Test)

Measurement Condition:

KPI Measurement shall be for application layer throughput with five-percent (5%) BLER operating point for both UL and DL. Throughput shall be calculated using data from the entire Cluster/ Market drive test for Milestone 1 and Milestone 2 using all available R99 Radio Bearer. •

Table 33: PSD Average Throughput Definition Key Performance Indicator Term

Definition

UserDataTransferred(kb)

Total ftp size download in kilo-bit in one session

SessionDuration(sec)

Total time (seconds) duration to download single fileSessionDuration(sec) = (time stamp of Session End or Session Error – time stamp of Session Start)

PSDAvgThroughput

Packet Switched Data average throughput using R99 RAB measured at application layer

3.1.19 PSD Call Setup Time PSD call setup time indicates Network response time to a user request for a packet data service. In case of multiple RRC connection requests the first RRC connection request will be considered for KPI calculation. •

Equation 29: PSD Call Setup Time (Drive Test Formula)

PS Call setup time = PDP Context Activation accept (MOC) - RRC Connection Request (MOC) •

Measurement Condition:

The UE is already attached to the UTRAN network. •

Table 34: PSD Call Setup time KPI KPI

Pre-Launch

At Launch

Post Launch

PSD Session Activation time

N.A

95th percentile sessions ≤ 5 seconds

N.A

Seller agrees to the 95th percentile and the 5 seconds provided the impact of non-UTRAN Equipment is within the industry typical value range. 28 of 40


3.1.20 PSD Inter-Radio Access Technology Handover Failure Inter-Radio Access Technology (IRAT) is a hard handover between UMTS and GSM. For Packet Switched handover, the RNC sends a Cell Change Order from UTRAN command to the UE. A successful handover can be monitored by the RNC Iu Release message from the CN. •

Equation 30: PSD IRAT Failure Rate (counter formula) 3G 2GPSHandoverFailureRate =100 *

# CellChangeOrderFromUTRANFailur e # CellChangeOrderFromUTRANCommand

pmNoOutIratCcReturnOldCh is increased when the CELL CHANGE ORDER FROM UTRAN FAILURE (RRC) message is received from the UE. pmNoOutIratCcAtt is increased when the CELL CHANGE ORDER FROM UTRAN (RRC) message has been sent to the UE. 3G 2GPSHandoverFailureRate =100 *

pmNoOutIratCcReturnOldCh(GsmRelation) pmNoOutIratCcAtt(GsmRelation)

Measurement Condition:

PSD IRAT HHO failure KPI does not include HHO Preparation failure. In order to verify PSD IRAT HO performance, some prerequisites have to be fulfilled such as: definition of IRAT strategy, setting up services priorities. Both 3G and 2G Networks shall have IRAT neighbors defined. The GSM Network shall have available resources without showing congestion. Mutually agreed test UE (with the latest available Software, Firmware and Equipment) shall be used for the IRAT KPI verification. The 2G and 3G Networks shall be properly configured in accordance with Purchaser’s Network design including the definition of the routing tables through the Core Network. PURCHASER shall inform Seller about major changes in the configuration of the GSM Network (frequency re-plan, Cell parameter changes, etc) that will degrade the 3G IRAT performance. Seller may review Purchaser’s GSM Network changes/planned activities before, during or after IRAT KPI verification drive. •

Table 35: PSD IRAT Failure Term Definition Key Performance Indicator Term

Definition

3G2GPSHandoverFailureRate

Hard Handover failure rate when a PS Data call (R99) fail to handover from UMTS network to GSM network

#CellChangeOrderFromUTRAN Failure(PS)

RRC: The number of Cell Change Order from UTRAN Failure messages from UE to RNC

#CellChangeOrderFromUTRAN Command(PS)

RRC: The number of Cell Change Order Messages from RNC to UE

Equation 31: PSD IRAT Failure Rate (Drive Test Formula)

3G 2GPSHHOFailureRate = 100 *

∑ CellChangeOrderFromUTRANFailure _ UE ∑CellChangeOrderFromUTRANCommand _ RNC PSD _ Call

PSD _ Call

RRCCellChangeOrderFromUTRANFailure_UE message is sent by UE. RRCCellChangeOrderFromUTRANCommand_RNC message is sent by RNC. •

Measurement Condition: 29 of 40


PSD IRAT HHO failure KPI does not include HHO Preparation failure. Specific drive route shall be identified at the UMTS RF Service Area boundary to verify this KPI. •

Table 36: PSD IRAT Failure Rate KPI Pre-Launch

Launch

N/A

≤ 5.0 %

Post-Launch ≤ 5.0 %

3.1.21 PSD IRAT Interruption time This KPI is an indicator of interruption time for the packet switch data during Inter-Radio Access Technology hard handover. •

Equation 32: PSD IRAT Interruption time (Drive Test Formula)

PSD_IRATInterruptionTime = (TimeRAUpdateComplete_UE – TimeCellChangeOrder_RNC) •

Equation 33: PSD IRAT User Data Interruption time (Drive Test Formula)

PSD_IRATUserDataInterruptionTime = (TimeFirstPacketDataReceivedIn2G – TimeLastPacketDataReceivedIn3G) Measurement Condition: This KPI is applicable for dedicated drive test route on 3G-2G RF service border area during Market Acceptance. The interruption time shall be measured during 3G to 2G PSD HHO. •

Table 37: PSD IRAT Interruption Definition Key Performance Indicator Term

Definition

PSD_IRATInterruptionTime

Duration of interruption to Packet data service during Hard Handover

TimeRAUpdateComplete_UE

The timestamp in drive test Tool when UE sends Routing Area Update complete message in the uplink to the 2G SGSN

TimeCellChangeOrder_RNC

The timestamp in drive test Tool when UE receives Cell Change Order message from RNC in the downlink

Table 38: PSD IRAT User Data Interruption Definition

Key Performance Indicator Term

Definition

PSD_IRATUserDataInterruptionTime

Duration in seconds to interruption of Packet data service from User perspective during IRAT Hard Handover from 3G to 2G

TimeFirstPacketDataReceivedIn2G

Timestamp in drive test Tool when first data is Received in 2G System after successful IRAT Handover.

TimeLastPacketDataReceivedIn3G

Timestamp in drive test Tool when UE receives last Packet Data in 3G System before IRAT Handover.

Table 39: PSD IRAT Interruption time KPI

30 of 40


KPI Term

At Launch

PSD IRAT Interruption time

95th percentile ≤ 12 seconds

PSD IRAT User Data Interruption time

Measure and Report

Seller agrees to the 95th percentile and the 12 seconds provided the impact of non UTRAN Equipment is within the industry typical value range.

3.1.22 HSDPA Access Failure This test shall be done with a single UE doing HSDPA call in the Cell under test with 384/64 Kbps associated DCH on the uplink. •

Equation 34: HSDPA Access Failure Rate (Counter Formula)

100*

  # RRC Connection Complete # SRBSuccess # RABAssignment Success ( HS − DSCH )   * * 1 −  # RRC Connection Re quest # SRBAttempt # RABAssignm ent Re quest ( HS − DSCH )   

  pmTotNoRrcConnectReqPsSucc(UtranCell)  * 0 . 9993 *    pmTotNoRrcConnectReqPs(UtranCell)   100*  1 −   pmNoRabEstablishSuccessPacketInteractiveHs(UtranCell)         pmNoRabEstablishAttemptPacketInteractiveHs(UtranCell)   Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8. As an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment, fault rates higher than 0.1% are observed, seller shall develop counters in P7. •

Table 40: HSDPA Access Failure Term Definition Key Performance Indicator Term

Definition

#RRCConnectionComplete

The number of successful RRC Connection Setups with Packet Establishment Causes (both originating and terminating).

#RRCConnectionRequest

The number RRC Connection Request Messages received with Packet Establishment Causes (both originating and terminating).

#RABAssignmentResponse (HSDSCH)

The number of RANAP: RAB Assignment Response messages sent from the RNC to the SGSN PS service.

31 of 40


Key Performance Indicator Term

Definition

#RABAssignmentRequest(HS-DSCH)

The number of RANAP: RAB Assignment Request messages sent from the SGSN to the RNC to establish HSDPA Service.

#SRBAttempt

The number of RRC Connection Completes that will lead to an Initial Direct Transfer to the PS domain and will also lead to a HSDPA RAB Assignment Request.

#SRBSuccess

The number of HSDPA RAB Assignment Requests

Equation 35: HSDPA Access Failure Rate (Drive Test Formula)

 ∑Activate PDP Context Accept   HSDPA _ Session HSDPA AccessFailureRate = 100 * 1 − ∑ RRC Connection Re quest   HSDPA _ Session  •

Measurement Condition:

In case of multiple RRC connection requests the first RRC connection request will be considered for KPI calculation. Access Failure Rate shall be measured by data session activation (HSDPA) followed by download. •

Table 41: HSDPA Access Failure Rate KPI Pre-Launch

Launch

Post-Launch

≤ 2.0 %

≤ 2.0 %

≤ 2.0 %

3.1.23 HSDPA Drop HSDPA session is considered as dropped when associated HS-DSCH has been released abnormally by either UTRAN or CN. This test will be done with a single UE performing HSDPA call in the Cell under test with 384/64 Kbps associated DCH on the uplink and HS-DSCH in the DL with downgrade/upgrade to/from R99 RAB. •

Equation 36: HSDPA Drop Rate (Counter Formula)

# HS - DSCH_ReleaseDueToFailure 100* # HS - DSCH_AllocationSuccess

    pmNoSystemRbReleaseHs    (pmNoRabEstablishSuccessPacketInteractiveHs + pmDlUpswitchSuccessHs    + pmNoRabEstablishSuccessPacketInteractiveEul + pmUlUpswitchSuccessEul )   100* pmNoSystemRbReleaseHs: Number of successful system releases of packet RABs mapped on HS-DSCH in the Serving HS-DSCH cell. The counter is stepped for the Serving HS-DSCH cell at RAB/RB combination transition from PS Interactive 64/HS - HS-DSCH to SRB-DCH or to Idle mode due to the same reasons as for stepping the existing counter pmNoSystemRabReleasePacket. pmNoSystemRabReleasePacket is only increased due to a RANAPIu Release Command or RAB 32 of 40


Assignment Request message with "release cause" = anything except 'Normal Release', 'Successful Relocation', 'Resource Optimisation Relocation', 'User Inactivity' or 'release-due-to-UEgenerated- signalling-connection-release'. This counter is incremented for the best cell in the Active Set in the SRNC and when releasing a HS RAB, this counter is stepped for the Serving HS-DCH cell. pmNoRabEstablishSuccessPacketInteractiveHs: The number of successful RAB establishments for PS Interactive RAB mapped on HS-DSCH. The counter is stepped for the selected Serving HSDSCH cell at RAB establishment after the successfully transition SRB-DCH to PS Interactive 64/HS - HS-DSCH. pmDlUpswitchSuccessHs: Number of successful DL upswitches to any HS state. The counter is stepped for successful DL upswitch to a RB combination containing HS. The counter is incremented in all cells of the active set. pmNoRabEstablishSuccessPacketInteractiveEul: The number of successful RAB establishments for PS Interactive RAB mapped on E-DCH/HSDPA. Counter is stepped for the Serving E-DCH cell at successful RAB/RB combination transition to PS Interactive E-DCH/HS - HSDSCH due to RAB establishment. The counter is triggered after sending of RAB Assignment Response (successful). pmUlUpswitchSuccessEul: Number of successful up-switches, triggered by UL user activity, to a target RB combination E-DCH/HSDPA. Stepped for the target cell. Counter in target cell is stepped at a successful upswitch triggered by UL user activity. •

Measurement Condition:

The Seller counter based metrics formula shall be updated as new functionality is introduced in the System •

Table 42: HSDPA Drop Term Definition Key Performance Indicator Term

Definition

HS-DSCH_ReleaseDueToFailure

Number of HS-DSCH allocation releases due to radio link and other failures

HS-DSCH_AllocationSuccess

Number of allocations when the RNC has received RRC: Radio Bearer Re-Configuration Complete message from the UE after successful HS-DSCH MACd flow setup

Equation 36: HSDPA Drop Rate (Drive Test Formula)

HSDPA _ DropRate = 100 *

∑ HS - DSCH_Drop

HSDPA _ Call

∑ HS − DSCH _ AllocationSuccess

HSDPA _ Call

Measurement Condition:

HSDPA Drop Rate shall be measured by data session activation (HSDPA) followed by ftp download •

Table 43: HSDPA Drop Rate KPI Pre-Launch

Launch

Post-Launch

≤ 3.0 %

≤ 2.0 %

≤ 2.0 %

3.1.24 HSDPA Latency 33 of 40


Table 44: HSDPA Latency Term Definition Key Performance Indicator Term

Round trip time for a 32 Bytes ping for HSDPA NRT RAB

HSDPA Latency •

Definition

Measurement Condition:

Latency shall be measured with the destination server for ping connected directly to the GGSN (i.e the server on the same Intranet domain as GGSN). RTT shall be measured in a stationary test. This test shall be done with a single UE doing HSDPA call in the Cell under test. •

Table 45: HSDPA Latency KPI KPI

Latency at Launch th

HSDPA Latency

95 percentile sessions ≤ 100 ms

Seller agrees to the 95th percentile and the 100ms provided the impact of non UTRAN Equipment is within the industry typical value range.

3.1.25 HSDPA Throughput Total number of RLC blocks sent over the observation window including re-transmission. •

Equation 39: HSDPA Throughput (Counter Formula)

HSDPA _ Throughput =

∑ AM _ RLC _ PDU _ Data(kb)

HSDPA _ Call

∑ AM _ RLC _ PDU _ Duration

HSDPA _ Call

HSDPACellThroughput =

pmSumAckedBits 2 * ( pmNoActiveSubFrames + pmNoInactive Re quiredSubFrames)

Key Performance Indicator Term

Definition

AM_RLC_PDU_Data(kb)

Total AM RLC PDUs (kilo-bit) transferred excluding retransmission in the uplink

AM_RLC_PDU_Duration

The total RLC PDU transmission duration (in seconds)

Table 46: HSDPA Throughput Term Definition Key Performance Indicator Term

Definition

AM_RLC_PDU_Data(kb)

Total AM RLC PDUs (kilo-bit) transferred excluding retransmission in the downlink or uplink

AM_RLC_PDU_Duration

The total RLC PDU transmission duration (in seconds). For downlink this duration excludes the period when the downlink transmission buffer for the RLC entity is empty

Equation 40: HSDPA Throughput (Drive Test Formula)

34 of 40


HSDPA _ Throughput =

∑UserDataTransferred (kb)

HSDPA _ Call

∑ SessionDuration(sec)

HSDPA _ Call

Measurement Condition:

Measurement shall be for Application Layer bit rate. The HSDPA throughput is only applicable when the UE is on HS-DSCH. •

Table 47: Stationary Maximum DL HSDPA Bit Rate (kbps) KPI Pre-Launch

Launch

Post-Launch

≥ 1300

≥ 1300

≥ 1300 (at RLC Layer using counter)

≥ 1100

≥ 1100

N.A

1800 kbps (using UE category 12) Under no load condition

50% of Available Power at System Reference Point allocated to HSDPA

Measurement Condition:

UE shall be in Stationary location under excellent RF condition (CPICH RSCP ≥ -80 dBm and CPICH Ec/No ≥ - 8 dB). The download file shall be compressed type. Measurement shall be based on FTP file download of minimum ten (10) MB file to reduce the impact of TCP slow start. Measurement shall be at application layer throughout the HSDPA Service Area. •

Table 48: Average DL Throughput (kbps) KPI Pre-Launch

Launch

Post-Launch

≥ 700

≥ 700

≥ 700 (at off peak)

≥ 600

≥ 600

N.A

Using UE category 12 1.8 Mbps Under no load condition 50% of Available Power at System Reference Point allocated to HSDPA

Measurement Condition:

UE shall be in mobile environment within designed HSDPA Service Area. The supported DL HSDSCH during drive test is 1800 kbps for UE Category 12. The Cell shall be loaded in such a way that HSDPA is allocated fifty-percent (50%) of the available power at System Reference Point. The remaining 50 % of available DL power at System Reference Point shall include common channels power, load due to one drive test UE (Voice) and the load simulated by OCNS speech mode. There shall be one UE making short Voice Call (AMR 12.2k) in the same cell as HSDPA call to monitor the impact of HSDPA on the voice user. Seller shall provide measurement report on short voice call for Setup Failure and Call Drop Rate as defined in Section 3.1.3 and Section 3.1.4 respectively. The average DL throughput measured during the entire drive test route within Cluster/Market Area shall

35 of 40


be based on HS-DSCH Application Layer throughput. Measurement shall be based on drive test with FTP file download of minimum ten (10) MB file to reduce the impact of TCP slow start. HSDPA Service Area shall be same as PSD 64k Service Area.

3.1.26 HSUPA Throughput Total number of RLC blocks sent over the observation window excluding re-transmission. •

Equation 39: HSUPA Throughput (Counter Formula)

HSUPA _ Throughput =

∑ AM _ RLC _ PDU _ Data(kb)

HSUPA _ Session

∑ AM _ RLC _ PDU _ Duration

HSUPA _ Session

HSUPACellThroughput =

pmSumAckedBitsCellEul ( pmNoActive10msFramesEul * 10)

pmSumAckedBitsCellEul: The number of Media Access Control Enhanced Uplink (Eul) bits received and acknowledged by the RBS. pmNoActive10msFramesEul: transmitted by the UE •

The number of 10ms frames containing enhanced uplink data

Table 49: HSUPA Throughput Term Definition Key Performance Indicator Term

Definition

AM_RLC_PDU_Data(kb)

Total AM RLC PDUs (kilo-bit) transferred excluding retransmission in the uplink

AM_RLC_PDU_Duration

The total RLC PDU transmission duration (in seconds).

Equation 40: HSUPA Throughput (Drive Test Formula)

HSUPA _ Throughput =

∑UserDataTransferred (kb)

HSUPA _ Session

∑ SessionDuration(sec)

HSUPA _ Session

Measurement Condition:

Measurement shall be at Application Layer bit rate. The upload file shall be compressed type. HSUPA Service Area shall be same as PSD 64k Service Area. UE shall be in Stationary location under excellent RF condition (CPICH RSCP ≥ -80 dBm and CPICH Ec/No ≥ - 8 dB) •

Table 50: HSUPA Stationary UL Bit Rate (kbps) KPI Pre-Launch

Launch

Post-Launch

UE Category 3 (1.45 Mbps)

≥ 1100

≥ 1100

Measure and Report

Stationary Peak Throughput (kbps) under no load

≥ 500

≥ 500

Measure and Report

Average Throughput (kbps) under no load

36 of 40


3.1.27 HSDPA Data Session Setup Time HSDPA Data Session setup time indicates Network response time to a user request for an HSDPA data service. •

Equation 41: HSDPA session definition (Drive Test Formula)

HSDPA Data Session setup time = PDP Context Activation accept (MOC) - RRC Connection Request (MOC) •

Measurement Condition:

This test shall be done with a single UE performing HSDPA call in the cell under test. The UE is already attached to the UTRAN Network. •

Table 51: HSDPA Data Session Setup time KPI KPI

Pre-Launch

Launch

HSDPA Session Activation time

N.A

95th percentile sessions ≤ 5 seconds

Seller agrees to the 95 th percentile and the 5 seconds provided the impact of non UTRAN Equipment is within the industry typical value range.

System Availability System Availability is defined as the percentage of time the Network can handle one hundred per-cent (100%) of the traffic it is designed for as measured at Cell Level. The purpose of this metric is to calculate the total amount of time (in percentage) out of the total operating time the RNC and Node-B are available to carry commercial traffic. Minimum granularity for KPI purposes is total loss of traffic at Cell level. Seller is responsible only for the Sub-System supplied to Purchaser: RNC and Node-B. This excludes Antenna Systems, Transport Systems, Power/Battery Backup, non-Seller Core Network. The loss of traffic at Cell Level can be due to one or more reasons: 1. Equipment failures 2. Software failures 3. Seller originated (accidental, misuse, reset etc.) 4. Planned events authorized by Seller (Software upgrade, Equipment upgrade, Parameter change etc) For Average Cell Availability, Seller shall be responsible for service degradation due to 1, 2, 3 (caused by Seller’s personnel or sub-contractors of Seller) and 4. For Average Cell Non-Maintenance Availability, Seller shall be responsible for service degradation due to 1, 2 and 3 (caused by Seller’s personnel or sub-contractors of Seller). The level of aggregation for this metric is Purchaser’s entire UMTS Network for which Seller has supplied the Equipment and Software (RNC and Node-B). The alarm aggregation for this metrics shall be performed on daily basis. System Availability KPI for Seller shall be calculated on a yearly basis for the purpose of achieving KPI target starting from the Network commercial Launch for RNC and Node-B Equipment. Seller shall provide all the necessary alarm details to assist Purchaser in realizing System Availability KPI matrices for the sub-Systems supplied by Seller (RNC and Node-B). Seller shall recommend to Purchaser which alarms to use to measure these KPIs. Daily System Availability shall be measured as follows:

3.1.28 Average Cell Availability •

Equation 37: Average Cell Availability formula

Using Alarms and Systems notification, the following formula shall apply 37 of 40


  ∑TotalCellDowntime(sec)  RNC 100 * 1 −  ∑24( Hrs) * 60( Min) * 60(sec) * TotalSectorCount   RNC •

Table 52: Average Cell Availability Term Definition Key Performance Indicator Term

Definition

TotalCellDowntime(sec)

Total duration (in seconds) of Node-B Cells within RNC unable to carry traffic due to planned or nonplanned events due to RNC or Node-B. The duration shall be calculated when Alarm/System notification clears minus when Alarm/System notification triggers.

TotalSectorCount

Total number of active Node-B Cells within RNC

AvgCellAvailability

Average Node-B Cells availability to carry user traffic calculated as a percentage of total time

Table 53: Average Cell Availability KPI

Average Cell Availability KPI

Sub-System

RAN Release

≥ 99.95 (99.7) %

RNC + Node-B

P5

≥ 99.95 (99.8) %

RNC + Node-B

P6

≥ 99.95 (99.9) %

RNC+ Node-B

P7

≥ 99.95 (>99.9)%

RNC+ Node-B

≥P8

3.1.29 Average Cell Non-Maintenance Availability •

Equation 38: Average Cell Non-Maintenance Availability Formula

 ∑TotalCellNonMa int enanceDowntime(sec)   RNC 100 * 1 −  ∑18( Hrs ) * 60( Min) * 60(sec) * TotalSectorCount   RNC  •

Table 54: Average Cell Non-Maintenance Availability Term Definition Key Performance Indicator Term

Definition

TotalCellNonMaintenanceDowntime(sec)

Total duration (in seconds) of Node-B Cells within a RNC not able to carry any traffic during normal operation (excluding planned events) due to RNC and Node-B Equipment reason. The duration shall be calculated when Alarm/System notification clears minus when Alarm/System notification triggers.

TotalSectorCount

Total number of active Node-B Cells within RNC

AvgCellNonMaintenanceAvailability

Average Node-B Cells available calculated as a percentage of available time at Cell level

Table 55: Average Cell Non-Maintenance Availability KPI 38 of 40


Average Cell Non-Maintenance Availability KPI

Sub-System

RAN Release

≥ 99.995 (99.7) %

RNC + Node-B

P5

≥ 99.995 (99.8) %

RNC + Node-B

P6

≥ 99.995 (99.9) %

RNC+ Node-B

P7

≥ 99.995 (>99.9) %

RNC+ Node-B

≥P8

Comparison of UMTS and GSM Network Performance Seller shall collect some 2G measurements defined below at the same time as 3G measurements for all Cluster Acceptance and all Market Acceptance drives. The 2G measurements collected by Seller from a 2G terminal in the vehicle executing a pre-defined call sequence are defined below: a. Voice Access Failure rate b. Voice Drop Rate c. Voice Quality; MOS The 2G terminal used shall be specified by Purchaser and shall be a commercially available terminal used by Purchasers 2G Customers. Call sequences used by the 2G terminals shall be identical to the call sequences used by the 3G terminals. To the extent possible, the number of 2G calls made shall be the same as the number of 3G calls made. Test cases for 2G terminals executed to derive the above stated KPI must be executed at the same time within the test vehicle as the test cases to derive the same KPI for the 3G terminals. It is a requirement for Cluster Acceptance and Market Acceptance that the 3G KPI are met. If the collected 2G KPIs do not meet 3G KPI target then 3G results shall be better than or equal to the obtained 2G results, using the same call profiles and call sequences. If the measured 2G KPI’s exceed the measured 3G KPI the Seller shall only have to meet the 3G target.

Impact of UMTS on GSM Network Performance The deployment of the UMTS Network shall not degrade Purchaser’s existing GSM Network performance. Excluding IRAT Hard Handover Feature, both UMTS and GSM Networks are considered independent of each other. This section addresses any concern related to UMTS Equipment installation on GSM Sites and related faults for which Seller has responsibility that could degrade GSM performance. The KPI shall be collected for the entire underlying 2G Site (s) relevant to 3G Node-B: •

MOU/drop for Voice Calls

Access failure rate for voice calls

Daily (24 hour) Voice Erlang Traffic

Seller shall ensure that introduction of the UMTS Network does not degrade Purchaser’s existing GSM Network performance by no more than (10%) ten percent at Cell Level, no more than (7.5%) seven and a half percent at Cluster Level and no more than (5%) five percent at Market Level. The method of calculation shall be as described in the UMTS Systems Acceptance (Appendix U16).

4

UTRAN Level-2 Key Performance Indicators

Purchaser shall provide Seller a document describing the KPI Level-2 counter based requirements. Seller shall provide Purchaser all the available counters down to the lowest cause level for RNC and Node-B. Seller shall provide (with a collaborative effort by Purchaser) a document that describes the Level-2 KPIs (formulas and counters) within ten (10) weeks after receipt of the Level 2 requirements from Purchaser. Any deviations on the above mentioned time schedule shall be mutually agreed.

39 of 40


For counters that are not likely to be available at the time of commercial launch (Milestone 2), Seller shall mutually agree with Purchaser (within (3) months of execution of the Fifth (5 th) Amendment) on a reasonable roadmap to develop required counters. Performance Management Measurement Entity shall be available at Cell, Node-B, Cluster, RNC, Market, Region and Network level for every KPI unless otherwise stated in Level-2 KPIs. RNC Measurement Granularity shall be a minimum period of fifteen (15) minutes unless otherwise stated in the Level-2 KPIs.

40 of 40


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.