World Applied Sciences Journal 3 (1): 000-000, 2008 ISSN 1818-4952 © IDOSI Publications, 2008
Multivariate Estimators for Two Phase Sampling 1
Nadeem Shafique Butt, 1Shahid Kamal and 2Muhammad Qaiser Shahbaz
1
College of Statistical and Actuarial Sciences, University of the Punjab, Lahore, Pakistan 2 Department of Mathematics, COMSATS Institute of IT, Lahore, Pakistan
Abstract: In this paper some new multivariate estimators for two phase sampling has been proposed. The proposed estimators use information on multiple quantitative variables and as well as multiple qualitative variables. Empirical study has been carried out to see the performance of proposed estimator over estimator proposed by . Key words: Multivariate estimator two phase sampling multiple auxiliary variables minimum variance •
•
•
Var ( y mlr ) = θS2y ( 1 − ρ2y.x )
INTRODUCTION The auxiliary information has always been a source of improvement in estimation of certain population characteristics. Several estimators have been developed in single and two phase sampling which utilizes information on auxiliary variables as well as auxiliary attributes. The classical estimators which use information on auxiliary variables are the ratio and regression estimators as given in . The classical regression estimator of population mean is given as: ylr = y + β ( X − x )
where ρ y.x is the squared multiple correlation coefficient between Y and x. The classical regression estimator for two phase sampling is given by as: 2
ylr ( 2 ) = y2 + β ( x1 − x 2 )
Var(y lr ) = θS ( 1 − ρ
2 yx
(1.1)
)
(
)
{
Var ylr ( 2 ) = S2y θ2 ( 1 − ρ2yx ) + θ1ρ2yx
}
(1.6)
where θh = n h−1 − N −h1
and nh is sample size at hth phase. Ahmed, Hanif [3] has extended the (1.6) the case of several variables. has proposed the regression type estimator using information of two auxiliary variables. The estimator proposed by is given as:
(1.2)
where θ = n-1-N-1 and ρyx is the correlation coefficient between X and Y. The estimator (1.1) in case of several auxiliary variables has been discussed by number of statisticians and the estimator in this case is given as: ymlr = y + β/ ( X − x )
(1.5)
where x1 and x 2 are first phase and second phase means of auxiliary variable X and y 2 is second phase mean of Y. The variance of (1.5) is given as:
The value of β for which the variance of (1.1) is 2 minimum is β = Sxy Sx . The minimum variance of (1.1) is given as: 2 y
(1.4)
yssm = y 2 + β1 ( x1 − x 2 ) + β2 ( Z − z )
(1.7)
The variance of (1.7) is:
(1.3)
{
Var ( yssm ) = S2y θ2 ( 1 − ρ2yx ) + θ1 ( ρ2yx − ρ 2yz )
where x is vector of sample means for auxiliary variables. The variance of (1.3); reported by among others; is given as:
}
(1.8)
2 where ρ yz is squared correlation coefficient between Y and Z.
Corresponding Author: Nadeem Shafique Butt, College of Statistical and Actuarial Sciences, University of the Punjab, Lahore, Pakistan
1
World Appl. Sci. J., 3 (1): ccc-ccc, 2008
have proposed a family of estimators in single and two phase sampling using information on auxiliary attributes. The variance of the proposed family depends upon the point bi-serial correlation coefficient. have also proposed several estimators in single and two phase sampling. A regression-in-ratio estimator proposed by is: X ysh ( 2 ) = y 2 + β yz ( z1 − z2 ) x
attributes and discussed the estimator for full, partial and no information cases. showed that the proposed family has smaller mean square error than given by . proposed some ratio estimators for single phase and two phase sampling by using information on multiple auxiliary attributes. The proposed estimators are generalization of the estimator proposed by . also drive the shrinkage version of the proposed estimators by using the method given proposed a number of generalized multivariate ratio estimators for two-phase and multi-phase sampling in the presence of multi-auxiliary variables for estimating population mean for a single variable and a vector of variables of interest. proposed more general ratio estimator when information on all auxiliary variables are not available for population (No Information Situation), the estimator is:
(1.9)
2
The variance of (1.9) is:
(
)
{
MSE ysh ( 2 ) ≈ Y 2 θ2 C 2y ( 1 − ρ2xy ) + ( C x − C y ρ xy )
2
{
}
}
2 + ( θ2 − θ1 ) C2x ρ2xz − ( C yρ yz − C x ρ xz ) (1.10)
proposed a generalized family of estimators based on the information of “k” auxiliary
Thk ( 1× p )
q x ( h) i = y( k ) 1 ∏ x i =1 ( k)i
αi1
αi 2
q x ( h) i ÷ y( k ) 2 ∏ ÷ x i =1 ( k)i
q x ( h) i ÷ L y( k ) p ∏ ÷ x i =1 ( k)i
α ip
÷ ÷
(1.11)
The variance-covariance matrix of the estimator is of the following form:
∑Thk ( p× p ) = θ k ∑ y( p× p ) − ( θ k − θ h ) ∑ 'y ( p× p ) ∑ −x1( q×q ) ∑ yx ( q× p )
(1.12)
proposed a number of generalized multivariate regression estimators for two-phase and multi-phase sampling in the presence of multi-auxiliary variables for estimating population mean for a single variable and a vector of variables of interest. The proposed estimator is of the following form: q Thk (1×p ) = α ∑ i1 y( k )1 y( k )2 K y( k ) p + 1 i =
( x(
h )i
−x( k )i
α ( x( )∑ q
i2
i= 1
h )i
q
− x(k )i
K∑ α ip i= 1
(
)
x( h )i −x( k )i
q Thk ( 1× p ) = y( k ) 1 y( k ) 2 K y( k ) p + ∑ α i1 x( h) i − x( k ) i i= 1
(
)
) ∑ α ( x( ) − x( ) ) q
i= 1
i2
hi
ki
q K ∑ α ip x( h) i − x( k ) i i= 1
(
)
(1.13)
The variance-covariance matrix of the estimator is of the following form: ΣThk ( p× p ) = θ k Σ y ( p× p ) − ( θk − θh ) Σ yx( p×q ) Σ −x (1q×q ) Σ 'x ( q× p )
(1.14)
In this paper we have proposed some multivariate regression estimators using information on several auxiliary variables and as well as auxiliary attributes. NOTATIONS In this section we define the notations used for the development of the multivariate estimators and variance covariance matrices. Let “w” and “x” be auxiliary variables and Y be the variable of interest. Let S xw be the covariance between x and w, syw be the covariance between Y and w. Using these notations we define βxw as regression coefficient between x and w for the i-th response variable and 2
World Appl. Sci. J., 3 (1): ccc-ccc, 2008
random sample of size n2 is available and information on auxiliary variables X and W has been collected alongside information of multiple response variables Y1, Y2,…, Yp. Suppose further that y 2 is mean vector of estimates based upon second phase sample, k is a vector of constants and A & B are diagonal matrices with diagonal entries αi & βi respectively. Based upon these information, the multivariate estimator is defined below:
β yi x.w = Sxy.w S2x.w
as partial regression coefficient between Yi and x keeping the w at constant level. Also Sy x.w is partial covariance between Yi and x after removing the effect 2 2 of e, Sy .w is the partial variance of T and Sx.w is the partial variance of x. We also define i
i
ρ2yi x.w = S2yi x.w
(S
2 2 x.w y i .w
S
)
t = y 2 + ( x1 − x2 ) k + ( W − w1 ) Ak + ( W − w2 ) Bk
as partial correlation coefficient between Yi and x after 2 removing the effect of w, ρy x.w as squared multiple correlation coefficient between Yi and combined 2 effects of x and w, ρy .w as squared multiple correlation coefficient between Yi and w.
The ith component of (3.1) is given as:
i
{
w1 = W − ew1 ; w2 = W − ew2 ; yi 2 = Yi + eyi 2 ; x1 = X − ex1 ; x2 = X − ex2
In this section the multivariate extension of estimator has been proposed. The multivariate extension has been proposed by using information on two auxiliary variables and can be used for simultaneous estimation of several variables. The multivariate extension is proposed below: Suppose a first phase random sample of size n1 is available and information on auxiliary variables X and W is recorded. Suppose further that a second phase
(
= e y2i 2 + ki2 ex1 − ex2
)
2
}
Using conventional transformation
MULTIVARIATE ESTIMATOR WITH QUANTITATIVE PREDICTORS
2
} {
ti = yi 2 + ki x1 + α i ( W − w1 ) − x2 + βi ( W − w2 ) (3.2)
i
( ti − yi )
(3.1)
w1 = W − ew1 ; w2 = W − ew2 ; yi2 = Yi + eyi2 ; x1 = X − ex1 ; x2 = X − ex2 the estimator (3.2) can be written in the following form:
(
)
ti − yi = eyi 2 + ki ex1 − ex2 − kiαi ew1 + ki βi ew2
Squaring above equation:
(
)
+ ki2α i2 ew21 + ki2 βi2ew22 + 2ki e y2i 2 ex1 − ex2 − 2kiα i e yi 2 ew1
(
)
(
)
+2ki βi eyi 2 ew2 − 2ki2α i ew1 ex1 − ex2 + 2ki2 β i ew2 ex1 − ex2 − 2ki2α i βi ew1 ew2 By applying expectation, the mean square error of ti is:
Si = MSE ( ti ) = E ( ti − yi ) or
2
Si = θ 2 S y2i + ( θ 2 − θ1 ) ki2 S x2 + θ1ki2α i2 S w2 + θ 2 ki2 βi2 S w2 + 2 ( θ1 − θ 2 ) ki S xyi − 2θ1kiα i S wyi +2θ 2 ki β i Swyi + 2 ( θ1 − θ 2 ) ki2 β i2 S wx − 2θ1ki2α i βi S w2
(3.3)
Optimum values of αi, βi and ki which minimize Si can be obtain by differentiating (3.3) with respect to unknown quantities.
Sx = β xw S w2
(3.4)
S wx 1 S wyi 1 − = β xw − β yi w 2 2 S w ki S w ki
(3.5)
αi =
βi =
3
World Appl. Sci. J., 3 (1): ccc-ccc, 2008
ρ xy − ρ wx ρ wyi S y ki = i ÷ = β yi x.w 2 1 − S wx Sx
(3.6)
Using the values of (3.4), (3.5) and (3.6) in (3.3); the MSE becomes
(
)
(
)
2 Si = S y2i θ 2 1 − ρ y2i .wx + θ1ρ xy2 i .w 1 − ρ wy i
(3.7)
The covariance between any two components of (3.1) is derived as under: ti = yi 2 + ki { x1 + α i ( w − w1 ) } − { x2 + β i ( w − w2 ) } t j = y j 2 + k j { x1 + α j ( w − w1 ) } − { x2 + β j ( w − w2 ) }
Using conventional transformations:
(
)
(
)
ti − yi = e yi 2 + ki ex1 + ex2 − kiα i ew1 + ki βi ew2
Similarly
t j − y j = ey j 2 + k j ex1 + ex2 − k jα j ew1 + k j β j ew2
Now
( ti − yi ) ( t j − y j ) = ey
i2
(
+ ki k j ex1 − ex2
)
2
(
)
(
)
e y j 2 + ki e y j 2 ex1 − ex2 − α i ki e y j 2 ew1 + βi ki ey j 2 ew2 + k j e yi 2 ex1 − ex2
(
)
(
)
(
− α i ki k j ew1 ex1 − ex2 + βi ki k j ew2 ex1 − ex2 − α j k j ew1 eyi 2 − α j ki k j ew1 ex1 − ex2
(
)
)
+ α iα j ki k j ew21 + βi kiα j k j ew1 ew 2 + β j k j ew 2e yi 2 + β j ki k j ew 2 ex1 − ex2 − α i ki β j k j ew1 ew 2 + βi β j ki k j ew22 By applying expectation to above equation, the covariance is: Sij = Cov ( ti , t j ) = E ( ti − yi ) ( t j − y j )
Sij = θ 2 S yi y j + ki ( θ1 − θ2 ) S xy j − θ1α i ki S wy j + θ 2 βi ki S wy j + k j ( θ1 − θ 2 ) S xyi + ki k j ( θ 2 − θ1 ) S x2 + ( θ1 − θ 2 ) β i ki k j S wx − θ1α j k j S wyi + θ1α iα j ki k j S w2 + θ1α j βi ki k j S w2 + θ 2 β j k j S wyi + ( θ1 − θ 2 ) β j ki k j S wx − θ1α i β j ki k j S w2 + θ 2 βi β j ki k j S w2
(3.8)
Using (3.4), (3.5) and (3.6) in (3.8) we have:
ρ xy ρ xy + ρ wyi ρ wy j − ρ xyi ρ wy j ρ wx − ρ xy j ρ wyi ρ wx Sij = S yi S y j θ 2 ρ yi y j − i j 2 1 − ρ wx 2 2 +θ1 ρ xyi .w ρ xy j .w 1 − ρ wy 1 − ρ wy i j
(3.9)
The covariance matrix can be written by using (3.7) and (3.9) MULTIVARIATE ESTIMATOR WITH QUALITATIVE PREDICTORS
information on two auxiliary a ttributes and can be used for simultaneous estimation of several variables. The multivariate extension is proposed as:
In this section the multivariate extension of estimator has been proposed. The multivariate extension has been proposed by using
Suppose a first phase random sample of size n 1 is available and information on auxiliary attributes τ and 4
World Appl. Sci. J., 3 (1): ccc-ccc, 2008
W is recorded. Further a second phase random sample of size n2 is available and information on auxiliary attributes τ and W has been collected alongside information of multiple response variables Y1, Y2,…, Yp. Suppose that y 2 is the mean vector of estimates based upon second phase, k is a vector of constants and A and B are diagonal matrices with diagonal entries γi and ηi respectively. Based upon these information, the multivariate estimator is defined below:
(
)
(
The ith component of (4.3.1) is given as
{
(
ti = yi 2 + ki τ 1 + γ i pδ − pδ1
)} − {τ
2
(
+ ηi pδ − pδ 2
) }
(4.2)
using conventional transformation
yi 2 = yi + eyi 2 ; τ 1 = τ + eτ1 ;τ 2 = τ + eτ 2 ; pδ1 = pδ − eδ1 and
)
t = y 2 + ( τ 1 − τ 2 ) k + pδ − pδ1 Ak + pδ − pδ 2 Bk
pδ 2 = pδ − eδ 2
(4.1)
Using above representations, the estimator (4.3.2) can be put in the following form
(
)
(
)
) {(
(
)
(
ti = yi + e yi 2 + ki τ + eτ1 + γ i pδ − pδ − eδ1 − τ + eτ 2 + ηi pδ − pδ − eδ 2 or
(
) }
)
ti − yi = eyi 2 + ki eτ1 − eτ 2 − kiγ i eδ1 + kiηi eδ 2
Squaring above equation:
( ti − yi )
2
(
= ey2i 2 + ki2 eτ1 − eτ 2
)
2
( (e
) ) − 2k γ η e
+ ki2γ i2 eδ21 + ki2ηi2 eδ22 + 2ki ey2i 2 eτ1 − eτ 2 − 2kiγ i e yi 2 eδ1
(
)
+2kiηi eyi 2 eδ 2 − 2ki2γ i eδ1 eτ1 − eτ 2 + 2ki2ηi eδ 2
τ1
− eτ 2
2 i i i δ1 δ 2
e
By applying the expectation, the mean square error of ti is: Si = MSE ( ti ) = E ( ti − yi )
2
Si = θ 2 S y2i + ( θ 2 − θ1 ) ki2 Sτ2 + θ1ki2γ i2 Sδ2 + θ 2k i2ηi2 Sδ2 + 2 ( θ1 − θ 2 ) ki Sτ yi − 2θ1kiγ i Sδ yi +2θ 2 kiηi Sδ yi + 2 ( θ1 − θ 2 ) ki2ηi2 Sδτ − 2θ1ki2γ iηi Sδ2
(4.3)
Optimum values of γi, ηi and ki which minimize Si can be obtained by differentiating (4.3) with respect to unknown quantities.
Sδτ = βτδ Sδ2
(4.4)
Sδτ 1 Sδ yi 1 − = βτδ − β yiδ 2 2 Sδ ki Sδ ki
(4.5)
γi =
ηi =
ρτ y − ρδτ ρδ yi ki = i 2 1 − Sδτ
Sy ÷ = β yiτ .δ Sτ
(4.6)
Using the values of (4.4), (4.5) and (4.6) in (4.3); the MSE becomes
(
)
(
)
Si = S y2i θ 2 1 − ρ y2i .δτ + θ1 ρτ2yi .δ 1 − ρδ2yi 5
(4.7)
World Appl. Sci. J., 3 (1): ccc-ccc, 2008
The covariance between any two components of (4.3.1) is derived as under: Table 1: Eigen values of the variance-covariance matrices of proposed estimator θ
1
--------------------------------------------------------------------------------------------------------------------------------------------0.1 θ
2
λ
0.2
0.3
0.4
0.5
0.7
0.8
1
0.2
1.76
0.3
2.64
2.64
0.4
3.51
3.52
3.53
0.5
4.39
4.40
4.40
4.41
0.6
5.27
5.27
5.28
5.29
5.29
0.7
6.14
6.15
6.16
6.16
6.17
6.18
0.8
7.02
7.03
7.03
7.04
7.05
7.05
7.06
0.9
7.90
7.90
7.91
7.92
7.92
7.93
7.94
θ2
7.94
λ2 0.2
0.94
0.3
1.40
1.42
0.4
1.85
1.88
1.91
0.5
2.31
2.34
2.36
2.39
0.6
2.77
2.79
2.82
2.85
2.87
0.7
3.22
3.25
3.28
3.30
3.33
3.36
0.8
3.68
3.71
3.73
3.76
3.79
3.81
3.84
0.9
4.14
4.16
4.19
4.22
4.24
4.27
4.30
θ2
4.32
λ3 0.2
0.08
0.3
0.12
0.12
0.4
0.16
0.16
0.16
0.5
0.20
0.20
0.20
0.20
0.6
0.24
0.24
0.24
0.24
0.24
0.7
0.28
0.28
0.28
0.28
0.28
0.28
0.8
0.32
0.32
0.32
0.32
0.32
0.32
0.32
0.9
0.35
0.35
0.35
0.35
0.35
0.35
0.35
{
(
ti = yi 2 + ki τ 1 + γ i pδ − pδ1
)} −{τ
2
(
(
)
(
)
ti − yi = e yi 2 + ki eτ1 + eτ 2 − kiγ i eδ1 + kiηi eδ 2 t j − y j = e y j 2 + k j eτ1 + eτ 2 − k j γ j eδ1 + k jη j eδ 2
Now
6
)}
+ ηi pδ − pδ 2
Using conventional transformations:
Similarly:
0.6
0.35
World Appl. Sci. J., 3 (1): ccc-ccc, 2008
(
)
(
( ti − yi ) ( t j − y j ) = eyi 2 ey j 2 + ki ey j 2 eτ1 − eτ 2 − γ i ki ey j 2 eδ1 + ηi ki ey j 2 eδ 2 + k j eyi 2 eτ1 − eτ 2
(
+ ki k j eτ1 − eτ 2
)
2
(
)
(
)
)
(
− γ i ki k j eδ1 eτ1 − eτ 2 + ηi ki k j eδ 2 eτ1 − eτ 2 − γ j k j eδ1 e yi 2 − α j ki k j eδ1 eτ1 − eτ 2
(
)
)
+ γ iγ j ki k e + ηi kiγ j k j eδ1 eδ 2 + η j k j eδ 2 e yi 2 + η j ki k j eδ 2 eτ1 − eτ 2 − γ i kiη j k j eδ1 eδ 2 + ηiη j ki k e 2 j δ1
2 j δ2
By applying expectation to above equation we get: Table 2: Eigen values of the variance-covariance matrices of estimator proposed by θ
1
--------------------------------------------------------------------------------------------------------------------------------------------0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 θ2 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 20.75 θ2 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10.81 θ2
µ3 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
µ1 2.79 3.68 4.85 6.03 7.22 8.42 9.61 10.80
5.31 5.58 6.28 7.36 8.51 9.69 10.88
7.88 8.07 8.37 8.97 9.93 11.03
10.45 10.63 10.84 11.16 11.70
13.03 13.19 13.38 13.62
15.60 15.76 15.94
18.17 18.33
µ2 2.31 2.74 2.90 3.05 3.19 3.33 3.48 3.62
3.56 4.62 5.24 5.49 5.66 5.81 5.95
4.78 5.90 6.93 7.65 8.01 8.23
5.98 7.13 8.23 9.24 10.01
7.19 8.34 9.47 10.55
8.40 9.55 10.69
9.60 10.76
0.11 0.16 0.21 0.25 0.29 0.32 0.36 0.39
0.17 0.22 0.27 0.32 0.37 0.41 0.45
0.23 0.28 0.33 0.38 0.43 0.48
0.28 0.34 0.39 0.44 0.49
0.34 0.40 0.45 0.50
0.40 0.45 0.51
0.46 0.51
0.51
Table 3: Relative efficiency of proposed estimator over estimator proposed by θ
1
--------------------------------------------------------------------------------------------------------------------------------------------0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 θ
∑λ ∑µ
i
2
0.2 0.3 0.4 0.5 0.6 0.7
0.53 0.63 0.69 0.74 0.77 0.80
i
0.46 0.53 0.59 0.63 0.67
0.43 0.49 0.53 0.57
0.42 0.46 0.50
7
0.41 0.45
0.40
World Appl. Sci. J., 3 (1): ccc-ccc, 2008 0.8 0.9
0.82 0.84
0.69 0.72
0.60 0.63
0.53 0.56
0.48 0.51
0.43 0.46
0.40 0.43
0.39
Sij = Cov ( ti , t j ) = E ( ti − yi ) ( t j − y j )
Sij = θ 2 S yi y j + ki ( θ1 − θ 2 ) Sτ y j − θ1γ i ki Sδ y j + θ 2ηi ki Sδ y j + k j ( θ1 − θ 2 ) Sτ yi + k i k j ( θ 2 − θ1 ) Sτ2 + ( θ1 − θ 2 ) ηi ki k j Sδτ − θ1γ j k j Sδ yi + θ1γ iγ j ki k j Sδ2 + θ1γ jηi ki k j Sδ2 + θ 2η j k j Sδ yi + ( θ1 − θ 2 ) η j ki k j Sδτ − θ1γ iη j ki k j Sδ2 + θ 2ηiη j ki k j Sδ2
(4.8)
Using (4.4), (4.5) and (4.6) in (4.8) we have
ρτ y ρτ y + ρδ yi ρδ y j − ρτ yi ρδ y j sδτ − ρτ y j ρδ yi ρδτ Sij = S yi S y j θ 2 ρ yi y j − i j 1 − ρδτ2
+θ1 ρτ yi .δ ρτ y j .δ 1 − ρδ2 yi 1 − ρδ2 y j (4.9) The covariance matrix can be written by using (4.7) and (4.9)
4.
Sahoo, J., L.N. Sahoo and S. Mohanty, 1993. A regression approach to estimation in two-phase sampling using two auxiliary variables. Current Science, 65 (1): 73-75. 5. Jhajj, H.S., M.K. Sharma and L.K. Grover, 2006. A family of estimators of population mean using information on auxiliary attribute. Pak. J. Statist., 22 (1): 43-50. 6. Samiuddin, M. and M. Hanif, 2007. Estimation of Population Mean In Single and Two Phase Sampling with or without Additional Information. Pak. J. Statist., 23 (2): 99. 7. Hanif, M., I. Haq and M.Q. Shahbaz, 2009. On a new family of estimators using multiple auxiliary attributes. World Applied Sciences Journal, 7 (11): 1419-1422. 8. Hanif, M., I. Haq and M.Q. Shahbaz, 2010. Ratio Estimators using Multiple Auxiliary Attributes. World Applied Sciences Journal, 8 (1): 133-136. 9. Naik, V.D. and P.C. Gupta, 1996. A note on estimation of mean with known population proportion of an auxiliary character. Jour. Ind. Soc. Agr. Stat., 48 (2): 151-158. 10. Shahbaz, M.Q. and M. Hanif, 2009. A General Shrinkage Estimator in Survey Sampling. World Applied Sciences Journal, 7 (5): 593-596. 11. Hanif, M., Z. Ahmed and M. Ahmad, 2009. Generalized Multivariate Ratio Estimator using Multi-Auxiliary Variables for Multi-Phase Sampling. Pak. J. Statist., 25 (4): 615-629. 12. Roy, D.C., 2003. A regression type estimator in two phase sampling using two auxiliary variables. Pak J. Statist., 19 (3): 281-290.
NUMERICAL STUDY In this section empirical study is conducted to see the performance of the proposed estimator over the estimator proposed by . Ratio of the Sum of Eigen values of variance-covariance matrices is used to calculate relative efficiencies of the proposed estimator for various values of θ1 and θ2. Table 1 contains the Eigen values computed from the variance-covariance matrix of proposed estimator and Table 2 contain the Eigen values computed from the variance-covariance matrix of estimator proposed by . Table 3 shows efficiency comparison of proposed multivariate estimator with estimator proposed by . The entries of Table 3 clearly indicate that the proposed estimator is more efficient as compared with the estimator proposed by for all combinations of θ1 and θ2. REFERENCES 1.
2. 3.
Ahmed, Z., A.G. Hussin and M. Hanif, 2010. Generalized multivariate regression estimators for multi-phase sampling using multi-auxiliary variables. Pak. J. Statist., 26 (4): 569-583. Hansen, M.H., W.N. Hurwitz and W.G. Madow, 1953. Sample survey methods and theory. New York: Wiley. Ahmed, Z., M. Hanif and M. Ahmad, 2009. Generalized Regression Cum-Ratio Estimators for Two-Phase Sampling Using Muliti-Auxiliary Variables. Pak. J. Statist., 25 (2): 93-106. 8