Bayesian approach to hypothesis testing 1e

Page 1

A SIMULATION STUDY ILLUSTRATING A BAYESIAN APPROACH TO JOHNSONNEYMAN TECHNIQUE 1. I considered two groups, one response variable and one covariate for each group. 2. I generated the covariates x1 and x2 respectively for group 1 and group 2 from a discrete uniform density. Then I generated y1 and y2 respectively for group 1 and 2 from a t distribution with degrees of freedom=2 such that the mean of y1 is α1 + β1 X1 and the mean of y2 is α2 + β2 X2. (I chose a nonnormal distribution, e.g. t to illustrate that our method works well under nonnormality as well). (well I know t is still close to Normal. Next we can generate from another distribution). 3. I chose α1, β1 α2, and β2 so that the plots of ‘E(y1) vs. X1’ and ‘E(y2) vs. X2’ intersects at some X0. (Note that E(yj) = αj + βj Xj, j=1,2) (see the Figure below). I generated 50 random samples for each group (that is n1=n2=50). (see the Matlab code below).

4. The aim is to construct a Bayesian algorithm such that JN technique is able to determine the intersection point irrespective of the underlying true distribution. (or in a more general sense, determine the X points at which the two lines are close). 5. The distance is defined as d(X0) = | α1 + β1 X0| - | α2 + β2 X0|. This is a function of the parameters thus it is also treated as a parameter in MCMC and so a Markov chain is constructed for it. (see the WinBUGS code implementing the Gibbs sampling). 6. In this code I modeled y1 and y2 as t with dof=5 (I just chose 5 arbitrarily for the sake of illustration). There may be two ways to model y’s here. 1) Conduct an explaratory data analysis on y before the Bayesian part and try to fit a distribution and use this distribution to model y in winBUGS, or 2) leave it nonparametric, that is, leave it unspecified and we can use Dirichlet process as a prior for the unknown distribution (I will try this soon). 7. For each d(X0) (where X0 represent the unique X values) we construct 100(1-α*)% posterior interval where α* is the significance level adjusted for multiple comparison. (It


is a multiple comparison because we are testing whether the posterior interval for d(X0) includes 0 for each X0.) 8. The results are given below under ‘Bayesian Posterior Results’. As seen, the posterior interval for d[2] almost covers the measure 0. This is the one that corresponds to the second largest unique value in X0 which is 2. And that’s exactly where the two lines intersect as seen in the figure above. That means, our way of JN technique could determine the truth about the intersection point. (The posterior intervals here are unadjusted though. I will figure out a way to adjust them soon by applying one of those multiple hypothesis testing procedures such as Benjamini-Hochberg, or Sidak, etc. to control for FDR or FWER).

MATLAB CODE FOR DATA GENERATION k = 2; % number of groups nk = [50; 50]; %size of each group alpha1 = 1; alpha2 = 9; beta1 = 2; beta2 = -2; error1 = trnd(2,nk(1),1) ; error2 = trnd(2,nk(2),1) ; x1 = unidrnd(10,nk(1),1); %generating grom discrete uniform(N=10) x2 = unidrnd(10,nk(2),1); x = [x1 x2]; y1 = alpha1 + beta1 * x1 + error1; y2 = alpha2 + beta2 * x2 + error2; y = [y1 y2]; plot(x1,alpha1 + beta1 * x1,x2,alpha2 + beta2 * x2); %to make sure that the lines intersect somewhere x0 = unique([unique(x1) ; unique(x2)]); lines will be compared nx0 = length(x0);

%unique X values for which the two


WinBUGS CODE model{ for (j in 1:k){ for (i in 1:nk[j]) { y[i,j] ~dt(mu[i,j],invsigma2,dof) mu[i,j]<-alpha[j] + beta[j]*x[i,j] } } for (z in 1:nx0) { d[z] <- abs((alpha[1] + beta[1]*x0[z]) - (alpha[2] + beta[2]*x0[z])) } dof <- 5 # Priors for (j in 1:k) {alpha[j] ~ dnorm(0,0.01) } for (j in 1:k) {beta[j] ~ dnorm(0,0.01) } invsigma2 ~ dgamma(0.01,0.01)

} # Inits list(alpha=c(0, 1),beta=c(0, 0),invsigma2=10) list(alpha=c(-3, -5),beta=c(3, 5),invsigma2=5) list(alpha=c(3, 6),beta=c(-5, -4),invsigma2=1)

# Data list(k=2, nk=c(50, 50), x=structure( .Data=c( 2, 10, 5, 7, 7, 8, 7, 7, 7, 8, 1, 6, 2, 6, 8, 9, 9, 4,


8, 3, 10, 3, 10, 10, 3, 8, 6, 9, 8, 4, 6, 7, 7, 1, 6, 10, 8, 5, 1, 6, 5, 8, 3, 4, 4, 7, 6, 4, 8, 4, 5, 7, 6, 5, 2, 2, 3, 3, 3, 10, 5, 7, 3, 7, 3, 4, 9, 1, 6, 6, 10, 1, 7, 3, 3, 2, 2, 5, 6, 8, 5, 7, 7, 4, 8, 3, 4, 1, 6, 2, 3, 8, 10, 1, 6, 10, 3, 5, 3, 1), .Dim=c(50,2)), y=structure( .Data=c( 4.86056, -10.0096, 12.9951, -4.43806,


16.0147, -6.95658, 14.878, -4.64662, 15.7725, -6.36457, 1.48779, -3.50043, 6.65265, -1.29018, 18.233, -8.7462, 18.7775, -3.02693, 17.364, 2.18176, 19.8658, 4.73033, 19.8483, -11.6233, 7.0501, -9.44441, 14.0254, -5.45578, 17.8658, 1.44843, 18.0153, 0.88879, 14.7538, 6.60737, 12.217, -10.0678, 14.8622, -1.01037, 2.66838, -2.48, 11.3159, -8.85095, 7.13759, 1.32288, 8.53348, -4.01297, 11.637, 0.990126, 16.4408, 0.635557, 12.2181, -4.95455, 11.1832, -7.52979, 5.00206, 8.25417, 6.46516, 3.85048, 8.94709, -10.3284, 14.0052, -3.82095, 7.28534, -4.77397, 6.9602, 4.02857, 20.1227, 6.3772, 13.6937, -3.07142, 20.9754, 7.92936, 15.0769, 2.96029, 8.1049, 5.42175, 5.84389, -4.93805, 13.1886, -5.76609, 11.2663, -3.90779, 14.8105, -0.834381, 16.095, 2.85836, 9.25082, 5.16656, 8.56778, 3.7477, 3.26576, -5.97064, 21.9457, 8.82467, 11.2662, -11.1404,


7.199, -1.31676, 6.14174, 15.9727), .Dim=c(50,2)), x0=c(1, 2, 3, 4, 5, 6, 7, 8, 9, 10), nx0=10)

Bayesian Posterior Inference node d[1] d[2] d[3] d[4] d[5] d[6] d[7] d[8] d[9] d[10]

mean 3.78 0.4488 4.072 7.998 11.92 15.85 19.78 23.7 27.63 31.55

sd 0.64 0.3403 0.4606 0.3979 0.3664 0.3742 0.4191 0.4911 0.5801 0.6795

MC error 0.004031 0.002112 0.003061 0.00278 0.002698 0.002833 0.003157 0.003619 0.004175 0.004791

2.5% 2.531 0.0182 3.161 7.218 11.21 15.12 18.97 22.75 26.51 30.24

median 97.5% 3.778 5.045 0.3775 1.268 4.073 4.974 7.999 8.775 11.93 12.64 15.85 16.57 19.78 20.58 23.7 24.65 27.63 28.75 31.55 32.86

start 1 1 1 1 1 1 1 1 1 1

sample 30000 30000 30000 30000 30000 30000 30000 30000 30000 30000


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.