A SIMULATION STUDY ILLUSTRATING A BAYESIAN APPROACH TO JOHNSONNEYMAN TECHNIQUE 1. I considered two groups, one response variable and one covariate for each group. 2. I generated the covariates x1 and x2 respectively for group 1 and group 2 from a discrete uniform density. Then I generated y1 and y2 respectively for group 1 and 2 from a t distribution with degrees of freedom=2 such that the mean of y1 is α1 + β1 X1 and the mean of y2 is α2 + β2 X2. (I chose a nonnormal distribution, e.g. t to illustrate that our method works well under nonnormality as well). (well I know t is still close to Normal. Next we can generate from another distribution). 3. I chose α1, β1 α2, and β2 so that the plots of ‘E(y1) vs. X1’ and ‘E(y2) vs. X2’ intersects at some X0. (Note that E(yj) = αj + βj Xj, j=1,2) (see the Figure below). I generated 50 random samples for each group (that is n1=n2=50). (see the Matlab code below).
4. The aim is to construct a Bayesian algorithm such that JN technique is able to determine the intersection point irrespective of the underlying true distribution. (or in a more general sense, determine the X points at which the two lines are close). 5. The distance is defined as d(X0) = | α1 + β1 X0| - | α2 + β2 X0|. This is a function of the parameters thus it is also treated as a parameter in MCMC and so a Markov chain is constructed for it. (see the WinBUGS code implementing the Gibbs sampling). 6. In this code I modeled y1 and y2 as t with dof=5 (I just chose 5 arbitrarily for the sake of illustration). There may be two ways to model y’s here. 1) Conduct an explaratory data analysis on y before the Bayesian part and try to fit a distribution and use this distribution to model y in winBUGS, or 2) leave it nonparametric, that is, leave it unspecified and we can use Dirichlet process as a prior for the unknown distribution (I will try this soon). 7. For each d(X0) (where X0 represent the unique X values) we construct 100(1-α*)% posterior interval where α* is the significance level adjusted for multiple comparison. (It