It’s Not Where You Start, It’s How You Finish: Predicting Law School and Bar Success - Appendixes

Page 1

It’s Not Where You Start, It’s How You Finish: Predicting Law School and Bar Success National Report of Findings for the AccessLex/LSSSE Bar Exam Success Initiative

APPENDIXES FOR WORKING PAPER (Last Updated June 8, 2021) Aaron N. Taylor*, Jason M. Scott*, and Josh Jackson These authors contributed equally to this work

*

A.

CONTENTS

Supplemental Appendix ........................................................................................................................ 1 A.I. Regression Outputs ............................................................................................................................ 1 Table A.I.1 Full Table of Effects of LSAT, UGPA, and Control Variables on Three LGPA Measures ............................................................................................................................................... 1 Table A.I.2 Full Table of Effects of Academic Performance and Control Variables on Bar Passage .. 2 Table A.I.3 Full Table of Effects of Student Engagement Factors and Control Variables on 3L LGPA .............................................................................................................................................................. 3 Table A.I.4 Full Table of Effects of Student Engagement Factors and Control Variables on Odds of Bar Passage ........................................................................................................................................... 6 A.II Sample Characteristics ...................................................................................................................... 9 Table A.II.1 School Participation in the Study ..................................................................................... 9 Table A.II.2 Summary Statistics ......................................................................................................... 10 Table A.II.5 Correlation Matrix of LGPA Variables .......................................................................... 10 Figure A.II.3 First-Time Bar Passage (Percent) by Race and Gender ................................................ 11 Figure A.II.4 Median LSAT and UGPA Since 2011 .......................................................................... 12 Figure A.II.5 First-Time Bar Passage (Percent).................................................................................. 12 A.3 Description of All Student Engagement Variables .......................................................................... 13 Table A.3.1 List and Description of All Student Engagement Variables and Their Component Survey Questions............................................................................................................................................. 13


B. Technical Appendix ............................................................................................................................... 17 B.I. LSSSE Survey .................................................................................................................................. 17 B.II. Schools with Missing Data ............................................................................................................. 18 B.III. Selection of Engagement Factors .................................................................................................. 20 B.IV. Addressing Concerns About Using LGPA and LGPA Growth .................................................... 21


A. SUPPLEMENTAL APPENDIX A.I. Regression Outputs Table A.I.1 Full Table of Effects of LSAT, UGPA, and Control Variables on Three LGPA Measures Coefficients and 95 Percent Confidence Intervals Law School GPA:

LSAT UGPA Male Asian Black Hispanic All Other Races Age: 35+ Graduation Year: 2019

First-Semester (n = 3,938)

First-Year (n = 3,941)

Final (n = 4,223)

Growth (n = 3,938)

0.380*** (0.350, 0.410) 0.268*** (0.238, 0.297) -0.071** (-0.128, -0.014) -0.311*** (-0.418, -0.205) -0.175*** (-0.290, -0.060) -0.149*** (-0.241, -0.056) -0.360*** (-0.566, -0.155) -0.201*** (-0.316, -0.087) 0.011 (-0.046, 0.068)

0.393*** (0.363, 0.422) 0.291*** (0.262, 0.320) -0.079*** (-0.135, -0.023) -0.370 *** (-0.475, -0.266) -0.250*** (-0.363, -0.138) -0.187*** (-0.278, -0.097) -0.368*** (-0.568, -0.168) -0.146 ** (-0.258, -0.034) 0.045 (-0.011, 0.101)

0.340*** (0.312, 0.368) 0.309*** (0.282, 0.337) -0.074*** (-0.128, -0.019) -0.355*** (-0.454, -0.256) -0.350*** (-0.455, -0.245) -0.203*** (-0.290, -0.116) -0.324*** (-0.512, -0.136) -0.020 (-0.125, 0.084) 0.053* (-0.001, 0.108)

0.055*** (0.036, 0.074) 0.112*** (0.094, 0.130) -0.042** (-0.076, -0.008) -0.131*** (-0.194, -0.068) -0.200*** (-0.268, -0.132) -0.101*** (-0.156, -0.047) -0.085 (-0.207, 0.037) 0.091*** (0.023, 0.159) 0.058*** (0.024, 0.092) -0.224*** (-0.243, -0.205)

0.175 (-0.347, 0.696)

0.284*** (0.172, 0.396)

-4989.333 10,000.670 10,069.740

-5,351.400 10,724.800 10,794.630

First-Semester LGPA Transfer Student Log Likelihood Akaike Inf. Crit. Bayesian Inf. Crit.

-5,063.274 10,146.550 10,209.330

-3,001.225 6,024.451 6,093.513

Note: *p<0.1; **p<0.05; ***p<0.01; law school GPAs have been standardized within each school.

1


Table A.I.2 Full Table of Effects of Academic Performance and Control Variables on Bar Passage Odds Ratios and 95 Percent Confidence Intervals Primary Predictor: LSAT and UGPA (n = 3,975)

1S LGPA (n = 3,721)

1L LGPA (n = 3,725)

Final LGPA LGPA Growth (n = 3,975) (n = 3,721)

3.385*** (2.968, 3.755)

First Semester LGPA

5.971*** (5.131, 6.950) 4.235*** (3.718, 4.823)

First Year LGPA

5.557*** (4.840, 6.381)

Final LGPA

1.708 *** (1.562, 1.868) 1.442*** (1.328, 1.567) 1.089 (0.922, 1.287) 0.628*** (0.475, 0.829) 0.645*** (0.484, 0.860) 0.804* (0.629, 1.029) 0.392*** (0.228, 0.673) 0.756* (0.560, 1.021) 0.214*** (0.106, 0.431)

1.257*** (1.126, 1.404) 1.202*** (1.088, 1.327) 1.172 (0.968, 1.419) 0.834 (0.605, 1.149) 0.772 (0.552, 1.081) 0.940 (0.710, 1.244) 0.590 (0.301, 1.155) 0.901 (0.631, 1.287) 0.194*** (0.086, 0.437)

1.189*** (1.061, 1.333) 1.140** (1.029, 1.263) 1.195* (0.983, 1.453) 0.896 (0.644, 1.245) 0.912 (0.644, 1.291) 1.005 (0.754, 1.340) 0.579 (0.292, 1.146) 0.882 (0.612, 1.272) 0.221*** (0.098, 0.500)

1.207*** (1.084, 1.344) 1.018 (0.921, 1.125) 1.190* (0.981, 1.445) 0.860 (0.622, 1.188) 0.960 (0.689, 1.338) 1.039 (0.783, 1.381) 0.403*** (0.210, 0.773) 0.662** (0.469, 0.935) 0.146*** (0.066, 0.322)

5.435*** (4.454, 6.632) 1.166** (1.035, 1.313) 1.015 (0.911, 1.131) 1.291** (1.052, 1.583) 0.962 (0.684, 1.354) 1.050 (0.734, 1.504) 1.082 (0.804, 1.456) 0.500* (0.243, 1.026) 0.750 (0.515, 1.093) 0.155*** (0.065, 0.373)

-1,784.700 3,587.399 3,641.481

-1,382.665 2,785.33 2,844.835

-1,315.750 2,651.499 2,711.015

-1,336.424 2,692.847 2,752.938

-1,213.998 2,449.997 2,515.452

LGPA Growth LSAT Score UGPA Male Asian Black Hispanic All Other Races Age: 35 or Older California Bar Taker Log Likelihood Akaike Inf. Crit. Bayesian Inf. Crit.

Note: *p<0.1; **p<0.05; ***p<0.01; LGPAs are standardized within each school.

2


Table A.I.3 Full Table of Effects of Student Engagement Factors and Control Variables on 3L LGPA Coefficients and 95 Percent Confidence Intervals Student Engagement Variables Included: LSSSE Engagement Indicators Only (n = 1,461) Learning to think like a lawyer: Quite a bit Learning to think like a lawyer: Very much Student advising: Satisfied Student advising: Very satisfied Student-faculty interaction: Often Student-faculty interaction: Very often Law school environment: Often Law school environment: Very often Emphasis on academics: Quite a bit Emphasis on academics: Very much Supportive relationships Acquired broad legal education: Quite a bit Acquired broad legal education: Very much Developed practical skills: Quite a bit Developed practical skills: Very much Diverse knowledge displayed: Often Diverse knowledge displayed: Very often

School-Related Factors

Student-Centered Factors

(n = 1,459)

(n = 1,413)

0.139* (-0.004, 0.282) 0.221*** (0.075, 0.367) -0.030 (-0.139, 0.079) 0.078 (-0.085, 0.240) 0.101 * (-0.001, 0.203) 0.235*** (0.082, 0.386) 0.068 (-0.042, 0.178) -0.064 (-0.238, 0.109) -0.113* (-0.246, 0.019) -0.179* (-0.360, -0.001) 0.018** (0.003, 0.034) 0.164** (0.015, 0.314) 0.105 (-0.059, 0.270) 0.017 (-0.113, 0.147) 0.096 (-0.065, 0.257) 0.094* (-0.016, 0.205) 0.068 (-0.073, 0.209)

3

All Student Engagement Factors (n = 1,288) 0.016 (-0.140, 0.172) 0.041 (-0.126, 0.209) -0.096 (-0.214, 0.021) -0.125 (-0.301, 0.052) -0.025 (-0.137, 0.088) 0.025 (-0.147, 0.198) -0.023 (-0.151, 0.105) -0.123 (-0.341, 0.095) -0.083 (-0.227, 0.061) -0.103 (-0.317, 0.111) 0.014 (-0.003, 0.031) 0.104 (-0.056, 0.263) 0.014 (-0.161, 0.188) -0.018 (-0.158, 0.122) 0.030 (-0.145, 0.205) 0.002 (-0.121, 0.118) -0.79** (-0.338, -0.020)


Table A.I.3 Cont. School satisfaction: Satisfied

0.170** (0.021, 0.318) 0.388*** (0.209, 0.566) -0.097 (-0.219, 0.025) -0.174*** (-0.298, -0.049)

School satisfaction: Very satisfied Amount of law school debt:20-100k Amount of law school debt:100k+ Coursework difficulty: Challenging Coursework difficulty: Very Challenging Class participation: Often Class participation: Very Often Preparation for class (hrs/wk): 21-30 hrs Preparation for class: 31+ hrs Came to class unprepared: Sometimes Came to class unprepared: Never Collaboration: Often/Very often Extracurricular legal experience (hrs/wk): 1-10 hrs Extracurricular legal experience: 11-20 hrs Extracurricular legal experience: 21+ hrs Self-Care (hrs/wk): 11-25 hrs Self-Care: 26+ hrs Other responsibilities (hrs/wk): 6-20 hrs Other responsibilities: 21+ hrs LSAT score

0.293*** (0.244, 0.342)

0.283*** (0.234, 0.333)

4

0.215*** (0.109, 0.320) 0.315*** (0.156, 0.475) 0.187*** (0.080, 0.295) 0.445*** (0.325, 0.564) -0.119** (-0.233, -0.006) -0.132** (-0.246, -0.019) 0.200*** (0.076, 0.324) 0.432*** (0.276, 0.588) 0.065 (-0.032, 0.162) -0.032 (-0.1450, 0.085) -0.085 (-0.208, 0.038) -0.107 (-0.243, 0.030) -0.081 (-0.184, 0.021) -0.072 (-0.222, 0.078) 0.004 (-0.103, 0.095) -0.203*** (-0.338, -0.069) 0.288*** (0.239, 0.338)

0.218*** (0.062, 0.375) 0.427*** (0.236, 0.618) -0.057 (-0.183, 0.069) -0.110* (-0.240, 0.020) 0.167*** (0.050, 0.283) 0.288*** (0.111, 0.466) 0.174*** (0.058, 0.290) 0.421*** (0.290, 0.551) -0.119* (-0.237, -0.000) -0.128** (-0.249, -0.007) 0.180*** (0.050, 0.310) 0.435*** (0.272, 0.599) 0.079 (-0.030, 0.187) -0.040 (-0.163, 0.084) -0.105 (-0.232, 0.023) -0.094 (-0.238, 0.050) -0.077 (-0.186, 0.033) -0.047 (-0.207, 0.113) -0.003 (-0.107, 0.100) -0.188** (-0.331, -0.044) 0.286*** (0.234, 0.337)


Table A.I.3 Cont. UGPA

0.269*** (0.222, 0.316)

0.285*** (0.238, 0.331)

0.257*** (0.210, 0.303)

-0.280*** (-0.463, -0.097) -0.311*** (-0.514, -0.109) -0.215** (-0.379, -0.051) -0.235 (-0.546, 0.077)

-0.244** (-0.433, -0.055) -0.267** (-0.471, -0.063) -0.219** (-0.385, -0.052) -0.201 (-0.520, 0.118)

-0.068 (-0.173, 0.038)

-0.065 (-0.170, 0.040)

-0.231** (-0.419, -0.043) -0.293*** (-0.489, -0.096) -0.171** (-0.333, -0.008) -0.180 (-0.502, 0.143) -0.214** (-0.387, -0.042) -0.050 (-0.156, 0.056)

Male Asian Black Hispanic All Other Races Age: 35 or older First generation law student Graduation Year: 2019 Amount of Law School Debt (control) Log Likelihood Akaike Inf. Crit. Bayesian Inf. Crit.

-0.028*** (-0.043, -0.014) -1,843.531 3,721.061 3,810.938

0.254*** (0.204, 0.304) -0.102** (-0.201, -0.002) -0.184* (-0.380, 0.012) --0.331*** (-0.543, -0.119) 0.175** (-0.344, -0.007) -0.149 (-0.479, 0.181) -0.233** (-0.420, -0.047) -0.063 (-0.175, 0.048) 0.028 (-0.072, 0.128)

-0.019*** (-0.033, -0.005) -1,832.297 3,706.594 3,817.589

-1,730.593 3,513.185 3,649.775

Note: *p<0.10; **p<0.05; ***p<0.01; 3L LGPA is standardized within each school.

5

-1,549.844 3,195.689 3,443.409


Table A.I.4 Full Table of Effects of Student Engagement Factors and Control Variables on Odds of Bar Passage Odds Ratios and 95 Percent Confidence Intervals Student engagement variables included: LSSSE Engagement Indicators Only (n = 1,451) Learning to think like a lawyer: Quite a bit Learning to think like a lawyer: Very much Student advising: Satisfied Student advising: Very satisfied Student-faculty interaction: Often Student-faculty interaction: Very often Law school environment: Often Law school environment: Very often

School-Related Factors (n = 1,408)

1.061 (0.683, 1.647) 0.939 (0.598, 1.474) 1.024 (0.726, 1.445) 0.910 (0.553, 1.498) 0.967 (0.705, 1.327) 0.954 (0.595, 1.531) 1.060 (0.750, 1.498) 1.062 (0.621, 1.815)

Emphasis on academics: Quite a bit

0.866 (0.567, 1.322) 0.610* (0.343, 1.086) 1.012 (0.963, 1.062) 1.519* (0.964, 2.393) 1.238 (0.747, 2.051) 1.405* (0.945, 2.090) 1.560* (0.941, 2.588) 1.166 (0.825, 1.647) 1.120 (0.709, 1.769)

Emphasis on academics: Very much Supportive relationships Acquired broad legal education: Quite a bit Acquired broad legal education: Very much Practical skills development: Quite a bit Practical skills development: Very much Diverse knowledge displayed: Often Diverse knowledge displayed: Very often

6

Student-Centered Factors (n = 1,366)


Table A.I.4 Cont. School satisfaction: Satisfied

0.922 (0.584, 1.456) 1.310 (0.746, 2.303) 0.972 (0.632, 1.495) 0.890 (0.598, 1.325)

School satisfaction: Very satisfied Amount of law school debt:20-100k Amount of law school debt:100k+ Coursework difficulty: Challenging Coursework difficulty: Very Challenging Class participation: Often Class participation: Very Often Preparation for class (hrs/wk): 21-30 hrs Preparation for class: 31+ hrs Came to class unprepared: Sometimes Came to class unprepared: Never Collaboration: Often/Very often Extracurricular legal experience (hrs/wk): 1-10 hrs Extracurricular legal experience: 11-20 hrs Extracurricular legal experience: 21+ hrs Self-Care (hrs/wk): 11-25 hrs Self-Care: 26+ hrs Other responsibilities (hrs/wk): 6-20 hrs Other responsibilities: 21+ hrs LSAT score

1.817*** (1.547, 2.135)

7

1.808*** (1.529, 2.137)

0.981 (0.690, 1.396) 1.268 (0.717, 2.245) 1.088 (0.771, 1.536) 2.053*** (1.355, 3.111) 0.785 (0.535, 1.151) 0.637** (0.436, 0.930) 1.265 (0.836, 1.915) 1.622* (0.950, 2.769) 1.269 (0.922, 1.748) 1.404* (0.940, 2.096) 1.484* (0.973, 2.263) 1.415 (0.911, 2.198) 1.057 (0.753, 1.484) 1.369 (0.817, 2.292) 0.899 (0.637, 1.268) 0.630** (0.414, 0.958) 1.796*** (1.507, 2.141)


Table A.I.4 Cont. UGPA Asian Black Hispanic All Other Races

1.510*** (1.307, 1.745) 0.472*** (0.285, 0.780) 0.564** (0.329, 0.966) 0.560** (0.3543, 0.887) 0.245*** (0.107, 0.561)

0.974 (0.934, 1.016) 0.857 (0.188, 3.908)

0.667 (0.149, 2.997)

1.503*** (1.291, 1.751) 0.604* (0.347, 1.053) 0.554** (0.314, 0.976) 0.611* (0.372, 1.004) 0.330** (0.130, 0.841) 1.179 (0.826, 1.680) 0.980 (0.937, 1.025) 0.891 (0.188, 4.220)

-557.560 1,147.121 1,227.486

-532.530 1,107.06 1,211.758

-497.985 1,045.971 1,170.192

First Generation Law Student Amount of Law School Debt (control) California Bar Takers Log Likelihood Akaike Inf. Crit. Bayesian Inf. Crit. Note: *p<0.10; **p<0.05; ***p<0.01.

8

1.511*** (1.302, 1.755) 0.465*** (0.273, 0.791) 0.503** (0.286, 0.887) 0.510*** (0.317, 0.819) 0.250*** (0.104, 0.601) 1.069 (0.766, 1.493)


A.II Sample Characteristics Table A.II.1 School Participation in the Study School ID 1

Total Obs.

LSSSE Responses

448

203

Response Rate (%) 45.31

Admin. Data Year(s) (AY2017/ AY2018/Both)

LSSSE Survey Year(s) (AY2017/ AY2019/Both)

Both

Both

2 770 216 28.05 Both Both 3 352 64 18.18 Both AY2017 4 272 90 33.09 Both Both 5 80 54 67.50 Both Both 6 201 53 51.24 Both Both 7 106 53 50.00 Both Both 8 110 66 60.00 Both Both 9 416 183 43.99 Both Both 10 128 81 63.28 Both Both 11 92 45 48.91 Both Both 12 118 60 50.85 Both Both 13 507 214 42.21 Both Both 14 287 193 67.25 Both Both 15 46 27 58.70 Both Both 16 104 41 39.42 AY2017 AY2017 17 164 98 59.76 Both Both 18 95 52 54.74 Both Both 19 262 99 37.79 Both Both 21 164 83 50.61 AY2018 AY2018 – – Overall 4,722 2,025 42.88 Note: School 20 was excluded from the sample; School 21 delayed its second year of participation until 2021.

9


Table A.II.2 Summary Statistics Obs. Mean Median St. Dev. Min. Max. Full Sample (n = 4,722) LSAT 4,479 154.39 154.00 6.04 130.00 174.00 UGPA 4,602 3.30 3.36 0.40 1.82 4.17 First-Semester LGPA 4,313 3.10 3.13 0.47 1.16 4.29 First-Year LGPA 4,316 3.11 3.14 0.44 1.67 4.24 Second-Year LGPA 4,514 3.21 3.23 0.38 2.00 4.25 Third-Year LGPA 4,552 3.32 3.34 0.36 1.65 4.47 Final LGPA 4,684 3.27 3.28 0.36 2.04 4.23 LGPA Growth 4,313 0.17 0.15 0.26 -0.86 1.79 Bar Exam Result 4,413 0.75 – – – – LSSSE Respondents (n = 2,025) LSAT 1,977 154.53 155.00 6.21 130.00 172.00 UGPA 1,921 3.32 3.36 0.41 1.82 4.17 First-Semester LGPA 1,890 3.11 3.14 0.49 1.54 4.29 First-Year LGPA 1,892 3.12 3.15 0.46 1.67 4.24 Second-Year LGPA 1,966 3.23 3.24 0.39 2.00 4.25 Third-Year LGPA 1,979 3.33 3.35 0.36 2.06 4.38 Final LGPA 2,012 3.28 3.30 0.36 2.13 4.23 LGPA Growth 1,890 0.16 0.14 0.25 -0.69 1.25 Bar Exam Result 1,922 0.75 – – – – Note: Differences in means between the full sample and the LSSSE subsample are not statistically significant (p < 0.05); for bar passage, the difference in success rate was not statistically significant.

Table A.II.5 Correlation Matrix of LGPA Variables

First-Semester First-Year Third-Year Final Growth

First Semester 1.00 0.93 0.76 0.84 -0.29

First Year

Third Year

1.00 0.82 0.91 -0.03

10

1.00 0.97 0.37

Final

1.00 0.28

Growth

1.00


Figure A.II.3 First-Time Bar Passage (Percent) by Race and Gender

11


Figure A.II.4 Median LSAT and UGPA Since 2011

Respondent Schools and All ABA Approved Schools

Source: AccessLex Institute (2020), Admissions [Data set], available from http://analytix.accesslex.org/DataSet. Note: Figures represent the median of the medians for each individual ABA law school for each admitted class

Figure A.II.5 First-Time Bar Passage (Percent)

Respondent Schools and All ABA Approved Schools

Source: AccessLex Institute (2020), First-Time and Ultimate Bar Passage (School-Level) [Data set], available from http://analytix.accesslex.org/DataSet. Note: first-time bar passage rates represent both July and February administrations

12


A.3 Description of All Student Engagement Variables Table A.3.1 List and Description of All Student Engagement Variables and Their Component Survey Questions LSSSE Variable Description Learning to Think Like a Lawyer measures the ability for students to think critically, think analytically, and effectively process information from different contexts and frameworks (LSSSE, 2013). These skills (coded as LTTLL) are deemed essential for practicing as an effective attorney.

Law School Environment provides insight into how students communicate with faculty (e.g., receiving prompt feedback or assisting on projects) and what type of advice they receive (e.g., job search advice). Students who report high scores within these categories (coded as LSE) typically have extensive interaction with faculty across a range of subjects that (in theory) should make them more likely to excel in law school and pass the bar exam (LSSSE, 2013).

Student Advising assesses the quality and quantity of advisory services such as academic counseling and career advising offered by law schools. Combined, these services (coded as SA) may affect students’ satisfaction as well as engagement in their legal education generally (LSSSE, 2013).

LSSSE Survey Questions During the current school year, how much has your coursework emphasized analyzing the basic elements of an idea, experience, or theory? During the current school year, how much has your coursework emphasized synthesizing and organizing ideas, information, or experiences? During the current school year, how much has your coursework emphasized making judgments about the value of information, arguments, or methods? During the current school year, how much has your coursework emphasized applying theories or concepts to practical problems or in new situations? To what extent does your law school emphasize encouraging contact among students from different economic, social, sexual orientation, and racial or ethnic backgrounds? To what extent does your law school emphasize providing support you need to thrive socially? To what extent does your law school emphasize helping you cope with non-academic responsibilities? To what extent does your law school emphasize providing the support students need to succeed academically? To what extent does your law school emphasize attending campus events and activities? To what extent does your law school emphasize providing the financial counseling students need to afford their education? In your experience at your law school, how satisfied are you with academic advising and planning? In your experience at your law school, how satisfied are you with career counseling? In your experience at your law school, how satisfied are you with personal counseling? In your experience at your law school, how satisfied are you with job search help? To what extent does your school emphasize providing the support you need to succeed in your employment search?

13


Table A.3.1 Cont. Student-Faculty Interaction provides insight into how students communicate with faculty (e.g., receiving prompt feedback or assisting on projects) and what type of advice they receive (e.g., job search advice). Students who report high scores within these categories (coded as SFI) typically have extensive interaction with faculty across a range of subjects that (in theory) should make them more likely to excel in law school and pass the bar exam (LSSSE, 2013).

Broad Legal Education refers to a scaled version (coded as gnleged) of the following variable about students’ perceptions that their experience at a law school contributed to. Challenging Coursework measures the extent to which students put forth extra effort in their academic lives (“going the extra mile”). Students with higher EFFORT scores went beyond the minimum effort required to succeed in courses, and instead thought about and worked on their ideas and projects extensively, making them more well-informed and increasing their retention of what they were taught in class.

Class Participation measures how often students are active course participants. Students who report higher values of this variable generally engage more with instructors and peers in the classroom. Collaboration measures the extent of students’ cooperation with other law students. This features questions asking about collaboration in a variety of contexts, including in-class projects, out-of-class discussions, and out-of-class assignments. This variable is coded as PEERS.

In your experience at your law school during the current school year, about how often have you used email to communicate with a faculty member? In your experience at your law school during the current school year, about how often have you discussed assignments with a faculty member? In your experience at your law school during the current school year, about how often have you talked about career plans/job search activities with faculty member or advisor? In your experience at your law school during the current school year, about how often have you discussed ideas from readings/classes with faculty members outside of class? In your experience at your law school during the current school year, about how often have you received prompt feedback (written or oral) from faculty on academic performance? In your experience at your law school during the current school year, about how often have you worked with faculty members on activities other than coursework? To what extent has your experience at your law school contributed to your personal development in acquiring a broad legal education? In your experience at your law school during the current school year, how often have you worked with classmates outside of class to prepare for class assignments? In your experience at your law school during the current school year, how often have you discussed ideas from their readings or classes with faculty members outside of class? In your experience at your law school during the current school year, how often have you discussed ideas from their readings or classes with others outside of class? (students, family members, coworkers, etc.) Rate the frequency at which you asked questions in class or contributed to class discussions.

Rate the frequency at which you worked with other students on projects DURING CLASS. Rate the frequency at which you discussed ideas from your readings or classes with others outside of class (students, family members, coworkers, etc.). Rate the frequency at which you worked with classmates OUTSIDE OF CLASS to prepare class assignments.

14


Table A.3.1 Cont. Coming to Class Unprepared measures the frequency at which students reported coming to class without having prepared via reading or assignments. This variable is coded UNPREP. Emphasis on Academics measures the extent to which a law school encourages students to take part in an academically holistic law school experience. This includes actively encouraging students to spend more time studying and working on class projects, but also includes emphasis on campus events, campus activities, and other academic support. Thus, the “academic climate” (coded as ACADEMIC) we seek to capture involves in- and out-of-class scholastics as well as involvement in a collegiate community. Extracurricular Legal Experience measures how much students worked in law-related jobs while in law school, either pro bono or for pay. This composite, coded as EXTRA, is intended to shed light on whether working in legal jobs during law school is associated with positive education outcomes. It may be the case that working in law-related jobs helps academic and bar skills development. Other Responsibilities measures the extent to which students must spend time and resources on activities not directly related to their education. This category, coded as NONLEGAL, does not include voluntary activities such as fitness or socializing. Rather, it includes activities such as working (in a non-legal job for money for bills, food, tuition, etc.), taking care of children, and commuting to law school.

Rate the frequency at which you [came] to class WITHOUT completing readings or assignments.

Practical Skills Development measures the extent to which students have gathered skills in effective speaking, research, and writing, three tangible skills that are important for success as an attorney. This variable, coded as HARD, is intended to allow researchers to examine whether developing these skills is associated with more positive education outcomes.

To what extent has your experience at your law school contributed to your personal development in writing clearly and effectively? To what extent has your experience at your law school contributed to your personal development in legal research skills? To what extent has your experience at your law school contributed to your personal development in speaking clearly and effectively? To what extent does your law school emphasize providing the support you need to thrive socially? To what extent does your law school emphasize helping you cope with your non-academic responsibilities? (work, family, etc.)

Supportive Environment captures the extent to which a law school emphasized helping to cope with non-academic responsibilities and supporting students’ social lives. Schools with higher HELP ratings do a better job of helping students manage the non-academic aspects of their law school careers, including their social lives and their domestic responsibilities.

To what extent does your law school emphasize attending campus events and activities? (special speakers, cultural events, symposia, etc.) To what extent does your law school emphasize spending significant amounts of time studying and on academic work? To what extent does your law school emphasize providing the support you need to help you succeed academically? During the current school year, about how many hours do you spend in a typical 7-day week doing legal pro bono work not required for a class or clinical course? During the current school year, about how many hours do you spend in a typical 7-day week working for pay in a law-related job? During the current school year, about how many hours do you spend in a typical 7-day week working for pay in a nonlegal job? During the current school year, about how many hours do you spend in a typical 7-day week providing care for dependents living you (parents, children, spouse, etc.)? During the current school year, about how many hours do you spend in a typical 7-day week commuting to class (driving, walking, etc.)?

15


Table A.3.1 Cont. Preparation for Class measures the extent to which students came to class fully prepared, having done all assigned reading or other required work. This is distinct from EFFORT, which measures the extent to which students put in work that is not required of them. A student could have a very high PREP score if they always prepared for class properly yet have a very low EFFORT score if they rarely prepared for class beyond doing what they were required to do. Self-Care measures the quality of students’ non-academic life in law school. Students who report higher LIFE scores have a more robust work-life balance and spend more time on fitness activities, socializing, and community activities (e.g., church). LIFE measures the extent to which students are actively engaging in these activities, as opposed to measuring how much schools emphasize and encourage these activities. Legal Skills Development measures professional identity formation, or the extent to which students’ legal education included training in various aspects of a legal career. Developing analytical, critical thinking, research, and communication skills is important for students to be able to excel in law school, and these skills are vital to passing the bar exam as well. Supportive Relationships measures the quality of students’ relationships with faculty, administrative staff, and other students. Schools with higher scores for nonacademic support (or CLIMATE) will typically do a better job of supporting their students on a professional level. This is a different type of help than, for example, help offered by the school with coping with mental health or other non-academic burdens. Diverse Knowledge Displayed measures the frequency with which class discussions and writing assignments included perspectives (e.g., ethnic or religious background) and conceptual ideas from other courses’ perspectives in class discussions and writing assignments (coded as CLASS). School Satisfaction measures the level of satisfaction (or RATE) that students reported with their education experience, and whether they would choose the same law school if they started over. Expected Loan Debt Measures the amount of expected debt (DEBT).

During the current school year, about how many hours do you spend in a typical 7-day week reading assigned textbooks, online class reading, and other course materials? During the current school year, about how many hours do you spend in a typical 7-day week preparing for class and clinical courses other than reading? (studying, writing, doing homework, trial preparation, and other academic activities) During the current school year, about how many hours do you spend in a typical 7-day week exercising or participating in fitness activities? During the current school year, about how many hours do you spend in a typical 7-day week relaxing and socializing (watching TV, partying, etc.) During the current school year, about how many hours do you spend in a typical 7-day week participating in community organizations? (politics, religious groups, etc.) To what extent has your experience at your law school contributed to your personal development in legal research skills? To what extent has your experience at your law school contributed to your personal development in writing clearly and efficiently? To what extent has your experience at your law school contributed to your personal development in thinking critically and analytically? [Rate] ...the quality of your relationships with administrative staff and offices. [Rate] ...the quality of your relationships with faculty and staff. [Rate] ...the quality of your relationships with other students.

How often were diverse perspectives (different races, religions, sexual orientations, genders, political beliefs, etc.) included in class discussions or writing assignments? How often did you put together ideas or concepts from different courses when completing assignments or during class discussions? How would you evaluate your entire educational experience at your law school? If you could start over again, would you attend the same law school you are now attending? How much educational debt from attending law school do you expect to have upon your graduation?

16


B. TECHNICAL APPENDIX B.I. LSSSE Survey

As with many survey-based studies, the responses from the LSSSE survey comprise individuals who voluntarily chose to complete the survey, but ideal data-generating processes are random. Relying both on schools to provide contact lists and on students to voluntarily participate presents a risk of differential participation; that is, that some subgroups may be more (or less) likely to participate than others. Regardless, should the decision to participate be based on some nonrandom factor (for example, students with low LGPAs respond at significantly lower rates) and should the resulting groups of respondents and nonrespondents differ systematically in one or more key characteristics, the survey responses could be biased, either positively or negatively. We are confident that this risk is low, given that we are able to supplement the survey data with student-level administrative data and therefore estimate response rates and compare measures of demographic representativeness as well as outcomes of interest between respondents and nonrespondents. Although of less concern than systematic differences between survey respondents and nonrespondents, bias may also be introduced through the use of Likert scales, although this risk is usually the tradeoff for gaining greater descriptive understanding. According to Jin and Chen (2020) and Freidman et al. (1994), the responses could be biased if: (1) respondents overwhelmingly make extreme ordinal selections that may not accurately capture their true experiences or perceptions (for example, selecting a 1 or 5 on a scale of 1-5; this is referred to as “extreme responses”); (2) respondents depend excessively on the neutral response (for example a 3 on a scale of 1-5; this is referred to as “neutral responding”) and/or (3) respondents, regardless of the prompt, exhibit a bias either toward the left side of the Likert scale or toward affirmative responses. (Note: The LSSSE survey consistently presents the affirmative choice on the left side of the scale.) The general consensus, however, among educational and social science researchers is that the benefits of using responsibly phrased and analyzed ordinal prompts compared to simple binary answers substantially outweigh the risks in many contexts (particularly in attitude), in part because it allows for more “descriptive richness” and because it better captures the reality that many of life’s experiences and perceptions exist somewhere in the grayscale and do not fit nicely into a binary, yes/no, black/white environment (Cohen & Lea, 2004; King et al, 1994; and Likert, 1931). A Likert scale is thus a critical component for pinpointing nuance in our assessment of student engagement. Moreover, our tests of internal consistency for our variables of interest are consistent with our expectations that are based either on previous testing (for the EIs by LSSSE themselves; LSSSE, 2013) or upon the theoretical considerations underpinning the composite variables we create (see

17


below for a description of these variables). 1 For example, the component questions of our composite variable help are, in theory at least, aligned. We would therefore expect, and indeed find, a high level of consistency in the responses to the component questions (α = 0.84). On the other hand, we would expect low internal consistency for our composite extra because its component questions ask about activities that are likely to at the least represent a tradeoff (more time spent performing pro bono work means less time for working a legal job) and at the most to be mutually exclusive. Indeed, the internal consistency of the component responses is quite low (α = 0.06). Although each of these potential sources of bias pose a degree of risk to the accuracy and reliability of the responses, we are confident that the threat is minimal, given our large sample size, response distributions, the conceptual and methodological rigor of the survey, and our treatment (and testing) of the variables within the models.

B.II. Schools with Missing Data

Some schools failed to provide complete data with respect to either administrative data (e.g., LSAT, UGPA, race) or LSSSE data (survey responses). With respect to administrative data, schools 3, 16, and 21 have a missing year; taken as a whole, these three schools do not appear to systematically differ from the others. In examining Schools 16 and 21 individually, however, School 16 does in fact differ systematically in regard to bar passage, final LGPA, and LGPA growth—our outcomes of interest—and racial composition. We consider the differences in the outcomes to be largely driven by the larger than average proportion of minority students in the AY 2017 cohort. 2 Although different in these respects, it is unclear why School 16 did not participate in the second year of the study. To ensure the reason for exclusion was not one that would introduce bias into our data, we examined several plausible reasons for why a school might not submit data that they had previously agreed to provide. We found that: (1) School 16 saw an increase in bar passage rates from 2017 to 2018, so they likely did not fail to participate out of concern for reporting poor performance; (2) The same dean of the law school was in place both years of the survey, so it is likely not a case of turnover-driven policy change; (3) No public scandal occurred at this school just prior to or during the study period; and (4) It is compliant with the ABA’s revised “Internal consistency” (or sometimes, “internal reliability”) refers to the level of agreement in responses to questions asking about similar perceptions, skills, experiences, etc. It is used to test for the presence of possible response biases, as we outline above. In theory, prompts based on the same construct should receive similar answers, and if not, it may indicate the presence of bias. The level of consistency can be estimated using a statistic called Cronbach’s alpha, simply indicated as α. The general consensus is that for these types of variables—as opposed to composites that are intentionally designed to return distinct or mutually exclusive responses (so-called “lumpy tests”, such as our variable “extra”)—an α greater than 0.9 is considered to represent “excellent” internal reliability, but anything greater than a 0.7 is deemed “acceptable” (Cronbach, 1951; George & Mallery, 2003; and Gliem & Gliem, 2003). 2 The relationship between race and bar passage/law school performance has been quite robustly supported by evidence (see, e.g., Clydesdale 2002 and Klein 1990). Thus, a school with a higher minority population may experience lower bar and law school performance, all else equal. 1

18


Standard 316 and is not noncompliant with any other ABA accreditation requirements. 3 Thus, without an apparent and concerning explanation for its nonparticipation in the study in AY 2018, we have no serious reason to suspect that this decision was driven by a factor that is likely to introduce bias, and we therefore include School 16 in our sample. In sum, we conclude that the schools providing one year of administrative data do not differ systemically from those providing two. With respect to LSSSE survey data, a similar story emerges. Three schools only supply one year of LSSSE data, and taken as a whole, the three do not appear to systematically differ from the larger sample. Examining these three schools individually, however, reveals that School 3 and School 16 may be systematically different. As described above, we do not consider the differences for School 16 to be concerning and therefore include its students’ responses in the analysis. However, the differences, particularly in bar passage (10 percentage points) between School 3 and the full sample are, in fact, concerning. These differences cannot be readily explained by the demographic composition of School 3’s AY 2017 graduation cohort, but like School 16, there are not apparent reasons for why it elected not to administer the survey. We decided not to remove School 3 after reviewing survey response rates and determining that these differences do not appear to be attributed to systematic differences between survey respondents and nonrespondents (see below). In addition, the use of school fixed effects allows us to capture this variation in our model. Response rates varied widely across schools, ranging from 18 percent (School 3) to more than 67 percent (Schools 5 and 14; see Table A.II.1). Low response rates are concerning for several reasons; namely that (1) fewer observations mean less statistical power for the analysis and (2) low participation may increase the risk of response bias if those participating in the survey somehow differ systematically from those that do not. For all schools with a response rate below 20 percent, a commonly accepted threshold for responses to external and digital surveys, 4 we compare survey takers for a given school to the full sample from that school on several factors: bar passage, first-semester LGPA, first-year LGPA, final LGPA, and demographic characteristics (i.e., race, gender, and age). We do not find any evidence of systematic differences; that is, the differences between the full sample and LSSSE respondents is sufficiently small and none are statistically significant. As a whole, the LSSSE subsample is therefore reasonably representative of the full sample.

ABA Standard 316 states: “At least 75 percent of a law school’s graduates who sat for a bar examination must have passed a bar examination administered within two years of their date of graduation” (ABA, 2019: 1). 4 Although a response rate of only 20 percent is normal and even larger than average in contemporary survey research (see, e.g., Dey 1997), our response rates for most schools are quite higher and 20 percent is relatively low within our sample. We investigate these due to the wide variation in response rates, not the absolute response rates. 3

19


B.III. Selection of Engagement Factors

The LSSSE Survey is a comprehensive survey of a wide range of factors related to students’ law school experiences, with a total of more than 100 survey questions—each of which appears as a variable in our survey data. Including all—or even most—of the questions would risk several threats to the validity (both internal and external) of our findings: too many questions or variables risks overfitting the model so that, while it explains well the variation for the sample we have, it fails to generalize to other samples (the singularity problem). Moreover, testing each of the variables separately dramatically increases the risk of type I error, which would require a p-value adjustment for multiple comparisons (even then, many might still clam that they are artifacts). Even with such an adjustment, adding variables blindly could result in reporting hundreds of results with, presumably, only a small number of meaningful and statistically significant results. And, even if those results were to be truly meaningful and issues related to type I (and subsequently, type II) error could be successfully overcome, they would be lost in the sea of null effects, and therefore the usefulness of this report and of those particular findings would be diminished. Hence, we decided early in the process that a critical precursory step would be to pare down the variables in the analysis, making decisions about what survey items warranted inclusion (and, conversely, exclusion) and, subsequently, how best they should be analyzed. By design, many of the survey items speak to similar elements of the campus environment (e.g., a question about students’ communication with faculty via email, and separate questions asking about students’ communication with faculty in office hours and other settings). For this reason, LSSSE groups together questions which speak to a common theme; these are referred to as “engagement indicators.” LSSSE uses exploratory factor analysis to determine which questions explain common, unobserved dimensions of law school environments, and this process returns each engagement indicator as one variable comprising several component survey questions. 5 LSSSE uses this process to derive a total of four engagement indicators, equating to four variables to be used in statistical analysis. We use a similar process to derive our own composite indicators. However, rather than use exploratory factor analysis to sort through which survey questions should be combined, each member of our research team individually reviewed the set of survey questions and used theoretical expectations to suggest groupings for survey questions. We then engaged in a collaborative process of deciding which derived composite variables made theoretical sense, as well as adjusting proposed composites by removing and/or adding survey questions to them. At the end of this process, we used confirmatory factor analysis 6 to verify that our composite For example, the LSSSE engagement indicator Learning to Think Like a Lawyer comprises four survey questions asking students how much their coursework emphasized critical thinking, analytical writing, and synthesizing information. 6 Confirmatory factor analysis (CFA) was performed on all engagement indicators (not including the four EIs provided by LSSSE) containing three or more survey questions. All sets of variables were confirmed to describe only one latent variable (the associated engagement indicator). Only one of three components explained a significant amount of variance within Extracurricular Legal Experience, and only two of three components explained 5

20


variables explained a common, unobserved dimension and should therefore be considered a valid composite variable. This process resulted in 14 composite variables. Thus, while LSSSE’s engagement indicators are formed entirely by exploratory factor analysis, our composite variables are formed from theoretical expectations and confirmed analytically. 7 As a result of these decisions, we use two sets of student engagement factors, treating them separately in the analyses. The first set comprises four composite variables that LSSSE itself creates and includes in its own reporting (the LSSSE engagement indicators). The second set of engagement factors comprises our 15 composite variables. Using the approach described above, we examine a total of 53 LSSSE questions: 21 captured in LSSSE’s engagement indicators, 33 captured with our own composite variables, and 1 present in both an engagement indicator and one of our composites.

B.IV. Addressing Concerns About Using LGPA and LGPA Growth

Three concerns regarding our use of LGPA and LGPA Growth have been raised over the course of the study. Below, we list these each of these concerns, describe them, and then explain the steps we have taken to address them and how they do so. We are confident that our analytical approach accounts for each of these concerns and that our results are valid and reliable. Law schools may not grade their students consistently over the course of their academic careers. Description: Students are generally graded more strictly in their first-year courses than they are in smaller courses in their third year, when grading is generally more lenient. This implies that students will be more likely to have positive LGPA growth simply because grading leniency increases over time. If this is the case, then one might question whether LGPA growth is a useful measure. Approach and rationale: In theory, grading leniency should affect all students equally, and thus all students should be subject to a bump in their GPAs over time. Notwithstanding, in order to calculate “growth,” we difference each student’s standardized first-semester GPA from his/her standardized final GPA. In this manner, growth is measuring the relative difference in students’ GPAs. Essentially, this means that we are measuring the difference between how a student performed in the first semester relative to his/her school’s average and how he/she performed at the end of his/her law school career relative to the school’s average. Given that students progress through their coursework at the same time, they are subject to the same changes in grading strictness/leniency, and therefore these changes are unlikely to have a meaningful effect on any one or small group of individuals. significant variation within Self-Care. We adjust Extracurricular Legal Experience per the results of the CFA, but do not change the composition of Self-Care because despite one component not accounting for significant variation, we have no reason to suspect the measurement will adversely affect the models’ accuracy. 7 There is one additional difference: while LSSSE scaled their four engagement indicators to always take a value between 0 and 50, we did not take this step with our composite variables and instead used simple linear combinations that we then transformed into categorical variables.

21


Moreover, if schools do tend to become more lenient in grading as students progress through law school, then this only means that there is a baseline level of growth likely to be built into each students’ LGPA. Our models capture variation in LGPA growth, so even for schools with progressing leniency, our models will still explain variation from the baseline among students. Grading varies by school, both in terms of strictness/leniency and in terms of scale. Description: An “A” at School X is not necessarily an “A” at School Y (what is generally referred to as the “gentleman’s C”). Moreover, some schools award an “A+” while others do not. As a result, some schools have a maximum LGPA of 4.0 while for others it is 4.33. Approach and rationale: We rescale each LGPA variable within each school so that each student’s LGPA is relative to his/her school’s mean for that particular LGPA (a process called “standardization”). This approach enables us to draw conclusions about LGPA across schools because we are comparing students’ relative LGPA and not their actual LGPA. In addition, our choice of a fixed effects model accounts for the many time-constant differences, observed and unobserved, between law schools. The results from a fixed effects model are derived by limiting comparisons to students within their respective schools, meaning that the differences between the schools, as long as they are constant over time, are conditioned out of the results. Combined with within-school standardization of LGPAs, this amounts to robust treatment of these concerns and allows us to produce reliable results. Some law school courses are “easier” than others. Description: Some law school courses are graded more leniently (e.g., clinical courses). Law students are presumably well aware that their LGPA is one of the biggest factors in determining the type of job they get, with the most desirable positions requiring the highest grades. Students are therefore incentivized to take courses in which it is relatively easy to get an “A.” This incentive paired with greater latitude to choose coursework later in their matriculation, may produce a dynamic where students are experiencing LGPA growth as a function of their decision to strategically take easier courses and not of their development and learning over time. Approach and rationale: We account for this in several ways. First, as is the case for the concern about grading leniency and school variation in grading policies, standardizing the LGPA variables allows us to compare students’ LGPA to the average LGPA at their school. Assuming that all students are equally incentivized and have equal opportunity to take “easier courses,” then standardizing LGPAs should address this concern. We also examined whether adding a control variable for the number of clinic credit hours had any meaningful bearing on the results, finding that it did not. We found no evidence of a relationship between the number of these credit hours and LGPA. 22


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.