RESEARCH REPORT
An Assessment of Quality Standards for Florida’s Department of Children and Families Licensed Residential Group Homes: Results from the Statewide Pilot September 30, 2019
CONTENTS Executive Summary.......................................................................................................................2 Next Steps..................................................................................................................................3 Background....................................................................................................................................3 Description of the Literature.......................................................................................................3 Group Care Quality Standards...................................................................................................3 Scale Conceptualization and Development............................................................................4
Project Team: Shamra Boel-Studt, Ph.D., MSW Assistant Professor College of Social Work, Florida State University
Methodology...................................................................................................................................4 Design........................................................................................................................................4 Data Collected............................................................................................................................4 Data Analysis and Organization of Results................................................................................4
Hui Huang, Ph.D., MSW Assistant Professor Florida International University
Results...........................................................................................................................................5 Program and Sample Descriptives.............................................................................................5 Completion Rates...................................................................................................................5 Sample Characteristics..........................................................................................................6 Residential Care Programs....................................................................................................8 Trauma-informed Residential Care......................................................................................10 Evidence-Informed Model of Care........................................................................................10 Evidence-based (Validated) Assessments...........................................................................11 Implementation Evaluation.......................................................................................................11 Theme 1: No Lead Agency...................................................................................................12 Theme 2: Incomplete Youth Forms.......................................................................................12 Theme 3: Program does not Take Dependency Youth, Only has Short-Term Placements, or is Dually-Licensed................................................................................................................12 Theme 4: Missing Forms and Follow-Up..............................................................................12 Theme 5: Clarification on How to Assess Certain Items.......................................................12 Group Care Quality Standards Assessment Psychometrics....................................................13 Confirmatory Factor Analysis...............................................................................................13 Summary of Results and Re-specifications..........................................................................13 Reliability analysis................................................................................................................15 Correlations between the GCQSA Ratings and Program Indicators........................................15 Youth Form...........................................................................................................................16 GCQSA Baseline Ratings & Supporting Themes from Open-Ended Responses....................17 Qualitative Themes Related to Quality Ratings........................................................................18 Theme 1: Family Involvement..............................................................................................18 Theme 2: Behavior Management.........................................................................................18 Theme 3: Facilitating Quality Improvement..........................................................................18 Theme 4: Documentation.....................................................................................................18 Theme 5: Program Model.....................................................................................................18 Youths’ Reflections on Experience in Residential Care............................................................19 Theme 1: Positive Comments on Group Home....................................................................19 Theme 2: Negative Comments on Group Home..................................................................19 Theme 3: Positive Comments on the GCQSA......................................................................19 Theme 4: Negative Views on the GCQSA............................................................................19 Dissemination...........................................................................................................................20
Zandra Odum, M.Ed. Project Management Consultant Office of Child Welfare Florida Department of Children and Families Neil Abell, Ph.D., MSW Professor College of Social Work, Florida State University Tanisha Lee, MSW State Foster Care Licensing Specialist Office of Child Welfare Florida Department of Children and Families Rachel Harris, MSW Research Assistant College of Social Work, Florida State University
Project Contributors: The Group Care Quality Standards Workgroup Florida Coalition for Children
Funded through a contract with the Florida Institute for Child Welfare
Discussion....................................................................................................................................20 Recommendations.......................................................................................................................20 Next Steps....................................................................................................................................21 References...................................................................................................................................22 Appendix A. Florida Quality Standard Initiative Framework.........................................................23 Appendix B. Correlations with Quality Ratings and Program Indicators......................................24
Executive Summary Effective July 1, 2017, Section 409.996(22) of the Florida Statutes was amended to require the Department of Children and Families (DCF or Department) to develop a statewide accountability system for residential group care providers based on measurable quality standards. The accountability system must be implemented by July 1, 2022. In collaboration with the Florida Coalition for Children (FCC) and the Florida Institute for Child Welfare (FICW), the Department established a core set of quality standards for licensed group homes. The department engaged the FICW to develop and validate a comprehensive assessment tool, the Group Care Quality Standards Assessment (GCQSA), designed to operationalize the quality standards. The GCQSA will serve as the core measure for the statewide accountability system. The quality accountability system initiative draws upon research and empirically-driven frameworks to transform residential services through the integration of research-informed practice standards, ongoing assessment, and continuous quality improvement. The purpose of the statewide pilot was to begin implementing the GCQSA in all six regions, giving stakeholders in each region an opportunity to become familiar with the assessment while providing ongoing monitoring and technical support. Assessment data were collected for one year from the full population of licensed residential care programs throughout Florida. Analyses of these data included: 1) program and sample descriptives; 2) GCQSA psychometrics; 3) GCQSA domain/item correlations with program indicators; and 4) GCQSA baseline results and supporting themes from open-ended responses. Program and Sample Descriptives The adjusted sample included 160 programs and 222 licensed facilities. There was a total of 1,516 assessment forms completed by youth (450 forms, 29.7%), direct care workers (450 forms, 29.7%), group home directors/supervisors (272 forms, 17.9%), lead agencies (183 forms, 12.1%), and licensing specialists (161 forms, 10.6%). Sixty-one percent of residential programs in the sample are traditional group homes followed by shelters (nearly 20%) and programs designed to serve specialized populations (e.g., CSEC, maternity; 19%) or other. The majority of residential programs (63.6%) use a shift care model while 36.4 percent reported using a family-style model. Less than half of the residential programs in the sample are nationally accredited. Of the 65 accredited programs across the state, the majority use shift-care models. Services most often provided include educational training and supports, recreation, life skills development/independent living, and discharge planning. With the exception of recreation, most of these services are provided both on and off-site. Mean scores on five-point scale (1 = ‘not at all’, 3 = ‘somewhat’, 5 = ‘completely’) indicated that, overall, programs are ‘somewhat’ trauma-informed (M = 3.06, SD = 1.51) with most reporting that staff are trained in trauma-informed care and adhere to principles of trauma-informed care (i.e., promote psychological and physical safety, trust, choice, and empowerment). Less than half reported routinely screening youth for trauma and traumatic stress or providing trauma education to youth and families as part of their approach. Implementation Evaluation Documented calls (N = 61) with the regional licensing teams were used to evaluate the implementation process. Overall, the evaluation supported few challenges with the implementation of the GCQSA. Thematic analyses revealed five recurring themes
FLORIDA INSTITUTE FOR CHILD WELFARE
related to implementation: 1) How to respond when programs do not have an identifiable lead agency; 2) Circumstances related to incomplete youth forms, 3) Assessing programs that are dually licensed or that do not take dependency cases, 4) Missing forms and follow-up, and 5) Clarification on how to best assess certain items. GCQSA Psychometrics Confirmatory factor analyses were used to examine the fit of the data with the GCQSA factor structure for all five forms (youth, licensing, lead agency, director, direct care worker). After model re-specifications were made, all five forms demonstrated factorial validity. Cronbach’s alpha coefficients for domain score across all re-specified forms were good-excellent (alphas ranged from .794 - .961). GCQSA Domain/Item Correlations with Program Indicators Higher domains and item ratings on the youth form were correlated with fewer critical incidents (i.e., youth hospitalizations, staff injury, youth injury, law enforcement calls, runaway episodes, total incidents). Results of correlational analyses of the licensing form showed mean scores on Domain 7 (Education, Skills, and Positive Outcomes) were associated with fewer law enforcement calls. No other domains scores on the licensing form were statistically associated with critical incidents. However, several item-level correlations were observed. Taken together, these results lend emerging support for the construct validity of the GCQSA. GCQSA Baseline Scores and Supporting Themes from Open-Ended Responses Relatively lower scores were observed for Domains 1 (Assessment, Admission, & Service Planning) and 8 (Pre/Post Discharge Processes). Scores were more consistent and higher for Domains 2 (Positive, Safe Living Environment) through 6 (Program Elements) across respondents. Mean scores of directors and direct care workers were notably more consistent with one another, higher, and less varied relative to other respondent types. Mean scores of the licensing specialists were lower relative to others and more consistent with those of the lead agencies. Themes from open-ended responses from licensing specialists provided further insights into the ratings and current practice in group homes related to family involvement, documentation, program models, and behavioral management. Emerging evidence suggests the GCQSA is prompting positive changes in practice and may promote a process of continuous quality improvement. A content analysis of youths’ open-ended responses indicated that overall, youth viewed their placement positively and felt connected with the program staff and house parents. The most frequent types of negative comments focused on feeling the environment was too restrictive and that staff were not adequately responding to their needs. Recommendations • Provide assistance to help providers become trauma-informed. • Provide guidance and recommendations to providers to support the consistent use of validated, comprehensive assessment tools. • Use the GCQSA annually to monitor trends over time and progress toward FFPSA requirements and implementation of quality care based on the Group Care Quality Standards. • Refine the GCQSA forms based on psychometric results. • Update the GCQSA training and manual based on findings from the implementation evaluation.
2
NEXT STEPS The GCQSA tool and training will be updated based on results from the statewide pilot in preparation for the statewide validation study scheduled to begin January 2020. Booster trainings including the regional licensing teams, providers, and lead agencies will be scheduled for late-November-December 2019. In addition, plans for two supplemental project components, the inter-rater reliability study and the outcomes development pilot, are underway. These studies will comprise the final phases in the validation process. The results of these combined studies will yield a highly rigorous validation of the GCQSA with implications for informing quality residential services in Florida and beyond.
Background Effective July 1, 2017, Section 409.996 (22), of the Florida Statutes was amended to require the Department of Children and Families (DCF or Department) to develop a statewide accountability system for residential group care providers based on measurable quality standards. The accountability system must be implemented by July 1, 2022. In collaboration with the Florida Coalition for Children (FCC) and the Florida Institute for Child Welfare (FICW), the Department established a core set of quality standards for licensed group homes. The Department engaged the FICW to develop and validate a comprehensive assessment tool, the Group Care Quality Standards Assessment (GCQSA), designed to operationalize the quality standards. The GCQSA will serve as the core measure for the statewide accountability system. The quality accountability system initiative draws upon research and empirically-driven frameworks to transform residential services through the integration of research-informed practice standards, ongoing assessment, and continuous quality improvement.
DESCRIPTION OF THE LITERATURE Quality social services have been defined as “the degree to which interventions influence client outcomes in desired ways in applicable domains while being delivered in a sensitive manner consistent with ethical standards of practice and the best available practice knowledge”.1 The debate surrounding quality residential care is longstanding. To address this, researchers, providers, and policy-makers have proposed establishing quality standards for residential care for children and adolescents.2,3,4 Federal guidelines, such as the Adoption and Safe Families Act of 1997 and the Family First Prevention Services Act (FFPSA) of 2017, place child well-being at the center of the quality debate.5 For example, FFPSA requires that children are cared for in “a setting providing high-quality residential care” (section 472(k)(2)(D)). In an effort to identify the elements of quality residential programming, Huefner (2018) reviewed seven published sources promoting quality standards specifically for residential treatment.6 The results of this review supported that "quality" encompasses a diverse set of criteria, including assessment, treatment planning, safety, family engagement, cultural competence, effective treatment, competent staff, positive outcomes, and aftercare. The quality standards generated from Huefner's review represent the culmination of the best available evidence and provide a starting framework to guide further development and the eventual validation of practice standards for residential care. Three quality measures for children’s residential programs have been developed, including the Child Welfare League of America Quality Indicators (CWLA QI),7 Boys Town Performance
FLORIDA INSTITUTE FOR CHILD WELFARE
Standards for Residential Care (BT PS),8 and the Building Bridges Initiative Self-Assessment Tool (BBI SAT).9 Each self-assessment survey is comprised of domains within which practices and conditions relevant to service delivery are assessed that providers can use to identify strengths and weaknesses to guide service improvement. Although these measures contribute useful examples and guidance for structuring and scaling quality indicators, to date, none have been validated. In their review of two of the quality measures, the CWLA QI and BT PS, Lee and McMillen (2008) note that neither measure provides clear guidance for scoring and interpretation and that the items appear to be equally weighted (i.e., given equal priority), despite some items measuring practices related to ensuring youth’s safety while others are geared toward issues of well-being or the integration of best practices.4 In addition, the measures were developed with minimal input from different stakeholders which can lead to privileging certain perspectives and questions of validity (note the BBI SAT is an exception). These previously developed assessments can be used to provide guidance in the development of quality assessments for residential care that draw upon their strengths while addressing the noted limitations.
GROUP CARE QUALITY STANDARDS Florida’s Group Care Quality Standards Initiative is a collaboration between the Florida Department of Children and Families, the Florida Institute for Child Welfare, the Florida Coalition for Children, academic researchers, child advocates, and service providers and consumers aimed at improving the quality and effectiveness of residential care. The initiative draws upon research and empirically-driven frameworks to transform residential services for children and adolescents through the integration of research-informed practice standards, ongoing assessment, and continuous quality improvement. Appendix A summarizes the eight-phased implementation plan guiding development and the process of scaling up the GCQSA and the proximal alignment with implementation science and practice. The DCF, in partnership with the FCC, convened the Group Care Quality Standards Workgroup, comprised of 26 members that include group care providers and child advocates throughout Florida with research support provided by the FICW and Boys Town National Research Institute. The workgroup was tasked with developing a set of research-informed quality standards for licensed residential group homes. Huefner’s (2018) consensus of proposed practice standards provided the workgroup with a working list of standards grounded in empirical research and best practice guidelines. Led by FCC Residential Committee leadership, members of the workgroup divided into task groups assigned to discuss the proposed standards within a specific practice domain and to select and adapt standards for Florida’s group homes. The standards identified by the task groups were reviewed and compiled into one document, resulting in the published guide, Quality Standards for Group Care (Group Care Quality Standards Workgroup, 2015).10 The guide outlines a set of 59 quality practice standards in the following eight domains: • Assessment, Admission, and Service Planning
• Professional and Competent Staff
• Positive, Safe Living Environment
• Program Elements
• Monitor and Report Problems
• Education, Skills, and Positive Outcomes
• Family, Culture, and Spirituality
• Pre-Discharge/Post Discharge Processes
3
Scale Conceptualization and Development Following the Department’s approval, the FICW was engaged to lead efforts to develop and validate an assessment tool designed to operationalize the standards. The research team began with establishing a conceptual framework (Figure 1) to guide the process and ensure the approach and resulting assessment was consistent with the aims and vision of the Department and Workgroup. Figure 1. Group Care Quality Standard Assessment (GCQSA) Conceptual Framework
Methodology DESIGN
The pilot began with a daylong orientation and training held in each of Florida’s six regions. Training and orientation sessions were attended by group care providers, lead agency personnel, and the regional licensing teams and were facilitated by the principle investigator and DCF project leads. Applying similar methods as in the two previous pilots and a mixed methods population-based design, GCQSA data were collected for all Department licensed group homes and shelters throughout Florida. Data collection was carried out over one-year (April 2, 2018 – April 30, 2019) and was coordinated with the annual re-licensure inspections conducted by the regional licensing teams. A live statewide debriefing webinar including providers and the licensing teams was held May 23, 2019. During the webinar, preliminary results from the pilot were presented and participants were given opportunities to provide feedback and ask questions.
DATA COLLECTED The primary measure for this study was the GCQSA. The licensing teams facilitated completion of the GCQSA for each group home. For each group home or shelter, a licensing specialist, lead agency, group home director, and a minimum of two youth and two direct care workers, completed GCQSA forms.
Following the completion of the initial draft of the Group Care Quality Standards Assessment, efforts toward validation began with establishing content validity (i.e., Do the items reflect the constructs they were designed to measure?) assessed by a panel of 16 experts.11 Elements of ecological validity (i.e., Do the concepts being measured have ‘real world’ applicability and practicability?) were evaluated during the feasibility study and implementation pilot. Preliminary estimates of internal consistency (i.e., Are scale items that are designed to measure the same constructs correlated across repeated uses?) were examined during these early phases to provide initial evidence of one form of reliability based on a small preliminary sample.12 Taken together, the findings from these earlier phases were used to refine the assessment tool and implementation process. The statewide pilot study represents a major step toward validation and full implementation. The purpose of the statewide pilot was to begin implementing the GCQSA in all six regions, giving stakeholders in each region an opportunity to become familiar with the assessment while providing ongoing monitoring and technical support.
FLORIDA INSTITUTE FOR CHILD WELFARE
The GCQSA is comprised of two sections. Section I is designed to gather information about the respondent and the residential program being assessed. The assessment of the quality standards begins in Section II, which is comprised of eight scales measuring the eight quality domains. The extent to which practices or conditions in the group homes are consistent with each standard is rated on a five-point scale (1 = not at all, 3 = somewhat, 5 = completely). Domain scores are computed from the mean of the item ratings. Qualitative process data were also collected via technical support calls and open-ended GCQSA items where respondents could provide additional information about their ratings or the standards.
DATA ANALYSIS AND ORGANIZATION OF RESULTS Quantitative data were analyzed using SPSS version 23. Mplus 7 was used to conduct confirmatory factor analyses. Different methods were used to analyze qualitative data due to different data structures. Documented triage call notes were manually reviewed and coded to identify recurring themes using thematic analysis.13 Open-ended responses from the GCQSA were analyzed in Nvivo 11 using an approach consistent with qualitative content analysis.14,15 The results are presented in five sections: 1) Program and Sample Descriptives; 2) Implementation Evaluation; 3) GCQSA Psychometrics; 4) GCQSA Correlations with Program Indicators; 4) GCQSA Baseline Ratings; and 5) Supporting Themes from Open-Ended Responses.
4
Results
PROGRAM AND SAMPLE DESCRIPTIVES Residential care programs were defined as group homes and/or shelters within an agency using the same model, located within the same region. Based on this definition, the sample included 169 programs for which at least one survey was completed. This encompassed 237 licensed facilities. Family Safe Families Network (FSFN) data from January 2018 showed a count of 309 licensed facilities throughout the state. Comparing the sample counts with FSFN indicated that the unadjusted statewide pilot sample included approximately 76.7 percent of licensed residential facilities throughout Florida. Table 1. Regional Comparison of Licensed Residential Facilities from January 2018 FSFN Data and the Statewide Pilot (unadjusted) Sample Region
64
63
98.4%
Northwest
27
29
58.7%
24
82.8%
Southeast
50
41
82.0%
Southern
35
16
45.7%
Suncoast
85
66
77.6%
Total
309
237
76.7%
During the one-year pilot study, nine residential programs including 15 licensed facilities discontinued operations. This included Florida Baptist Children’s Homes, resulting in the closure of 10 licensed facilities (Central = 2, NE = 2, NW = 3, Southern = 2, Suncoast = 1). Other closed programs included Ikare Youth and Family Services, Inc (facility = 1) in the Northeast, and the Peak Group Homes in the Suncoast region (facilities = 2). In the Northwest region the Susanna Wesley Emergency Shelter (facility = 1) and the Travis Tringas facility of CIC (facility = 1) were also closed and/or non-operational. The adjusted sample, accounting for closures, included 160 programs and 222 licensed facilities. Within programs, the number of facilities ranged from one to six (median = 1, mean = 1.37, standard deviation = .88). Figure 2 shows the distribution of programs and licensed facilities across Florida’s six regions. Figure 2. Regional Distribution of Residential Programs and Licensed Facilities (N = 160). 70 60 40 30
18
10 Central
51
41
37
20 0
63
61
50
24
Northeast
12
19
Northwest
PROGRAMS
30 12 Southeast
14
Southern
Suncoast
FACILITIES
A total of 1,516 assessment forms were completed during the statewide pilot. Of the total, 450 (29.7%) were youth forms and 450 (29.7%) were direct care worker forms, followed by 272 (17.9%) director/supervisor forms, 183 (12.1%) lead agency forms, and 161 (10.6%) licensing specialist forms. From the
FLORIDA INSTITUTE FOR CHILD WELFARE
140 126
120
119
119
100
70
Percent
46
150
91
Statewide Pilot
Northeast
Figure 3. Distribution of Quality Standards Assessment Forms Completed by Type and Region (N = 1516)
87
90
FSFN Jan. 2018
Central
total, 433 (28.6%) of the assessment forms were completed for programs in the Suncoast region, followed by 412 (27.2%) Central, 319 (21.0%) Southeast, 145 (9.6%) Northeast, 114 (7.5%) Northwest, and 93 (6.1%) Southern. Figure 3 displays the distribution of form completion by respondent type across regions.
65
60
55 46 39
36
12
Central
YOUTH
65
46
36
34
30
29
30
0
43
43
15
Northeast LEAD AGENCY
18 11
23 13
Northwest
21 13
Southeast DIRECT CARE WORKER
18 18
Southern
DIRECTOR
Suncoast
LICENSING
Completion Rates One purpose of the statewide pilot was to evaluate a revised sampling procedure based on results from the previous smaller pilots. For each assessed program, a minimum of two youth from each licensed facility, one representative from the lead placement agency, two direct care workers from each licensed facility, one director, and one licensing specialist were asked to complete surveys. Providers had the option of inviting more than the minimum of two youths and direct care workers per facility to participate in the assessment. Providers chose this option in several cases. Completion rates were calculated based on the minimum sampling criteria. Seven programs were excluded in the calculation of completion rates due to having only 1-2 forms completed. This occurred most often for programs with relicensing inspections scheduled near the beginning or end of the statewide pilot which may have resulted in data collection being discontinued prior to programs having sufficient time to complete all of the assessments. In four instances, assessment forms were incomplete due to program closures, resulting in the decision to exclude the program in calculating completion rates. Completion rates for some included programs were also adjusted for circumstances that were beyond the control of the provider or licensing team that prevented forms from being completed. These included the following: the program had no identifiable lead agency (n = 5), no youth or only one youth meeting criteria to participate in assessment (n = 6), programs had only one staff and director (n = 3), no youth were placed at the time of the assessment (n = 5), no staff were available at the time of the assessment (n = 1), and youth declined to participate (n = 4).
5
Table 2 shows the completion rates for the aggregated sample adjusted for excluded programs and other factors impeding form completion. Completion rates were high for overall programs and at the level of the respondents. Overall, 107 (70.1%) programs had all forms completed based on the minimum sampling criteria. Of the 151 programs included in the calculation of completion rates, the large majority had forms completed by directors and licensing specialists and the majority also had forms completed by lead agencies, direct care workers, and youth, supporting a high level of participation across the state. Completion rates by region are shown in Figure 4. Table 2. Completion Rates Across Programs and Respondents Statewide (N = 151) # Programs with Minimum Forms Completed
Completion Rate
Overall
107
70.9%
Youth
121
80.1%
Lead Agency
133
88.1%
Direct Care Worker
125
82.7%
Director/Supervisor
143
94.7%
Licensing Specialist
140
92.7%
Figure 4. Completion Rates by Program and Respondents by Region 100
Sample Characteristics Characteristics of youth in the sample are presented in Table 3. In summary, the majority of youth were in their early to midteens with a nearly equal percentage of males and females. Most youth in the sample identified as white or black with a small percentage (slightly above 6%) who were born outside of the U.S. The majority of youth completing the assessment were currently living in a group home (versus a shelter). Nearly 50 percent of youth indicated that they had only experienced one prior group home placement, with less than 16 percent of the sample having experienced more than three prior group home placements. Most youth reported having been in their current group home between 1-2 years. Of the nearly one-third of youth who had never been placed in a foster home, 122 (82.4%) were currently in a group home and 26 (17.6%) were in a shelter. Although most youth identified having contact with a supportive adult (parent, relatives, non-relative) outside of their group home or shelter, approximately 10 percent reported they had no supportive adults with whom they are currently connected. Table 3. Youth Characteristics Number
(%)
Male
218
48.6
Female
229
51.0
2
0.4
Group home
391
86.9
Shelter
59
13.1
4th
15
3.3
5th
25
5.6
6th
24
5.4
7th
45
10.0
8th
55
12.3
9th
67
15.0
10th
79
17.6
11th
70
15.6
12th
43
9.6
High school graduate
12
2.7
Not enrolled in school
3
0.7
Post-secondary education
8
1.8
Don’t know
2
0.4
3
0.7
Age (range 9 – 19)
Mean
(SD)
14.77
(2.28)
Gender
Other Facility Type
Grade 80
60
40
20
0
Central OVERALL
Northeast YOUTH
Northwest LEAD AGENCY
Southeast DIRECT CARE WORKER
Southern
Suncoast
DIRECTOR
LICENSING SPECIALIST
Race/Ethnicity American Indian/Alaska Native Asian
5
1.1
Black
162
36.0
Native Hawaiian/Pacific Islander
2
0.4
White
163
36.2
Hispanic/Latino
72
16.0
Other
43
9.6
Yes
421
93.6
No
29
6.4
Born in the US*
Table 3 Continued on Page 7
FLORIDA INSTITUTE FOR CHILD WELFARE
6
Table 3. Youth Characteristics (continued) Mean
(SD)
Number
(%)
Less than 1 month
59
13.1
1-2 months
59
13.1
3-4 months
39
8.7
5-6 months
48
10.7
7-8 months
33
7.3
9-10 months
32
7.1
11-12 months
37
8.2
1-2 years
107
23.8
2 or more years
36
8.0
146
32.5
Adoption
49
10.9
Independent living
121
6.9
Foster home
12
2.7
Relative home
46
10.2
Non-relative home
10
2.2
Don’t know
65
14.5
1
219
49.1
2
115
25.8
3
41
9.2
4
27
6.1
5
19
4.3
6
6
1.3
7
5
1.1
8
1
0.2
9
1
0.2
10 or more
12
2.7
1
123
27.9
2
64
14.5
3
35
7.9
4
29
6.6
5
14
3.2
6
3
0.7
7
4
0.9
8
4
0.9
Months in Group Home
Table 4 shows that the majority of respondents from the lead contract agencies were contract managers, followed by case managers and placement specialists. Slightly more than 40 percent of lead agency respondents indicated having been assigned to or overseen the placement of 10 or more youth in the group home being assessed in the past year. Table 4. Lead Agency Personal Characteristics Number
(%)
Case Manager
45
(24.7)
Placement Specialist
38
(20.9)
Contract Manager
71
(39.0)
Other
16
(8.8)
Form Completed Collaboratively by Multiple Individuals within Agency
12
(6.6)
Number of Cases Placed in Home Being Assessed in Past 12 Months
22
(12.3)
1
20
(11.2)
2
13
(7.3)
3
9
(5.0)
4
8
(4.5)
5
10
(5.6)
6
13
(7.3)
7
5
(2.8)
8
5
(2.8)
9
1
(0.6)
10 or more
73
(40.8)
Professional Title
Discharge Plan Return to biological parents
Number of Prior Group Homes
Number of Prior Foster Homes
9
0
--
10 or more
17
3.9
I have never lived in a foster home
148
33.6
None
46
10.2
Biological parents
214
47.6
Relatives
271
60.2
Non-relative
157
34.9
Supportive Adults Outside of Group Home
Countries outside of U.S. = Brazil (n = 1), Columbia (n = 2), Ethiopia (n = 1), Germany (n = 1), Guatemala (n = 6), Haiti (n = 8), Jamaica (n = 1), Mexico (n = 1), Philippines (n = 1), Puerto Rico (n = 1), Romania (n = 1), Russia (n = 3), Taiwan (n = 1), United Kingdom (n = 1).
FLORIDA INSTITUTE FOR CHILD WELFARE
7
The results presented in Table 5 show that the largest proportion of the group home directors were between the ages of 36-55 years and direct care workers were between 26-35 years, with a higher percentage of females in both positions. Most directors identified as white followed by black; though among direct care workers the reverse was observed, with the largest percent identifying as black followed by white. Most directors/supervisors held bachelor’s or master’s degrees while the highest level of education among most direct care workers was a high school diploma. The largest percentage of both directors and direct care workers reported having been in their current position for one to five years. However, a higher percentage of directors reported having been in their position for longer (6-21 or more years) compared to direct care workers.
Residential Care Programs
Table 5. Group Home Director and Direct Care Worker Characteristics
Table 6. Type of Group Care Programs (N = 148)A
Director/Supervisor (N = 272)
Examining Table 6, the majority (58.1%) of the 148 programs with available data on program types were traditional group homes (non-therapeutic or not designed to provide services to a specified specialized population). This was followed by nearly 20 percent shelters, with the remaining 22 percent designed to serve a specialized population, therapeutic group homes, or other. The other category contained hybrid programs that combined different models (e.g., shelter + traditional group home) or those that did not fit into one of the pre-existing categories (e.g., wilderness camp).
Direct Care Worker (N = 450)
Number
%
Traditional Group Home
86
58.1%
Therapeutic Group Home
7
4.7%
Shelter
30
20.3%
Maternity Home
6
4.1%
Crossover/DJJ
2
1.4%
Sexually Exploited (CSEC)
4
2.7%
Number
%
Number
%
18-20
1
(0.4)
4
(0.9)
21-25
5
(1.9)
45
(10.0)
26-30
19
(7.1)
67
(14.9)
31-35
26
(9.7)
58
(12.9)
36-40
47
(17.5)
50
(11.1)
41-45
45
(16.8)
52
(11.6)
46-50
39
(14.6)
50
(11.1)
Special Needs/Medically Fragile
3
2.0%
51-55
41
(15.3)
46
(10.2)
Other
10
6.8%
56-60
23
(8.6)
50
(11.1)
61 or older
22
(8.2)
27
(6.0)
Total
148
100%
Male
107
(39.3)
123
(27.6)
Female
163
(59.9)
322
(72.4)
Figure 5 shows that the regions largely follow the aggregate trends in types of residential care programs.
0
--
7
(1.6)
Figure 5. Regional Distribution of Residential Programs Types (N = 148)
Age
Maternity + Traditional (1) Emergency Shelter (runaway, CINS/FINS) (2) DJJ/Severe Behavioral Issues (1) Shelter + Group Home (1) Brain Injury (1) Traditional + CSEC (1) Wilderness Camp (1) Wilderness + Transitional Group Home (1) Traditional Group Home + Emergency Shelter (1)
Gender
Race/ethnicity American Indian/ Native American Asian
4
(1.5)
1
(0.2)
Black
110
(40.4)
227
(50.6)
White
118
(43.4)
141
(31.4)
Hispanic/Latino
25
(9.2)
40
(8.9)
Multiracial
8
(2.9)
18
(4.0)
Other
7
(2.6)
15
(3.3)
36
(13.2)
210
(46.7)
Highest Level of Education High school/GED Associate’s degree
25
(9.2)
88
(19.6)
Bachelor’s degree
101
(37.1)
120
(26.7)
Central
Northeast
Northwest
Southeast
Southern
Suncoast
Master’s degree
98
(36.0)
31
(6.9)
Other
6
0
0
2
0
2
Doctorate degree
12
(4.4)
Special Needs / Medically Fragile
0
1
0
1
0
1
CSEC
1
0
0
1
0
2
Crossover/DJJ
1
0
0
1
0
0
Maternity
0
1
0
3
0
2
Shelter
7
2
4
9
4
4
1
0
0
0
0
6
23
9
9
11
8
26
Other
1
(0.2)
43
9.6
Years in Current Position Less than 1 year
35
(12.9)
103
(23.0)
1-2
61
(22.4)
141
(31.5)
3-5
75
(27.6)
115
(25.7)
6-10
46
(16.9)
56
(12.5)
11-15
20
(7.4)
21
(4.7)
16-20
17
(6.3)
7
(1.6)
21 or more
18
(6.6)
4
(0.9)
Note. Missing responses on some items were observed, resulting in the counts not equaling the total number of directors or direct care workers for some items. Valid percentages, adjusted for missing responses, were calculated in SPSS.
FLORIDA INSTITUTE FOR CHILD WELFARE
Therapeutic Traditional
A
Count is based on program data from licensing and director forms, and includes programs excluded from calculation of completion rates due only one form completed.
8
The majority of residential programs (63.6%) use a shift care model while 36.4 percent reported using a family-style model. The distribution of residential models across the state and regionally are shown in Figure 6. Figure 6. Shift-Care and Family-Style Residential Programs (N = 151)
Family Style 55 (36%)
Shift Care 96 (64%)
Shift Care
Family Style
Total
Central
20 (55.6%)
16 (44.4%)
36 (100%)
Northeast
8 (61.5%)
5 (38.5%)
13 (100%)
Northwest
8 (66.7%)
4 (33.3%)
12 (100%)
Southeast
24 (80%)
6 (20%)
30 (100%)
Southern
8 (57.1%)
6 (42.9%)
14 (100%)
Suncoast
28 (60.0%)
18 (39.1%)
46 (100%)
Table 7 shows that fewer than half of the residential programs in the sample are nationally accredited. Of the 65 accredited programs across the state, the majority use shift-care models. This trend was observed across nearly all regions with the exception of the Northeast region where one of the two accredited programs was a shift-care model and the other a family-style home. Table 7. National Accreditation Status of Residential Programs by Region and Model
As seen in Table 8, the services most often provided included educational training and supports, recreation, life skills development/independent living, and discharge planning. Educational services, life skills development/independent living, and discharge planning were most often provided both on and off-site while recreation was most often provided off-site. For example, of the 146 programs with service data, 139 (95%) reported providing education services/supports, and among those that provide educational services, 85 (61.2%) provide those service on and off-site (both). Of the 139 programs that provide educational services/supported, 83 programs (60.6%) reported those services are provided directly by staff or professionals who are contracted or work outside of the group home (both). Table 8. Residential Program Services (N = 146) Service Provided
On-site
Off-site
Both
Directly
Externally
Both
# (%)
# (%)
# (%)
# (%)
# (%)
# (%)
# (%)
Educational
139 (95%)
34 (24.5%)
20 (14.4%)
85 (61.2%)
16 (11.7%)
38 (27.7%)
83 (60.6%)
Vocational
89 (64%)
20 (18.3%)
48 (44%)
41 (37.6%)
13 (12%)
43 (39.8%)
52 (48.1%)
Recreation
136 (96.5%)
21 (15.3%)
8 (5.8%)
108 (78.8%)
26 (19.1%)
3 (2.1%)
107 (78.7%)
Family Support
113 (79%)
28 (22.2%)
29 (23%)
69 (54.8%)
30 (25%)
22 (18.3%)
68 (56.7%)
Medical/ Nursing
97 (68.8%)
20 (16.9%)
58 (49.2%)
40 (33.9%)
12 (10.1%)
59 (49.6%)
48 (40.3%)
Mental/ Behavioral Health
121 (85.8%)
29 (21.8%)
30 (22.6%)
74 (55.6%)
29 (19.9%)
41 (31.3%)
61 (46.6%)
105 (75%)
38 (29.9%)
29 (22.8%)
60 (47.2%)
37 (29.6%)
37 (29.6%)
51 (40.8%)
Total Accredited
Shift Care
Family Style
Number (%)
Number (%)
Number (%)
Case management
Statewide (n = 150)
65 (43.3%)
49 (75.4%)
16 (24.6%)
Life skills/ Independent living
137 (97.9%)
57 (42.2%)
3 (2.2%)
75 (55.6%)
53 (39%)
4 (2.9%)
79 (58.1%)
Central (n = 36)
19 (52.8%)
13 (68.4%)
6 (31.6%)
Parent training/ education
67 (48.6%)
21 (21.9%)
32 (33.3%)
43 (44.8%)
21 (30.7%)
42 (36.8%)
37 (32.5%)
Northeast (n = 13)
2 (15.4%)
1 (50%)
1 (50%)
Family counseling
94 (67.6%)
31 (27.2%)
42 (36.8%)
41 (36%)
35 (30.7%)
42 (36.8%)
37 (32.5%)
Northwest (n = 12)
9 (75%)
6 (66.7%)
3 (33.3%)
Discharge planning
128 (92.1%)
69 (52.3%)
10 (7.6%)
53 (40.2%)
62 (47.3%)
7 (5.3%)
62 (47.3%)
Southeast (n = 30)
16 (53.3%)
15 (93.8%)
1 (6.3%)
Aftercare
Southern (n = 13)
6 (46.2%)
4 (57.1%)
2 (33.3%)
77 (55.8%)
24 (22.6%)
38 (35.8%)
44 (41.5%)
33 (30.8%)
33 (30.8%)
41 (38.3%)
Other
29 (43.3%)
--
--
--
--
--
--
Suncoast (n = 46)
13 (28.3%)
10 (76.9%)
3 (23.1%)
Note. Other includes the following services: 30 and 60 day check-ins with parents, civic/volunteerism, conflict resolution, driver’s license permit, equine therapy, extended foster care, family vacations, family visitation, follow-up/check-in, kickboxing, mentoring, neuropsychological evaluation, nutrition training, psychiatric services, psychoeducation groups, service referrals, spiritual guidance, substance abuse counseling, therapy, volunteer training, anger management/A&D groups, assessment and service plans, career/college planning, dance, discovery science club, game and movie nights, hair care, I/L skills, neuropsychiatric monitoring, referral, research, transportation.
FLORIDA INSTITUTE FOR CHILD WELFARE
9
Trauma-informed Residential Care
Evidence-Informed Model of Care
Programs’ use of trauma-informed approaches were rated on a scale of 1-5 with higher scores indicating the program meets all assessed criteria for trauma-informed care. Mean scores indicated that, overall, programs are ‘somewhat’ trauma-informed (M = 3.06, SD = 1.51). Program ratings are presented in Figure 7 and Figure 8; Figure 7 shows the counts of group homes at each level of using trauma-informed care and Figure 8 shows the percentage of programs meeting elements of a traumainformed approach.
Programs’ use of an evidence-informed model was rated on a scale of 1-5 with higher scores indicating the program meets more of the assessed criteria for an evidence-informed model. Mean scores indicated that, overall, programs models are ‘somewhat’ to ‘a little’ evidence-informed (M = 2.44, SD = 1.48). Program ratings are presented in Figure 9. Figure 10 shows the percentage of programs meeting different criteria for an evidence-informed model.
Figure 7. Ratings of Trauma Informed Approach in Residential Programs (N = 146)
32 (22%)
37 (25%)
Figure 9. Ratings of Program’s use of Evidence-Informed Models (N = 140) 18 (13%) 17 (12%)
NOT AT ALL
A LITTLE SOMEWHAT
A LITTLE
27 (19%)
SOMEWHAT
26 (18%)
NOT AT ALL
62 (44%)
27 (19%)
MOSTLY COMPLETELY
24 (16%)
MOSTLY COMPLETELY
16 (11%)
Figure 10. Elements of Evidence-Informed Models (N = 147) Figure 8. Elements of Trauma-Informed Approaches Among Residential Programs (N = 152)
50
50% 46%
80 70
40
76.7% 71.5% 32%
30
60 50
44.7%
40
47.2%
24.2%
20
47.2%
20.1%
10
30 20
0
10 0
Staff trained in TIC
Trauma screening/ assessment
Traumafocus therapy
Trauma education for youth & family
Applies principles of TIC
Clearly defined model documented in handbook or manual
Model/services informed by published research
Staff trained in model of care with fidelity monitoring
Internal evaluation in past 5 years
External evaluation in past 5 years
Note. Categories are not mutually exclusive.
Note. Categories are not mutually exclusive.
FLORIDA INSTITUTE FOR CHILD WELFARE
10
Evidence-based (Validated) Assessments
IMPLEMENTATION EVALUATION
Programs’ use of evidence-based (validated) assessments to inform service planning was rated on a scale of 1-5 with higher scores indicating greater consistency in the use of validated assessments. Mean scores indicated that, overall, programs’ use of validated assessments is somewhat consistent (M = 2.75, SD = 1.73). Figure 11 shows that 38 (25.3%) of programs were rated by licensing specialists as consistently using validated assessments to inform service plans, 68 (45.3%) were rated as not using assessments at all, while the remaining 44 (29.3%) ranged from ‘a little’ to ‘mostly’.
The implementation evaluation results reported here are based primarily on results from documented triaged calls with the regional licensing teams. The purpose of the triage calls was to provide progress monitoring and technical support to facilitate completion of the GCQSA. A total of 61 calls were held with the licensing teams. The call dates are listed in Table 10.
Figure 11. Programs’ Use of Evidence-based Assessments (N = 150) 38 (25%)
68 (45%)
21 (14%)
NOT AT ALL A LITTLE SOMEWHAT MOSTLY COMPLETELY
16 (11%)
7 (5%)
A wide variety was observed in the different types and methods of assessment being used by programs, ranging from verbal interviews, collecting information from the Child Behavioral Health Assessment provided by the case managers, using assessment forms created by the programs, and using a wide range of existing assessment tools. Table 9 displays the top six validated assessments being used by residential providers. The percentages in the table are calculated from the full sample of 152 group homes with complete data on this item. The most frequently used validated assessment was the Child Assessment of Needs and Strengths (CANS) followed by the Child Functional Assessment Rating Scale (CFARS), the Child Behavior Checklist (CBCL), and the Trauma Symptom Checklist for Children (TSCC). Table 9. Top Six Most Frequently Reported Validated Assessments Used by Providers (N = 152) Assessment
Number
%
1
Child Assessment of Needs & Strengths (CANS)
22
14.5%
2
Child Functional Assessment Ratings Scale (CFARS)
11
7.2%
3
Child Behavior Checklist (CBCL)
3
1.9%
4
Trauma Symptom Checklist for Children (TSCC)
3
1.9%
5
Adverse Childhood Experiences Scale (ACES)
2
1.3%
6
Preschool & Early Childhood Functional Assessment (PECFAS)
2
1.3%
Table 10. Summary of Technical Support Calls with Regional Licensing Teams Central
Northwest
Northeast
Southern
Suncoast
Southeast
5/7/18
5/15/18*
5/25/18*
5/15/18
4/30/18
5/11/18
6/4/18
5/29/18*
6/22/18
5/29/18*
5/28/18*
6/8/18*
7/2/18
6/12/18
7/27/18*
6/12/18
5/14/18
6/22/18
8/6/18
6/26/18
8/24/18
6/26/18
6/11/18*
7/6/18
10/1/18
7/10/18
9/28/18
7/10/18
7/9/18
7/20/18
11/5/18
7/24/18
10/26/18*
7/24/18
7/23/18
8/17/18
12/3/18
8/20/18*
11/23/18*
8/21/18
8/27/18
9/21/18
1/7/19
9/18/18
12/28/19*
9/18/18
10/22/18ϕ
10/19/18
2/4/19
10/16/18
1/25/19
10/16/18
11/26/18
11/16/18
3/4/19*
11/20/18
2/22/19
11/20/18*
1/28/19
12/21/18
4/1/19
12/18/18*
3/22/19*
12/18/18
2/25/19
1/28/19
1/15/19
4/26/19*
1/15/19
3/25/19
2/25/19
2/19/19*
2/19/19
4/22/19
3/25/19
3/19/19
3/19/19
4/19/19
4/16/19 Note. * Indicates call was cancelled due to scheduling conflicts or holiday. Φ Indicates the project team traveled to the site for re-licensing observation.
The structure of the calls typically began with the research team providing updates on form completion since the previous call, then time was spent summarizing missing forms for programs, verifying that forms were matched with the correct programs, and discussing any observed trends or issues within the data. The licensing teams provided progress updates for their regions and discussed any questions or issues they had encountered. The calls also served as a way for the research team to learn about questions or concerns from providers, lead agencies, or youth that were shared with licensing specialists. Recurring themes and frequently asked questions extracted from the aggregated call notes are summarized in the following section. Generally, the licensing specialists consistently reported the process was ‘going well’. They reported that this sentiment was echoed among providers and lead agencies. The most consistent complaint was related to the lengthiness of the assessment. However, both licensing specialists and providers (as reported by licensing specialists) indicated that as they became more familiar with the assessment, it began to take less time.
Note. Other validated assessments reported as being used by one program included: Child and Adolescent Functional Scale (CAFAS), Child PTSD Symptom Scale (CPSS-V5), Child PTSD Signature Scale for DS-5, Strengths and Difficulties Questionnaire, Modified Overt Aggression Scale (MOAS).
FLORIDA INSTITUTE FOR CHILD WELFARE
11
Theme 1: No Lead Agency A frequently recurring theme discussed during the calls surrounded how to respond when there was no identifiable lead contract agency for a program. This occurred in cases of programs that primarily served children from the community (i.e., youth not formally involved in the dependency system) or for programs that received mostly out-of-county placements. These instances were processed on a case-by-case basis to determine whether a lead agency form could or should be completed. In some cases, the licensing specialist noted that the local community-based care lead agency, although not contracting with the group home to coordinate placements within the region, still engaged in some monitoring and could likely complete the assessment form. In other instances where this was not the case, it was determined that the lead agency form was not applicable and, therefore, could not be required, and the program could ‘opt-out’ of the requirement to have an assessment form completed by a lead agency with the reason provided listed in the assessment.
Theme 2: Incomplete Youth Forms Providers and licensing specialists also posed questions concerning the completion of youth forms. Different reasons for incomplete youth forms included: youth declining to participate, there were no youth meeting eligibility criteria available during the assessment, the program serves youth with cognitive or developmental disabilities, or in a few instances, no youth were placed at the time of the assessment. In each of these cases, or similar circumstances during future assessments, the team decided that a youth form could not be required or would be considered as meeting criteria for opting out of completing the youth form.
Theme 3: Program does not Take Dependency Youth, Only has Short-Term Placements, or is Dually-Licensed Another recurring question concerned whether programs that do not serve children in the dependency system should be required to participate in the assessment. This question was also posed for short-term shelters and dually-licensed facilities. The GCQSA was designed to assess core elements of quality thought to be applicable to a broad range of settings. Therefore, any facility licensed by the Department is eligible to participate in the GCQSA. However, some items are automatically not applicable to short-term placements (e.g., Domain 7. Item 3: For youth who stay in this program for a full school year, the majority (over 60%) progressed into a higher grade.). These items were identified in the early pilots and are automatically skipped in the assessments of shelters.
Theme 4: Missing Forms and Follow-Up The call discussions often focused on missing forms and followup efforts by the licensing specialists to prompt form completion. Some incomplete data was expected due to introducing a new process. Missing forms occurred for all respondent types. However, in the early phases of the statewide pilot, it was observed that missing forms were occurring most frequently among the lead agencies. Among the reasons underlying this
FLORIDA INSTITUTE FOR CHILD WELFARE
issue included some programs having no identifiable lead agency and for others, there was a lag due to figuring out who at the lead agency should be responsible for completing the form (i.e., directing the form to the right person). A list of programs with missing forms for the various respondent types were provided during each of call to guide where licensing specialists should direct follow-up emails or calls. Follow-up calls and/or emails were found to be highly effective and were often followed by observable spikes in forms completion in the data. The early trends in missing forms among lead agencies was reversed with overall high levels of completion, including 100 percent completion in the Central region. Some licensing specialists reported having to engage in multiple follow-up attempts, which could be time consuming. The ‘Tracking Report’, a link to a Qualtrics generated report consisting of a table displaying current data on form completion for each program, was created to aid in tracking form completing. When asked, the licensing specialists who used the report said they found it helpful. In a few instances, forms that were considered missing were found to be partially completed but not submitted. Another issue contributing to missing forms was due to respondents completing forms under the wrong professional title (e.g., lead agency completed licensing specialist form) or program. Working with the licensing specialists, several of these forms were located in our data and to were matched with the correct program. In a few instances, the licensing specialists asked providers to complete the forms again.
Theme 5: Clarification on How to Assess Certain Items The fifth theme included a compilation of questions all aimed at how to best assess a given practice. In some cases, the questions were related to understanding language, how to generate the best available evidence (i.e., what criteria to assess or look for), or how to evaluate an item when limited evidence was available. These questions (along with the issues described above) will be presented with responses in a “Frequently Asked Questions” component of the revised training and manual. Examples included the following: 1. What should be considered as a service/treatment plan for group homes? Can it include the plan created by the Department or a contracted therapist? 2. If a comprehensive assessment is provided to the group home by the CBC, does this count? 3. If the program used contracted providers, does this count as the program providing trauma treatment? What if the type of therapy is not specified? 4. If the group home is in the process of having an evaluation done but it is not completed, should they be rated as if they are not using an evidence-based model? 5. Is there a difference between evidence-based and evidence-supported? The Sanctuary Model states that it is theory-based and evidence-supported, should this be considered evidence-supported?
12
GROUP CARE QUALITY STANDARDS ASSESSMENT PSYCHOMETRICS
Confirmatory Factor Analysis
Figure 11 shows the multidimensional measure of quality in residential care based on the Group Care Quality Standards defined by the workgroup. Each domain represents a key area of practice contributing to quality of care. The figure depicts the hypothesized latent structure underlying the GCQSA measured by each the different forms (Youth, Licensing, Lead Agency, Director, Direct Care Worker) with the items capturing the observable practice elements within each domain. The arrows indicate a correlated factor (e.g. D1, D2, D3…etc.) structure. Note, the Youth Form does not contain a scale for Domain 5 (Professional, Competent Staff). Figure 12. Domains and Items Comprising the GCQSA Form
Table 11 shows the results comparing model fit for the original and re-specified models. A guide for interpreting cut-off values for each fit statistic is shown at the bottom of the table. Table 11. Confirmatory Factor Analysis Model Fit for Original and Re-specified Models Model (items)
χ2/df
TLI
CFI
RMSEA
WRMR
Youth
Original (68)
2.24
.90
.90
.05
1.60
Licensing
Original (84)
2.71
.87
.87
.11
2.77
Licensing
Re-specified (76)
2.15
.92
.93
.08
1.87
Lead Agency
Original (81)
2.70
.88
.88
.10
2.33
Lead Agency
Re-specified (73)
2.18
.93
.93
.08
1.70
Form
D1
D2
D3
D4
D5
D6
D7
D8
Director
Original (82)
2.60
.85
.86
.08
2.94
# items
# items
# items
# items
# items
# items
# items
# items
Director
Re-specified (74)
2.17
.90
.91
.07
2.15
Direct Care Worker
Original (82)
4.07
.83
.84
.08
3.15
Direct Care Worker
Re-specified (77)
2.33
.92
.93
.05
2.07
Note. D1 = Assessment, Admission, and Service Planning; 2 = Positive, Safe Living Environment; D3 = Monitor and Report Problems; D4 = Family, Culture, and Spirituality; D5 = Professional, Competent Staff, D6 = Program Elements; D7 = Education, Skills, and Positive Outcomes; D8 = Pre-Discharge/Post Discharge Processes.
Psychometric analyses included a confirmatory factor analyses using weighted least squares estimation to test the hypothesized eight-factor model (seven for the youth form) and internal consistency reliability. Preliminary item-level analyses were performed to assess for missing data and skewedness resulting in dropping item 5.1 “The clinical team meets weekly to discussion treatment progress” from Domain 5 due to more than 90% missing on the item. Due to the fact that each of the five GCQSA forms must be separately validated, the presentation of psychometric results are abbreviated and combined across forms for conciseness.B The results are presented as follows: 1) results from confirmatory factor analysis including model fit statistics for the original and re-specified models; 2) summary of model re-specifications (e.g., items deleted) for each form; and 3) Cronbach’s alpha coefficients for the domains for all five forms.
B
Note. χ2/df = Chi-square/de degrees of freedom; Cut-off < 3 TLI = Tucker Lewis Index; Cut-off > .90 CFI = Comparative Fit Index; Cut-off > .90 RMSEA = Root mean error of approximation; < .08 WRMR = Weighted Root Mean Square Residual; Cut-off < 1.0
Summary of Results and Re-specifications Model fit was achieved for all forms following re-specifications. Models were re-specified based on item factor loadings, item cross-loadings, and modification indices. Items with factor loading of < .35 were considered to be poorly correlated with other items in the scale and were excluded. Item cross-loading occurs when an item loads highly with items on a different scale (e.g., item 1 of domain 1 is also strongly correlated with items in domain 2). Cross-loading can contribute to poorly delineated scales and reduce model fit. However, some cross-loading was allowed, provided the item clearly loaded more highly on the scale it was intended for, and this did not negatively impact model fit. Modification indices were also used to determine whether to drop an item that was highly correlated with items that were not part of the same subscale. If dropping the item would improve the model chi-square value by .25 or greater, the item was excluded from the re-specified model. In addition, the model was adjusted by adding statements allowing for correlated error variance among some of the items that belong to the same domain (i.e. scale).
Detailed results for all forms are available upon request.
FLORIDA INSTITUTE FOR CHILD WELFARE
13
If the modification indices showed that allowing a pair of items to correlate would improve the model chi-square by .25 or greater, a statement was added to the model. The results for each of the forms are summarized below. The five items that were excluded from two or more forms included: 1.6, 5.7, 6.3. 7.8, and 7.6. Youth Form (N = 450) All cut-off criteria were met when fitting the original model including all 68 items for the youth form. The value for the Weighted Root Mean Square Residual was the only fit index that was under the cut-off, which is given less weight than other fit statistics, with some statisticians cautioning itâ&#x20AC;&#x2122;s use as a sole measure of fit due to unreliability. Therefore, the Youth Form required no re-specification with results supporting its factorial validity.
Lead Agency (N = 183) Eight items were excluded from the re-specified licensing form (see Table 13). Items 1.7 and 1.8 demonstrated extreme interitem correlations, indicating possible duplication (r = .912). The model was re-run after dropping item 1.7, resulting in improved model fit. Similar results were found for items 5.7 and 5.4 (r = .916) and for 8.1 and 8.2 (r = .98) resulting in the decision to drop items 5.7 and 8.2. The remaining items were excluded due to cross-loading on other scales and modification indices indicating that dropping the item would result in improved model fit greater than .25 (1.6, 6.3, 7.2, 7.6, 7.8). Four statements allowing correlated error variance between items within the same scales were added to the re-specified model (3.2 with 3.1; 1.10 with 1.5; 1.2 with 1.1; and 5.3 with domain 5). Table 13. Items Excluded from the Re-Specified Lead Agency Form
Licensing Form (N = 152) Eight items were excluded from the re-specified model (see Table 12). One item (7.1) was dropped due to a low factor loading (loading = .115), indicating that the item was poorly correlated with other items in the scale. The remaining seven items were dropped due to item cross-loadings and/or modification indices indicating that dropping the item would results in at least a .25 or greater improvement in model fit. Nine â&#x20AC;&#x2DC;withâ&#x20AC;&#x2122; statements were added allowing for correlated error variance between the following items: 1.10 with 1.15; 1.1 with 1.2; 1.14 with 1.12; 1.12 with 1.6; 1.14 with 1.6; 1.13 with 1.7; 6.7 with 6.6; 7.9 with 7.4; and 7.6 with 7.5. Table 12. Items Excluded from the Re-Specified Licensing Form
Items
1.6
As much as possible, the views of other providers are considered when making admissions decisions.*
1.7
Documented service plans focus on individual strengths and needs.
5.7
Staff receive training and demonstrate competency in teaching positive social skills.*
6.3
Youth are allowed to have personal belongings, and personalize their rooms.*
7.2
Youth receive the educational supports they need to succeed.
7.6
Staff teach positive social skills, values, and behaviors to youth in the program.*
7.8
The program focuses on reducing behavioral issues and symptoms in youth.*
8.2
Transition plans include a focus on the continuity of family relationships.
Items 4.5
Youth have opportunities to participate in community activities.
5.7
Staff receive training and demonstrate competency in teaching positive social skills.*
6.3
7.1
Youth are allowed to have personal belongings, and personalize their rooms.*
The program ensures that youth receive on-going comprehensive educational assessments needed to determine their educational needs.
7.8
The program focuses on reducing behavioral issues and symptoms in youth.*
8.4
Before a youth is discharged to a new placement, he or she is given a period of transitional time to become familiar and comfortable with the new placement.
8.6
Within 30 days after discharge, the program follows up with youth and their caregivers to check whether they are connected with aftercare services and other supports.
8.7
The program follows-up with youth and their caregivers to monitor postdischarge outcomes.
Note. *Asterisks indicate items that were excluded across multiple forms.
Note. *Asterisks indicate items that were excluded across multiple forms.
FLORIDA INSTITUTE FOR CHILD WELFARE
14
Director (N = 272)
Reliability analysis
Eight items were excluded from the re-specified model (see Table 14). Items 6.15 (loading = .137) and 3.3 (loading = .221) were excluded due to low factor loadings. The remaining items were excluded due to cross-loading on other scales and modification indices indicated that dropping the item would result in model fit improvement of .25 or greater. One statement allowing correlated error variance between items 2.4 and 2.5 was added to the final, best fitting model.
Cronbach’s alpha coefficients were computed using SPSS for each domain for the final, re-specified model scale. Reliability coefficients of .70 or higher are considered as evidence of acceptable (or better) reliability. As shown in Table 16, all domains exceeded the minimum criteria demonstrating evidence of goodexcellent reliability. The standard error of measurement (SEM) represents the average deviation and the extent to which the test provides an accurate measure. Smaller SEM values indicated a more accurate measure.
Table 14. Items Excluded from the Re-Specified Director Form
Table 16. Reliability Coefficients by Domain and Form
Items 1.3
Evidence-based assessments are used to inform service planning.
3.3
Youth may contact an outside advocate (e.g., GAL, case manager, child advocate) to share concerns about their care.
5.7
Staff receive training and demonstrate competency in teaching positive social skills.*
6.3
Youth are allowed to have personal belongings, and personalize their rooms.*
6.4
Youth are allowed to have a choice in the clothes they wear.
6.15
Psychiatrists monitor youths’ psychotropic medication regimens at least once a month.
7.6
Staff teach positive social skills, values, and behaviors to youth in the program.*
7.8
The program focuses on reducing behavioral issues and symptoms in youth.*
Note. *Asterisks indicate items that were excluded across multiple forms.
Direct Care Worker (N = 450) Five items (see Table 15) were excluded from the re-specified model based on item cross-loading with items in other scales and modification indices indicating that dropping the item would result in improved model fit (>.25). Table 15. Items Excluded from the Re-Specified Direct Care Worker Form Items 1.6
As much as possible, the views of other providers are considered when making admission decisions.*
1.12
Service plans include a focus on increasing children’s family and natural supports.
1.14
Service plans are reviewed and updated with youth and their family at least every 90 days.
5.7
Staff receive training and demonstrate competency in teaching positive social skills.*
6.3
Youth are allowed to have personal belongings, and personalize their rooms.*
6.4
Youth are allowed to have a choice in the clothes they wear.
Youth
Licensing
Lead Agency
Director
Direct Care Worker
α
(SEM)
α
(SEM)
α
(SEM)
α
(SEM)
α
(SEM)
D1
.877
(.307)
.942
(.248)
.935
(.228)
.944
(.153)
.907
(.205)
D2
.887
(.236)
.919
(.187)
.937
(.167)
.937
(.088)
.816
(.165)
D3
.830
(.367)
.883
(.259)
.931
(.192)
.928
(.085)
.805
(.200)
D4
.875
(.245)
.884
(.231)
.931
(.183)
.931
(.083)
.871
(.153)
D5
---
.794
(.382)
.875
(.280)
.875
(.137)
.825
(.190)
D6
.920
(.189)
.932
(.174)
.961
(.134)
.959
(.062)
.873
(.121)
D7
.872
(.308)
.813
(.317)
.929
(.214)
.929
(.116)
.869
(.147)
D8
.862
(.366)
.862
(.446)
.930
(.323)
.943
(.218)
.886
(.255)
CORRELATIONS BETWEEN THE GCQSA RATINGS AND PROGRAM INDICATORS Correlations between mean ratings from the Youth and Licensing Forms and select program indicators were examined. The program indicators included six items measuring the counts of various incidents (i.e., youth hospitalizations, staff injuries, youth injuries, calls for law enforcement response, runaway episodes, and the total number of combined incidents) that occurred in the group home within the past six months. Program indicators were treated as possible proxies for program quality where it was expected that quality ratings would be inversely related to incidents; that is, higher quality ratings would be correlated with fewer incidents. However, it is important to acknowledge the exploratory nature of these analyses. Few, if any, prior studies have been conducted that examine elements of quality with program characteristics. It is also recognized that many other factors could contribute to the occurrence of the types of incidents measured. For instance, a group home known to take youth with severe externalizing behaviors might be expected to experience more incidents, regardless of quality. These results should be viewed with caution and as preliminary analyses that were aimed at exploring associations with various program indicators and quality. Correlations between domains scores, items, and total scores were examined with the full results presented in Appendix B. Criteria that were used for identifying practically and statistically relevant correlations included: probability value <.10 and effect sizes > .1. Correlation coefficients presented in Appendix B that were between .1-2. are considered small, .2-.5 are considered medium, and >.5 are considered large.
Note. *Asterisks indicate items that were excluded across multiple forms.
FLORIDA INSTITUTE FOR CHILD WELFARE
15
Youth Form As shown in Appendix B, consistent with expectations, negative correlations between several domain scores and four out of six indicators were observed along with several item-level correlations. In addition, these expected results indicated that higher domain scores and/or item ratings were associated with fewer incidents. Table 17 provides a succinct summary of the domains and the number of items demonstrating negative correlations with each type of incident—meeting the specified criteria for effect sizes and probability values. Higher scores on the domains (higher quality) most strongly predicted fewer youth injuries and police calls, followed by total incidents and runaway episodes. Hospitalizations and staff injuries were not correlated with any of the domains; however, some item-level correlations were observed. Domains 2 (Positive, Safe Living Environment) and 6 (Program Elements) were correlated with the most program indicators (4 out 6), while Domain 5 (Professional & Competent Staff) was not statistically correlated with any program indicator. Table 17. Summary of Domain and Item Correlations with Program Indicators – Youth Form Hospitalization Dom.
D1
D2
D3
D4
D5
D6
D7
D8
-
-
-
-
-
-
-
-
# Items
1
2
1
-
-
3
2
-
Youth injury
Staff injury
Dom.
-
-
-
-
-
-
-
-
# Items
-
-
-
1
-
-
-
-
Dom.
-
-
Police calls
# Items
5
10
2
3
-
8
4
1
Dom.
# Items
5
6
2
3
-
-
10
FLORIDA INSTITUTE FOR CHILD WELFARE
5
3
Runaway
Dom.
-
-
-
-
-
-
# Items
2
4
1
-
-
7
3
2
Correlations were also examined between mean domain scores and item ratings of the Licensing Form and the six program indicators. In the initial analyses, unexpected, positive, and statistically non-significant results were observed showing higher quality rating were associated with more incidents in some cases. Subsequent analyses were performed, adding control variables to examine whether these extraneous variables may be influencing the relation between quality ratings and program indicators. The results showed that the number of youths served in a program was positively associated with all six indicators (p < .05). As might be expected, as the number of youth served in the program increased, the number of incidents also increased. Correlations between quality ratings and program indicators were re-examined controlling for the number of youth served. After accounting for the variance in program indicators associated with the number of youth served, several negative and statistically relevant correlations between quality ratings and the program indicators emerged. These results are summarized in the Table 18. Domain 7 (Education, Skills, and Positive Outcomes) was correlated with fewer police calls. Otherwise, all correlations were observed at the individual item level. Table 18. Summary of Domain and Item Correlations with Program Indicators – Licensing Form
Total Incidents Dom.
-
-
Hospitalization
# Items
Youth injury
Staff injury
Police calls
Runaway
Total Incidents
Dom.
# Items
Dom.
# Items
Dom.
# Items
Dom.
# Items
Dom.
# Items
Dom.
# Items
D1
-
-
-
-
-
-
-
-
-
-
-
-
D2
-
5
-
4
-
1
-
4
-
2
-
2
D3
-
1
-
1
-
-
-
-
-
-
-
-
D4
-
-
-
1
-
-
-
1
-
1
-
1
D5
-
1
-
-
-
-
-
1
-
-
-
-
D6
-
1
-
2
-
-
-
8
-
4
-
4
D7
-
1
-
1
-
-
1
-
-
-
-
D8
-
-
-
-
-
-
-
3
-
-
-
-
2
6
2
1
-
7
5
2
16
GCQSA BASELINE RATINGS & SUPPORTING THEMES FROM OPEN-ENDED RESPONSES GCQSA ratings reflect the extent to which a standard, or the combined standards, are being met within a domain (on average) with scores ranging from 1 = ‘not at all’, 2 = ‘a little’, 3 = ‘somewhat’, 4 = ‘mostly’, and 5 = ‘completely’. Figure 13 displays mean domain scores by respondent with error bars representing 95 percent confidence intervals (CI) around the mean. Relatively lower scores were observed for Domains 1 and 8. Scores were more consistent and higher for Domains 2 through 6 across respondents. Scores of directors and direct care workers were notably more consistently with one another, higher and less varied relative to other respondent types. Scores of the licensing specialists were often lower relative to others, more varied as indicated by the width of the CIs and, overall, most consistent with those of the lead agencies. Means, standard deviations, and the minimum and maximum scores by domain and respondent are presented in Table 19. Figure 13. GCQSA Domains Means and 95% Confidences Intervals by Respondent Type
Table 19. GCQSA Descriptive Statistics by Domains and Respondent Type
D1: Assessment, Admission, & Service Planning
D2: Safe, Positive Living Environment
D3: Assessment, Admission, & Service Planning
D4: Family, Culture, & Spirituality
5.00
4.50
D5: Professional & Competent Staff
4.00
D6: Program Elements
3.50
3.00 Youth
Lead Agency
Direct Care Worker
Director
Licensing
D7: Education, Skills, & Positive Outcomes
D1 Assessment D2 Safe Environment D3 Monitor & Report D4 Family & Culture D5 Professional Staff D6 Program Elements D7 Education & Skills D8 Pre/Post Discharge
D8: PreDischarge/ Post Discharge Processes
N
Mean
SD
Min
Max
Youth
450
4.04
.876
1.00
5.00
Lead Agency
182
3.90
.892
1.00
5.00
DCW
447
4.48
.671
1.00
5.00
Director
272
4.45
.646
1.00
5.00
Licensing
159
3.65
1.03
1.00
5.00
Youth
450
4.54
.702
1.00
5.00
Lead Agency
177
4.42
.668
1.00
5.00
DCW
430
4.77
.384
1.33
5.00
Director
265
4.78
.352
1.00
5.00
Licensing
159
4.41
.656
2.00
5.00
Youth
449
4.46
.809
1.00
5.00
Lead Agency
179
4.42
.731
1.00
5.00
DCW
450
4.72
.447
2.29
5.00
Director
272
4.83
.319
2.14
5.00
Licensing
159
4.16
.759
2.00
5.00
Youth
450
4.53
.693
1.00
5.00
Lead Agency
178
4.45
.695
1.00
5.00
DCW
450
4.77
.427
2.33
5.00
Director
272
4.83
.316
2.00
5.00
Licensing
158
4.30
.677
2.00
5.00
Youth
0
-
-
-
-
Lead Agency
176
4.31
.790
1.00
5.00
DCW
450
4.77
.447
1.80
5.00
Director
271
4.80
.387
2.00
5.00
Licensing
158
4.16
.843
2.00
5.00
Youth
450
4.51
.667
1.00
5.00
Lead Agency
179
4.52
.628
1.00
5.00
DCW
450
4.78
.340
2.00
5.00
Director
272
4.84
.307
1.87
5.00
Licensing
159
4.14
.666
2.00
5.00
Youth
449
4.26
.861
1.00
5.00
Lead Agency
177
4.37
.803
1.00
5.00
DCW
448
4.78
.405
2.86
5.00
Director
209
4.77
.434
1.40
5.00
Licensing
159
4.24
.734
1.71
5.00
Youth
389
4.25
.985
1.00
5.00
Lead Agency
146
3.58
1.22
1.00
5.00
DCW
370
4.47
.756
1.00
5.00
Director
208
4.20
.910
1.00
5.00
Licensing
127
3.64
1.21
1.00
5.00
Note. Error Bars = 95% Confidence Interval. Figure is re-scaled to the data.
FLORIDA INSTITUTE FOR CHILD WELFARE
17
QUALITATIVE THEMES RELATED TO QUALITY RATINGS In addition to domain scores, open-ended comments from the licensing specialists were used to facilitate an increased understanding of the ratings and current practice in residential care related to the quality standards. Open-ended responses from 152 licensing forms were independently reviewed by two members of the research team and then discussed until agreement was reached on the common themes. Comments were organized into the following most frequently recurrent themes, respectively: Family Involvement (27 comments), Behavior Management (21 comments), Facilitating Quality Improvement (20 comments), Documentation (17 comments), and Program Models (15 Comments). Each of these themes are described below along with supporting comments and direct quotes from the survey.
“Service plans are more focus[ed] on clinical goals. Provider is now ensuring that service plans are focus[ed] on safety, permanency and well-being.” As reflected in the above comments, providers focused on specific areas and set goals to address quality improvements based on the standards such as “The provider[‘s] goal for next licensing year is to have all parties involved with the child participate in the service plan.” In at least one instance, the licensing specialist was able to address a misconception about certain practices as reflected in the following comment: “The program was told by [name redacted] that they could not practice or encourage spiritual beliefs; this was corrected during licensing visit.” These comments, along with documented discussions during the triage calls, provide emerging evidence of the positive impacts the GCQSA is having on group home practice that are consistent with the intended goals—to facilitate continuous quality improvement.
Theme 1: Family Involvement Licensing specialist comments reflected their views that the group homes were generally supportive of family involvement and reunification efforts. The most frequently mentioned example of how programs support family involvement was through facilitating family visits through scheduling visits or, in some programs, providing transportation. Other examples mentioned included a program that assists with paying for hotel rooms or allowing families to stay in a cabin on the campus for those that live outof-town. The most common reasons mentioned for limited family involvement across programs were summarized in the following comment: “Parental/familial involvement is limited due to TPR [terminated parental rights] or lack of involvement by the parent with case management; distance is also a barrier as all youth placed are from out of the area…” However, some programs were praised for the efforts to include families as reflected in the following: “[Group home] does a fantastic job assisting with parent visitations and including bio-parents when possible.”
Theme 2: Behavior Management Although several specialists commented on behavior management, the comments were highly consistent and most often simply stated that the program does not use physical restraints or seclusion room placements as a method of behavioral management. In a few instances, it was noted that the program had used physical restraints but “only as needed”. In addition, licensing specialists frequently commented that programs used methods of “verbal de-escalation” to manage behavioral incidents.
Theme 3: Facilitating Quality Improvement The GCQSA provided an opportunity for providers to assess current practices and for licensing specialists to offer recommendations on how to more closely align practices with the standards, as reflected in comments such as “Recommendations were made to improve the detail in their incident reports.” and the notes below: “Service plan is an area that was focused on during relicensure. The program ensures moving forward that it includes and meets the quality group home standards. Also, the CANS assessment was provided to the agency.”
FLORIDA INSTITUTE FOR CHILD WELFARE
Theme 4: Documentation The licensing specialists frequently commented on limited or lack of documentation in several areas including training, supervision, service plans, and discharge. This created difficulty with assessing these practices, resulting in recommendations for providers to improve their documentation as reflected in the following comments: “A good amount of discharges are from disruptions; however, the program informally follows up with youth, they do lack documentation of this and were encouraged to put something in place.” “The agency is providing appropriate services, however it recommended that the agency include and improve documentation to include all that is required in the rule such as goals and progress information on vocational, physical and behavioral health needs.” “Agency had more than 40 hours on file for staff! That was great to see! Agency now knows that frequent staff supervision should be documented and that is one of the goals for next licensing year.” In some cases, comments reflected that recommendations had been put into effect and resulted in notable improvements: “The agency has shown a huge improvement in regard to documentation where the training is now captured and clear.”
Theme 5: Program Model Several comments were related to programs’ use of a model. Most comments indicated that the program had no specified model, documented manual, and/or handbook (e.g., “The program has not developed an evidence-informed model of care”, “The agency staff are training in trauma-informed care and oversight is provided by the CBC however they do no[t] have a manual.”). Among programs using existing evidence-supported models, the Sanctuary and Teaching Family Model were both mentioned. In other cases, comments reflected that while the program did not currently have a model, they were working toward developing one (e.g., “The program is in the process of identifying a model that will best fit their facility.”).
18
YOUTHS’ REFLECTIONS ON EXPERIENCE IN RESIDENTIAL CARE Of the 450 youth assessments, 119 youth participants provided open-ended responses related to the standards and their experiences in residential care. After conducting initial reviews of the open-ended response data by two members of the research team, subsequent analyses focused on coding text into the following four themes: 1) Positive Comments on Group Home; 2) Negative Comments on Group Home; 3) Positive Comments on the GCQSA; and 4) Negative Comments on the GCQSA. Summary descriptions of content related to each broad theme along with direct quotes pulled from youth comments are presented below.
Theme 1: Positive Comments on Group Home In total there were 70 comments within this category (the largest proportion). That is, the majority of the 119 youth who provided comments on their assessments expressed feeling positively about their group homes. Youth reflected on feeling they were well treated, supported and cared for (25 quotes) and others commented that their current placement as “the best I’ve ever been” (9 quotes). “I did not know I was coming to [group home] and at first I did not want to be here. But over the last year or so I like it. My house parents are always supportive of me and at times I don’t like to show it but I appreciate them and all of the staff at [group home]. My case coordinator is easy to talk to and always makes me feel like i am a priority. They feel like family to me. They are annoyingly right at times and are pushing me to be a better person.” “Born into foster care and I must say this is the hands down best group home I've ever been in. The staff and [staff member] have changed my whole mindset and life, they've really taught me how important my life is and my child. And they've helped me tremendously in school and getting my GED. I could go on and on about this home and how much love they have given me even after all the disrespect I've put them through. It's been nothing but love and forgiveness which I've never felt in the [age] I've been alive. I can't even explain how much of an amazing place this is.” Youth comments also reflected on personal growth and selfimprovement (i.e. life skills development) they experienced since their placement in the group home (8 quotes). Oftentimes, youth reflected on a special connection with a particular staff member and how that relationship facilitated positive change (7 quotes). A number of youths stated that their group home felt like “a family” or “home”. In addition, youth expressed how they felt safe in their group homes (8 quotes) and that there was good communication (both in expectations from staff and in feeling their concerns were being heard) (5 quotes). The following quotes exemplify these themes: “We are like a family go through struggles and we pray and we love to eat.” “this program was vary difficult it definitely pushed me to my limit. staff always new how to keep me and everyone around me safe and made me and the others feel like they could talk to them, without judgement.” “Overall, I feel very confident in this program improving my social skills, anger management, and my home situation. The staff keep me safe, and help me learn to do the right thing.”
FLORIDA INSTITUTE FOR CHILD WELFARE
Theme 2: Negative Comments on Group Home In total there were 25 comments reflecting negative views on their group home experiences or specific circumstance or situations youth would like to see changed or improved upon. The most frequent of the comments were related to feeling that staff do not listen or respond to their needs adequately (7 quotes) and various ways in which the environment was viewed as restrictive (6 quotes). “I feel like my opinion is not valued by my fellow staffs. I am constantly ignored and is not treated with respect. My needs are not being met at [group home]. We are isolated from the community most of the time. This is an ok placement with lots of improvements needed.” Comments on environmental restrictions were more varied and include remarkss such as “yes phone calls but no phone until I’m 15”, “I want doors on the shower. I want wifi. I am somewhat satisfied with my care at [group home]” and “Would like more community time, more allowance, and more activities”. Four comments were related to access to church, either reflecting not enough access as in the following quote “If staff does not feel like taking us to church, they won’t take us…” or feeling forced to attend “We get force to go to church every Sunday”. Some youth commented on having limited or no knowledge of their service plan (4 quotes). Other comments reflected views of staff as bullying them “Sometimes staff are the bullies” or feeling too closely monitored “They are constantly looking over our shoulders”.
Theme 3: Positive Comments on the GCQSA There were five comments expressing positive views about the GCQSA—each related to the assessment as providing an opportunity for the youth to express their point of view, as reflected in the following: “this survey was very helpful and I hope my voice is heard so to the person reading this please acknowledge the fact that [group home] have helped me to become a much better person. Even though i understand my situation may not be the best I know that at [group home] i have a family here and they provide a and environment stimulating normality and stability” “Thank You for letting me fill this survey out because I really think that it helps when you guys really know what's going on at these places!”
Theme 4: Negative Views on the GCQSA This category was comprised of ten comments in which youth expressed dissatisfaction with the survey or provided comments on what could be improved upon. Comments within this category were generally more varied and less consistent relative to the other categories and included negative views about the structure or format of the survey “too long”, “no good choices”, and “I would prefer to have it read to me”. In other cases, the comments reflected youths needed more instruction on how to complete the survey “I didn’t know how to complete this”. Some youth commented on items not being applicable due to their age (6 quotes).
19
DISSEMINATION Preliminary results from the statewide pilot were presented at the Association of Children’s Residential Centers Annual Conference in April 2019, the Florida Coalition for Children Conference in July 2019, and at the Children’s Bureau Evaluation Summit in August 2019.
Discussion The purpose of the statewide pilot was to begin implementing the GCQSA in all six regions, giving stakeholders in each region an opportunity to become familiar with the assessment while providing careful monitoring and ongoing technical support. Assessment data were collected for one year from the full population of licensed residential care programs throughout Florida. Completion rates for the GCQSA were high overall across programs and the different respondent groups. The sample supports a heterogeneous population of residential programs with the majority falling into the category of ‘traditional group home’. Based on requirements of the Family First Prevention Services Act, programs’ use of trauma-informed approaches and validated assessment were examined. Results of the combined analyses suggest trauma-informed approaches and validated assessments are being utilized in residential care, however, there is substantial room for growth in both areas. Themes within the qualitative analyses suggests providers are working toward finding ways (e.g., training, models, trauma assessment) to adapt their current practices and program models. Results of the evaluation of the implementation process support the effectiveness of the current process. Themes from documented technical support calls revealed relevant implementation considerations. The calls provided an opportunity for the project team to discuss and figure out ways to best address these issues during the statewide pilot and for the continued implementation of the GCQSA. These results will also be incorporated into an updated training curriculum and manual. Substantial progress was achieved in the validation of the GCQSA. Specifically, evidence of factorial validity was established for all five forms, resulting in identifying items to consider excluding from the assessment. In addition, all domain scales demonstrate good-excellent reliability. These are critical steps in the validation process, resulting in increased confidence in the GCQSA as a valid and reliable measure of quality residential care and in the preparedness of the GCQSA to undergo the final stages in the validation process. Most consistently, domain and item rating on the youth form were found to be negatively correlated with critical incidents (youth hospitalizations, staff injury, youth injury, law enforcement calls, runaway episodes, total incidents). Results of correlational analyses of the licensing form showed that mean scores on domain 7 (Education, Skills, and Positive Outcomes) were associated with critical incidents within programs. No other domains were statistically associated with critical incidents on the licensing form. However, several item-level correlations were observed. Taken together, these results lend preliminary support for construct validity of the GCQSA. Future analyses should be conducted to provide a more rigorous test of construct validity. Analyses of open-ended responses provided additional insight into the item ratings, current practices in residential care, and, importantly, ways in which the GCQSA is prompting providers
FLORIDA INSTITUTE FOR CHILD WELFARE
to assess their current practice and to begin aligning practices with the standards. The combined baseline scores along with open-ended responses can guide efforts to support providers in adapting their programs to improve quality care. Finally, youth comments suggest that for many, their experiences in residential care have resulted in personal growth and meaningful connections, while other comments identify areas where youth are less satisfied with their care. It is important that the views of residential care are balanced, considering both positive and negative aspects. These finding shed light on both. Recommendations for supporting quality residential care and for preparing for the next phases of the GCQSA and statewide accountability system are based on the results of the statewide pilot are presented below.
Recommendations • Provide assistance to help providers become trauma informed. This could entail compiling a list of certified/qualified traumainformed care trainers or experts who can provide technical support to aid providers in adapting their services and models. • Provide guidance and recommendations to providers to support the consistent use of validated comprehensive assessment tools. Emphasis should be placed on finding assessments that can inform service-planning, progress monitoring at the child-level, and provide aggregated program data that can be used to evaluate program outcomes and effectiveness. • Use the GCQSA annually to monitor trends over time, progress toward FFPSA requirements, and in the implementation of quality care based on the Group Care Quality Standards. Emerging evidence suggests the GCQSA is prompting positive changes in practice and may promote a process of continuous quality improvement. • Refine the GCQSA forms based on psychometric results. Decisions on which items to exclude at this point will need to be made in preparation for the statewide validation study. • Update the GCQSA training curriculum and manual based on findings from the implementation evaluation. It is recommended that the implementation protocol that was utilized during the pilot be maintained. Updates to the training may include (but not be limited to): ° Add content into the training addressing the themes from the implementation evaluation including providing a FAQs resource guide or link. ° Add ‘opt-out’ selections to the Licensing Specialist Form in circumstances where there is no lead agency or youth to complete the form. In addition, to limit the amount of effort toward follow-up, an option of ‘three follow-up attempts made with no response’ could be added. ° Add additional content that would aid in correctly matching forms to the correct program to minimize issues with missing assessment forms. This could include adding items requesting facility addresses, reviewing and refining the process of assigning unique identification numbers to match facilities to programs that is currently in place, and emphasizing the importance of consistent follow-up.
20
Next Steps Based on the findings from the statewide pilot, updates will be made to the GCQSA tool and training in preparation for the statewide validation study schedule to begin January 2020. Booster trainings that will include the regional licensing teams, providers, and lead agencies will be scheduled for late-November-December 2019. In addition, plans for two supplemental project components, the inter-rater reliability study and the outcomes development pilot, are underway. These studies represent the final phases in the validation process. The results of these combined studies will yield a highly rigorous validation of the GCQSA with implications for informed quality residential services in Florida and beyond.
FLORIDA INSTITUTE FOR CHILD WELFARE
21
References 1
2
3
4
5
6
7
8
9 10 11
12
13
14 15
16
Megivern, D. M., McMillen, J. C., Proctor, E. K., Striley, C. L., Cabassa, L. J., & Munson, M. R. (2007). Quality of care: Expanding the social work dialogue. Social Work, 52, 115-124. https://doi.org/10.1093/sw/52.2.115 Boel-Studt, S. M., & Tobia, L. (2016). A review of trends, research, and recommendations for strengthening the evidence-base and quality of residential group care. Residential Treatment for Children & Youth, 33, 13-55. doi:10.1080/0886571X.2016.1175995 Farmer, E. M. Z., Murray, M. L., Ballentine, K., Rauktis, M. E., & Burns, B. J. (2017). Would we know it if we saw it? Assessing quality of care in group homes for youth. Journal of Emotional and Behavioral Disorders, 25, 28-36. doi:10.1177/1063426616687363 Lee, B. R., & McMillen, J. C. (2008). Measuring quality in residential treatment for children and youth. Residential Treatment for Children & Youth, 24, 1-17. doi:10.1080/08865710802146622 Wulczyn, F., Barth, R. P., Yuan, Y.-Y. T., Harden, B. J., & Landsverk, J. (2017). Beyond common sense: Child welfare, child well-being, and the evidence for policy reform. New York, NY: Routledge. Huefner, J. C. (2018). Crosswalk of published quality standards for residential care for children and adolescents. Children and Youth Services Review, 88, 267-273. doi:10.1016/j.childyouth.2018.03.022 Carman, G. O., & Farragher, B. J. (1994). Quality indicators for residential treatment programs: A survey instrument. Washington, DC: Child Welfare League of America. Daly, D. L., & Peter, V. J. (1996). National performance standards for residential care: A policy initiative from Father Flanagan's' Boy's Home. Boys Town, NE: Father Flanagan's Boy's Home. Building Bridges Initiative. (2009). Building Bridges Self-Assessment Tool. Boston, MA: Author. Group Care Quality Standards Workgroup. (2015). Quality Standards for group care. Boys Town, NE: Boys Town Press. Boel-Studt, S., Huefner, J. C., Bender, K., Huang, H., & Abell, N. (2018). Developing quality standards and performance measures for residential group care: Translating theory to practice. Residential Treatment for Children & Youth, 1â&#x20AC;&#x201C;22. https://doi.org/10.1080/0886571X.2018.1536494. Huang, H., & Boel-Studt, S (presented 2018, November). Analyzing stakeholder participation and response patterns to develop a targeted, multi-informant assessment. Paper presented at the meeting of Council on Social Work Education Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3, 77-101. doi:10.1191/1478088706qp063oa Mayring, P. (2004). Qualitative content analysis. A Companion to Qualitative Research, 1, 159-176. Vaismoradi, M., Turunen, H., & Bondas, T. (2013). Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing & health sciences, 15(3), 398-405. https://doi.org/10.1111/nhs.12048 Finney, S.J., & DiStefano, C. (2013). Non-normal and categorical data in structural equation modeling. In G.R. Hancock & R.O. Mueller (Eds.), Structural equation modeling: A second course, 2nd Edition (pp. 439â&#x20AC;&#x201C;492). Charlotte, NC: Information Age Publishing
FLORIDA INSTITUTE FOR CHILD WELFARE
22
Appendix A. Florida Quality Standard Initiative Framework
PHASE
1
Advocacy & engagement
PHASE
2
Convene workgroup Develop standards
PHASE
3
PHASE
Develop GCQSA & implementation plan
4
PHASE
Feasibility study Revise GCQSA
5
PHASE
Field study Refine GCQSA
6
Statewide training Statewide pilot Partial validation
PHASE
7
Full validation Evaluation
EXPLORATION & DEVELOPMENT
PHASE
8
Full implementation
Ongoing evaluation
INSTALLATION INITIAL IMPLEMENTATION
FLORIDA INSTITUTE FOR CHILD WELFARE
23
Appendix B. Correlations with Quality Ratings and Program Indicators Correlations between GCQSA Youth Form Ratings & Program Indicators Hospital
Staff Injury
Youth Injury
Police Calls
Runaway
Total Incidents
D1
Assessment, Admission, & Service Planning
-.144
-.045
-.250**
-.294**
-.141
-.208*
1.1
When I first got here, someone from this group home asked how I felt about coming here.
-.109
.062
-.292**
-.155
-.099
-.125
1.2
When I first got here, someone from this group home talked with me so that I knew what to expect.
-.278**
-.083
-.377**
-.447**
-.289*
-.353**
1.3
My service plan includes goals that help me stay safe, healthy, and doing well.
-.069
-.058
-.177
-.236*
-.061
-.161
1.4
My service plan includes goals that help prepare me to live with family or on my own.
-.160
.059
-.239*
-.160
-.186
-.151
1.5
I helped set the goal(s) in my current service plan.
-.094
.049
-.132
-.130
-.012
-.083
1.6
My family or others I am close with can help set my service plan goals if they want to.
-.060
-.069
-.082
-.111
-.109
-.091
1.7
Adults who have helped me like my case manager, GAL, or counselors help set the goals in my current service plan.
-.142
-.011
-.174
-.209*
-.057
-.111
1.8
My service plan can also include goals for my family or others I am close to so they can be involved in my progress.
-.114
-.006
-.150
-.336**
-.212*
-.221*
1.9
My service goals help me understand what is expected of me.
.096
-.074
.024
-.206*
.028
-.083
Staff talk with me about how I am doing on my goals and ask my opinion when making updates to my plan.
-.168
-.086
-.256*
-.161
-.030
-.130 -.203*
1.10 D2
Positive, Safe Living Environment
-.106
.050
-.305*
-.222*
-.231*
2.1
I have not been harmed or abused in this group home.
-.053
.106
-.095
-.014
-.072
-.024
2.2
I feel safe in the group home.
-.117
-.120
-.296*
-.340**
-.157
-.254*
2.3
Staff do not bully, threaten, or cuss at us.
-.022
.058
-.123
.039
-.128
-.007
2.4
Staff do not use physical punishment such as spanking, hitting, or pushing.
.004
.110
-.154
.004
-.077
-.012
2.5
Staff have talked to me about what my rights are while I am in this program.
-.101
-.020
-.339**
-.068
-.139
-.116
2.6
Staff respect my personal rights.
-.066
-.027
-.165
-.189*
-.119
-.156
2.7
My food, clothing, and personal hygiene needs are met in this program.
-.077
-.037
-.232*
-.133
-.115
-.126
2.8
Staff help me calm down when I get angry or upset.
-.028
.069
-.244*
-.079
-.077
-.061
2.9
If someone in the group threatens the safety of others, the staff know it and deal with it.
-.025
-.069
-.109
-.120
-.073
-.084
2.10
I have never been physically hurt by another kid in this program.
-.188*
-.058
-.248*
-.160
-.146
-.208* -.266*
2.11
In this program, kids don’t bully or threaten each other.
-.172
.057
-.163
-.264*
-.258*
2.12
Staff teach us to respect and support our peers.
-.093
.019
-.213*
-.177
-.139
-.131
2.13
My peers in this group home are respectful and supportive toward each other.
-.118
-.046
-.238*
-.222*
-.274*
-.220*
2.14
Staff keep a close watch on us and jump in right away if things start to get out-of-hand.
.002
.068
-.205*
-.176
-.114
-.107
2.15
Staff use restraints or time out rooms only when there is no other way to keep us from getting hurt.
.040
.092
-.170
-.175
-.096
-.096
2.16
If I tried to hurt myself, staff would keep me from doing it.
-.142
.089
-.329**
-.256*
-.275*
-.220*
2.17
Staff take any talk of suicide or self-harm very seriously.
-.286*
.010
-.228*
-.429**
-.320*
-.361**
D3
Monitor & Report Problems
-.115
.028
-.291*
-.197*
-.110
-.163
3.1
If I’m concerned about my treatment here, I’m allowed to talk to someone like a guardian or counselor from outside.
-.127
.006
-.235*
-.083
-.094
-.090
3.2
Staff told me how to file a private complaint (grievance) about problems I might see or have here.
.027
-.001
-.147
-.131
-.002
-.086
3.3
I am comfortable with telling staff what I need.
.034
.055
-.100
-.020
.010
.014
3.4
Staff respond when I talk to them about things that I feel concerned about.
-.166
-.021
-.362**
-.291*
-.219*
-.264*
3.5
I feel like my concerns are taken seriously.
-.223*
.052
-.265*
-.247*
-.139
-.221*
D4
Family, Culture, & Spirituality
-.070
-.035
-.231*
-.276**
-.113
-.215*
4.1
While in this group home, I have been allowed to have phone calls or visit with my family or other important people in my life if I wanted to.
.026
-.017
-.066
-.110
.007
-.095
4.2
Staff talk to us about the importance of family and relationships.
-.011
.062
-.148
-.167
-.054
-.160
4.3
Staff are supportive of my relationship with my family and other important people in my life.
-.036
.032
-.269**
-.198
-.174
-.167
4.4
Staff try to find ways to involve my family or other important people in my life in my care.
-.040
.001
-.250**
-.217*
-.126
-.169
4.5
In this group home, we do things in the community like volunteer, attend community events, or go places like church, the swimming pool, or the movies.
-.049
-.035
-.104
-.246**
-.145
-.178
4.6
Staff here respect my culture and things that are important to me as part of my racial and ethnic identity.
-.147
-.098
-.271**
-.389**
-.074
-.239*
4.7
Staff here respect my sexual orientation and gender identity.
-.094
-.184*
-.207*
-.047
-.015
-.069
4.8
Staff here respect my religious or spiritual beliefs.
-.054
-.006
-.012
-.089
-.042
-.049
4.9
I am allowed to go places where I can practice my beliefs.
-.046
-.035
-.071
-.132
-.024
-.114
FLORIDA INSTITUTE FOR CHILD WELFARE
24
Hospital
Staff Injury
Youth Injury
Police Calls
Runaway
Total Incidents
D6
Assessment, Admission, & Service Planning
-.148
.036
-.224*
-.401**
-.230*
-.309**
6.1
We do things kind of like a family here like eat meals together, play games, and have household chores.
.026
-.094
-.147
-.105
-.015
-.081
6.2
I am allowed have some of my own things and fix my room the way I like.
-.084
.102
-.060
-.334**
-.331**
-.285**
6.3
I pick out the clothes I wear.
-.097
.046
-.197*
-.289**
.004
-.182*
6.4
I have privacy in my bedroom or bathroom.
-.096
-.020
-.006
-.245**
-.136
-.171
6.5
If I meet with my therapist, family, or other non-family supportive adults, we can meet in a private space.
.001
.044
-.126
-.201*
-.071
-.123
6.6
Staff interact with us a lot during our daily routine.
-.089
.061
-.025
-.078
.034
-.072
6.7
We have a daily schedule that we follow here that includes free time.
-.021
.050
-.237**
-.174
-.071
-.132
6.8
There are at least two staff on duty at all times except when we are sleeping.
-.144
.048
-.052
-.151
-.027
-.128
6.9
There are enough staff around to handle anything that happens.
-.207*
-.031
-.239*
-.380**
-.258**
-.310**
6.10
Staff know what is going on around here.
-.220**
.011
-.259**
-.434**
-.265**
-.356**
6.11
I am allowed to do regular things that most kids do like play sports, spend time with friends, or go to school events.
-.120
.033
-.190*
-.344**
-.286**
-.328**
6.12
Staff or my therapist talk to me about trauma I have been through and how to deal with it better.
-.087
.067
-.199*
-.340**
-.212*
-.232*
6.13
The staff here care about me.
-.087
.023
-.204*
-.128
-.116
-.119
6.14
I have a good relationship with the staff.
-.079
.006
-.117
-.274**
-.196*
-.181
6.15
I get a lot of help from the staff.
-.200*
-.042
-.227*
-.414**
-.258**
-.327**
D7
Assessment, Admission, & Service Planning
-.163
.011
-.268**
-.287**
-.116
-.220*
7.1
Being in this program has helped me to do well in school.
-.161
-.009
-.222*
-.259**
-.069
-.200*
7.2
I get the help I need to pass my classes at school.
-.046
.010
-.177
-.026
-.022
-.030
7.3
I was told by staff that I can receive job training for things like welding or cooking or other types of jobs if I want it.
-.021
.104
-.067
-.015
.077
-.005
7.4
Staff teach me how to get along with others and to treat people right.
7.5
Staff teach us about doing the right thing.
7.6
-.162
-.044
-.339**
-.338**
-.171
-.234*
-.248**
-.056
-.316**
-.409**
-.239*
-.280**
Staff teach us to be aware of how our actions affect others.
-.167
-.056
-.295**
-.416**
-.263**
-.338**
7.7
Being in this program is helping me to feel and behave better than I did before I came here.
-.229*
-.015
-.221*
-.358**
-.245*
-.317**
7.8
In this program I have been taught independent living skills such as how to buy and make healthy meals, balance a checkbook, or apply for a job.
.035
.097
-.065
-.061
-.012
-.034
D8
Pre-Discharge/Post-Discharge Processes
-.099
.102
-.168
-.266*
-.192
-.210*
8.1
Staff started talking with me soon after I got here about how to handle things like school, jobs, and different support Iâ&#x20AC;&#x2122;ll need when I leave here.
-.157
.006
-.241*
-.344**
-.273**
-.306**
8.2
Staff have talked with me about ways I can stay in touch with my family and other important people in my life after I leave here.
-.056
.075
-.080
-.259*
-.198*
-.218*
8.3
I have a discharge plan that focuses on helping me find a long-term place to live, either on my own or with others.
-.122
.108
-.101
-.217*
-.145
-.161
8.4
Staff are helping me and/or my family find other programs and services we need to help me be successful after I leave here.
-.022
.125
-.170
-.135
-.082
-.081
FLORIDA INSTITUTE FOR CHILD WELFARE
25
Correlations between GCQSA Licensing Form Ratings & Program Indicators Hospital
Staff Injury
Youth Injury
Police Calls
Runaway
Total Incidents
D1
Assessment, Admission, & Service Planning
.224
.088
.209
-.012
.078
.097
1.1
A comprehensive assessment is completed for each youth within 30 days of admission.
.192
.133
.215
.095
.139
.162
1.2
Assessment items ask about youth safety threats, strengths, needs, and prior trauma.
.251+
.099
.238
.122
.251
.227
1.3
Evidence-based assessments are used to inform service planning. Please identify the name(s) of assessments that are used in the space below.
.148
.019
.051
.065
.151
.111
1.4
As much as possible, youth are informed about the decision-making process regarding their admission.
.110
-.020
.177
.037
.004
.064
1.5
As much as possible, family or non-family supportive adults are involved in the admissions process.
.115
.112
.238
.021
.091
.111
1.6
As much as possible, the views of other providers (e.g., case manager, behavioral health provider, GAL) are considered when making admissions decisions.
.192
.129
.081
-.118
.014
.042
1.7
Documented service plans focus on individual strengths and needs. Service plan may also be referred to as a care plan or treatment plan.
.207
.072
.164
-.040
.029
.057
1.8
Documented service plans focus on helping youth achieve safety, permanency, and/ or well-being.
.172
.070
.166
-.088
-.043
.004
1.9
Youth are involved in creating their service plans (e.g., help determine the goals, are present or consulted with as part of service planning meetings).
.151
-.135
-.117
-.083
-.071
-.105
1.10
As much as possible, families are involved in creating service plans.
.239
.111
.233
-.017
.115
.097
1.11
Effort is made to collaborate with other professionals who are involved to create service plans.
-.040
-.003
.031
-.181
-.059
-.079
1.12
Service plans include a focus on increasing children’s family and natural supports.
.088
.056
.108
-.094
-.117
-.047
1.13
Service plans include clearly defined, measurable goals.
.101
.030
.109
-.084
-.061
-.027
1.14
Service plans are reviewed and updated with youth and their family at least every 90 days.
.191
.120
.245
.050
.068
.134
D2
Positive, Safe Living Environment
-.160
-.218
-.095
-.179
-.120
-.164
2.1
Youth are protected from exposure to harmful or traumatizing incidents.
-.234+
-.327*
-.184
-.145
-.146
-.218
2.2
Staff follow policies prohibiting the use of corporal punishment or coercive practices that could constitute verbal or physical abuse or bullying.
-.047
-.371**
-.179
.074
.063
-.018
2.3
Documented policies on youths’ rights while in the program are reviewed with the youth and family.
-.062
-.210
-.004
.001
-.058
-.025
2.4
Staff protect the rights of youth according to program policies.
-.098
-.357**
-.032
-.054
.003
-.063
2.5
The program provides for youths’ needs (e.g., shelter, food, clothing, personal hygiene…etc.).
-.205
-.286*
-.060
-.101
-.059
-.127
2.6
Staff effectively de-escalate crisis and behavioral incidents according to training and program policies including documenting serious incidents.
-.178
-.161
-.035
-.161
-.040
-.131
2.7
Staff prevent incidents of youth-to-youth harm according to training and program policies.
-.273*
-.189
-.234
-.344*
-.212*
-.332*
2.8
The program creates a respectful and supportive peer culture.
-.314*
-.222
-.129+
-.301*
-.214+
-.269*
2.9
Staff closely supervise youth according to program policies.
-.264*
-.247
-.067
-.179
-.219
-.241
2.10
Staff respond quickly when a youth’s actions threaten the safety of others according to program policies.
-.305*
-.188
-.086
-.252*
-.218
-.232
2.11
Physical restraints and seclusion are used only in emergencies involving imminent safety risks.
.061
.095
-.013
-.194+
-.108
-.063
2.12
Policies including the use of risk assessments and safety plans are followed to prevent youth from self-harm.
.156
.090
.074
.051
.081
.103
D3
Monitor & Report Problems
-.078
-.146
-.041
-.050
-.041
-.057
3.1
Staff report all serious problems to supervisors and file incident reports as needed (e.g., crisis management, abusive practices, youth-to-youth incidents, suicidal behavior).
-.238
-.128
-.061
-.088
-.102
-.127
3.2
All allegations of unsafe, inappropriate, or abusive practices are reported to external oversight agencies.
-.099
-.062
-.009
-.088
-.069
-.068
3.3
Youth may contact an outside advocate (e.g., GAL, case manager, child advocate) to share concerns about their care.
-.102
-.122
-.078
-.182
-.094
-.133
3.4
The program routinely uses surveys to assess consumer satisfaction (e.g., youth, parent/guardian, partner agencies).
.129
-.053
.136
-.042
.032
.034
3.5
Staff inform youth and families of methods for confidentially filing grievances or concerns.
-.171
-.221+
-.083
-.100
-.175
-.142
3.6
Staff document program responses to grievances filed by youth and families.
.030
-.123
-.173
.093
.063
.016
3.7
The program makes an effort to respond to the needs and concerns of youth and families.
-.145
-.088
-.035
-.037
-.030
-.049
3.8
The program has specific mechanisms in place to ensure that youth and families can communicate their needs.
-.029
-.041
.042
.042
.047
.043
FLORIDA INSTITUTE FOR CHILD WELFARE
26
Hospital
Staff Injury
Youth Injury
Police Calls
Runaway
Total Incidents
D4
Family, Culture, & Spirituality
-.055
-.144
.044
-.192
-.178
-.125
4.1
The program provides opportunities for youth to call or visit with family or other non-family supportive adults.
-.197
-.123
-.020
-.139
-.192
-.146
4.2
The program supports contact with family through things such as: scheduling visits/ call times, providing transportation, obtaining travel reimbursements, or supervising visits when needed.
-.023
-.057
.067
-.208
-.218
-.124
4.3
The program demonstrates an understanding of family connections by supporting family preservation and reunification efforts.
-.204
-.145
-.036
-.279*
-.259+
-.223+
4.4
The program puts a lot of effort into engaging families in their child’s care whenever possible.
.001
-.091
.063
-.172
-.145
-.110
4.5
Youth have opportunities to participate in community activities (e.g., volunteering, going to the movies, public swimming pools, community events…etc.).
-.160
-.418**
-.016
-.181
-.133
-.161
4.6
The program supports youths’ connection with their culture of origin (e.g., allowing participation in cultural practices, ceremonies, events and regular contact).
-.094
-.201
-.018
-.113
-.052
-.084
4.7
Staff respect youth’s cultural identity (e.g., race/ethnicity, sexual orientation, gender identity).
-.064
-.207
.016
-.019
-.001
-.022
4.8
Youth and families receive culturally competent and linguistically appropriate services.
.190
-.013
.109
.044
.057
.099
4.9
Youths’ spiritual beliefs and values are respected and supported through providing access to places where they can practice their beliefs (e.g., churches, temples, mosques).
-.059
-.058
-.008
-.179
-.180
-.123
D5
Professional & Competent Staff
.020
.021
.101
-.037
.050
.057
5.2
Staff are trained in established/best practice crisis management methods. Enter the name of the method used in this group home in the space below.
.059
.212
.130
.070
.126
.139
5.3
Staff behave in a professional manner when interacting with other staff, professionals, youth, and families.
-.274*
-.196
-.082
-.242+
-.116
-.204
5.4
Staff receive in-service and ongoing training to build knowledge and skills necessary for quality services.
.001
-.048
.122
-.072
.032
.033
5.5
Staff receive training and demonstrate cultural competency in their work with youth and families.
.032
-.138
.100
-.007
.058
.064
5.6
Staff receive regular, documented supervision to ensure compliance with training and program policies and procedures.
.111
.061
.066
-.022
.014
.060
5.7
Staff receive training and demonstrate competency in teaching positive social skills to youth. Training can include in-service, supervision, or attending workshops or formal training sessions.
.058
.018
.140
-.001
.075
.100
D6
Program Elements
-.022
-.067
.049
-.090
-.002
-.020
6.1
Level of care is matched to youths’ needs with as few restrictions as possible.
-.083
-.111
.020
-.223+
-.026
-.125
6.2
The program provides a family-like environment to the extent possible based on the youths’ needs (e.g., eating meals together, sharing in household chores and responsibilities, doing recreational activities together).
-.290*
-.341**
-.129
-.285*
-.104
-.258+
6.3
Youth are allowed to have personal belongings, and personalize their rooms.
-.025
-.100
.007
-.168
-.148
-.104
6.4
Youth are allowed to have a choice in the clothes they wear.
-.168
-.135
.076
-.091
-.116
-.105
6.5
Youths' rights to privacy are respected by providing them with private living spaces (e.g., bedroom, bathroom), places for therapy and family visits, and records are kept confidential.
-.137
-.123
-.057
-.230+
-.233+
-.196
6.6
Staff are actively involved with youth during the daily routine.
-.277*
-.209
-.043
-.193
-.184
-.220+
6.7
Youth activities of daily living (ADL) are monitored by staff according to the needs of each youth.
-.127
-.120
.043
-.097
-.144
-.106
6.8
Youth have access to the full range of needed services either on-site or through external providers (e.g., therapist, psychiatrist, physician, eye doctor).
.078
-.042
.075
-.066
.003
.014
6.9
Youth follow a structured daily routine that includes planned free time.
-.123
-.222+
-.015
-.155
-.095
-.133
6.10
Rate how true each of the following statements are for this program: - The program uses continuous quality improvement (CQI) in an ongoing effort to evaluate and improve services.
.158
.072
.159
.098
.155
.166
6.11
Regular staff meeting documentation includes attention to youth’s progress, team work, and addressing program issues.
.081
.117
.047
-.002
.128
.109
6.12
There are a minimum of two staff on duty at all times, except during sleeping hours.
-.061
.026
.071
.010
.089
.035
6.13
There are enough staff on duty to maintain adequate supervision (e.g., aware of what is going on, well prepared to handle any incidents that occur).
-.285*
-.217
-.065
-.318*
-.250*
-.309*
6.14
Staff are aware of any adjustments to youths’ medication, closely monitor dosage and any side effects, and report any concerns.
-.190
-.118
-.032
-.278*
-.209
-.204
6.15
Psychiatrists monitor youths’ psychotropic medication regimens at least once a month.
.080
.066
.133
-.025
-.073
.042
6.16
Staff provide ongoing supervision of youth according to program policies.
-.233+
-.213
-.051
-.227+
-.221+
-.231+
6.17
Youth are provided with opportunities to take part in normal activities (e.g., school functions, spending time with friends, participating in sports, band, or other extracurricular activities and clubs).
-.157
-.205
-.054
-.222+
-.147
-.179
6.18
Staff establish relationships with youth characterized by trust, caring, and support.
-.265*
-.168
-.024
-.182
-.238+
-.211
6.19
The program uses a trauma-informed approach.
.181+
.084
.027
.193+
.126
.206+
6.20
The program uses an evidence-informed model of care.
.246+
.141
.126
.138
.221+
.227+
FLORIDA INSTITUTE FOR CHILD WELFARE
27
Hospital
Staff Injury
Youth Injury
Police Calls
Runaway
Total Incidents
D7
Education, Skills, & Positive Outcomes
-.097
-.118
.034
-.228+
-.186
-.146
7.1
The program ensures that youth receive ongoing comprehensive educational assessments needed to determine their educational needs.
-.053
-.029
.053
-.121
-.094
-.051
7.2
Youth receive the educational supports they need to succeed (e.g., tutors, quiet time to complete homework, access to computers).
-.048
-.085
.070
-.174
-.150
-.095
7.3
For youth who stayed in this program for a full school year, the majority (e.g., over 60%) progressed into a higher grade.
-.078
-.108
-.174
-.146
-.134
-.161
7.4
The program ensures that qualified youth have a current 504 Plan or Individualized Educational Plan (IEP).
-.048
-.025
.034
-.157
-.118
-.069
7.5
Vocational assessments and training are made available for youth who are interested.
-.149
-.225+
.120
-.308*
-.162
-.182
7.6
Staff teach positive social skills, values, and behaviors to the youth in the program.
-.099
-.071
.057
-.178
-.213
-.126
7.7
The program focuses on reducing behavioral issues and symptoms in youth.
.032
-.033
.097
-.040
-.112
-.019
7.8
Staff teach youth developmentally appropriate life skills and competencies needed to transition to life after group care.
-.121
-.026
.010
-.210
-.169
-.118
7.9
Staff talk to us about the importance of family and relationships.
-.011
.062
-.148
-.167
-.054
-.160
D8
Pre-Discharge/Post-Discharge Processes
.195
.125+
.177
.053
-.008
.109+
8.1
Transition planning starts soon after admission and includes a focus on education and/or employment and other supportive services to help youth successfully transition from care.
.109
.083
.187
.035
.004
.078
8.2
Transition plans include a focus on the continuity of family relationships.
.108
.080
.162
-.077
-.087
.016
8.3
Discharge plans are designed to support youths' long-term permanency (e.g., independent living, reunification, adoption).
.160
.128
.183
.104
.024
.135
8.4
Before a youth is discharged to a new placement, he or she is given a period of transitional time to become familiar and comfortable with the new placement (Note. This item excludes unplanned discharges).
.233+
.188
.040
.146
.036
.140
8.5
Prior to discharge, the program helps connect youth and their caregivers with community resources and aftercare services.
.281*
.135
.092
.097
.018
.135
8.6
Within 30 days after discharge, the program follows up with youth and their caregivers to check whether they are connected with aftercare services and other supports.
.076
.021
-.127
-.055
.020
-.014
8.7
The program follows-up with youth and their caregivers to monitor post-discharge outcomes (e.g., permanency, educational, family, and functional outcomes).
.078
-.018
-.052
.068
.112
.053
GCQSA Total
.032
-.044
.094
-.097
-.050
-.017
Note. Values displayed in table are partial correlation coefficients controlling for number of youth service in a program.
FLORIDA INSTITUTE FOR CHILD WELFARE
28