windermere-wellbeing-report

Page 1

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector

Prepared by Associate Professor Martyn Jones, Associate Professor John Douglass Whyte and Matthew Coote Project supported by The William Buckland and Alfred Edments Estate, former members of the ANZ Trustees and currently managed by Equity Trustees


ACKNOWLEDGEMENTS

We acknowledge the traditional custodians of the lands on which Windermere Child and Family Services and RMIT University stand—the Peoples of the Kulin Nation, including Community members, Elders and Respected Persons, both past and present. We acknowledge and thank the staff and clients of Windermere and the students of RMIT University who participated in this project. Without their time and interest, this study would not have been possible. The significance for this project of prior work by Community Indicators Victoria on community wellbeing and by RAND Health in their medical outcomes study is also acknowledged. This project was funded by the William Buckland Foundation and Alfred Edments Estate, original managed by ANZ Trustees, currently managed by Equity Trustees. We would also like to acknowledge the Helen MacPherson Smith Trust who supported the initial development work of the Wellbeing Measurement Tool enabling us to progress to the refinement stage the major focus of this report.

RESEARCH TEAM Associate Professor Martyn Jones Associate Professor John Douglass Whyte Research Officer Matthew Coote School of Global, Urban and Social Studies RMIT University

PROJECT REFERENCE GROUP Dr Lynette Buoy CEO, Windermere Professor Catherine McDonald Emeritus Professor RMIT University Formerly Ms Cheryl de Zilwa past CEO, Windermere Ms Serap Ozedemir past Business Development and Special Projects Manager, Windermere

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 2


CONTENTS 1.

Introduction

5

2.

Background and Rationale

6

2.1 Introduction to Windermere 2.2 Striving for ‘accountability’ and ‘effectiveness’ 2.2.1 Growing valuing of ‘evidence’ of program effectiveness 2.2.2 Looking at more than activity outputs 2.3 The pilot project: review 2.3.1 The pilot project: administration and findings 2.3.2 Pragmatic of evaluation service programs 2.3.3 Key themes from the pilot project 2.4 Emergent aims for current project 3.

Conceptualisation: Wellbeing and Measurement 3.1 3.2 3.3 3.4 3.5

4.

Measuring wellbeing: Conceptually broad, yet specifically sensitive. Various approaches/measures of conceptualising subjective wellbeing Devising a single ‘aggregate’ instrument Wellbeing and outcomes: The policy context Summary

Research Approach: Methodology and Process 4.1 Principles behind the approach 4.2 Project design 4.3 Project methods 4.3.1 Setting the context and inquiry 4.3.2 Development of materials and analysis 4.4 Summary

5.

Findings: Survey Development and Refinement 5.1 Considerations for statistical analyses 5.2 Addressing and incorporating stakeholder and participant feedback 5.3 Sequence of measurement tool administrations and statistical analyses 5.3.1 Measurement tool 2 5.3.2 Measurement tool 3 5.3.3 Measurement tool 4 5.3.4 Measurement tool 5 5.3.5 Descriptive statistics of aggregated sample by survey iteration and socio-demographic variables 5.4 Results of analyses 5.4.1 Aggregated survey results 5.4.2 Survey results by variables 5.4.3 Results of the correlation process for the reduction of questions for measurement tool 5 5.5 Summary

6.

Discussion and Recommendations 6.1 6.2 6.3 6.4 6.5 6.6

The value of a wellbeing measurement tool Confidence in the wellbeing measurement tool Technical considerations in using the wellbeing measurement tool Implementing the wellbeing measurement tool Dissemination Concluding comments

6 7 8 9 10 11 14 15 16 17 17 20 23 25 27 29 29 29 30 30 32 37 38 38 39 39 39 39 40 41 41 42 42 51 56 58 59 59 60 60 61 63 64

3 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Appendices A B C D E F G H I J K L M N P

Bibliography Windermere Wellbeing Pilot Survey Summary of Wellbeing Measurement Initiatives Ethics Approval: Plain Language Statement Basic questions for participants: RMIT students Qualitative feedback from Windermere staff and clients Qualitative feedback from RMIT students Measurement Tool 2 Measurement Tool 3 Measurement Tool 4 Results of T-Tests: Example – Gender: Q1 to Q8, Tools 2, 3 & 4 Results of correlation process for reduction of questions for tool 5 Measurement Tool 5 Complete list of questions: Tools 2, 3, 4 & 5 Final Wellbeing Measurement Survey

66 69 71 73 75 76 81 93 95 96 101 109 141 143 150

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 4


1. INTRODUCTION The development of a Wellbeing Measurement Tool for use in the community services sector was an initiative conceived jointly between Windermere Child and Family Services (Windermere) and the social work discipline of RMIT University (RMIT). At the time the initiative was first discussed, in 2008, the idea that community services providers might focus more concertedly on the outcomes reported by their clients was relatively novel in Victoria, Australia. In the intervening years, this has become much more commonplace, to the extent of now being expected or even required.

From the earliest stages, the intent was to develop a tool that would measure the consumers’ 1 wellbeing, as reported by them. Wellbeing was chosen as the significant outcome variable and the tool was to capture this in its most rounded sense. Yet it was recognised that ultimately it would be the client who would bring their own meaning to this term and its presence or otherwise in their lives. To that extent, wellbeing was perceived to be an ‘empty signifier’ – a category for measurement that would be open for the client to interpret in their own unique way.

Developing a credible Wellbeing Measurement Tool that is valid and reliable but also useable by community service organisations has proven to be a lengthy and challenging process. A pilot project conducted in 2009-2010 provided the platform for the present project. Funding provided to Windermere in 2012 by the ANZ Trustees (former managers of the William Buckland Foundation) enabled detailed and rigorous work to be conducted to develop and refine the pilot wellbeing tool. This Report outlines the research carried out by RMIT from July 2012 to September 2014 on behalf of Windermere to produce a tool – the Wellbeing Survey – that can be used with confidence to gauge the reported wellbeing of its consumers1. The Report begins by providing some background to the project. Then, having reviewed relevant literature on the topic of wellbeing and outcome evaluation, the research methodology is reported and the findings are presented and discussed, with recommendations. The Report is intended to be accessible to the informed community service practitioner and manager. It is written primarily to that audience.

1 The term ‘consumer’ used generically can include current or potential users of health and/or community

services, carers or family members, along with interested members of the community.

5 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


2. BACKGROUND AND RATIONALE 2.1 Introduction to Windermere Windermere, a southern corridor Melbourne-based welfare agency, caters primarily to children, families, and individuals who experience disadvantage and vulnerability. The agency comprises over 140 full and part-time staff and provides services to over 14,000 people each year (Windermere, 2014). Despite the varied nature of Windermere’s work, its programs have the common purpose of contributing to improving lives. Windermere’s purpose is to get in early to make a difference in the lives of individuals, families and communities. (Windermere, 2014).

Windermere is a well-established, wholly independent agency, situated in Melbourne’s South-eastern suburbs with a historical heritage spanning over 150 years. Together with related agencies, it serves a population of approximately 1.5 million people. The agency provides its services from 9 locations, with major centres in Narre Warren, Cranbourne, Pakenham. Berwick, Morwell and Bairnsdale. The main locales for Windermere services are the City of Casey, the City of Greater Dandenong, Shire of Cardinia, Gippsland, and City of Bayside. The demand for Windermere’s services are expected to increase as the area it services is currently in one of Melbourne’s fastest developing growth corridors. For instance, the City of Casey, in terms of population, is currently the third fastestgrowing municipality in Victoria behind Wyndham and Whittlesea. The population of Casey (June 2014) is approximately 281,000, with a projected population of 459,000 in 2036 (City of Casey 2014:1).

Windermere offers services in the areas of physical, emotional and sexual abuse, family violence, homelessness, disability support, violent crime and caring for a child with developmental delay (Windermere, 2012). Programs include emotional and practical support, counselling, advice, education, advocacy, in home care for children and families, and housing support (Windermere, 2012).

The agency has a number of collaborative relationships with educational, government and welfare agencies including the Victorian Department of Human Services, Department of Justice, Cardinia Shire Council, Victoria Police, Monash University, and RMIT University.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 6


Windermere like many welfare agencies is funded by both the Australian federal and state governments, as well as private and corporate donations, the majority of Windermere services are funded by government. Windermere in 2012 received just over $13 million, accounting for 82% of Windermere’s total revenue of almost $16 million (Windermere, 2012).

2.2 Striving for ‘accountability’ and ‘effectiveness’ Notions of accountability and effectiveness related to the provision of public sector human services have seen important shifts over the past decades.

Moving from a

reliance on quantifiable program activity and efficiency and target measures, recent trends have seen a movement towards addressing clients’ subjective impact.

This

movement has been informed and underpinned, in part, by two questions.

First is that, given in the formation of any program/service provision there is the presumption that the effort is well-intentioned and that it is sufficient to focus on program efficiency (for example, cost-benefit ratio or client throughput) as a primary measure of accountability and effectiveness. By default, empirical evidence has been considered more valued and reliable in identifying explicit, identifiable factors by which intra- and inter-program measures can be made and compared. But what of the actual impact on the client? A point that strays into a more problematic realm and is the focus of this work. On one hand, it is possible to measure shifts in particular conditions (for example, income, specific health symptoms, and numbers of involvements with social services) but it is much more difficult to measure subjective experiences, including emotional, social and existential conditions or attitudes. With the proposed challenge of how does one measure those attitudes and experiences that may, themselves, be more accurate, if elusive, as indicators of a program impact we there lead into the second proposition.

If we are to attempt to understand and consider those more ephemeral qualities of the human condition, how can we do so in a way that is not seen simply as an attempt to generalise an individually-subjective response to broader client and program considerations but instead true effectiveness.

7 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Attempts to address the qualitative and quantitative aspects of both program and client conditions have seen the increased development and application of what has become known as ‘evidence-based’ practice and evaluation. As is described in greater detail next, this has involved the reconciliation of sometimes competing priorities and emphases.

2.2.1

Growing valuing of ‘evidence’ of program effectiveness

As noted by Bloom, Fischer and Orme (2009), the evolution of evidence-based practice (EBP) across health and human service professions – including medicine, nursing, social work, psychology and public health – is one of the most important developments in addressing those professions’ practice and program evaluation. Both an ideology and a method, EBP is philosophically underpinned by the principle that clients deserve to be provided with the most effective programs/interventions possible. Pragmatically, it reflects the practitioners’ commitment to make use of all possible means to identify and

apply the most effective evidence related to any particular issue or problem – at all points of planning, conceptualising, operationalising and client contact.

The challenges facing the practitioner who attempts to incorporate EBP are varied and interwoven, in part because the methods of identifying/locating the most effective practices/interventions extend beyond simply those incorporated in empirically-based practice (Bloom, Fischer & Orme, 2009). This is no more evident than in the application of EBP principles to broadly interdisciplinary and multi-programmatic human service endeavours.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 8


2.2.2

Looking at more than activity outputs

Among the challenges are to, first of all, locate and identify as many studies of effectiveness related to a particular problem as is possible and then to critically assess the studies identified for validity, utility for the problem at hand and the concluding presentation of evidence. These efforts are made more daunting in that they require the practitioner simultaneously to search for studies embodying the most rigorous protocols while at the same time including sensitivity to socially and culturally relevant considerations.

The second great challenge is that of conceptual integration. The success of efforts informed by these principles of EBP is very much dependent upon the sensitive integration of two differing types of research: the use of single system designs as a core of evaluation-informed practice; and the effective utilisation of experimental and/or quasiexperimental designs that serve to inform practitioners’ decisions about the most effective procedure for a particular situation.

But

the

reliance

on

interventions

informed by the ‘highest’ EBP level of multiple randomised, controlled studies is no guarantee of effective service delivery with every client. What must also

be

considered

are

the

characteristics of both clients and practitioners - either or both of which can

have

outcomes.

profound Cultural,

effects

on

socioeconomic,

ethnic and various other demographic and interpersonal variables can impact intervention results – regardless of the rigour of the EBP protocols employed.

To minimise such occurrences, the practitioner

is

encouraged

to

incorporate regular feedback on the Figure 2.1 The PRAISES model: Integrating Evaluation-Informed and Evidence-Based Practice (Bloom, Fischer & Orme, 2009) 9 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


progress of the services being provided and to make changes accordingly. Such feedback has been shown to be a characteristic of evaluation-informed practice that can reduce deterioration and enhance overall outcome (Bloom, Fischer & Orme, 2009)

Despite the challenges, several successful protocols to integrate such factors have been developed. One such model is the PRAISES model, shown in the figure 2.1 above. While at first glance daunting, the flowchart is nothing more than a compilation of the multiple steps in which many practitioners already engage. Presented in a systematised flowchart that is intended to present a structured approach to those steps, this flowchart illustrates characteristics and protocols that enhance evidence-based practice through the flow of evaluation-informed practice. These include: o

Empirically-based since it incorporates both meanings of evidence-based practice the use of results of classical evaluation research to inform the selection of demonstrably effective interventions, and the systematic evaluation of the effects of those interventions.

o

Integrative insofar as it integrates all practice and evaluation activities, with no distinctions made between evaluation and practice.

o

Eclectic since it is a framework based on the perspective that the knowledge base of practice in the helping professions that is [1] pluralistic in the sense that knowledge is derived from multiple sources and [2] eclectic in that it uses clear, precise and systematic criteria to select knowledge.

o

Systematic, in that it clearly identifies the various phases of practice, organising them in a logical sequence.

o

Accountable, in that it reveals the entire practice process, inviting scrutiny by others.

o

A way of thinking since it illustrates and encourages an approach that is grounded in the ethics and values that underlie the practices of helping professions.

2.3 The Pilot Project: review The present project arises out of a previous work that was undertaken between Windermere and RMIT in 2009-10, as reported in ‘Development of a Wellbeing Measurement Tool in the Community Services Sector’ (Whyte & Jones, 2011). The rationale behind that originating project is provided here to explain the broader context to the current work.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 10


The pilot project that examined the Windermere programs in the original research attempted to utilise all of the evidence-based practice and evaluation features, protocols and processes described above - incorporating, integrating and coordinating evaluation and practice aspects in the course of its pursuit of evidence-based practice and evaluation.

In 2009-10, a team of RMIT researchers – in consultation with Windermere Disability, Family Services and Counselling teams – developed and trialled a survey tool (hereafter pilot survey). Analysis of the results was undertaken and published in the report noted above, ‘Development of a Wellbeing Measurement Tool in the Community Service Sector’ (Whyte & Jones 2011).

The development of such a pilot survey to better understand both evidence and evaluation informed practice in a form of effectiveness was driven by the multitude of programs offered by Windermere, there was a need for a measure that was conceptually broad, but still specifically sensitive. In addition, the measure had to encompass the physiological, psychological, social, cultural and existential aspects of the human condition. With this in mind, a longitudinal wellbeing measure was proposed across the domains of emotional wellbeing, physiological wellbeing, community engagement, cultural engagement, and spiritual worth.

When designing the pilot survey, the following constraints were considered: o

that it provided a reliable, sensitive and valid measure over time,

o

that it incorporated the diversity of Windermere’s programs,

o

that it not interfere unduly with program running, service user experience or worker experience, and

o

2.3.1

that it not require extensive training to administer. Pilot project: administration and findings

Throughout the administration of the pilot tool, a total of 29 clients consented to be part of the pilot project. Of these, 25 actually completed at least one survey which was administered at two time points. Participants included nine from the Counselling Program, eight from the Disability Program and nine from the Family Services Program. Consisting of 17 female and nine male respondents, the ages ranged from 28 to 61

11 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


years, with a mean age of 42 years. Nine different postcodes were represented in the respondent sample.

Aside from socio-demographic and Windermere-related information, the balance of the survey consisted of eight questions related to wellbeing. Credit was due to Community Indicators Victoria and RAND Health (Medical Outcomes Study) whose prior work was particularly influential in the development of this framework.

As shown in the Windermere Wellbeing Pilot Survey (Appendix B), these questions were: o

Question 1: Impact of physical health problems on social activities.

o

Question 2: Impact of emotional issues on social activities.

o

Question 3: Impact of physical health problems on work or other activities.

o

Question 4: Impact of emotional issues on work or other activities.

o

Question 5: General attitude towards life over the past month.

o

Question 6: Attitudes/experiences of living in their neighbourhood.

o

Question 7: How they feel about their community.

o

Question 8: General sense of wellbeing now and compared to three months ago.

The response options for each of the above were via a semantic differentiation five point Likert scale. This was the case for each of the individual questions, together with a series of subset questions. The limitations regarding sample size and levels of measurement severely restricted the statistical analyses that could be conducted. However, it was possible to examine the responses more broadly, looking at response trends for each question across both first-and second-iteration surveys.

A critical consideration in interpreting the results of the surveys is the ‘starting point’ of the participants. Are they in crisis? Are they facing unique situations? Has life been stable or fractured? All can impact the results and their interpretation.

As indicated, in the course of this pilot project, surveys of 40 different questions (eight general questions plus subsets) were given to three different program participants over two iterations. Of the 160 different means responses (including those of the aggregated programs) for the first survey, none were outside the first or fifth points of the five-point Likert scale and only 15 were outside the second or fourth Likert scale points. Further, of these 15, none reflected a semantic identifier of an exacerbated condition or perception.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 12


This was an important consideration in viewing changes in means between first and second surveys, since when dealing with clients experiencing crises one might expect response

outliers

indicating

highly

skewed

perceptions/experiences

-

perceptions/experiences that regress to a central mean over time, regardless of intervention. With this understanding, the shifts between first-and follow up-survey means for each of the program’s participants were examined.

In the pilot survey instrument, questions 1 – 4 dealt with how the individual’s physical or emotional issues might have impacted their social or work lives/activities. Consistent across these questions, Counselling and Family Services program response shifts indicated decreased impacts of physical or emotional issues, on both social and work activities. Also consistent across the four questions, Disability program responses indicated either mixed or increased impacts.

The respondent’s attitudes about life were the focus of Questions 5. The shifts in response means were consistent across all programs, including aggregated programs, with them all indicating increases in positive attitudes/feelings across three of the four subset questions.

Questions 6 and 7 focused on the individual’s attitudes, experiences and feelings about their communities. While all program responses showed mixed differences in the individual’s perceptions of their community, all program responses showed shifts towards agreement related to feeling accepted, respected and being able to have a say in community matters.

The most existentially broad of all the questions was Question 8, asking about the individual’s general sense of wellbeing, both in the present and compared to three months previous. All program means showed positive shifts in individual perceptions of wellbeing.

In summary, in matters concerning the individual’s perceptions of impact of physical or emotional issues on their social or work lives, virtually all of the changes in responses from first to follow up surveys were positive - indicating enhanced/improved senses of coping ability and empowerment. The same can be said of the individual’s sense of personal place in their community and, more broadly, of their general sense of wellbeing.

13 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


The only domain where first to follow up survey shifts were mixed was in the respondents’ perceptions of the characteristics of their community.

In addition, feedback from Windermere staff and clients regarding administration of the instrument from each of the programs was solicited. The pilot indicated changes were required to the length and wording of the survey if it was to be more user-friendly and administratively viable. It was this feedback that underpinned the need for a comprehensive revision and trialling process as pursued in this current project.

2.3.2

Pragmatics of evaluating service programs

A number of considerations related to constraints placed on evaluation processes in human services contexts have already been discussed - the impact on the research design that can be implemented, recruitment of participants and their willingness to participate in longer-term projects. But in addition to these factors it was evident from the pilot project there were others related to the positioning of such human service organisations in the larger State human services arena.

One such pragmatic consideration was the fact that any program in the human service arena selected to participate in such surveys would likely be funded by Department of Human Services. With this funding comes operational parameters’, including target numbers of clients and numbers of hours of service delivery. With many programs within this funded model stretched to capacity in just in delivering the service, it becomes very difficult to add a significant additional evaluation task to an already heavy workload. Further, small to medium sized organisations often lack the capacity to employ additional resources for such a task.

Another consideration was the alignment of any such evaluation with relevant informing policies, Acts and Frameworks. For instance, for any survey examining the quality of services to people with disabilities, the survey tool’s alignment with the Disability Act and the accompanying Quality Framework is essential.

A further factor was one of compatibility of the organisation’s data collection/entry protocols with those of the larger funding body. It was considered that, in Victoria, compatibility with the Department of Human Services Client Relationship Information

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 14


System for Service Providers (CRISSP) might help to avoid inefficiencies in data entry, reporting and retrieval. 2.3.3

Key themes from the pilot project

The themes which emerged from the pilot project were related to (1) the outcomes of the surveys and (2) the feasibility and operational considerations encountered in its administration. Regarding survey outcomes: •

The outcomes gave evidence that the programs involved did, have an impact on the client’s sense of agency, empowerment and wellbeing

The outcomes supported the feasibility of exploring a wide range of factors related to wellbeing.

Regarding feasibility and operational considerations: • Any attempt to use such a survey tool would require consideration of applicability to a wide range of applications. For instance, attention would have to be paid to wording with an eye to simplification and to attempts to employ multiple questions with subtle distinctions. • Considerations would also have to be made for applications with clients who might experience challenges in completing the survey.

Such clients might

include: o Those from culturally and linguistically diverse backgrounds. o Those with intellectual or cognitive challenges. • The most successful approach to the administration of such surveys would be one that is incorporated into the organisation’s operational routine and not perceived as an additional task.

15 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


2.4 Emergent aims for current project The findings and themes from the pilot project were fundamental to determining the purpose and design of the current project. The pilot had demonstrated the potential of the pilot survey for measuring client wellbeing in a meaningful way. Yet, there were issues regarding its feasibility as a useable tool for the organisation. Primarily, these issues related to the intrinsic qualities of the survey itself, namely its length, accessibility, language and layout; and to the practicability of administering the survey in a community service organisation subject to time and resource pressures.

Therefore, the aims that have emerged for the current project are to develop and refine the wellbeing measurement tool in such a way that it would retain its meaningfulness in capturing a holistic sense of subjective wellbeing; while employing more accessible language and layout; at the same time reducing the administrative burden through making the survey much shorter – and in all this, ensuring the survey retains its legitimacy as a measurement instrument. The approach adopted to meet these aims is discussed in chapter four. The next chapter reviews the conceptualisation of wellbeing and its measurement with reference to relevant literatures.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 16


3. CONCEPTUALISATION: WELLBEING AND MEASUREMENT The first three sections of this chapter elaborate on the conceptualisation that guided the initial pilot project (Whyte & Jones, 2011) and that has continued to inform the current work. They consider the ways wellbeing has been understood; and approaches to its measurement. This is followed by a consideration of contingent literatures on the place of wellbeing in an outcome oriented community services environment.

3.1 Measuring wellbeing: Conceptually broad, yet specifically sensitive Wellbeing is a multi-faceted concept concerned with people’s quality of life and ability to live a life that has meaning and fulfilment for them. A more precise definition of its meaning has remained elusive, shaped from a variety of disciplines dating back in time to the writings of the ancient Greek philosophers. The commonality for most experts on the topic is that wellbeing is something positive worth striving for with the central theme of wellbeing research being the desire to somehow improve the world for those living in it.

As just described, wellbeing is not a new concept. It has been studied for centuries and from a variety of disciples including philosophy, psychology, medicine, economics, and the social and physical sciences. The notion of wellbeing is used interchangeably with terms such as happiness, quality of life, standard of living, and life satisfaction. For the sake of brevity, the following section provides an abridged version of a more extensive literature review carried out on subjective wellbeing.

In ancient Greek philosophy, Aristotle defined wellbeing as happiness. He stated that there are many good things in life, such as health, pleasure and family, however, the ‘highest good’ human beings can attain is happiness (Annas, 1993). He believed that wellbeing could best be achieved by carrying out worthwhile activities, such as intellectual pursuits. At the same time, Aristotle acknowledged that in order to have wellbeing one must possess certain resources, in particular, positive relationships with significant others and a source of income. Similarly, he proposed that wellbeing was more difficult to achieve if a person experienced negative events, for example, the loss of children or good friends (Annas, 1993).

17 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Aristotle’s writings form the basis of the modern-day capability approach, made famous by economist Amartya Sen and enhanced by philosopher Martha Nussbaum (Nussbaum & Sen, 1993). The approach focuses on what people are effectively able to do and to be, that is, on their capabilities. Wellbeing for Sen and Nussbaum is achieved by allowing people the freedom to choose the kind of life they would like to lead. Both argue that the focus of policy decisions should be on achieving wellbeing and the removal of barriers that restrict wellbeing (Nussbaum & Sen, 1993). The theory rejects an equal distribution of resources or goods on the grounds that this is too simplistic. Such thinking is evident in the influential 2009 ‘Report on Economic Performance and Social Progress’: “the time is right for our measurement system to shift from measuring economic production to measuring people’s wellbeing” (Stiglitz, Sen & Fitoussi, 2009:12).

The capabilities approach presumes that different people have different needs in order to achieve a good life. For example, a person with a disability would require more resources than his/her peers to achieve a life of the same standard (Nussbaum, 1999; 2000). Wellbeing is therefore what a person is capable of doing, their access to resources and their opportunities to use their abilities. Nussbaum (2000) in particular argues that wellbeing is a universal concept and most of the subsequent critique of her work concerns an alleged failure to acknowledge sufficiently the role of local context (Clark & Gough, 2005). It needs to be acknowledged that the religious, spiritual, and cultural beliefs of a community can have a vital role to play in the individual’s access to resources and thereby their general sense of wellbeing.

The difficulties around defining wellbeing are recognised by the Australian Bureau of Statistics (ABS), Australia’s national statistics agency. The ABS (2001:6) broadly defines wellbeing as “a state of health or sufficiency in all aspects of life” and note that “measuring wellbeing therefore involves mapping the whole of life”. According to the ABS (2001:6), “there can be no single measure of wellbeing that satisfies all parties interested in helping people improve their lives. Rather, a range of methods needs to be available …”. Interestingly it is remarked, “In some ways, wellbeing might best be assessed subjectively, as it is strongly associated with notions of happiness and life satisfaction” (ABS, 2001:6).

Wellbeing is commonly seen to contain objective and subjective aspects. Many studies of wellbeing divide the concept along these lines. Objective wellbeing (OWB), as the label indicates, incorporates objective conditions. Subjective wellbeing (SWB), on the

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 18


other hand, relates to an individual’s self-assessment of well-being. Eminent scholars in the field have proposed that subjective well-being “is a particularly democratic scalar. It is based on the idea that how each person thinks and feels about his or her life is important” (Diener & Suh, 2000:4).

Explicit definitions of wellbeing, and differentiations between objective or subjective understandings of the concept, are not always immediately forthcoming. In these cases, the types of measurements and questions in use can give clues to as to particular meaning of wellbeing. For example, self-rating questions about an individual’s ‘happiness’ or ‘life satisfaction’ suggest measurement of SWB.

There have been attempts to develop universal standards, or indicators, for wellbeing at various scales including at the site of the individual as well as community and national levels (see Appendix C). It has been found possible to measure shifts in particular objective conditions (such as income, specific health symptoms and numbers of involvements in social services) but this is much more difficult with subjective experiences (such as emotional, social and existential conditions or attitudes). Overt measures such as outwardly observable behaviour patterns thought to be associated with wellbeing (Strack, Argyle & Schwartz, 1991) or peer/expert ratings on other’s wellbeing (Pavot, 2008) have thus far proved unsuccessful in measurement of SWB. Therefore, it has been argued that subjective self-reports are the best way to elicit information about an individual’s perceived sense of wellbeing (Diener et al, 1999) or SWB. Indeed, some commentators would go further: “Because subjective well-being refers to affective experiences and cognitive judgements, self-report measures of subjective well-being are indispensable” (Larsen & Eid, 2008: 4, italics in original).

The difference in physical, cultural, social, and economic environments across communities and countries makes a global wellbeing measure especially challenging. Nonetheless, the social indicator movement of the 1970’s developed with the aim of developing universal standards, or indicators, for wellbeing. The movement was founded in the belief that an increase in standard of living would contribute to an improved quality of life. Rather than measure wellbeing at the individual level, indicators are recorded across communities and countries (OECD, 2009). These indicators are objective measures of wellbeing and include measures such as the number of people in active employment, life expectancy, literacy rates, reported violent crime etc (OECD, 2009). The research group ‘Community Indicators Victoria’, based at the McCaughey Centre,

19 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


has added that wellbeing indicators should include the importance of our relationships not only with each other but also with our external environment, such as our perception of safety and our ability to participate in community (Cox et al, 2010; Wiseman & Brasher, 2008).

In medicine and psychiatry, wellbeing is more often defined in terms of quality of life or more specifically ‘health-related quality of life (HRQoL)’. Quality of life (QoL) is a subjective measurement tool by which the individual is asked to rate their perceived quality of life on Likert-type scales from poor to good. The field developed as a means of incorporating the patients view in medical decisions and is concerned with measuring a person’s physical and emotional functioning (Katschnig, 2006). Quality of life instruments are now predominantly used as outcomes measures in clinical trials and health research as well as to assist in ethical dilemmas as pharmaceutical options for diseases continue to progress.

The main limitation of HRQoL measures is that they are restricted to health status and do not incorporate contextual elements of life, such as income, education, freedom, and social inclusion. Physical and mental health is only one part of the wellbeing equation, and it would be remiss not to incorporate the spiritual and societal factors that contribute to experiencing a good and meaningful life (Katschnig, 2006).

In summary, wellbeing is a loose term and means many things to different people. Overall, it is the values and experiences we hold dear, our access to resources, a feeling of purpose, the freedom to pursue goals, bodily health and integrity, and the ability to build positive relationships with others and the world in which we live.

3.2 Various approaches/measures of subjective wellbeing

Wilson’s (1967, p. 294) description of the happy person was one of the first attempts to categorise the factors that contribute to experiencing SWB. He stated that the happiest people are “young, healthy, well-educated, well-paid, extroverted, optimistic, worry-free, religious, married with high self-esteem, job morale, modest aspirations, of either sex and of a wide range of intelligence.” Just over four decades later, and more recent research corroborates many, but not all of Wilson’s SWB.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 20


The main theories behind why some people experience a high level of SWB whilst others do not are complementary to describing what factors impact on SWB. In general, theories of SWB can be divided into: (i)

‘top-down’ approaches, and

(ii)

‘bottom-up’ approaches.

Top-down approaches view SWB as the product of individual internal traits and psychological processes and maintain that we have a pre-determined ability to experience SWB that is largely independent from our environment. Top-down theorists emphasize the individual factors that influence SWB, including: o

genetic disposition which has been shown to count for as much as 50% of the individual variance in SWB scores (Healey, 2008);

o

cognitive processes under our voluntary control, such as conscious lifestyle and activity changes, positive thinking, and creating and pursuing meaningful goals (Sheldon & Lyubomirsky, 2004);

o

personality type with individuals reporting high scores on SWB ratings corresponding to the personality characteristics of extroversion, optimism, the ability to set reasonable goals and work towards them, and the ability to not dwell on past negative events (Diener et al. ,1999);

o

the ability to adapt our perceptions of wellbeing over time as we experience changes in our life circumstances (Cummins 2001; 2002); and

o

the ability to compare ourselves with others and adjust our expectations of wellbeing accordingly (Calman,1984).

In contrast, bottom-up approaches give weight to the external social and situational forces shaping our perception of SWB. Bottom-up theories view SWB as the accumulation of a series of happy moments. The happy person is one who has experienced many pleasurable events. The theory also puts forward that our achievements are dependent on the opportunities in our environment. Hence, bottom-up theories hold that wellbeing is the combination of experiencing satisfaction in a number of particular domains, such as family life, intimate relationships, lifestyle, health, financial and work situation, spirituality and housing. Bottom-up theories would emphasise the social factors shaping our abilities to achieve wellbeing.

21 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


The following bottom-up factors have been shown to have positive influences on subjective wellbeing: o

monetary income which is strongly correlated with SWB for those who are unemployed or earning below the poverty line (Healey, 2008). For those living on moderate-to-high incomes sufficient to meet their needs, income becomes less important in life satisfaction ratings when compared to other factors (Healey, 2008; Helliwell, 2003);

o

perceived good health (Diener et al., 1999; Helliwell 2003);

o

marriage (Diener et al., 1999; Helliwell 2003) and positive relationships with extended family, friends, neighbours, as well as connections with the wider community (Argyle, 1999, Helliwell & Putnam, 2005; Jordan, 2007);

o

partaking in a religious faith (Diener et al., 1999; Helliwell 2003);

o

living in a community with perceived trust, freedom from violence and security (Diener et al., 1999; Helliwell 2003);

o

age as identified by Helliwell (2003), stating that those in the middle years of their lives (ages 35 – 54 years) reported lower levels of SWB than the young and those over 65 years of age, assuming participants were equally healthy; and

o

access to nature (Burns, 2005).

Researchers in the field now believe that top-down and bottom-up theories are not mutually exclusive (Diener, 2008). Rather, SWB is a complex combination of both these approaches incorporating physical, mental, emotional, social and environmental factors (Sheldon & Lyubomirsky, 2004). Whilst research often reports general population trends, we need to be aware that individual values, interpretations and experiences create unique ideas of what a happy and satisfied life would look life. Further it can be said that the person with the greatest chance at achieving wellbeing has a positive temperament, is optimistic, doesn’t dwell on negative events, is living in an economically developed society, has access to positive social connections, sufficient resources to progress towards self-determined goals, access to nature, and perceives their external environment to be safe.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 22


With an understanding of the approaches to conceptualising SWB, it can be further understood that while social indicators, such as median income levels, crime rates, and so on, are one way of quantifying SWB, they present only a portion of the complete picture and are inadequate in defining wellbeing. As discussed above, outwardly observable behaviours thought to be associated with wellbeing, such as unconscious body language or an ‘energetic, friendly’ appearance, have not been found to be adequate correlates with self-reported wellbeing (Strack, Argyle & Schwartz, 1991). Ratings by peers and experts on other’s wellbeing have been found to be only weakly correlated with self-reports (Strack, Argyle & Schwartz, 1991; Pavot, 2008). Thus it has been proposed that only the individual is able to provide the information needed to determine what they feel about their own life (Diener et al., 1999). As more overt measures have thus far proved unsuccessful, subjective self-reports are currently the only solution in providing information about a person’s perceived sense of wellbeing.

3.3 Devising a single ‘aggregate’ instrument SWB is typically measured by means of a single-measure survey questionnaire. Respondents are asked to rate their feelings and perceptions about themselves and their life on a rating scale from very poor to very good. The measurement of SWB is an umbrella term with incorporates affective responses (such as mood and emotions) as well as cognitive evaluations of life circumstances (life satisfaction). Affective responses are transitory and ongoing states of subjective experiences.

In contrast, life satisfaction is concerned with how content people are with various aspects of their lives, such as their health, income etc. and as such is usually more stable than affective responses. Although mood is pertinent to SWB, researchers are more interested in longer-term effects impacting on SWB. Life satisfaction ratings therefore are generally more popular in research as it is viewed to be the more stable component of SWB (Diener et al., 1999). Ultimately though, an individual’s affective state will impact on their responses and it is for this reason that the measurement of SWB is concerned both affective and cognitive responses.

Overall, the evaluation of SWB is concerned with the “positive mood and emotional states within an individual’s subjective experience and a cognitive evaluation of the

23 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


conditions and circumstances of his or her life in positive and satisfying terms” (Pavot, 2008 p. 125).

In further considering how, exactly, SWB is conceptualised and measured, social indicators, such as median income levels, crime rates etc. are among the ways of creating a contextual framework of SWB. Although observable social indicators are relevant, they present only a portion of the complete picture and are inadequate in defining wellbeing. Outwardly observable behaviours thought to be associated with wellbeing, such as unconscious body language or an ‘energetic, friendly’ appearance, have not been found to be adequate correlates with self-reported wellbeing (Strack et al., 1991).

It would seem that only the individual is able to provide the information needed to determine what they feel about their own life (Diener et al., 1999). As more overt measures have thus far proved unsuccessful, subjective self-reports are currently the only solution in providing information about a person’s perceived sense of wellbeing.

In further considering the applicability of SWB measures to Windermere programs, despite SWB’s broad applications, the question of whether it can be applied to measure the effectiveness of a service such as Windermere is largely unknown. The closestrelated research in answering this question is the use of Quality of Life (QoL) measures in the mental health field. These have become increasingly popular as a measurement of treatment outcomes as well as assessing the quality of mental health services (Curella, 1999; Lasalvia & Ruggeri, 2006). Many studies have found correlations between subjective QoL scores and service satisfaction with those individuals reporting high service satisfaction also reporting high-perceived QoL (Holcomb et al., 1998; Ruggeri et al., 2001; 2005).

Oliver et al. (1996) summarises the appeal of selecting a QoL measure when evaluating a service: o

it places the service user at “the centre of service planning” (Oliver et al., 1996, p. 242),

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 24


o

it serves as a useful way of monitoring and reviewing the user’s life circumstances

o

it can be used in clinical practice to determine individual user goals and reflect upon personal achievements, and

o

it is an intuitively well-received concept understood by a variety of audiences.

Overall though, to the extent of this literature review, there is surprisingly little information on the use of SWB in measuring service effectiveness. It is however positive to learn that subjective measures, such as QoL questionnaires, which contain elements of SWB have been used effectively in service evaluation. The correlation between service satisfaction and QoL is particularly reassuring. The quality of life studies described above share some commonalities in that they evaluate a single program within a service, combined with a subject group that share some commonalities, such as a diagnosed mental illness.

Windermere differs from these reports in that its service spans a heterogeneous group of service users, as well as multiple programs and interventions. Finding a SWB measure that adequately assesses change across all these domains, would prove to be a challenging task. Considering the multitude of programs offered by Windermere, there was a need for a measure of effectiveness that was conceptually broad, but still specifically sensitive. In addition, the measure had to encompass the physiological, psychological, social, cultural and existential aspects of the human condition. These concerns guided the work on development of the pilot survey tool, and a longitudinal SWB rating was proposed across the domains of emotional wellbeing, physiological wellbeing, community engagement, cultural engagement, and spiritual worth. It was this survey tool that was to be further developed and refined in the current project.

3.4 Wellbeing and outcomes: The policy context Wellbeing is a concept that has rapidly grown in use since its association with both the capabilities approach advanced by Sen and Nussbaum, which re-worked notions of human welfare, and the study of happiness and life satisfaction, which gave added importance to an individual’s self-assessment of their state of being. It has also gained traction from its proximity to the idea of wellness, and the view that health should be as much a matter of promoting wellness as treating illness.

25 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


The appropriation of the concept is evident in formal policy contexts. Many community service organisations in Victoria that provide services directly to clients are subject to accreditation by the Victorian government. Of particular note is that of the Department of Human Services (DHS) where wellbeing is now one of those standards (Standard 3) stated as: “People’s right to wellbeing and safety is promoted and upheld” (DHS, 2011:4). Criteria related to this standard include reference to the use, for example, of strengths based approaches and goal oriented plans in service provision (DHS, 2011:1012).

The local government of one of the areas served by Windermere, Bayside City, provides an example of the potency of wellbeing in policy formation. The city council has developed a “comprehensive new strategy that will promote wellbeing for people of all ages and abilities” (Bayside City Council, 2012:4). The strategy and action plans “look at the environments that are the most important for wellbeing: the built, natural, social and economic environments” (Bayside City Council, 2012:9). Accompanying goals target an engaged and supportive community; a healthy and active community; and safe and sustainable environments.

The question of knowing whether or not a desired outcome is achieved is clearly one that exercises Windermere itself – but also the wider community services sector. The significance of attending to outcomes in setting policy directions is an area that has increasingly, and often controversially, engaged the sector. In Victoria, it has been one of the key elements of a proposed sector reform.

In late 2012, Professor Peter Shergold was appointed by the Victorian Government to lead the Service Sector Reform project.

His final report was launched a year later

(Shergold, 2013). During the five month consultation period, an enhanced focus on outcomes quickly emerged as one of the key themes for debate and reform. “There was a broad consensus that service delivery should focus on results… Shifting the focus of performance measurement from the allocation of inputs to the achievement of outcomes was strongly supported by service users…” (Shergold, 2013: 24).

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 26


Shergold notes that many community service organisations were equivocal about the adoption of an outcomes focus.

For some, the concern was that outcomes could

“sometimes focus too much on the immediate alleviation of disadvantage rather than its longer term prevention” (Shergold, 2013:24). Such misgivings about an approach that underestimated the impact of structural factors in achieving positive outcomes was indicative of an underlying concern about “the manner in which outcomes were set and defined” (Shergold, 2013:24). In addition, some community service organisations were wary about linking funding directly to the achievement of outcomes, fearing that “outcomes-based funding models could have unanticipated consequences if they were inappropriately designed” (Shergold, 2013:24).

Shergold concluded that “the challenge is the need to ensure that outcomes are longer term and wide-ranging. They should be based on a holistic approach to individual disadvantage and take account of the goals service users set for themselves” (2013:25). The recommendation of the Sector Service Reform is that “an outcomes framework should be developed through a partnership between the government and community service organisations… Policy development and program design should be based on the collection of data, research analysis and evaluation of outcome performance” (2013:25). Shergold commented that while all community service organisations “recognize that establishing an outcomes framework would not be easy… [most] believe that, for all the manifold challenges, this is a goal worth pursuing” (2013:25).

3.5 Summary It was evident from the literature that informed the pilot project’s initial development of a wellbeing measurement tool that this would be a challenging but worthwhile task. While subject to a multitude of interpretations, wellbeing had become a concept of significance, a marker (perhaps an ‘empty signifier’) for a valued state of being. Focusing on subjective wellbeing can be a means to enable the clients of community service organisations to communicate their own sense of wellbeing. Yet to reflect a more holistic understanding of wellbeing, this has to prompt self-assessment across a broad range of life domains, from physical wellbeing outwards to places of belonging and inwards to more existential aspects.

27 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


The findings from the pilot project and further engagement with the literature have provided direction for the current project. The wellbeing measurement tool as originally developed was found to have the potential to capture meaningful information across a range of life domains. It was consistent with a ‘bottom up’ approach to measuring wellbeing, enquiring directly of clients themselves. Furthermore, it showed the potential to contribute to a more outcomes-focused orientation in the community services sector, which has gained increasing prominence since the original pilot project was undertaken. The original tool had proven too cumbersome for ready administration within a time and resource pressured community service organisation, and the nature of its language, style and overall presentation had also dampened enthusiasm. Clearly, more work was needed on the instrument itself if it was to be adopted and put to use.

The current research project took up the challenge of developing and refining the wellbeing measurement tool into an instrument that would measure meaningful aspects of clients’ lives in a form that would be useable within the community service sector. The intent was to design an instrument that would be shorter and more accessible whilst retaining its legitimacy as a measurement tool. This gave rise to a research approach that would comprise a number of stages and processes, as presented in the next chapter.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 28


4. RESEARCH APPROACH: METHODOLOGY AND PROCESS This chapter outlines the project design and methods, structured around the different phases of the project. An explanation is given for the particular methods and methodological approach taken.

4.1 Principles behind the approach Several principles inform the project approach. These are: o

It seeks to investigate whether a quantitative measurement of Subjective Wellbeing (SWB) can be developed as a valuable outcome measure in service delivery

o

It embraces collaboration and participation as necessary conditions for developing shared understanding of outcomes measures

o

It places importance on the rigorous development of a statistically reliable and valid instrument

o

It seeks to mutually engage the community service sector, the community and academia in the production of a worthwhile and useable tool

o

It uses both quantitative and qualitative data collection methods

4.2 Project design The project was undertaken in different stages that were designed to address the project’s core objective of designing and refining a statistically reliable and valid wellbeing measurement tool, while maintaining a stakeholder-centric approach. These stages were:

o

Setting the context and inquiry: a review of relevant literature and consultation with stakeholder groups

o

Development of materials and analysis: the design, piloting, testing and revision of research instruments and processes to generate further knowledge about wellbeing outcomes in the community service sector

29 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


As described in detail, below, this developmental aspect was operationalised through a number of successive iterative stages of designing, piloting and revising through the incorporation of relevant stakeholder feedback. The contextual and inquiry work overlapped with, and supported, the development of materials and subsequent analysis.

Ethics approval to conduct the research was obtained from the Human Research Ethics Committee at RMIT University. The Plain Language Statement used when obtaining consent for participation from students of the university is provided in Appendix D.

4.3 Project Methods This section details the rationale for choices of particular methods and means for each stage/aspect of the project.

4.3.1 Setting the context and inquiry

Literature Review The literature review formed an important step in scoping and contextualising important issues around development and refinement of a wellbeing measurement tool for use in the community services sector. The literature review has guided the project in three main ways: o

Informing the theoretical and practical contexts for developing and refining a valid wellbeing measurement tool

o

Developing and refining important research questions, objectives and research design

o

Determining themes to guide both qualitative and quantitative data analysis

In order to achieve these guidelines, three main aspects of literature were under review during the pilot and current project: o

Conceptualisation of wellbeing, in particular subjective wellbeing

o

Subjective wellbeing and outcome measurement

o

Designing a single ‘aggregate’ measurement tool

The literature review has shown how the project is located within a broader series of debates around shifting emphases on accountability and effectiveness, increasing

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 30


recognition of the importance of personal (subjective) sense of wellbeing, approaches in measuring aspects of the human condition, and issues around instrument design and implementation. Consultation with stakeholder groups This project embraces collaboration and participation as necessary conditions for developing a useful and implementable wellbeing measurement tool. As such, different communication mechanisms for stakeholders connected with the project have been in use at different points along the way. This involved actively providing opportunities for stakeholders to participate in the development of the project’s outcomes. The main target stakeholders include Windermere managers, team leaders and associated staff, Windermere clients (former and current), RMIT students in Social Work and associated programs, and members of the Project Team. Other target stakeholders include the Funding Body (William Buckland Foundation and ANZ Trustees), Social Work educators and academics.

A series of communication activities were employed throughout the project with targeted stakeholders to gain feedback on project findings, provisional models and materials. Communication activities included: o

Project team meetings

o

Reference groups meetings

o

Briefing sessions with Windermere managers, team leaders and associate staff

o

Consultation sessions with Windermere clients (former and current)

o

Presentations to local, regional and national forums of relevant professional and industry groups

o

Multiple surveys of RMIT University students in Social Work and associated programs (asking for feedback on the layout, wording and sequence of measurement tool questions – as discussed further below)

Summary of Project Team Outputs – Setting the context and inquiry Relating to this phase, the project team produced: o

A literature review, updated from the pilot project

o

A course of inquiry, incorporating pathways for consultation, that feed into the next main stage of the project – the development of materials and analysis

31 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


4.3.2 Development of materials and analysis

Designing and testing abridged versions of wellbeing measurement tool This phase involved testing the scope of the pilot measurement tool, modifying the measurement tool, and testing the trustworthiness of the abridged version. This involved different stages.

o

Stage 1: Survey of RMIT students, applying the: (a) Existing pilot measurement tool with improved layout and then modified versions (b) Additional socio-demographic questions; and (c) Additional feedback questions to RMIT students enrolled in the social work and associated programs

o

Stage 2: Analysis of the results: a) Correlational analysis – to identify the 3 to 4 significant factors represented in the survey responses; b) An analysis of variance within the RMIT data corpus based on sociodemographic variables (age, gender, first language)

o

Stage 3: From the above analysis, the existing wellbeing measurement tool revised into an abridged version to be trialled

o

Stage 4: Trialling the revised and abridged measurement tool (Tool 5) by survey of RMIT students enrolled in social work and associated programs

Additional steps between Stage 1 and Stage 2: This involved testing four versions of the survey, allowing for analysis of variance

o

Measurement Tool 2: the initial pilot survey (40 questions) with improved layout

o

Measurement Tool 3: re-ordering the sequence of questions; re-wording the questions; and modifying the layout (40 questions);

o

Measurement Tool 4: further re-ordering the sequence of questions (40 questions).

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 32


Note: Measurement Tool 1 is the initial pilot survey from 2009-10

Figure 4.1 illustrates the process of revising different versions of the measurement tool, involved changes to the wording, layout, and sequencing of the questions, leading to the final abridged wellbeing measurement tool (measurement tool 5)

Figure 4.1: Different versions of measurement tool

Measurement Tool 1

Measurement Tool 3

Pilot Survey

Similar Layout to Tool 2 Measurement Tool 2 Same Questions as Pilot Survey Improved Layout

Rewording of Questions

Reordering of Questions Measurement Tool 4

Measurement Tool 5 Reduced number of questions

Same Layout as Tool 3

Same Questions as Tool 3

Reordering of Questions (As compared to Tool 2 & 3)

Determining the sites for the survey rounds The questionnaire survey of measurement tool versions 2, 3, 4 and 5 were applied to students enrolled in Social Work and associated social science and humanities programs leading to a statistical analysis. While this cohort of RMIT students is something of a convenience sample - composed of persons most easily accessed to fill out the survey this group represents well the diversity in the general Victorian population with significant representation of people from CALD as well as refugee backgrounds, people with disability, and people eligible for other forms of income security. The survey involved asking participants a series of additional questions around age groups, sex, field of

33 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


study, student status, country of birth, language, place of residence, parental status, allowing an assessment to be made of socio-demographic status of survey participants. This enabled an analysis of variance within the RMIT data corpus based on demographic variables (age, gender, country of birth, language, and parental status).

In total, 543 RMIT students completed the questionnaire survey. Figure 4.2 shows the breakdown of respondent numbers as per the different measurement tool versions, along with the respective survey sites.

Figure 4.2: Measurement tool respondent numbers and sites

Measurement Tool 1

Measurement Tool 3

<25 Windermere Clients

108 RMIT Students Measurement Tool 5

Measurement Tool 2 246 RMIT Students

Measurement Tool 4

82 RMIT Students

107 RMIT Students

Note on potential for statistical analyses The statistical analysis from the pilot project (2009-10) had been constrained by two main factors: (a) sample size and (b) levels of measurement of the variables. The sample size was <25 (30 initial respondents and then <25 due to attrition in the availability of the intended follow up survey). This overall number enabled for t-test comparisons but did not allow for further differentiation of responses (for example, across three differing programs) as this falls at the minimum range for the most means analyses protocols.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 34


Related to levels of measurement of the survey variables, wherever possible the highest level of measurement—ranking from nominal, ordinal, interval and ratio—are utilised. Even applying this technique to the semantic differentiation response variables, most of the client socioeconomic variables are nominal level measure—with the exception of client age—and the response variables are ordinal.

Summary of Project Team Outputs – Measurement Tool Development and Analysis Relating to this phase, the project team produced:

o

Survey of 543 RMIT students: applying the: (a) Existing pilot measurement tool; (b) Additional socio-demographic questions; and (c) Additional feedback questions to RMIT students enrolled in the social work and associated programs

o

Analysis of the results: a) Correlational analysis – to identify the 3 to 4 significant factors represented in the survey responses; b) An analysis of variance within the RMIT data corpus based on sociodemographic variables (age, gender, first language) c) Analysis of feedback on survey design from RMIT students

o

Revision of the existing wellbeing measurement tool into an abridged version

o

Trialling of the revised and abridged measurement tool by application to RMIT students enrolled in social work programs

The contextual and inquiry work overlapped with the development of materials and subsequent analysis. Figure 4.3 illustrates the research approach for this project. It shows the different phases, stages and approaches taken in testing the scope of the initial pilot tool and the testing of trustworthiness of modified versions, leading to final abridged measurement tool.

35 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Figure 4.3: Outline of Research Approach

Survey Round One with Measurement Tool 2 Respondents: 246 RMIT Students

Results from Survey Round One: 1. Begin Correlational Analysis

Results from Survey Round One: 2. Redesign & Reword Aspects of Tool

Additional Consultation: Consult at Windermere with (a) managers & other staff on the wording, phrasing and design of the measurement tool; (b) small group of Windermere clients Revise Measurement Tool

Survey Round Two with Measurement Tool 3; and Measurement Tool 4

Respondents: 215 RMIT Students

Results from Survey Two: 1. Continue Correlational Analysis

Results from Survey Two: 2. Redesign & Reword Aspects of Tool

Additional Consultation: Consult with Windermere Manager Action Group on the wording, phrasing and design of the measurement tool

Revise Measurement Tool down to groupings of questions (physical health, emotional health, neighbourhood, community, temporal) reflecting key domains of human condition (Bio-psycho-socio-structuralspiritual) Survey Round Three with Measurement Tool 5

Respondents: 82 RMIT Students Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 36


4.4 Summary This chapter has described the rationale and the significance of the project. It shows how the project has an important contribution to make to advancing understanding in the use of quantitative measurement of SWB as an outcomes measure for service delivery. The two phases of the project design describe the logic with which the project was undertaken, from initial review and inquiry into analysis and development. The methods followed in each of these phases describe how the collaborative and participative approach of the project was translated into the project activities. The next chapter presents the quantitative and qualitative analysis and the findings of the study.

37 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


5. FINDINGS: SURVEY DEVELOPMENT AND REFINEMENT This chapter describes the various phases of the survey development, including the features of the iterative process to survey refinement undertaken, the incorporation of feedback at each phase, the statistical analyses undertaken at each phase and the overall findings of the process.

5.1 Considerations for statistical analyses

Related to statistical analyses, it is worth reiterating that in this research the notion of the individual’s ‘wellbeing’ is a dependent variable, as measured/indicated by the survey instrument. As an outcome, this varies according to the independent variable(s), for example, the ‘effectiveness’ of Windermere’s activities/interventions. While the pilot survey indicated the potential to create a tool that would be effective in capturing preand post-intervention differences in the clients’ senses of wellbeing, the phases of instrument refinement undertaken here were: o

to more fully develop statistical power related to the construct validity of the overall instrument

o

to demonstrate reliability in responses across a larger sample of participants and,

o

to undertake a norming of responses across various participant gender, age and language groupings.

The use of semantic differentiation Likert Scales for each of the questions meant that descriptive statistical analyses in the form of means, t-tests and analysis of variance were used to identify changes in responses across the survey iterations. Further, the administration of socio-demographic questions to RMIT University students (Appendix E), alongside the survey instrument itself, facilitated t-tests and analysis of variance, and between participant socio-demographic variables, e.g. gender, age group, language. Cronbach’s Alpha analysis was undertaken to examine internal consistency at both initial and final phases and correlational analysis was utilised in the process of reducing the number of questions.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 38


5.2 Addressing and incorporating stakeholder and participant feedback

Client consultation afforded important feedback and, since ease of administration and accessibility to the service users were factors emphasised by Windermere staff throughout the pilot survey and subsequent development process, feedback provisions for their comments were incorporated at each of the subsequent survey administration iterations (Appendix F). Similarly, feedback from the RMIT University student participants informed iterations of the refinement process (Appendix G).

5.3 Sequence of measurement tool administrations and statistical analyses 5.3.1 Measurement tool 2 As noted in chapter 4 above, the measurement tool 2 (Appendix H) was administered to 246 RMIT University students. In addition to the 40 question measurement tool, the participants were also given surveys asking for socio-demographic details and for feedback regarding the features of the survey instrument.

This was the first larger-scale administration of the instrument which was changed only in colour and font size from the pilot instrument.

The analyses undertaken for this instrument included identification of aggregate sample mean responses for each of the questions. In addition, the response for each question was further filtered to examine means by age groups, gender, country of birth and language. Further, internal consistency of the questions within the tool was confirmed via Cronbach’s Alpha analysis.

Given the still limited number of responses, no attempt was made to conduct correlational analyses of individual questions with the purpose of collapsing reducing the numbers of questions. This would be undertaken after administration and analyses of subsequent tool administrations.

5.3.2 Measurement tool 3 Measurement tool 3 (Appendix I) was one of two survey iterations (along with measurement tool 4) that responded to feedback from measurement tool 2 participants and from Windermere staff. The reason for creating and administering two different tools

39 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


in responding to the feedback was because of the complex nature of the feedback suggestions.

So as not to conflate changes/variables from one survey instrument to another, and thus compromise statistical analyses of variance, this tool was similar in layout to tool 2, with some rewording and reordering of questions.

This measurement tool was administered to 108 RMIT University students. As with measurement tool 2, these participants were also given surveys requesting sociodemographic details and for feedback regarding the desirable/undesirable features of the survey instrument

The analyses of this instrument included identification of aggregate sample mean responses for each of the questions and analysis of variance for each question across tools 2 and 3. The response for each question was further filtered to examine variance across age groups, gender, country of birth and language. Also, correlational analysis comparing the means of related questions was conducted.

5.3.3 Measurement tool 4 Measurement tool 4 (Appendix J) was the companion to survey tool 3 in responding to tool 2 feedback, and used the same layout and same questions. The changes were in reordering of questions compared to tools 2 and 3.

As with tool 3, the analyses undertaken for this instrument included identification of aggregate sample mean responses for each of the questions and analysis of variance for each question across survey tools 2, 3 and 4.

An example of the t-tests results for one socio-demographic variable, gender, is provided in Appendix K. Further, the mean response for each question was further filtered to examine variance across age groups, gender, country of birth and language. Correlational analysis across similar domain questions was conducted.

At the conclusion of the analysis of this individual tool, correlational analysis across similar domain questions was conducted. This was to identify those individual questions with the highest measures of response correlation to others, so as to reduce the number of questions from the original 40 to a more desirable lesser number. (Appendix L refers.)

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 40


5.3.4 Measurement tool 5 Measurement tool 5 (Appendix M) is a 20 question instrument administered to 82 RMIT University students. The reduction in numbers of questions is the result of correlational analysis of response means for individual questions across tools 2, 3 and 4, which is described in greater detail below. The complete list of questions across tools 2, 3, 4 and 5 is at Appendix N.

5.3.5 Descriptive statistics of aggregated sample by survey iteration and sociodemographic variables DESCRIPTIVE STATISTICS OF AGGREGATED SAMPLE BY SURVEY ITERATION & SOCIO-DEMOGRAPHIC VARIABLES Socio-dem Variables

Number (%) Tool 2

Tool 3

Tool 4

Tool 5

Total

Male

36 (14.6)

25 (23.1)

16 (15.0)

12 (14.6)

89 (16.4)

Female

209 (85.0)

81 (75.0)

90 (84.1)

70 (85.4)

450 (82.9)

1 (0.4)

2 (1.9)

1 (0.9)

0 (-)

4 (0.7)

English Only

201 (81.7)

85 (78.7)

81 (75.7)

65 (79.3)

432 (79.6)

LOTE

42 (17.1)

21 (19.4)

24 (22.4)

16 (19.5)

103 (19.0)

3 (1.2)

2 (1.9)

2 (1.9)

1 (1.2)

8 (1.5)

Australia

197 (80.1)

88 (81.5)

86 (80.4)

69 (84.1)

440 (81.0)

Outside Aust

48 (19.5)

18 (16.7)

19 (17.8)

12 (14.6)

97 (17.9)

1 (0.4)

2 (1.9)

2 (1.9)

1 (1.2)

6 (1.1)

18 – 19

20 (8.1)

21 (19.4)

15 (14.0)

15 (18.3)

71 (13.1)

20 – 24

116 (47.2)

50 (46.3)

44 (41.1)

40 (48.8)

250 (46.0)

25 – 34

76 (30.9)

31 (28.7)

35 (32.7)

20 (24.4)

162 (29.9)

35 – 44

22 (8.9)

4 (3.7)

9 (8.4)

4 (4.9)

39 (7.2)

45 – 54

8 (3.3)

2 (1.9)

4 (3.7)

3 (3.7)

17 (3.1)

55 – 64

3 (1.2)

0 (-)

0 (-)

0 (-)

3 (0.6)

No Answer

1 (0.4)

0 (-)

0 (-)

0 (-)

1 (0.2)

Gender

Unidentified Language Spoken

No Answer Birthplace

No Answer Age Groups

41 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Throughout the three rounds of survey administration a total of 543 students completed the wellbeing measurement tool. The overwhelming majority of respondents were female consisting of 450 (82.9%) compared to 89 (16.4%) male respondents. Four respondents did not identify or answer the question relating to gender. With language spoken, a large majority of respondents spoke English only, consisting of 432 (79.6%) compared to 103 (19%) that spoke a LOTE (Language Other Than English). Eight respondents did not answer the question on language spoken. With birthplace the overwhelming majority of respondents were born in Australia consisting of 440 (81%) compared to 97 (17.9%) that were born outside Australia. Six did not answer the question on place of birth.

With age groups the largest number of respondents were situated in the 20-24 year old age group consisting of 250 (46%) followed by 162 (29.9%) in the 25-34 year old age group, 71 (13.1%) in the 18-19 year old age group, 39 (7.2%) in the 35-44 year old age group, 17 (3.1%) in the 45-54 year old age group and 3 (0.6%) in the 55-64 year old age group. One did not answer the question on age groups.

5.4 Results of analyses Key to the entire survey development and refinement process described earlier was the conduct of various analyses of responses. These findings include the aggregated survey results, the results by variables and the results of the correlational process for the reduction of questions for measurement tool 5.

5.4.1 Aggregated survey results Aside from socio-demographic and addition feedback information, the balance of the survey consisted of questions related to wellbeing. Measurement Tools 2, 3 and 4 consisted of 40 questions (8 main questions and balance of subset questions). Measurement Tool 5 consisted of 20 questions (5 main questions and balance of subset questions).

As shown in Appendix N, these questions, including their respective subset questions, were: o Question 1: Impact of physical health problems on social activities. o Question 2: Impact of emotional issues on social activities. o Question 3: Impact of physical health problems on work or other activities. o Question 4: Impact of emotional issues on work or other activities.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 42


o Question 5: General attitude towards life over the past month. o Question 6: Attitudes/experiences of living in their neighbourhood. o Question 7: How they feel about their community. o Question 8: General sense of wellbeing now and compared to three months ago. The response options for each of the above were via a semantic differentiation five point Likert scale. This was the case for each of the individual questions, together with all subset questions. The tabulated findings below are presented in the same question sequence initiated in Measurement Tool 2. This was the first survey round undertake in this project.

Q1 Physical Health & Social Activities MEASUREMENT TOOL 2

Mean

Q1 4.40

Q1a 4.38

Q1b 4.31

Q1c 4.39

Q1d 4.44

Q3d 4.16

Q3e 4.54

Q1d 4.27

Q1e 4.22

MEASUREMENT TOOL 3

Mean

Q3a 4.50

Q3b 4.25

Q3c 4.12

MEASUREMENT TOOL 4

Mean

Q1a 4.24

Q1b 4.27

Q1c 4.03

MEASUREMENT TOOL 5

Mean

Q1 4.38

Examining the aggregated responses on questions relating to physical health and social activities the initial survey responses (tool 2) were very similar in their distribution with a range from a low of 4.31 to a high of 4.44. In addition, for all survey rounds there is consistency in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q1, 1a, 1b, 1c, 1d] range from a low of 4.03 to a high of 4.54.

43 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Q2 Emotional Issues & Social Activities MEASUREMENT TOOL 2

Mean

Q2 3.81

Q2a 3.96

Q2b 3.87

Q2c 4.02

Q2d 3.55

Q2d 3.71

Q5e 3.59

Q5f 3.80

Q3e 3.65

Q3f 3.68

MEASUREMENT TOOL 3

Mean

Q5a 3.80

Q5b 3.93

Q5c 3.87

Q5d 3.93

MEASUREMENT TOOL 4

Mean

Q3a 3.78

Q3b 3.93

Q3c 3.75

Q3d 4.08

MEASUREMENT TOOL 5

Mean

Q3 3.84

Examining the aggregated responses on questions relating to emotional issues and social activities the initial survey responses (tool 2) revealed slight variations in their distribution with a range from a low of 3.55 to a high of 4.02. For all survey rounds, however, there is consistency in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q2, 2a, 2b, 2c, 2d] range from a low of 3.55 to a high of 4.08.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 44


Q3 Physical Health & Work Activities MEASUREMENT TOOL 2

Mean

Q3 4.46

Q3a 4.57

Q3b 4.46

Q3c 4.48

Q3d 4.57

Q4d 4.49

Q4e 4.65

Q2d 4.33

Q2e 4.32

MEASUREMENT TOOL 3

Mean

Q4a 4.67

Q4b 4.45

Q4c 4.35

MEASUREMENT TOOL 4

Mean

Q2a 4.24

Q2b 4.33

Q2c 4.21

MEASUREMENT TOOL 5

Mean

Q2 4.29

Examining the aggregated responses on questions relating to physical health and work activities the initial survey responses (tool 2) were very similar in their distribution with a range from a low of 4.46 (two of those) to a high of 4.57 (two of those as well). For all survey rounds there is consistency in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q3, 3a, 3b, 3c, 3d] range from a low of 4.21 to a high of 4.67.

45 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Q4 Emotional Issues & Work Activities MEASUREMENT TOOL 2

Mean

Q4 4.06

Q4a 4.24

Q4b 4.14

Q4c 4.14

Q4d 3.84

Q4d 4.00

Q6e 4.04

Q6f 4.12

Q4e 3.87

Q4f 4.04

MEASUREMENT TOOL 3

Mean

Q6a 4.01

Q6b 4.29

Q6c 4.25

Q6d 4.30

MEASUREMENT TOOL 4

Mean

Q4a 3.95

Q4b 4.24

Q4c 4.00

Q4d 3.97

MEASUREMENT TOOL 5

Mean

Q4 3.96

Examining the aggregated responses on questions relating to emotional issues and work activities the initial survey responses (tool 2) were slight variations in their distribution with a range from a low of 3.84 to a high of 4.24. For all survey rounds, however, there is consistency in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q4, 4a, 4b, 4c, 4d] range from a low of 3.84 to a high of 4.29.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 46


Q5 Wellbeing Over the Last Month MEASUREMENT TOOL 2

Mean

Q5a 3.51

Q5b 3.52

Q5c 4.25

Q5d 3.09

Q7c 3.96

Q7d 3.08

Q7c 3.91

Q7d 3.11

Q17 4.00

Q18 3.22

MEASUREMENT TOOL 3

Mean

Q7a 3.47

Q7b 3.39

MEASUREMENT TOOL 4

Mean

Q7a 3.58

Q7b 3.58

MEASUREMENT TOOL 5

Mean

Q15 3.70

Q16 3.62

Examining the aggregated responses on questions relating to wellbeing over the last month the initial survey responses (tool 2) revealed variations in their distribution with a range from a low of 3.09 to a high of 4.25. For all survey rounds there is some variation in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q5a, 5b, 5c, 5d] range from a low of 3.08 to a high of 4.25.

47 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Q6 Neighbourhood MEASUREMENT TOOL 2

Mean

Q6a 3.93

Q6b 4.03

Q6c 4.11

Q6d 3.44

Q6e 3.45

Q6f 3.13

Q6g 3.78

Q6h 3.68

Q1f 4.10

Q1g 3.68

Q1h 3.63

Q5f 4.14

Q5g 3.83

Q5h 3.70

4

Q9 4.20

Q10 3.65

MEASUREMENT TOOL 3

Mean

Q1a 4.04

Q1b 3.86

Q1c 4.04

Q1d 3.36

Q1e 3.32

MEASUREMENT TOOL 4

Mean

Q5a 4.03

Q5b 3.91

Q5c 3.93

Q5d 3.45

Q5e 3.48

MEASUREMENT TOOL 5

Mean

Q5 4.05

Q6 3.93

Q7 4.04

Q8 3.46

Examining the aggregated responses on questions relating to living in the respective neighbourhood the initial survey responses (tool 2) revealed variations in their distribution with a range from a low of 3.13 to a high of 4.11. For all survey rounds there is some variation in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q6a, 6b, 6c, 6d] range from a low of 3.13 to a high of 4.14.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 48


Q7 Community MEASUREMENT TOOL 2

Mean

Q7a 3.79

Q7b 3.96

Q7c 3.18

Q7d 3.33

Q2c 3.19

Q2d 3.37

Q6c 3.30

Q6d 3.48

Q13 3.26

Q14 3.54

MEASUREMENT TOOL 3

Mean

Q2a 3.74

Q2b 3.85

MEASUREMENT TOOL 4

Mean

Q6a 3.68

Q6b 3.99

MEASUREMENT TOOL 5

Mean

Q11 3.74

Q12 4.04

Examining the aggregated responses on questions relating to the community initial survey responses (tool 2) revealed variations in their distribution with a range from a low of 3.18 to a high of 3.96. For all survey rounds there is some variation in responses between the first and follow up surveys. On a 1-5 Likert scale ranging from ‘not at all’ to ‘extremely’ (reversed stats) the means across all questions of [Q7a, 7b, 7c, 7d] range from a low of 3.18 to a high of 3.99.

49 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Q8 General Sense of Wellbeing MEASUREMENT TOOL 2

Mean

Q8a 3.33

MEASUREMENT TOOL 3

Q8b 3.07

Mean

Q8a 3.41

Q8b 3.21

MEASUREMENT TOOL 4

Mean

Q8a 3.40

MEASUREMENT TOOL 5

Q8b 3.27 Mean

Q19 3.39

Q20 3.39

Examining the aggregated responses on questions relating to the general sense of wellbeing the initial survey responses (tool 2) revealed variations in their distribution with a range from a low of 3.07 (3 months ago) to a high of 3.33 (now). For all survey rounds there is some slight variation in responses between the first and follow up surveys but a pattern. On a 1-5 Likert scale ranging from ‘poor’ to ‘excellent’ (reversed stats) the means across all questions of [Q8a, 8b, 8c, 8d] range from a low of 3.18 to a high of 3.99.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 50


5.4.2 Survey results by variables This section outlines findings based on important variables in particular gender, language spoken, birthplace and age group. QUESTION 1 IMPACT OF PHYSICAL HEALTH ON SOCIAL ACTIVITIES Gender Tool Have physical health problems interfered with my usual social activities over the last month?* Reduced the overall amount of time spent?* Achieved less than I would have liked?*

Didn’t do them as well or carefully as usual?*

Had difficulty because of physical pain?*

2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5

Language

Birthplace 1819

2024

Age group 253534 44

4554

5564

4.41 4.48 4.27 4.41

Outs Aus 4.38 4.56 4.06 4.17

4.42 4.29 4.73 4.80

4.53 4.55 4.27 4.33

4.30 4.65 3.97 4.15

4.06 3.67 3.86 4.00

4.57 5.00 5.00 5.00

1.00 nr nr nr

4.28 4.16 4.10

4.45 4.31 4.31

4.05 3.80 3.94

4.40 4.00 4.40

4.48 4.47 4.41

4.31 4.04 4.15

4.00 4.00 3.44

4.50 4.50 5.00

3.67 nr nr

4.34 4.05 4.05

4.18 4.32 4.15

4.39 4.10 4.09

4.00 4.07 3.65

4.25 3.52 4.27

4.42 4.33 4.21

4.21 4.21 3.82

4.16 4.00 3.33

4.50 4.00 4.50

3.33 nr nr

4.39 4.10 4.19

4.40 4.09 4.31

4.34 4.37 4.30

4.42 4.20 4.31

4.27 3.80 4.00

4.11 3.67 4.40

4.53 4.39 4.40

4.28 4.15 4.09

4.11 3.67 3.75

4.57 4.50 5.00

5.00 nr nr

4.43 4.43 4.31

4.39 4.54 4.20

4.67 4.50 4.35

4.45 4.52 4.27

4.42 4.56 3.88

4.70 4.52 4.60

4.65 4.50 4.27

4.21 4.64 4.09

3.94 4.00 3.56

4.43 5.00 4.75

3.33 nr nr

LOTE

Aus

4.40 4.54 4.13 4.36

Eng Only 4.36 4.52 4.25 4.39

4.59 4.40 4.25 4.40

4.39 4.29 4.28

4.37 4.14 4.25

4.40 4.25 4.34

4.36 4.15 4.03

4.30 4.04 4.06

4.39 4.18 4.29

M

F

4.38 4.51 4.28 4.50

2 4.50 4.58 3 4.22 4 5 * Note: Wording of questions from Tool 2.

Some amendments to wording of these questions in Tools 3, 4 & 5.

51 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


QUESTION 2 IMPACT OF EMOTIONAL ISSUES ON SOCIAL ACTIVITIES Gender Tool Have emotional issues interfered with my usual social activities over the last month?* Reduced the overall amount of time spent?*

2 3 4 5

Birthplace 1819

2024

Age group 253534 44

4554

5564

3.85 3.82 3.76 3.89

Outs Aus 3.68 3.65 3.78 3.58

3.47 3.38 3.87 3.93

3.96 3.98 4.00 4.05

3.67 3.82 3.53 3.35

4.06 3.75 3.63 4.25

3.71 3.50 3.50 3.33

1.50 nr nr nr

3.90 3.95 3.48

4.04 3.95 3.97

3.62 3.71 3.67

3.63 3.62 3.87

4.01 4.20 4.02

3.99 3.69 3.88

4.17 4.00 3.89

3.71 3.50 3.75

2.67 nr nr

3.85 3.89 3.79

4.00 3.79 3.57

3.92 3.28 3.76

3.67 3.21 3.61

3.47 3.62 4.07

4.02 4.08 3.80

3.81 3.65 3.59

4.06 4.00 3.89

3.29 4.00 3.00

2.68 nr nr

3.95 3.78 4.38

4.01 3.93 4.14

4.08 3.89 3.76

4.07 4.00 4.08

3.79 3.42 4.00

3.63 3.43 4.47

4.15 4.27 4.19

3.89 3.73 3.88

4.41 3.75 3.89

3.57 4.00 3.50

3.33 nr nr

3.50 3.52 3.81

3.53 3.62 3.63

3.68 3.50 3.57

3.60 3.66 3.60

3.30 3.20 3.78

3.20 3.43 3.67

3.76 3.78 3.82

3.27 3.38 3.47

3.89 3.50 3.67

3.43 3.50 3.25

2.67 nr nr

3.67 3.70 3.81

3.71 3.79 3.76

3.68 3.80 3.33

3.76 3.85 3.67

3.45 3.47 3.56

3.35 3.33 3.67

3.84 3.96 3.84

3.55 3.85 3.44

4.00 4.00 4.00

4.00 3.50 3.25

2.33 nr nr

LOTE

Aus

3.79 3.67 3.80 3.87

3.85 3.75 3.57 3.53

3.91 3.87 4.00

3.97 3.91 4.08

3.82 3.83 3.88

F

3.92 3.83 3.78 3.67

2 4.24 3.96 3 3.92 4 5 Achieved less than I 2 4.21 would have liked?* 3.90 3 3.73 4 5 Didn’t do them as 2 4.39 well or carefully as 4.00 3 usual?* 4.02 4 5 Found myself feeling 2 3.85 anxious?* 3.63 3 3.64 4 5 Found myself feeling 2 3.91 sad?* 3.85 3 3.67 4 5 * Note: Wording of questions from Tool 2.

Language Eng Only 3.80 3.81 3.82 3.95

M

Some amendments to wording of these questions in Tools 3, 4 & 5.

QUESTION 3 IMPACT OF PHYSICAL HEALTH ON WORK Gender

Have physical health problems interfered with my usual work or other activities over the last month?* Reduced the overall amount of time spent?*

Birthplace 1819

2024

Age group 253534 44

4554

5564

4.46 4.67 4.33 4.34

Outs Aus 4.46 4.67 3.83 3.92

4.16 4.43 4.71 4.60

4.63 4.74 4.23 4.25

4.38 4.74 4.12 4.15

3.93 4.33 3.71 4.00

4.83 5.00 4.75 4.67

3.00 nr nr nr

4.54 4.56 4.45

4.59 4.51 4.43

4.46 4.00 3.83

4.37 4.00 4.86

4.67 4.65 4.37

4.49 4.41 4.18

4.42 4.50 3.56

4.75 4.50 5.00

4.00 nr nr

4.47 4.28 4.23

4.45 4.59 4.42

4.49 4.39 4.29

4.34 4.00 3.75

4.47 3.86 4.64

4.63 4.54 4.30

4.30 4.44 4.03

4.11 4.50 3.38

4.63 3.50 4.75

3.67 nr nr

4.39 4.24 4.38

4.49 4.47 4.38

4.44 4.53 4.40

4.49 4.55 4.41

4.43 4.07 3.94

4.42 4.09 4.79

4.61 4.71 4.35

4.29 4.44 4.21

4.53 4.50 3.75

4.63 4.00 4.75

3.67 nr nr

4.79 4.52 4.31

4.55 4.62 4.34

4.67 4.76 4.50

4.57 4.65 4.39

4.56 4.57 3.94

4.70 4.52 4.64

4.73 4.67 4.37

4.42 4.70 4.27

4.10 4.50 3.56

4.63 5.00 5.00

4.00 nr nr

LOTE

Aus

4.38 4.75 4.07 4.34

4.56 4.70 4.33 4.53

4.64 4.19 4.25

4.57 4.41 4.36

4.36 4.19 4.13

M

F

2 3 4 5

4.46 4.67 4.29 4.00

2 4.55 4.53 3 4.36 4 5 Achieved less than I 2 4.48 would have liked?* 4.39 3 4.24 4 5 Didn’t do them as well 2 4.49 or carefully as usual?* 4.55 3 4.33 4 5 Had difficulty because 2 4.53 of physical pain?* 4.69 3 4.34 4 5 * Note: Wording of questions from Tool 2.

Language Eng Only 4.43 4.66 4.26 4.27

Tool

Some amendments to wording of these questions in Tools 3, 4 & 5.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 52


QUESTION 4 IMPACT OF EMOTIONAL ISSUES ON WORK Gender Tool Have emotional issues interfered with my usual work or other activities over the last month?* Reduced the overall amount of time spent?*

2 3 4 5

Birthplace 1819

2024

Age group 253534 44

4554

5564

4.14 4.09 3.92 3.97

Outs Aus 3.76 3.71 4.05 3.92

3.89 3.71 4.20 4.20

4.20 4.19 3.97 3.98

3.84 3.85 3.76 3.60

4.25 4.50 4.11 4.75

4.50 4.50 4.00 4.00

2.00 nr nr nr

4.21 4.20 4.05

4.30 4.37 4.20

3.95 3.73 4.39

3.95 3.86 4.40

4.38 4.58 4.21

4.01 4.04 4.24

4.63 4.33 4.22

4.50 5.00 4.00

2.00 nr nr

4.13 4.29 4.05

4.21 4.10 3.86

4.20 4.40 3.99

3.86 3.47 4.00

3.85 3.81 4.20

4.33 4.52 3.95

3.89 4.04 3.85

4.53 5.00 4.44

4.00 4.00 4.00

2.67 nr nr

4.52 4.05 4.20

4.12 4.34 3.97

4.26 4.15 3.90

4.20 4.36 3.92

3.88 3.93 4.18

3.95 3.81 4.13

4.28 4.63 3.93

3.89 4.08 3.91

4.76 4.67 4.11

4.00 4.00 4.00

2.00 nr nr

4.30 4.09 4.06

3.82 4.14 3.82

3.90 3.76 3.95

3.89 4.17 3.80

3.60 3.50 4.06

3.60 4.14 4.27

4.03 4.10 3.95

3.53 3.77 3.56

4.22 4.00 4.00

4.25 5.00 3.75

2.00 nr nr

4.28 4.00 4.44

3.99 4.17 4.08

4.03 3.95 3.90

4.06 4.23 4.01

3.73 3.56 4.06

3.85 3.95 4.33

4.11 4.23 4.07

3.83 3.96 3.79

4.21 4.67 4.33

4.25 4.50 4.00

2.67 nr nr

LOTE

Aus

4.38 3.86 4.20 3.96

4.09 3.94 3.76 3.73

4.58 4.05 4.56

4.24 4.29 4.28

4.48 3.91 4.31

F

4.01 4.05 3.91 4.00

2 4.18 4.36 3 4.18 4 5 Achieved less than I 2 4.08 would have liked?* 4.36 3 3.94 4 5 Didn’t do them as 2 4.08 well or carefully as 4.38 3 usual?* 3.94 4 5 Found myself feeling 2 3.76 anxious?* 4.05 3 3.85 4 5 Found myself feeling 2 3.96 sad?* 4.18 3 3.98 4 5 * Note: Wording of questions from Tool 2.

Language Eng Only 4.05 4.04 3.99 4.06

M

Some amendments to wording of these questions in Tools 3, 4 & 5.

QUESTION 5 GENERAL ATTITUDE TOWARDS LIFE OVER PAST MONTH Gender Over the last month I have been Generally content with life.

Feeling positive about the future

Feeling that bad things keep happening to me. Feeling calm and peaceful

Tool 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5

M

F

3.52 3.44 3.64 3.33 3.52 3.36 3.60 3.42 4.24 4.00 3.89 4.08 3.07 3.12 3.12 3.17

3.40 3.46 3.25 3.76 3.51 3.46 3.56 3.66 4.29 3.88 4.13 3.99 3.15 2.96 3.07 3.23

Language Eng Only 3.52 3.50 3.59 3.68 3.56 3.43 3.59 3.59 4.25 4.04 3.85 3.97 3.07 3.12 3.11 3.21

Birthplace

LOTE

Aus

3.48 3.52 3.52 3.73 3.33 3.33 3.48 3.80 4.23 3.67 3.95 4.27 3.18 3.00 3.05 3.27

3.53 3.55 3.57 3.71 3.56 3.50 3.56 3.61 4.28 4.16 3.85 3.99 3.10 3.16 3.05 3.19

Outs Aus 3.41 3.28 3.58 3.58 3.36 3.00 3.68 3.67 4.11 3.00 4.11 4.08 3.02 2.78 3.39 3.42

1819

2024

Age group 253534 44

4554

5564

3.30 3.38 3.80 3.80 3.30 3.01 3.40 3.40 4.15 3.29 3.93 4.07 2.70 3.00 3.07 3.07

3.69 3.56 3.82 3.90 3.65 3.50 3.82 3.88 4.24 4.26 3.75 3.98 3.15 3.32 3.27 3.35

3.38 3.56 3.29 3.35 3.47 3.42 3.40 3.40 4.34 3.80 4.00 3.95 3.10 2.81 2.97 3.05

3.13 3.50 3.00 2.67 3.50 2.50 2.75 2.33 3.88 4.50 3.75 3.67 3.63 2.00 3.00 3.00

2.67 nr nr nr 1.67 nr nr nr 4.00 nr nr nr 1.67 nr nr nr

3.40 3.67 3.44 3.75 3.43 4.00 3.78 4.00 4.20 5.00 4.33 4.50 3.00 3.33 3.00 3.50

53 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


QUESTION 6 FEELINGS ABOUT LIVING IN THEIR NEIGHBOURHOOD Gender About living in my neighbourhood I generally like living here.

I generally feel safe here.

It is easy to get around

There are enough community services.

Community services are accessible.

I have access to further education.

Public places are pleasant and appealing. I am able to participate in arts, sports & cultural activities.

Tool 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5

Language

Birthplace 1819

2024

Age group 253534 44

4554

5564

3.94 4.08 3.98 4.00 4.04 3.88 3.83 3.83 4.17 4.03 3.86 3.97 3.49 3.37 3.37 3.34 3.50 3.31 3.39

Outs Aus 3.91 3.94 4.26 4.30 4.02 3.89 4.26 4.50 3.84 4.06 4.21 4.42 3.29 3.39 3.68 4.17 3.27 3.39 3.74

4.20 3.91 4.00 4.00 4.10 3.57 3.60 3.60 4.20 4.00 4.00 4.00 3.45 3.01 3.23 3.27 3.50 3.01 3.29

3.88 3.94 4.16 4.18 4.00 3.80 4.00 4.05 4.05 3.88 3.88 3.93 3.28 3.28 3.42 3.43 3.29 3.24 3.43

3.93 4.26 4.09 4.10 4.04 4.07 3.89 3.90 4.16 4.30 4.03 4.20 3.63 3.64 3.62 3.60 3.63 3.59 3.62

3.71 4.00 3.33 3.00 3.90 4.50 3.78 3.75 4.16 4.00 3.33 4.00 3.50 4.00 3.22 4.00 3.67 4.00 3.44

4.25 4.50 3.75 3.67 4.25 4.00 4.50 4.33 4.13 4.50 4.75 4.67 4.00 4.00 3.50 3.33 3.71 2.50 3.50

4.67 nr nr nr 4.67 nr nr nr 3.67 nr nr nr 3.33 nr nr nr 3.00 nr nr

3.68 3.80 4.10 4.20 3.61 3.48 3.71

4.20 4.23 4.14 4.20 3.82 3.73 3.78

3.89 3.71 4.11 4.17 3.61 3.50 4.16

4.50 4.24 3.93 3.93 4.00 3.29 3.13

3.95 4.22 4.21 4.28 3.66 3.59 3.91

4.33 3.84 4.20 4.35 3.82 4.00 4.00

4.14 4.25 4.00 4.00 3.95 4.00 3.67

4.13 3.50 4.00 3.67 4.00 4.50 4.50

3.67 nr nr nr 4.00 nr nr

3.08 3.29 3.71 3.60

3.78 3.71 3.69 3.67

3.27 3.33 3.63 3.50

3.80 3.48 3.64 3.60

3.57 3.61 3.61 3.60

3.78 3.65 3.86 3.65

3.86 4.00 3.33 4.00

3.63 4.50 4.25 4.00

3.33 nr nr nr

LOTE

Aus

3.94 4.32 4.13 3.99 4.11 4.20 4.13 3.86 4.00 4.17 4.13 3.91 3.37 3.52 3.57 3.37 3.37 3.54 3.57

Eng Only 3.97 4.11 4.05 4.03 4.04 3.91 3.93 3.92 4.15 4.06 3.88 4.00 3.49 3.32 3.49 3.46 3.48 3.28 3.48

3.73 3.86 4.05 4.07 3.98 3.76 3.91 3.87 3.83 3.95 4.24 4.13 3.18 3.57 3.43 3.53 3.28 3.52 3.52

4.16 4.13 4.19 4.33 3.79 3.64 3.78

3.94 4.16 3.87 4.17 3.69 3.88 4.13

4.22 4.22 4.18 4.18 3.81 3.75 3.89

3.71 3.56 3.69 3.92

3.49 3.88 3.73 3.60

3.80 3.74 3.70 3.67

M

F

3.93 3.94 4.01 4.42 4.02 3.74 3.87 4.33 4.12 4.01 3.90 4.75 3.46 3.30 3.43 4.00 3.47 3.24 3.47

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 54


QUESTION 7 FEELINGS ABOUT THEIR COMMUNITY Gender About how I feel in my community I feel accepted.

I feel culturally respected

I am able to have a say about community matters. I feel pride in my community

Tool 2 3 4 5 2 3 4 5 2 3 4 5 2 3 4 5

M

F

3.79 3.79 3.67 3.92 3.99 3.84 4.00 4.08 3.25 3.10 3.38 2.75 3.35 3.31 3.57 2.92

3.77 3.60 3.80 3.71 3.71 3.92 3.93 4.03 2.86 3.52 2.93 3.34 3.27 3.56 3.07 3.64

Language Eng Only 3.86 3.78 3.75 3.77 4.05 3.91 4.10 4.09 3.22 3.27 3.28 3.23 3.43 3.44 3.47 3.55

Birthplace

LOTE

Aus

3.46 3.76 3.43 3.60 3.53 3.81 3.62 3.80 2.95 3.10 3.24 3.40 2.85 3.24 3.48 3.47

3.86 3.83 3.67 3.74 4.04 3.97 4.02 4.09 3.21 3.30 3.25 3.24 3.41 3.42 3.43 3.53

Outs Aus 3.56 3.50 3.63 3.75 3.61 3.50 3.74 3.75 3.07 2.94 3.42 3.33 3.05 3.29 3.74 3.58

1819

2024

Age group 253534 44

4554

5564

4.05 3.38 3.93 3.93 4.20 3.33 4.20 4.20 3.42 2.81 3.07 3.07 3.35 3.00 3.73 3.60

3.84 3.88 3.63 3.63 3.98 4.00 3.93 3.93 2.96 3.16 3.33 3.30 2.23 3.32 3.60 3.25

3.73 3.71 3.57 3.70 3.97 3.94 3.86 4.10 3.40 3.42 3.27 3.05 3.51 3.65 3.21 3.75

3.52 4.25 3.67 4.25 3.62 4.25 4.22 4.50 3.48 4.00 3.56 4.75 3.38 4.00 3.56 3.33

3.57 3.50 4.25 4.00 3.86 3.50 4.25 4.00 3.57 3.00 3.50 3.00 3.43 3.00 3.50 3.54

4.00 nr nr nr 3.33 nr nr nr 2.67 nr nr nr 2.67 nr nr nr

1819

2024

Age group 253534 44

4554

5564

3.30 3.29 3.27 3.27 2.90 3.38 3.33 3.33

3.38 3.56 3.68 3.53 3.03 3.16 3.43 3.45

3.31 3.32 3.17 3.25 3.14 3.16 3.20 3.50

3.38 2.50 3.50 3.39 3.50 3.50 2.75 2.67

3.33 nr nr nr 2.67 nr nr nr

QUESTION 8 GENERAL SENSE OF WELLBEING Gender Tool How do I rate my general sense of wellbeing now? How does this compare to 3 months ago?

2 3 4 5 2 3 4 5

M

F

3.29 3.35 3.43 3.42 3.04 3.15 3.29 3.42

3.56 3.63 3.31 3.39 3.23 3.38 3.19 3.39

Language Eng Only 3.35 3.49 3.46 3.44 3.04 3.23 3.26 3.39

Birthplace

LOTE

Aus

3.18 3.24 3.19 3.27 3.17 3.19 3.33 3.33

3.40 3.52 3.40 3.40 3.03 3.24 3.23 3.37

Outs Aus 3.06 3.06 3.37 3.33 3.23 3.11 3.47 3.50

3.14 3.33 3.11 3.33 3.10 3.00 2.89 3.00

These tables represent the response means for each of the tools across each of the socio-demographic variables, including gender, language spoken, birthplace and age, across each of the questions. These response means allow for comparison of Windermere client responses with equivalent tool socio-demographic matrix responses both laterally within tool administrations and longitudinally across tool iterations.

55 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


5.4.3 Results of the correlational process for the reduction of questions for measurement tool 5

A fundamental element to be addressed in the refinement of the wellbeing instrument was the reduction in the number of questions from the 40 that were found in all iterations of the tool, from the pilot to measurement tool 4. A key consideration in the reduction process what that the holistic understanding of subjective wellbeing (physical health, emotional health, social activities, work activities, neighbourhood, community and general sense of wellbeing) that marked the 40 question instrument would remain in the shortened version.

The process of deletion of items was undertaken through systematically employing four methods/stages based on SPSS statistical analysis. These were: • Stage 1, the deletion of items based on Cronbach’s Alpha measure for internal consistency or inter-item reliability. By excluding Q5c and Q8b (identified as reducing the reliability in their subtype of question) also changed the overall reliability when both these items were excluded. Although only a very small insignificant improvement, it represented the possibility of improving the questionnaire without these items. By further excluding Q5c and Q8b the explained variance increased, indicative of the larger cumulative percentage (75.63% as opposed to the original 72.85%). The nature of the factor loading ceased to change, thus warranting strong rationale for these two items to be excluded

• Stage 2, the deletion of items based on their smaller factor loadings.

For

example, based on factor loadings the lowest loading item for each factor (Q1d, Q2d, Q5c, Q6b, q8b). The second lowest factor loading items for each were also noted (Q1b, Q2, Q6a, Q7a, Q8a). Items were excluded that had a very high (>09) inter-item correlation (multicolinearity) or substantially low inter-item correlation (not truly valid item) and smaller factor loading on the component. For example, question 1d and 6b

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 56


were identified as problematic based on their inter-item correlation and smaller factor loadings in comparison to other items. Q3b and Q3c had a shared correlation of r=.937, thus the deletion of Q3b was based on this number having the smaller factor loading onto component two. Q4a and Q4b had a shared correlation= .91; Q4b and Q4c = .906), thus the same method was usedchecking the factor loadings and eliminating the item with the smaller loading. (Q4a and Q4b). Cronbach’s alpha reliability scale was used to check with each additional deletion. Factor analysis was rerun and found that with the exclusion of Q1d, Q6b, Q3b, Q4a, Q4b, the explained variance increased and did not change the structure of the factor analysis.

• Stage 3, the deletion of items based on inter-item correlations using clinical judgment.

This identified overly related items that might indicate multi-

colinearity (<.9) or potentially very weak items that appeared to not relate well with more than two other items, i.e., <.08 for several items (Q1b, Q1d, Q6c, Q6d, Q6f, Q6g, Q7c, Q8b) At this stage, items were deleted primarily based on their inter-item correlations. Within items in question two- the inter-item correlations were substantially higher than for other questions, thus the deletion for two of these was warranted, as it was interpreted as the items are all pertaining the same information (potential of slight multi-colinearity). The items were deleted based on the lower factor loadings onto the factor. For component 1: Q2 and Q2d. The same method was used for component 2- physical health, as this factor had 10 questions addressing this topic. The deletion of items was based on poorest factor loadings and high inter-item reliability with items in the same factor, this included Q3d (smaller factor loading compared to all other Q3 items). The results below show that this did not impact the reliability or the explained variance/ factor structure of the factor analysis.

57 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


•

Stage 4: Deletion of cases based on a larger cut-off point for inter-item correlations (.80). This was done manually and resulted in a total of 25 pairs of items having a correlation of over .80. Within these, 11 pairs were from question 4 and 5 were from question 2 (keep in mind these two groups of questions combined make up component 1) Thus, the ability to remove further items from this component should result in a shorter questionnaire whilst maintaining reliability and explained variance. The items were removed according to their common relatedness to a multiple of other items within that question bunch. This resulted in Q7a being deleted because it related (<.8) to Q7b. It had a lower factor loading and therefore was eliminated.

Detailed matrices of the processes and findings are at Appendix L.

5.5 Summary This chapter has described the various phases of the survey development, including the iterative process undertaken to refine the survey, the incorporation of feedback at each phase, the statistical analyses undertaken at each phase and the overall findings.

The use of descriptive statistical analyses in the form of means, t-tests and analysis of variance were used to identify changes in responses across the survey iterations. In addition, the administration of socio-demographic questions alongside the survey instrument itself, facilitated t-tests and analysis of variance, and between participant socio-demographic variables, for example, gender, language spoken, birthplace and age group. Cronbach’s Alpha analysis was employed to examine internal consistency at both initial and final phases and correlational analysis was applied in the process of reducing the number of questions resulting in a final, abridged version of the wellbeing measurement tool.

The next chapter provides discussion and recommendations concerning the wellbeing measurement tool in particular around aspects of value and confidence in the tool, technical considerations of use, implementation and dissemination.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 58


6. DISCUSSION AND RECOMMENDATIONS 6.1 The value of a wellbeing measurement tool

Implicit in the development of a wellbeing measurement tool are the assumptions that wellbeing is worth measuring, that it can be measured, and that this can be beneficial. These assumptions are, each in their own way, contentious. Investing in measuring wellbeing requires justification. While the research project does not set out to provide that justification, it has proceeded on the basis – forged through the collaborative nature of the project – that improving the wellbeing of clients is in accord with the missions and values of the community service sector. Adopting a holistic approach to wellbeing, which has been the intent of the project, helps focus attention on a range of life domains and is consequently associated with person-centered service provision that recognises the impact of contextual factors.

Designing a tool for measuring wellbeing has required producing a framework that encourages consideration of the range of life domains while still leaving it open to the client themselves to bring their own interpretations and perceptions to their assessment of their state of being. There is an inevitable subjectivity in the measurement of subjective wellbeing. These considerations relate to the question of outcomes. There is a necessary debate about the advantages and risks of the community service sector becoming more outcomes driven. Critiques become more strident the more that outcome measurement is linked to performance management and payment by results.

The development of a wellbeing measurement tool sits at the heart of a paradox, as articulated by a fierce critic of the increasing use of outcomes in public policy: “Information about outcomes can either be simple, comparable and efficient to collect, or it can be a meaningful picture of how outcomes are experienced by people. It cannot be both.” (Lowe, 2013:214). As a consequence of this inbuilt flaw, it is argued, “an outcomes based approach distorts the priorities and practices of those tasked with delivering such intervention and results in worse outcomes for the people who are supposed to benefit” (Lowe, 2013:216).

The wellbeing measurement tool developed in this project does aim to produce information that is both meaningful and efficient to collect. In that sense, it represents a

59 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


cross-over in traditional understandings of what and how complex states, wellbeing in this instance, can be measured.

It remains to be seen whether, through its

implementation, the predicted distortions in organisational priorities and practices do necessarily follow – or whether there may be a way out of the paradox.

6.2 Confidence in the wellbeing measurement tool

The final, abridged version of the wellbeing survey for use by Windermere is included as Appendix P. The research processes that have been employed in the development and refinement of the tool offer confidence in its legitimacy. Systematic administration of the tool and its revised versions, analysis of quantitative and qualitative data, and timely consultation with key stakeholder groups (clients and staff) represent a robust procedure for producing the abridged and more accessible survey instrument.

The resulting tool is one which, although drawing from a number of wellbeing measures across different disciplines, has been shown to possess a high degree of internal consistency across its measures. It is a tool which draws upon larger scale trialling across broadly diverse samples, with enhanced representativeness of the intended Windermere client base. It is an instrument that has been tested for variance across relevant socio-demographic factors. And it is an instrument that is more easily administered alongside of normal client intake/review procedures.

6.3 Technical considerations in using the wellbeing measurement tool

There are two main methodological considerations to bear in mind when using the wellbeing measurement tool. The first is that the relationship between Windermere services and the client’s sense of wellbeing, as recorded in different administrations of the tool, is both an independent variable (the provision of Windermere services) and a dependent variable (the wellbeing responses). As such, the interpretation of the results of different iterations of the survey is done through observed change in individual question and aggregate response means. This change can be understood as statistically significant through a number of statistical analyses, including t-tests and ANOVA. These measures, while potentially statistically significant, must be understood as measures of association, not causality.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 60


The second main consideration is differences in response means across sociodemographic variables, including gender, spoken language, birthplace and age. While not statistically robust to stand as formal norming of the tool (for example, the small sample numbers for male gender in age groups 45-54 and 55-64 preclude this), the response means identified in chapter 5 facilitate the identification and comparison of Windermere client responses with equivalent tool socio-demographic matrix responses.

The final wellbeing measurement tool provided with this report is the outcome of a research design involving quantitative and qualitative methods to develop and refine a Wellbeing Survey for Windermere. In the course of its adoption, Windermere may choose to make stylistic changes to the Survey according to ongoing feedback and to suit different forms of media. The internal rigour of the Survey can be maintained if no changes are made to the wording and ordering of questions, or to the measuring scales employed.

6.4 Implementation of the wellbeing measurement tool

Implementation by a community services provider of an instrument such as the wellbeing measurement tool is assisted greatly when it has a clear and explicit organisational mandate. In its most recent Strategic Plan, Windermere has set five priorities for 201316. One of these is to “improve lives” and an element of the plan here is to “implement the Wellbeing Measurement Tool to ensure we make a difference” (Windermere, 2013). Such commitment by the senior executive of the organisation, written into its formal statement of strategic direction, is an important step in securing future use of the tool.

It is evident from studies in the implementation of research and evaluation into organisational policy, practice and decision-making that challenges exist at a number of levels. Locally, there are illustrations of this in respect of the Community Indicators project (Cox et al, 2010) and the Outcomes StarTM project (Harris & Andrews, 2013). Highlighting two sets of challenges for present purposes: one concerns infrastructure capacity; and the other issues are more cultural, relating to engagement and staff capacity.

A central aim of the project has been to anticipate and respond to these two sets of challenges in the development and refinement of the instrument itself. To a great extent, demands on infrastructure capacity have to do with ease of administration. The abridged

61 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


version of the wellbeing survey offers a form of data collection that is more readily integrated into organisational processes. Consultation with the information technology officer for Windermere has enabled early consideration of the survey being produced in electronic as well as hard copy format. It is anticipated that the form of the survey will also be suitable for conversion for use on mobile devices and ideally with touch screen functionality. A vital next step will be instituting processes whereby the generated data can be securely stored and efficiently analysed.

The important lesson from the project has been to engage with relevant staff within the organisation as the instrument was being developed. As well as the information technology section, assistance was also forthcoming from the marketing section of the organisation on style and design. While the instrument was being developed by researchers at RMIT University, a lot of attention was paid to ensuring it would be ‘owned’ by Windermere once produced. In the early stages, members of the research team presented the project to Windermere’s Board of Management. Ongoing communication with not only the strategic management group and organisational service areas but also the front-line program managers was crucial. The amount of time involved in refining the instrument, with lengthy periods where nothing tangible was evident for the organisation, did make it hard to maintain interest and enthusiasm. In efforts to counter this, the project research officer would spend time at the main office of the organisation and meet informally with staff to sustain familiarity with the project. The research officer also attended formal manager meetings to provide briefings and updates, and to consult on various aspects of the project development work. At the early stage of the project, the research officer also met with client representatives to gain feedback on the tool’s development but also to provide a face for this research to the ‘end users’.

The strategic mandate, collaboration with organisational service areas and engagement with staff at differing levels of the organisation have all contributed to establishing a conducive environment for the future implementation of the wellbeing measurement tool. Discussions will need to ensue within the organisation’s executive as to the next phases of implementation. Windermere has invested in this tool as a vehicle to assist the organisation evaluate the extent to which it is ‘making a difference’ in improving lives. As with most measurement instruments, the tool can be employed in a variety of ways and there will need to be careful consideration of its introduction to meet desired purposes.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 62


It is recommended that the survey be administered in the first instance to program areas within Windermere that have demonstrated the keenest interest in using the tool and a positive attitude to its potential to generate a new kind of information to guide policy and practice. Logistical and technical issues can be identified and resolved through such trailing. The holistic, generalised approach to wellbeing embodied in the survey is at some remove from more concrete, programmatic outcomes with which front-line managers and practitioners would be more acquainted. Furthermore, care would need to be taken not to over-simplify attribution of ‘impact’ or lack of impact on wellbeing. While the tool can be used within targeted and rigorously designed effectiveness studies, it seems unlikely that this will become its most common form of use. Perhaps more likely is that the tool will be used to provide information to the organisation on self-reported levels of client wellbeing across a range of life domains. Such information can be considered alongside other modes of reporting, accountability and evaluation available to the organisation, and thus contribute to Windermere’s narrative on the difference it is making and to its decision-making on future strategic objectives.

6.5 Dissemination

The wellbeing measurement tool has been the product of several years of collaboration between researchers at RMIT University and Windermere. There has been considerable turnover of staff in both organisations during that time, including a change of CEO at Windermere and the retirement of a lead professor at RMIT. Yet the project has been brought to successful fruition and merits wider dissemination. There have also been changes in the policy context for the community sector, which has seen a growing focus on outcomes and their measurement, consequently creating an incipient interest in the work of this project.

The wellbeing measurement tool produced through this project is novel in significant respects. It has been designed specifically for a community service organisation that is committed to enhancing its accountability in its mission to improve lives, and to generating information that will assist guide its plans for doing so. While the tool has been informed by prior work conducted by Community Indicators Victoria and RAND Health medical outcomes study, the final abridged version represents a new addition to the field of wellbeing measurement. And this is a tool that aims to resolve a paradox in that field by measuring meaningful outcomes through simple, efficient and feasible data collection in the community sector.

63 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


To date, Windermere and RMIT University have presented the work of this project at the international conference ‘Social Work, Education and Social Development’, held in Melbourne, July 2014. Further opportunities to present the work to academic and professional audiences, national and international, will be sought. Interest will be gauged in the production of shorter versions of the research report – for use with practitioner staff or with service clients, for example – and such documentation will be generated collaboratively between Windermere and RMIT as required. It is also intended that the project work shall be disseminated through scholarly publishing outlets. Local dissemination will continue to occur through membership of community sector networks. Particular attention will be given to opportunities to contribute to the Victorian Government Service Sector Reform process that is seeking to enhance an outcome orientation within the sector.

6.6 Concluding comments The current project, supported through funding from the William Buckland Foundation (ANZ Trustees), has built upon the earlier pilot project, funded by Helen Macpherson Smith Trust.

The purpose of the project was to develop and refine a wellbeing

measurement tool, produced and trialled in the pilot project, which would enable the collation by Windermere of meaningful data on client subjective wellbeing and which would be an easy tool to administer. A measurement tool has been produced that can capture measures on a range of life domains to afford a holistic sense of subjective wellbeing, while being a tool that is relatively short and accessible in language and layout. The development of the tool involved a series of tests for reliability and validity. A robust process was also adopted to shorten the survey whilst maintaining its trustworthiness.

There have been challenges over a period of several years to arrive at this point. The later stages in the development of the tool proved particularly time consuming. A key feature of the project has been a conscious intent to sustain a collaborative approach with the host community service organisation, Windermere, both staff and clients. Given the period that has elapsed between the initial the idea for such a research instrument to its final delivery as a wellbeing measurement tool, some six years overall, there has inevitably been significant turnover in the personnel immediately engaged in the project. It is noteworthy that the commitment by Windermere has remained strong, to the extent

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 64


that the implementation of the wellbeing measurement tool is written into their most recent strategic plan.

As the wellbeing measurement tool is handed over to Windermere, an exciting and equally challenging new phase arises. It is hoped that the tool will assist the organisation, and perhaps others in the community services sector, in matters of accountability, evaluation and organisational policy and practice. A considered approach to a stronger focus on meaningful outcomes, guiding engagement with populations that are vulnerable and disadvantaged, will contribute to Windermere’s vision to “get in early to make a difference in the lives of individuals, families and communities� (Windermere, 2013).

65 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


APPENDICES APPENDIX A

BIBLIOGRAPHY

ABS. 2001. Measuring wellbeing: Frameworks for Australian social statistics. Catalogue No. 4160.0. Canberra: Australian Bureau of Statistics. ANNAS, J. 1993. The morality of happiness. Oxford: Oxford University Press. ARGYLE, M. 1999. Causes and correlates of happiness. In: KAHNEMAN, D. (ed.) Well-being: The foundations of hedonic psychology. New York: Russell Sage Foundation. BAYSIDE CITY COUNCIL. 2012. Wellbeing for all ages and abilities strategy: 2013-17. Sandringham, Vic: Bayside City Council. BLOOM, M, FISCHER, J. & ORME, J. 2009. Evaluating practice: Guidelines for the accountable professional (6th edn.). Boston: Allyn and Bacon. BRICKMAN, P., & CAMPBELL, D. T. 1971. Hedonic relativism and planning the good society. In APPLEY, M.H. (ed.), Adaptation level theory: A symposium (pp. 287–302). New York: Academic Press. BURNS, G.W. 2005. Naturally happy, naturally healthy: the role of the natural environment in well-being. In: HUPPERT, F. A., BAYLIS, N. & KEVERNE, B. (eds.) The science of well-being. Oxford: Oxford University Press. CALMAN, K. C. 1984. Quality of life in cancer patients - an hypothesis. Journal of Medical Ethics, 10, 124127. CASEY CITY COUNCIL . 2014. DEMOGRPAHICS, http://www.casey.vic.gov.au/council/aboutcasey/demographics, accessed 27/10/2014. CLARK, D. & GOUGH, I. 2005. Capabilities, needs, and wellbeing: Relating the universal and the local. In: MANDERSON, L. (ed.) Rethinking wellbeing. Netley: Griffen Press. COX, C., FRERE, M., WEST, S. & WISEMAN, J. 2010. Developing and using local community wellbeing indicators: Learning from the experience of Community Indicators Victoria. Australian Journal of Social Issues, 45, 71-88. CUMMINS, R.A. 2001. Self-rated quality of life scales for people with an intellectual disability: A reply to Ager and Hatton. Journal of Applied Research in Intellectual Disabilities. 14, 1-11. CUMMINS, R.A. 2002. The validity and utility of subjective well-being: A reply to Hatton and Ager. Journal of Applied Research in Intellectual Disabilities. 15, 261-268. CURELLA, L. S. 1999. Routine outcome assessment in mental health research. Current Opinion Psychiatry, 12, 207-210. DHS. 2011. Department of Human Services standards. Melbourne, Vic: Victorian Government Department of Human Services. DIENER, E. 2008. Myths in the science of happiness, and directions for fruture research. In EID, M. & LARSON, R.J. (eds.) The science of subjective well-being. New York: Guilford Publications. DIENER, E., LUCAS, R.E. & SCOLLON, C.N. 2006. Beyond the hedonic treadmill: Revising the adaption theory of well-being. American Psychologist, 61, 305-314.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 66


DIENER, E., LUCAS, R.E. & SMITH, H.L. 1999. Subjective well-being: Three decades of progress. Psychological Bulletin, 125 (2), 276 – 302. DIENER, E. & SUH, E.M. (2000). Measuring subjective well-being to compare the quality of life of cultures. In DIENER, E. & SUH, E.M. (eds.) Culture and subjective well-being. Cambridge, MA: The MIT Press. TM

HARRIS, L. & ANDREWS, S. 2013. Implementing the Outcomes Star well in a multi-disciplinary environment. St Kilda, Vic: The Salvation Army, Crisis Services Network. HEALEY, J. (ed.) 2008. Happiness and life satisfaction: Issues in Society. Thirroul, NSW :The Spinney Press. HELLIWELL, J. F. 2003. How's life? Combining individual and national variables to explain subjective well-being. Economic Modelling, 20, 331-360. HELLIWELL, J. F. & PUTNAM, R. D. 2005. The social context of well-being. In: HUPPERT, F. A., BAYLIS, N. & KEVERNE, B. (eds.) The science of well-being. Oxford: Oxford University Press. HOLCOMB, W. R., PARKER, J., C, LEONG, G. B., THIELE, J. & HIGDON, J. 1998. Customer satisfaction and self-reported treatment outcomes among psychiatric patients. Psychiatric Services, 49, 929-934. INTERNATIONAL WELLBEING GROUP, 2006, Personal Wellbeing Index. Melbourne: Australian Centre on Quality of Life, Deakin University (http://www.deakin.edu.au/research/acqol/instruments/wellbeing_index.htm.) JORDAN, B. 2007. Social Work and Well-being. Lyme Regis, Dorset: Russell House Publishing. KATSCHNIG, H. 2006. How useful is the concept of quality of life in psychiatry? In: KATSCHNIG, H., FREEMAN, H. & SARTORIUS, N. (eds.) Quality of life in mental disorders. Chichester: John Wiley & Sons Ltd. LARSEN, R.J. & EID, M. 2008. Ed Diener and the science of subjective well-being. In: EID, M. & LARSEN, R.J. (eds.) The science of subjective well-being. New York: Guilford Publications LASALVIA, A. & RUGGERI, M. 2006. Quality of life in mental health service research. In: KATSCHNIG, H., FREEMAN, H. & SARTORIUS, N. (eds.) Quality of life in mental disorders. Chichester: John Wiley & Sons Ltd. LOWE, T. 2013. New development: The paradox of outcomes – the more we measure, the less we understand. Public Money and Management, 33, 213-216. NUSSBAUM, M. & SEN, A. 1993. The quality of life. Oxford: Clarendon Press. NUSSBAUM, M. 1999. Sex and social justice, New York: Oxford University Press. NUSSBAUM, M. 2000. Woman and human development, New York: Cambridge University Press. OLIVER, J., HUXLEY, P., BRIDGES, K. & MOHAMAD, H. 1996. Quality of life and mental health Services, London: Routledge. OECD. 2009. Society at a glance 2009: OECD Social Indicators, OECD. PAVOT, 2008. The assessment of subjective well-being: Successes and shortfalls. In: Eid, M. & Larson, R.J. (eds.) The science of subjective well-being. New York: Guilford Publications.

67 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


RUGGERI, M., BISOFFI, G., FONTECEDRO, L. & WARNER, R. 2001. Subjective and objective dimensions of quality of life in psychiatric patients: A factorial approach. The South Verona Outcome Project 4. British Journal of Psychiatry, 178, 268-275. RUGGERI, M., NOSE, M., BONETTO, C., CRISTOFALO, D., LASALVIA, A., SALVI, G., STEFANI, B., MALCCHIODI, F. & TANSELLA, M. 2005. The impact of mental disorders on objective and subjective quality of life: A multiwave follow-up study. British Journal of Psychiatry, 187, 121-130. SHELDON, K. M. & LYUBOMIRSKY, S. 2004. Achieving sustainable new happiness: Prospects, practices, and prescriptions. In: LINLEY, P. A. & JOSEPH, S. (eds.) Positive psychology in practice. Hoboken, NJ: Wiley. SHERGOLD, P. 2013. Service sector reform: A roadmap for community and human services reform. Melbourne, Vic: Victorian Government. STIGLITZ, J., SEN, A. & FITOUSSI, J. 2009. Report by the commission on the measurement of economic performance and social progress. www.stiglitz-sen-fitoussi.fr/ STRACK, F., ARGYLE, M. & SCHWARZ, N. (eds.) 1991. Subjective well-being: An interdisciplinary perspective. Oxford, UK: Pergamon Press. WARE, J.E, KOSINSKI, M.,GANDEK, B.G., AARONSON, N., ALONSO, J., APOLONE, G, BECH, P., BRAZIER, J et al. 1998. The factor structure of the SF-36 health survey in 10 countries: Results from the International Quality of Life Assessment (IQOLA) project. Journal of Clinical Epidemiology, 51,1159-1165. WHYTE, J.D. & JONES, M. 2011. Development of a wellbeing measurement tool in the community services sector. Final report to Helen McPherson Smith Trust, RMIT University, Melbourne, Australia. WILSON, W. 1967. Correlates of avowed happiness. Psychological Bulletin, 67, 294-306. WINDERMERE. 2012, Windermere annual report 2011/12, Narre Warren, Victoria. WINDERMERE. 2013. Windermere: Our way forward – 2013-2016 strategic plan. Narre Warren, Victoria. WINDERMERE. 2014. Narre Warren, Victoria, accessed 24 October 2014, http://www.windermere.org.au/ WISEMAN, J & BRASHER, K. 2008. Community wellbeing in an unwell world: Trends, challenges and possibilities. Journal of Public Health Policy, 29, 353-366

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 68


APPENDIX B

WINDERMERE WELLBEING PILOT SURVEY

69 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 70


APPENDIX C

SUMMARY OF WELLBEING MEASUREMENT INITIATIVES

There are a number of initiatives on measuring different aspects of well-being. These vary across different scales and contexts. For simplicity, the focus of this section is on those identified within Australia and other predominately ‘English-speaking countries’ such as Canada, United Kingdom, and the United States. Having said that, it should be acknowledged that considerable work on well-being occurs in other places, both in isolation and in conjunction with work in ‘English-speaking countries’. For example, one notable attempt at research synthesis is the ‘World Database of Happiness’ originating in 1984 from Erasmus University in Rotterdam, Netherlands. Below is a list of some of these initiatives at different geographic scales.

Australia – National a. Australian Unity Well-being Index b. Australian National Development Index (ANDI)- in development c. Measure of Australia’s Progress d. Measuring Progress for Children e. State of the Environment (SoE) Other – f. The Herald/Age Lateral Economic Index of Australia’s Wellbeing Australia – Victoria a. Growing Victoria Together b. Indicators of Community Strength c. Victorian Community Indicators Project 2.3 New Zealand – National and others a. Happiness in New Zealand b. Progress Initiatives in New Zealand c. Quality of Life ‘07 in Twelve of New Zealand’s Cities d. Social Indicators for New Zealand 2.4 Canada – National and others a. The Canadian Index of Wellbeing b. Happiness in Canada c. The Index of Economic Wellbeing d. The British Colombia Atlas of Wellness

71 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


2.5 United Kingdom – National and others a. National Well-Being Project b. The Well-being Institute c. The New Economics Foundation Centre for Well-being d. The Gallup-Healthways Well-Being Index (also in US) e. Child Well-being Index f. Measuring Progress in Child Well-being g. Outcomes Star* *Note: Being adopted elsewhere, including Victoria, Australia

2.6 United States – National and others a. American Human Development Project b. State of the USA (possibly more economic-based) c. Economic Opportunity Index d. Gallup Social and Economic Analysis e. Happiness in the USA (possibly more economic-based) f. The Gallup-Healthways Well-Being Index (also in UK) g. The Child and Youth Well-being Index h. Geography Matters: Child Well-being in the States

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 72


APPENDIX D

ETHICS APPROVAL: PLAIN LANGUAGE STATEMENT School of Global, Urban and Social Studies

GPO Box 2476 Australia Melbourne VIC 3001

PLAIN LANGUAGE STATEMENT

Tel. +61 3 9925 2328 Fax +63 3 9925 8266 www.rmit.edu.au

Project Title:

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector

Investigators:

Associate Professor Martyn Jones Emeritus Professor Catherine McDonald Associate Professor John Whyte Mr Matthew Coote

Dear Student, You are invited to participate in a research project being conducted by RMIT University. The project title is Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector. Please read this sheet carefully and be confident that you understand its contents before deciding whether to participate. If you have any questions about the project, please ask one of the investigators. About the project This is a two-year collaborative research project between RMIT University and Windermere Child and Family Services, funded through the ANZ Trustees. Researchers from the Social Work program at RMIT University have been working with Windermere to design a measurement tool, which asks about a client’s sense of wellbeing. The purpose of this measurement tool is to help Community Service Organisations evaluate the impact of community service interventions (whether and how their work assists clients). Why have you been approached? We are asking students from the School of Global, Urban and Social Studies at RMIT University to participate in the research project. We anticipate that around 750 RMIT students will participate in the questionnaire survey stage of the project. For participants this involves: (1) Filling out a copy of the actual wellbeing measurement tool (answering the normal questions that would be asked of clients when they first come into Windermere); (2) Giving your view on the wellbeing measurement tool by answering some feedback questions; and (3) Answering some basic socio-demographic questions We expect that the three tasks will take around twenty minutes in total.

73 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Possible Risks and Benefits It is unlikely, but possible, that the questions about wellbeing on measurement tool might distress you. If that is the case, please tell us quickly and we will stop and assess if you need any assistance. If you are unduly concerned about your responses to any of the questionnaire items or if you find participation in the project distressing, you should contact the RMIT Counselling Service on the telephone at (03) 9925 4365 (business hours) or (03) 9925 3999 (after hours – 5.30pm to 8.30am) or email counselling@rmit.edu.au. Alternatively, you can contact Associate Professor Martyn Jones as soon as convenient. Martyn Jones will discuss your concerns with you confidentially and suggest appropriate follow-up, if necessary. The benefit of this project is that researchers from RMIT University have the opportunity and the capacity to work on refining a measurement tool for Windermere and other Community Service Organisations. This will help Community Service Organisations evaluate the impact of their community service interventions. The anonymous and non-identifiable results from the project will also form the basis for publication of articles in academic and professional journals and for the subsequent production of resources for CSOs. In addition, a plain language report of the project outcomes will be available at the end of the project. This will carry the project title. Privacy and Disclosure of Information Completion of the survey questionnaire (three tasks) is anonymous and non-identifiable, and we ask that you ensure you do not put your name or signature anywhere. RMIT University has provisions in place to manage privacy and disclosure issues. Participation is voluntary. You are free to withdraw from the project at any time and to withdraw any unprocessed data previously supplied. Completion and return of the questionnaire is taken as consent to participate. Further information for any questions/problems: Associate Professor Martyn Jones, School of Global, Urban and Social Studies, RMIT University, Phone No. 03 9925 3788

If you have any complaints about your participation in this project please see the complaints procedure on the Complaints with respect to participation in research at RMIT page at www.rmit.edu.au/browse;ID=2jqrnb7hnpyo

Yours sincerely

Associate Professor Martyn Jones Emeritus Professor Catherine McDonald Associate Professor John Whyte Mr Matthew Coote

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 74


APPENDIX E

BASIC QUESTIONS FOR PARTICIPANTS: RMIT STUDENTS

Project Title: Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector 1 What is your age group? 

18-19 years of age

20-24

25-34

35-44

45-54

55-64

65 +

2 Are you male or female? 

Male

Female

3 What is your main field of university study? Please specify below

(for example, social work or youth work or urban planning or so on) 4 Are you a domestic or international student? 

Domestic student

International student

5 What country were you born? Please specify below

6 Do you speak a language other than English at home? 

No, English only

Yes, – please specify below

7 What is the postcode of where you currently live? Postcode: 8 Do you have children? 

Yes

No Thank you for your time

75 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


APPENDIX F

QUALITATIVE FEEDBACK FROM WINDERMERE STAFF AND CLIENTS

Windermere clients (Consumer Consultative Committee – two sessions June 2013)

The outline: -

-

Research officer (MC) gave a brief PowerPoint presentation (an introduction, then basic overview of the project, and finally invite for feedback on aspects of the measurement tool) MC invited feedback on the measurement tool around the layout, content, ease of reading, words etc MC handed around three different options on the layout of the tool and two options of the wording and numbering of a set of questions This resulted in debate and feedback

Notes of the discussion points and issues: [Red indicates initial thoughts or updated responses to feedback.]

(1) On the layout, it was agreed upon that blue was best- calming. However, some members also liked another option where there is a 1-1 alternation in colours/non-colour white for the questions (one on, then one off, one on…) as this makes it much easier to scan across the tool and distinguish each question Note 1– maybe we could use a blend of the two options

(2) On the layout, it was also agreed that the circles within the box option is easier to use. This makes it easier to read and help gives direction as to where to tick Note 2- possible blend again

(3) On the wording, it was agreed that the tool was too complicated for participants and too wordy Note 3- this confirms our knowledge that we have to simplify the tool – We have been able to counter this by reducing the number of questions (down from 40)

(4) On the wording and layout, it was felt by one that we should cluster the ‘emotional health’ questions together and then likewise for the ‘physical health’. Each series of questions should have a title at the top to show what the series is about. Note 4- this is similar to the feedback that we are getting from some of the RMIT students – We’ve done this with final tool

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 76


(5) On the wording and layout is was felt that the ‘blocking’ of questions was too overwhelming and hard to read Note 5 – most of this has been rectified in our refined measurement tool as we reduced the number of questions

(6) On the scale, it was felt that this was confusing. It was suggested some options including (1) numbers instead of words; or (2) a slider scale; or (3) a simple triyes – maybe – no would be better Note 6 – This links in with our discussion on the scale at research catch-up. We’ve found that 1 to 5 is better. If we give three options, most respondents will take the middle position. Similarly, the RMIT students and Social Work research methods textbooks etc suggest a scale with numbers

(7) On the questions, some members who looked at the questions on the second page of our measurement tool (those around community and neighbourhood), felt that these were much easier to read and if required, easier to fill out Note 7 – maybe because community and neighbourhood relate to something more concrete, objective and impersonal compared to feelings and emotions which can be abstract, subjective and personal We tested a re-order of the tool but found that introduction with social and emotional is more statistically rigorous.

(8) On the wording, scale and layout, it was agreed that people with disability, and others who have trouble with literacy, might prefer pictures and diagrams instead of words. One option put forward was a series of faces where the participant could draw happy face, sad face and so on Note 8 - visualisation is something that pops up from time to time. We will consider this additional option to compliment out survey or at least, have a systematic framework in place so that we can incorporate this without losing rigour etc. Having said that, we have decided to go with the current tool (words and numbers) and then look at visual substitutes at a later date Visual/pictures have not been tested

(9) On actual application of the measurement tool, it was felt that the two points (pre and post) of implementation could be problematic and open to error due to outside influences on the day of the survey. As such, the member suggested that we should have three points (thus reducing possible error) of (1) preprogram/intervention; (2) middle; then (3) upon exit. Note 9- not sure if this would reduce error as over-determinants always influence a survey as part of the push and pull. Having said that, the tool can be implemented at different points (for eg, every month along the way) All surveys are impacted by outside influences

77 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


(10) On the physical constraints of completing the measurement tool, it was agreed that some people with a disability would have trouble with the paper survey (ticking the boxes with a pen). At first glance, the logical response is that carers or relevant staff at the respective CSO could, upon instructions, assist in filling out the measurement tool. A subsequent question was asked about electronic and computer options and also ‘survey monkey’. Our initial response was that paper copies are better because (1) not everyone is able to or chooses to use a computer; and (2) computer surveys have a much lower response rate. This view is debatable. Two issues raised by both members and Windermere staff are: (1) disability clients are increasingly using computer and electronic technologies; and (2) these give clients more independence do such a task, meaning there is less concern about privacy and bias with responses (as would be the case with a staff member). In the case of relying on a staff member filling out the survey, it was felt that (1) participants would be less likely to be negative or critical if felt they received poor service; and (2) staff might be tempted to manipulate results to favour their careers or the standing of the respective CSO. At the same time, we’ve found that some people still prefer paper surveys. We need to find a way of reconciling these two devices. Note 10- We’ve decided to go with a paper copy at the first instance but we will liaise with IT specialists at RMIT, Windermere or outside consultants if required. Note April 2014: Adrian & Ryan at Windermere have been on technologies The paper copy can be the entry point/a base to work from but we see a move to online

Windermere Managers, Team Leaders & other staff (2012-2014)

Feedback from the Windermere Managers, Team Leaders & other staff This included: -

Research Officer conducting one on one meetings with some Managers and Team Leaders (late 2012 and early to mid-2013) Research Officer conducting one on one and small groups meetings with some staff (late 2012-2014) such as data entry person, IT & tech people Research Officer presenting at Windermere MAG (Management Action Group) on two occasions (March 2013 and April 2014)

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 78


Feedback, broadly included: The survey is too long Some of the questions are confusing – will be too confusing for clients Font of pilot (measurement tool 1) was too small Colour of pilot (measurement tool 1) is wrong There seems to be repetition of some questions The main questions and sub questions are asking the same thing (tool 1, 2) Lack of definitions. For example, wellbeing is not defined. What does work mean? Paid? Unpaid? At home? You should have a separate page with definitions How are you going to administer this? Paper, electronic? Some clients such as those receiving disability service prefer electronic – cannot use paper Who fills it out? The client or case worker or both? Is it anonymous? Will it have a cover sheet? How do we know that Windermere is making a difference? What about other things going on people’s lives? The survey looks boring (measurement tool 2,3,4) Some clients will find these questions intrusive and disturbing (for example victims of crime) What database? Crisp? Have you thought about visualisation or shapes (such as faces to indicate rating)?

RMIT Marketing (Late 2013 and again early 2014) -

-

-

Surveys undertaken through RMIT (NOT our measurement tool) are constrained by RMIT regulations and guidelines. As our surveys are official RMIT product, they must adhere to the branding templates. As a general rule they avoid red and other traffic light colours ( e.g. red can be emotive - it has connotations with stop, no, danger etc and conversely green means go, amber is caution etc.) That said, it depends on whether you want to 'lead' the respondents in a particular direction. Answer: We don’t Plain language: RMIT consult with the study and learning centre who provide useful advice around the appropriateness of the language for the diverse range of students. Answer: we are testing language Layout options – RMIT survey tools have default settings for fonts and layout options so it's difficult to gain specific advice on this except to keep it simple Keep in mind that if you are running a paper based survey and are scanning the responses as opposed to manually punching the data, your formatting options will be limited by the scanning software

79 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Summary Notes re Feedback on Different Versions of the Wellbeing Measurement Tool

NB: Excludes feedback from RMIT Students as this is a separate document Aim a) Tool needs to be easy to read and easy to fill out b) Tool needs to be statistically valid and rigorous (we’ve now done this through testing)

With various versions of the measurement tool we’ve tested a) Different wording of questions b) Different order of questions c) Different number of questions (down from 40 on original tool) d) Different layout and colours (this finding more qualitative than quantitative) Issues with layout: a) Ideally we have all the questions on one page (psychologically it makes the tool shorter) b) However, there are limits to keeping to a one-pager as it can make the tool look crammed and difficult to read As of 2014: a) The wording of the questions have been tested b) The order of the questions have been tested c) We’ve been able to reduce the number of questions d) We’ve found a preference for a scale with numbers 1 to 5 e) We’ve found a preference for calming colours (this finding more qualitative than quantitative)

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 80


APPENDIX G

QUALITATIVE FEEDBACK FROM RMIT STUDENTS

1. INTRODUCTION This is an outline of additional feedback on different versions of the wellbeing measurement tool that we received from three survey rounds of RMIT students for the project entitled Development and Refinement of a Wellbeing Measurement Tool in Community Services Sector. •

Survey Round One (Measurement Tool 2) was conducted throughout March 2013 and April 2013.

Survey Round Two (Measurement Tool 3 and Measurement Tool 4) was conducted throughout July 2013 to October 2013.

Survey Round Three (Measurement Tool 5) was conducted throughout September 2014 and October 2014.

In Survey Round One, Round Two and Round Three students were asked to complete: (i) Wellbeing Measurement Tool (ii) Basic Socio-Demographic Questions for Participants (iii) Feedback (forms) on the Wellbeing Measurement Tool In total we received 543 survey responses From the 543 participants, we also received additional feedback on aspects of the Wellbeing Survey (layout, length, wording, phases used etc.). This feedback consists of: (1) Feedback (forms) on the Wellbeing Measurement Tool: 183 returned (2) Verbal feedback from students at the completion of the survey (3) Notes and comments on measurement tools Note 1:

The feedback forms were not distributed to all participants and all classes due to time constraints in some classes

The number of returned feedback (forms) for each version of the Wellbeing Measurement Tool is as follows: Survey Round One (Measurement Tool 2) •

Feedback (forms) on the Wellbeing Measurement Tool: 164 returned

Survey Round Two (Measurement Tool 3) •

Feedback (forms) on the Wellbeing Measurement Tool: 25 returned

Survey Round Two (Measurement Tool 4) •

Feedback (forms) on the Wellbeing Measurement Tool: 22 returned

Survey Round Three (Measurement Tool 5) •

Feedback (forms) on the Wellbeing Measurement Tool: 19 returned

81 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Some thought was given to quantifying the feedback from the students but this was deemed problematic for a number of reasons, these being: (a) Feedback forms were not distributed to all participants and all classes due to time constraints in some classes (b) Many of the feedback forms returned were partially filled out (c) Some students gave feedback in a combination of multiple forms: feedback forms along with verbal comments as well as notes/comments on measurement tool (d) It was difficult to track respondents of feedback forms because the feedback forms were often given out at completion of measurement tool As such it was felt that this student feedback could compliment the qualitative feedback from Windermere managers, other Windermere staff, and Windermere clients

2. FEEDBACK FORM FOR RMIT STUDENTS The form used to invite feedback from RMIT students is appended after Section 3.

3. FEEDBACK ON WELLBEING MEASUREMENT TOOL FROM RMIT STUDENTS This section outlines feedback on the different versions of the wellbeing measurement tool. This is structured around main themes that emerged from the feedback based on (a) the wording and phrases in use in the measurement tool; and (b) suggestions to improve the measurement tool. In support of this, select comments from the RMIT students underpin these themes.

Survey Round One (Measurement Tool 2) -Rewording / Phrasing -Layout -Miscellaneous -Repetitive

Words or phrases in the wellbeing measurement tool unclear or confusing Generally had to read a couple of times. The use of "social activities" needs defining (i.e. where does study fit: not work, not a social activity as I would interpret the term. The Layout was confusing i.e. the questions and then following questions related to the first question. Q1-4 A. Q6 Difficult to answer - seems irrelevant to neighbourhood (otherwise what is the definition of neighbourhood). Too wordy. The 1a, b, c etc. doesn’t flow well or make sense. Questions 1, 2, 3 and 4 could be worded simpler. Hard to determine between "Social Activities" and "other Activities" to "Reduced Overall time Spent" Should be reworded. 1a) Need to specify that it is time for usual activities. 5a) Use Happy not content.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 82


Quite a bit. I had to keep rereading to clarify what is being asked. I found the question below the main questions too wordy and it may be overwhelming for a client. I’m just too tired to do this I find it a bit confusing and overwhelming as an exhausted person. Had to read each question a few times as they were confusing. Wording of the first for questions is very confusing. Should write out each question in full to make it clear I included "work" as a "social activity” perhaps more distinction is needed. I think that some people may need clarification on what physical health and emotional issues mean. Was a bit confusing because of the way it was worded. The way the questions are structured is weird - the wording of the a, b, c and d to relate to the question is hard to follow. Some of the phrases "didn’t do them as well or carefully as usual” didn’t match the question well. The first question and the wording of part a, b and c didn’t make a lot of sense. I worked out what was meant but it could have been worded better. First Question the a, b, c, d wording is confusing. The a, b, c, d sections as well as the heading questions made the question boxes quite dense. I actually was a little bit confused at the start whether I had to select a, b, c, or d, or tick a box for each. Just could have been clearer. "Reduced the amount of time spent "- not very clear. Didn’t do them as well or as carefully as usual? "reduced amount of time spent" The Relation of the question to options a,b,c,d etc. is not very clear- seems it should be phrased on more of a likert scale response Question 1-4 didn’t seem to follow naturally form the original question. I wasn’t sure if the first part of the question required answering before the letter in the subsections. Questions 1-4 Reduced the overall amount of time spent. A little unclear. Question 7c Am I culturally respected - hard to define. Questions 1-4 difficult to understand a-e could be worded better. Wasn’t sure. The readability of 1, 2, 3, 4 parts is confusing. Was a tad repetitive. Physical and Emotional Health need to be defined. Items A and B are ambiguous. There are not many "flip” Questions. Couldn’t get what the questions were getting at &felt that my answers couldn’t reflect what I wanted to say. Questions response options were limiting. If I was confusing moving from "not at all" being positive to being a negative as you move through questions The Sub Headings & individualised questions didn’t work for me. In terms of the way they are set at the question - were a little similar an overlapped. Sentences were unclear.

83 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Because I have just moved and your questions are confusing because it makes the sound like I don’t enjoy but to begin with I was overwhelmed. Ideas of community were confusing due to being an international student with different views. Not enough possible answers. Might be helpful to define the key terms as they could differ between people. E.g. What is a social activity? Layout is very unclear. I understand the question but it may not be not be worded clearly to all. "Usual" a-d confusing. The a-d options are quite unclear. For Questions 1-4. Was a little unsure about the a, b, c, d was referring to e.g. a) reduced the amount of time spent? Time spent doing what? I found questions 1-4 quite difficult to comprehend, the wording wasn’t very good. Could be put better to reduce confusion. Some awkward phrasing - reduced the overall amount of time spent and achieved less than I would have liked. Heading and sub question structure may be confusing and give unclear results. I found some questions unclear of what they were asking me. "Achieved less than I would have liked " didn’t do them well or carefully as usual" Question 1-4 too long- Simplify. "Reduced the overall amount of time spent" - Doesn't really make sense. Distinction between neighbourhood and community. The phrase "About how I feel in my community” etc. I think would be clearer and more informal phrased as a question. I did not find words or phrase unclear of confusing but it was a bit odd being asked versions of the same questions multiple times. Felt like questions 1 to 4 had to be read quite carefully. There were questions within the main questions & it was unclear of the meaning of the statement and the relevance to the question. What defines "work" In Q1-4 Having the first questions a to b it was confusing what was being asked. Maybe better saying Maybe better saying 1.a reduced the amount of time spent on social activity etc. "Work and other Activities?" Most of the questions and choice of answers were quite repetitive and don’t appear to help issues. That has to do with housing. It took me a while to understand the context of the questions. Questions 1-4 are very repetitive. It feels as though a to d repeat the what the questions say, and it is sometimes unclear what they relate to.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 84


Q1 social actives and achievement. Questions 1,2,3,4 are not clear, overwhelming. Question A is particularly confusing. But for clients you might need to specify a little more e.g. emotional issues? Culturally respected? Neighbourhood in relation to further education. Needs clarification e.g. no universities in my neighbourhood but library, does that count. Reduced the overall amount of time spent, this is a little hard to follow in relation to the opening question applicable for 1.a, 2.a, 3a, 4.a. Could be better worded. In the tick box questions, the answer options were not clear and did not flow on well from the questions. I didn’t feel that the scale of measurement used often fit the questions asked. The Language would be difficult for English as a second language people. First question should not be phrased as a question. I did read it a little too quickly when I read it again I understood it better. However this is just me, not the tool. Purpose of survey unclear, layout confusing, options provided are not indicative /do not appear specific or relevant. Options vague. Maybe make it more clear that the sub questions in each section relate to the overall question. Phrases such as "emotional issues" seem a bit too broad. It’s a little difficult to get the hang of "main questions" for” sub questions" as it wasn’t specified. I found the over aching question a little confusing continued to address it. "didn’t do them as well or carefully as usual". It’s a little difficult to get the Hang of a "Main Question" followed by "sub questions" as it wasn’t specified.

Suggestions for improving the wellbeing measurement tool The general question at the top of the questions below is quite confusing when I’m just sitting here not stressed, so I don’t know how clear it is for someone who is stressed or has just been approached for service. Perhaps include; How I interact with my local community? Who do I have to support me if I need it? I don't like "quite a bit" It doesn’t seem to be the right phrase. The questions and sub questions need to be rephrased to be more clear and specific. Open ended questions. Too repetitive. It was too cramped together. Also it is on dear whether to answer the question on top The layout is really boring! Questions 1 and 3 should seem similar. Questions 2 and 4 seem similar. Needs Summary at beginning explaining what it’s for paragraph or dot points. Better language layout. Using a Likert scale i.e. 1 to 10. See above. Q1 &3 I got confused as they are very similar. I thought I was reading the same question. I think if my

85 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


physical health was poor it would affect social and work activities. As mentioned above, possibly define the main terms in the questions. It might be clearer to put physical health questions together, and emotional health questions together. I have found the clients prefer 1-10 answers. They have said that it is easier to rate a number. Categories are too broad and don’t consider Parents careers etc., Or mental illness and time of life. Asking age group, program and postcode is too identifying. Make it clearer- Look at other psych assessment tools to see how they word it. Don’t be scared to repeat to make it clearer. Clearer options as in a, b, c, and d. Word the first question better- more clarity e.g. Reduced the amount of time spent WITH FRIENDS. I thought the use of blue colours was calming and deviated from the standard B&W survey. Very easy and efficient. Change wording, particular Q1-4, option "didn’t do their...” Set out a bit clearer. I believe individuals who have an intellectual disability of use English as their second language would find this quite confusing. The layout is not that appealing to the eye. May need to lay out questions differently for eg. Q5-7 were much clearer & better set out than Qs 1-4. Layout quite plain and boring. Felt quite long, could have taken a simpler form. Questions 1-4 are set out poorly. The grid is fine however the layout of the questions are confusing. I would imagine that somebody whose second language is English would struggle. Questions 5-8 are much easier to answer. This seems like a pretty general wellbeing test, and doesn’t seem like a good indicator of wellbeing - Maybe allow for some blank lines for people to elaborate as the questions are vague and tailored. Opportunity for people to highlight their own careers above wellbeing more clarity in the phrasing may be too long. Be clear on how components affect one another. E.g. work and emotional state. As above. I’d like clearer subsections to flow on from the question. I couldn’t follow it easily in my head. Otherwise the content is specific and measurable. Seems an effective tool. Perhaps change the colour of light blue where you rate each question. Layout maybe include spaces between each question. I didn’t really like the colours and the layout. Rewording of questions 1,2,3,4 to be more clear. As above. Simplify it. Just make it clear and define the difference between questions. Make sentences clear, reword questions. Space Boxes. Clearer. What do you want from this survey? More possible answers. Broader, More international.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 86


General Layout. Layout- Separate Main Questions and a, b, c questions. Much better layout. A clear explanation at the beginning of the survey. Just clear up the phrasing & words. More Clearly worded questions. Change phrasing. Change wording. Maybe a question or two about individual’s health/community history to get a better sense of their identity. Improve wording condense questions. Less Options maybe 3 instead of 5. Some aspects that caused a little confusion - hence frustration when you supply a choice of a ,b,c,d etc. it seems like a choice between them "reduced the amount of time spent"? It’s not a question. This immediately raises the question "on what?" by the survey taker. You need to make it obvious that the second part of the question is linked to the first. Integrate a and b, c and d in Q 1 and 3. Clearer layout (more text spacing) I don’t feel it accurately asks enough questions to get an understanding of what’s important to someone and how they... Layout the need to read across the template from left to right from the initial question. Q1-4 Could probably start with a yes/no, so don’t have to go through them. I think the content itself is good, just very wordy. I thank a N/A option should be available for cultural questions e.g.: I am able to participate in cultural of sporting activities. It is possible to merge question 1& 2 merge 3 & 4, and merge 7&8. Provide the option "don’t know" The question "do you have children" should ask if one has a child or children. A person with one child may fail to answer the question. A bit brief if you're getting to the crux of definitions of How people conceptualise wellbeing. Another page or two wouldn’t have placed a strain on participants and could elicit much more. PL5 should always be distributed before participants complete a survey. The dot points initially confused me. Clearer questions. Most of Issues could be resolved with the right outcome. Clarity in the questions. Make it clear, Separate individual questions rather than having the a-d components. I think questions needed to be more specific. More Specific Questions about how I felt in the past. I don’t really remember so this might help. Those who are not working should not do

87 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Could be a little more specific about input of community factors outside our postcode area? But maybe this beyond the scope. Better explanation of the context of questions and what you’re are looking for and how this applies to the participant filling in the forms. More Questions about the detail. I'm am international student. I am a bit confused whether or not I should answer the a, b, c and d questions under the big questions. There should be clear division for the space given for the answer to be given for the main question as well as the sub questions. Create clearer answer questions that are coherent with the questions. Leave an N/A option.

Survey Round Two (Measurement Tool 3) -Rewording / Phrasing -Layout -Miscellaneous -Repetitive

Words or phrases in the wellbeing measurement tool unclear or confusing Sort of not enough options, but also too specific. Slightly messy. Q3 & 4 a bit unclear on what physical health is referring to/what social activities being referred to. Questions 3,4,5 & 6 part c is sort of confusing might need rewording. Parts b-e on questions 3-5 were a bit confusing, I had to read it a few times to understand what it was asking. I achieved less than I would have liked is hard to respond to with options. Q3d, Q4d & Q5d them should be clearly stated again - what is "them". The layout could be clearer in that it felt a bit cumbersome.

Suggestions for improving the wellbeing measurement tool Survey needs to be broader however also needs to be condensed. Needed "Don't know" option. Linking location of living with educational opportunities was a bit of a stretch. I felt sick for 5 days which meant I didn't go out or do school work meaning my results were quite poor for the physical section something to think about. Be more specific in wording - a bit unclear.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 88


Could have specified "stress" as an emotional factor - for students who experience stress from school. Make it shorter. Some of the questions could be summed up into one instead of two separate similar questions. There was a focus on working as I am not working it wasn't relevant. Questions 3,5 & 6 were repetitive. Phrase the questions more simply Themes/colours/grouping of questions (visually) for physical health and emotional issues questions to be visually clearer. There is no definition of "community" and "neighbourhood" - people might get confused. Q3d,Q4d &Q5d them should be clearly stated again - what is "them". The questions were quite repetitive which was effective for getting it done quickly but loss of a bit of interest. Shorter. The repetitiveness of the question wording is slightly annoying and may cause people to start answering randomly because of this. Shorter questions or statements. Some questions appeared to have been doubled up.

Survey Round Two (Measurement Tool 4) -Rewording / Phrasing -Layout -Miscellaneous -Repetitive

Words or phrases in the wellbeing measurement tool unclear or confusing Social activities, "carefully" – confusing. For certain backgrounds with English not as a first language I feel lots of words are too complicated. I understood it, but not sure if it would confuse clients. Moderately - this could be an unclear term for people with English as a second language. " I didn't do them as well" or "carefully as usual" re: Social Activities, this is not clear. I didn't understand that they were statements initially, it was hard to understand.

Suggestions for improving the wellbeing measurement tool Layout was a little difficult - cluttered. This made it difficult to follow/understand.

89 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Reduce length. I find it a little confusing that you are asking about my health then all of a sudden my neighbourhood in the same survey. Reducing length of survey. Possibly more about community engagement and involvement. Maybe no need for 2 gender boxes or add another box. Specify what work activity means- didn't know if that meant job or Uni etc ? Need "Don't Know/ Not applicable" for some questions. Simpler language and shorter. A lot of questions so reduce otherwise I found it easy to follow. It felt repetitive so maybe use different wording or examples so it isn't repetitive. I think I get what you’re trying to achieve here with structure of survey. Felt some questions were a bit too wordy. Add a "not specified" option for gender.

Survey Round Three (Measurement Tool 5) -Rewording / Phrasing -Layout -Miscellaneous -Repetitive

Words or phrases in the wellbeing measurement tool unclear or confusing -

I like how simple it is to read

-

Does emotional mean mental health? if so you should use mental health

-

I found this survey simple to read and fill out

Suggestions for improving the wellbeing measurement tool -

Having extra spaces for clients to either specify or elaborate

-

Questions are a bit general too

-

It’s a little bit broad, the colour is a little bit bland it doesn’t look interesting. The issues are not bad

-

Order: why not start with questions about neighbourhood

-

Order: put the questions about ‘over the last month’ higher

-

Lighten the boxes or remove them altogether

-

Clean page up, lighten it up

-

Layout , try to remove boxes

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 90


-

Questions are cluttered on one page

-

The font is small enlarge it

-

With the orange logo frame it out so it goes darker to lighter, edge of orange

-

Why don’t you have the option for ‘no answer’?

-

I don’t like the italics

-

Get rid of the bars at the bottom

FEEDBACK FORM

FEEDBACK ON THE WELLBEING MEASUREMENT TOOL Project Title: Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector Please tick the box that best represents your answer to each question Question 1

Very Good

Quite Good

Neither Good nor Bad

Quite Bad

Very Bad

Very Good

Quite Good

Neither Good nor Bad

Quite Bad

Very Bad

What do you think of this wellbeing measurement tool?

Question 2 What do you think of the following features of the wellbeing measurement tool? a. Length b. Content c. Layout/ design/ colours d. Ease of reading e. Phrases used f. Words used

Question 3 Did you find any words or phrases in the wellbeing measurement tool unclear or confusing?

91 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


 No  Yes, - please specify below ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________

Question 4 Do you have any suggestions for improving the wellbeing measurement tool?  No  Yes, - please specify below _______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 92


APPENDIX H

MEASUREMENT TOOL 2 WINDERMERE WELLBEING SURVEY

Please tick the box that best represents your answer to each question Question 1

Not at all

A little bit

Moderately Quite a bit

Extremely

Not at all

A little bit

Moderately Quite a bit

Extremely

Not at all

A little bit

Moderately Quite a bit

Extremely

Not at all

A little bit

Moderately Quite a bit

Extremely

Have physical health problems interfered with my usual social activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Had difficulty because of physical pain?

Question 2 Have emotional issues interfered with my usual social activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Found myself feeling anxious? e. Found myself feeling sad?

Question 3 Have physical health problems interfered with my usual work or other activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Had difficulty because of physical pain?

Question 4 Have emotional issues interfered with my usual work or other activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Found myself feeling anxious?

93 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


e. Found myself feeling sad?

Question 5 Over the last month, have I been:

Not at all

A little bit

Moderately

Quite a bit

Extremely

Not at all

A little bit

Moderately Quite a bit

Extremely

Not at all

A little bit

Moderately Quite a bit

Extremely

a. Generally content with life b. Feeling positive about the future c. Feeling that bad things keep happening to me d. Feeling calm and peaceful

Question 6 About living in my neighbourhood: a. I generally like living here b. I generally feel safe here c. It is easy to get around d. There are enough community services e. Community services are accessible f. I have access to further education g. Public places are pleasant and appealing h. I am able to participate in arts, sports & cultural activities

Question 7 About how I feel in my community: a. I feel accepted b. I feel culturally respected c. I am able to have a say about community matters d. I feel pride in my community

Question 8 Poor

Fair

Good

Very Good

Excellent

Much worse

Somewhat worse

About the same

Somewhat better

Much better

a. How do I rate my general sense of wellbeing now?

b. How does this compare to 3 months ago?

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 94


APPENDIX I

MEASUREMENT TOOL 3

WELLBEING SURVEY (OPTION 3) Please circle the number that best represents your answer to each question

Question 1 About living in my neighbourhood: a. I generally like living here b. I generally feel safe here c. It is easy to get around d. There are enough community services e. Community services are accessible f. I have access to further education g. Public places are pleasant and appealing h. I am able to participate in arts, sports and cultural activities

Not at all

A little bit

Moderately Quite a bit

Extremely

1 1 1 1 1 1 1

2 2 2 2 2 2 2

3 3 3 3 3 3 3

4 4 4 4 4 4 4

5 5 5 5 5 5 5

1

2

3

4

5

Not at all

A little bit

1 1

2 2

3 3

4 4

5 5

1

2

3

4

5

1

2

3

4

5

Not at all

A little bit

1

2

3

4

5

1 1 1 1

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

Question 2 About how I feel in my community: a. I feel accepted b. I feel culturally respected c. I am able to have a say about community matters d. I feel pride in my community

Moderately Quite a bit

Extremely

Question 3 Physical health and social activities a. Physical health problems have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

Moderately Quite a bit

95 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014

Extremely


Question 4 Physical health and work activities a. Physical health problems have impacted upon my usual work activities over the last month

Not at all

A little bit

1

2

3

4

5

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

More specifically about physical health and work activities: 1 b. I reduced the time spent 1 c. I achieved less than I would have liked 1 d. I didn’t do them as well or carefully as usual 1 e. I had difficulty because of physical pain

Moderately Quite a bit

Extremely

Question 5 Emotional issues and social activities a. Emotional issues have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

Not at all

A little bit

Moderately Quite a bit

Extremely

1

2

3

4

5

1 1 1 1 1

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

Not at all

A little bit

1

2

3

4

5

1 1 1 1 1

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

Question 6 Emotional issues and work activities a. Emotional issues have impacted upon my usual work activities over the last month More specifically: b. I reduced the time spent c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

Moderately Quite a bit

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 96

Extremely


Question 7 Over the last month, I have been:

Not at all

A little bit

Moderately

Quite a bit

Extremely

1 1 1 1

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

a. Generally content with life b. Feeling positive about the future c. Feeling that bad things keep happening to me d. Feeling calm and peaceful

Question 8 a. How do I rate my general sense of wellbeing now?

Poor

Fair

1

2

Much worse b. How does this compare to 3 months ago?

1

Somewhat worse 2

Good 3

About the same 3

Very Good

Excellent

4

5

Somewhat better

Much better

4

5

97 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


APPENDIX J

MEASUREMENT TOOL 4 WELLBEING SURVEY (OPTION 4)

Please circle the number that best represents your answer to each question

Question 1 Physical health and social activities a. Physical health problems have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

Not at all

A little bit

Moderately Quite a bit

Extremely

1

2

3

4

5

1 1 1 1

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

Not at all

A little bit

1

2

3

4

5

1 1 1 1

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

Not at all

A little bit

1

2

3

4

5

1 1 1 1 1

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

Question 2 Physical health and work activities a. Physical health problems have impacted upon my usual work activities over the last month More specifically: b. I reduced the time spent c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

Moderately Quite a bit

Extremely

Question 3 Emotional issues and social activities a. Emotional issues have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

Moderately Quite a bit

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 98

Extremely


Question 4 Emotional issues and work activities a. Emotional issues have impacted upon my usual work activities over the last month

Not at all

A little bit

1

2

3

4

5

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

More specifically about emotional issues and work activities: 1 b. I reduced the time spent 1 c. I achieved less than I would have liked 1 d. I didn’t do them as well or carefully as usual 1 e. I found myself feeling anxious 1 f. I found myself feeling sad

Moderately Quite a bit

Extremely

Question 5 About living in my neighbourhood: a. I generally like living here b. I generally feel safe here c. It is easy to get around d. There are enough community services e. Community services are accessible f. I have access to further education g. Public places are pleasant and appealing h. I am able to participate in arts, sports & cultural activities

Not at all

A little bit

Moderately Quite a bit

Extremely

1 1 1 1 1 1 1

2 2 2 2 2 2 2

3 3 3 3 3 3 3

4 4 4 4 4 4 4

5 5 5 5 5 5 5

1

2

3

4

5

Not at all

A little bit

1 1

2 2

3 3

4 4

5 5

1

2

3

4

5

1

2

3

4

5

Question 6 About how I feel in my community: a. I feel accepted b. I feel culturally respected c. I am able to have a say about community matters d. I feel pride in my community

Moderately Quite a bit

99 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014

Extremely


Question 7 Over the last month, I have been:

Not at all

A little bit

Moderately

Quite a bit

Extremely

1 1 1 1

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

a. Generally content with life b. Feeling positive about the future c. Feeling that bad things keep happening to me d. Feeling calm and peaceful

Question 8 a. How do I rate my general sense of wellbeing now?

Poor

Fair

1

2

Much worse b. How does this compare to 3 months ago?

1

Somewhat worse 2

Good 3

About the same 3

Very Good

Excellent

4

Somewhat better

5

Much better

4

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 100

5


APPENDIX K

RESULTS OF T-TESTS EXAMPLE: GENDER – Q1 to Q8, Tool 2, 3 & 4

1. Is there a significant difference in the mean physical health and social activities results for females and males? Tool 2

Tool 3

Tool 4

101 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


2. Is there a significant difference in the mean emotional issues and social activities results for females and males? Tool 2

Tool 3

Tool 4

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 102


3. Is there a significant difference in the mean physical health and work results for females and males? Tool 2

Tool 3

Tool 4

103 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


4. Is there a significant difference in the mean emotional issues and work results for females and males? Tool 2

Tool 3

Tool 4

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 104


5. Is there a significant difference in the mean general wellbeing results for females and males? Tool 2

Tool 3

Tool 4

105 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


6. Is there a significant difference in the mean neighbourhood results for females and males? Tool 2

Tool 3

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 106


Tool 4

7. Is there a significant difference in the mean community results for females and males? Tool 2

Tool 3

107 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Tool 4

8. Is there a significant difference in the mean temporal sense of general wellbeing results for females and males? Tool 2

Tool 3

Tool 4

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 108


APPENDIX L

RESULTS

OF

RESULTS OF CORRELATIONAL PROCESS FOR REDUCTION OF QUESTIONS FOR TOOL 5 CORRELATIONAL

PROCESS

FOR

REDUCTION

OF

QUESTIONS FOR TOOL 5 BACKGROUND INFORMATION: Data analysis technique employed • Extraction used a Principle Component method, direct oblimin- oblique rotations done during the analysis NOT as post-hoc. • Scores were based on Eigenvalues greater than one, and scree plots. • Suppressed coefficients were changed from the default option of .10 to .40 as indicated by Field (2009). At this stage, the Factor analysis indicated that 72.85% of the variance was explained with the extraction of 6 components (using Eigenvalue> 1) or 62.33% using 3 components (using the 5% minimum variance explained) or 66.35% using 4 components (using the scree plot measure- viewing where the elbow is present in the graph) (see output table below).

109 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 110


Factor analysis with oblique rotation purified the factors within the questionnaire Although, these factors were partially created (e.g. emotional impact on work), they were merged to create a broader factor (e.g. emotional impact) From here, eyeballing technique was used to identify weaknesses within each factor, which were identified as items with the smallest factor loadings. For example (it can be seen in the table below): • Component 1: Made up of all of the items from question 2 and 4- emotional issues (social and work, respectively). • Component 2: Made up of all of the items from question 1 and 3- physical issues (social and work, respectively). • Component 3: Made up of all of the items from question 6- neighbourhood • Component 4: Made up of all of the items from questions 7 and Q8b- community and “how does this compare to three months ago?” • Component 5: Made up of all of the items from question 5 and Q8b- temporal perspective of 1 month prior and three months prior - Component 1= social - Component 2= physical - Component 3= neighbourhood - Component 4= community - Component 5: past feeling This part was as to be expected, with the exception of Q8a which potentially shouldn’t belong to community component, but this was after the rotation- (purified the factors within data)

111 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 112


Deletion of items was done systematically employing three methods: A. Deletion of items based on Cronbach’s alpha measure for internal consistency or inter-item reliability: Using ‘Cronbach’s alpha if item deleted’ function B. Deletion of items based on their smaller factor loadings (see pattern matrix table) • Based on factor loadings lowest loading item for each factor- Q2d, Q1d, 6b, 8b, 5c. • The 2nd lowest factor loading items for each factor was also noted: Q2, 1b, 6a, 7a, 8a C. Deletion of items based on inter-item correlations- (see ‘correlation matrix’ table- very large table- see output) • Utilising eyeball technique to identify overly related items that may indicate multicolinearity (<.9) or potentially very weak items that appeared to not relate well with more than two other items (i.e. <.08 for several items) Q1b, 1d, 6c, 6d, 6f, 6g, 7c. 8b.

Data Set 1: Survey Round one (Semester 1 2013) Survey Tool Option Two: STAGE 1 DELETION OF ITEMS: based on the item affecting the Cronbach’s alpha, having low inter-item correlation and poor factor loading • By excluding Q5c and Q8b (identified as reducing the reliability in their subtype of question) also changed the overall reliability when both these items were excluded. (this was only a very small insignificant improvement), but this represents the possibility of improving the questionnaire without these items.

And by excluding Q5c and Q8b the explained variance increased indicative of the larger cumulative percentage (75.63% as opposed to the original 72.85%). The nature of the factor loading ceased to change. Warranting strong rationale for these two items to be excluded

113 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 114


STAGE 2: Deletion of items based on inter-item correlation and poor factor loading Items were excluded that had a very high (>09) inter-item correlation (multi-colinearity) or substantially low inter-item correlation (not truly valid item) and smaller factor loading on the component. For example, question 1d and 6b were identified as problematic based on their inter-item correlation and smaller factor loadings in comparison to other items. Q3b and Q3c had a shared correlation of r=.937, thus the deletion of Q3b was based on this number having the smaller factor loading onto component two. Q4A and Q4B had a shared correlation= .91; Q4B and Q4C = .906), thus the same method was usedchecking the factor loadings and eliminating the item with the smaller loading. (Q4A and Q4B). Cronbach’s alpha reliability scale was used to check with each additional deletion. Factor analysis was rerun and found that with the exclusion of Q1D, Q6B, Q3B, Q4A, Q4B, the explained variance increased and did not change the structure of the factor analysis.

115 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 116


117 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


STAGE 3: DELETION OF CASES BASED ON THE INTER-ITEM CORRELATIONS USING CLINICAL JUDGEMENT At this stage, items were deleted primarily based on their inter-item correlations. Within items in question two- the inter-item correlations were substantially higher than for other questions, thus the deletion for two of these was warranted, as it was interpreted as the items are all pertaining the same information (potential of slight multi-colinearity). The items were deleted based on the lower factor loadings onto the factor. For component 1: Q2 and Q2d. The same method was used for component 2- physical health, as this factor had 10 questions addressing this topic. The deletion of items was based on poorest factor loadings and high inter-item reliability with items in the same factor, this included Q3d (smaller factor loading compared to all other Q3 items). The results below show that this did not impact the reliability or the explained variance/ factor structure of the factor analysis.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 118


119 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 120


STAGE FOUR: Deletion of cases based on a larger cut-off point for inter-item correlations (.80): •

This was done manually and resulted in a total of 25 pairs of items having a correlation of over .80. Within these, 11 pairs were from question 4 and 5 were from question 2 (keep in mind these two groups of questions combined make up component 1) Thus, the ability to remove further items from this component should result in a shorter questionnaire whilst maintaining reliability and explained variance. -

The items were removed according to their common relatedness to a multiple of other items within that question bunch.

•

This resulted in 7A being deleted because it related (<.8) to Q7B. It had a lower factor loading and therefore eliminated.

Q1-8 overall

Q1: overall = .931 All individual items > 0.9 ďƒ no basis for deletion of any

121 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Q2: overall= .949. All individual items > 0.9 ďƒ no basis for deletion of any

Q3: overall= .952. All individual items > 0.9 ďƒ no basis for deletion of any

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 122


Q4: overall= .963. All individual items > 0.9 ďƒ no basis for deletion of any

Q5: overall= .779.

123 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Q6: overall= .889. All individual items > 0.8 ďƒ all very similar score (.01-.04 apart)

Q7: overall= .886. All individual items > 0.8 ďƒ all very similar score

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 124


Q8: overall= .529. consistency between two items is poor- the inter-item correlation is very small at .36); because only two items- cannot compute “Cronbach’s alpha if case deleted)

Using Cronbach’s alpha to detect items that decreased the internal consistency/ reliability of the questionnaire, the results showed that Q1-4 didn’t have any particular items to be problematic. Item 5c changed the Cronbach’s alpha from .779 to .863 if deleted. Question 8 was also perceived as problematic based on the evidence that the Cronbach’s alpha was very poor (.529) and the two items did not correlate well together (inter-item correlation= .36). CONCLUSION: Q5C and 8b were deleted because they decreased the internal consistency/ reliability of the questionnaire. On the basis of the three techniques employed to remove items from the questionnaire on wellbeing, the findings suggest that items that require further analysis, are • Based on factor loadings lowest loading item for each factor- removed Q1d, Q2d, 6b, 8b, 5c. • The 2nd lowest factor loading items for each factor was also noted: Q2, 1b, 6a, 7a, 8a • potential weak items identified were based on the item have several very weak correlations with more than two other items (i.e. <.08) Q1b, 1d, 6c, 6d, 6f, 6g, 7c. 8b. • Cronbach’s alpha: deleting 5c and 8b would improve the internal consistency of the questionnaire

125 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


FINAL RESULTS WITH EXCLUDED CASES: Deleted 10 items- using three techniques systematically Data Set 2: Survey Round Two (Semester 2 2013) Survey Tool Option Three: Analysis was completed on this data set- the KMO and Bartlett’s test for sphericity assumptions were met (this was done for both the overall data set as well as the individual questions). These assumptions ensured that a factor analysis was possible on the data as the items were related, and that the data was significantly different from the Identity Matrix (assumes that only variables are related to themselves, and work independently of other variables). Similarly, the normality was checked using histograms, where slight positive skew was detected, however the sample size permits the continued use of the data as well as the rationale behind the data, as it was partially expected to be positively skewed. The normality was also checked via the skew and kurtosis for each subtype of question, were no items were excessive of breeching this assumption. To view the output for assumption checking for each item go to folder “assumption testing round two option three”. A factor analysis was conducted on this round of questionnaires to ensure that with the same deletion of problematic or weaker items, the questionnaire’s reliability remained high, as well as the basic factor structure of the questionnaire. This was done by running a reliability scale on the original data and computing an overall Cronbach’s alpha, assessing inter-item reliability; looking at the factor loadings (with oblimin rotation) from the “pattern matrix” and the total amount of variance explained table (see below and next page). Original data:

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 126


127 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


As can be seen in the previous tables, the items are loading quite well onto components, the Cronbach’s alpha is high indicating strong reliability and the amount of total variance using the Kaisen-eigenvalue-above-one rule (K1) indicates nine factors are extracted to explain 80.97% of the total variance. • An error was then run- indicating that no rotation was possible. Because of this, the factor loading and structure could not be computed. It has been stated that it is unwise to run rotations as post-hoc testing as it changes the structure and interpretation of the data. This was also a warning from SPSS when running the reliability testing. This indicates that perhaps there is a problem with missing data.

Because of this, the stages were not able to be computed and instead, the internal consistency of the data was ran in similar stages of item deletion.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 128


Stage One: Deletion of 5c and 8B:

Stage two: Additional deletion of Q1D, 3B, 4A, 4B, 6A:

Stage three: Additional deletion of Q1B, 2, 2D, 3D:

As can be seen above, although the alpha has decreased, it was only a very small decrease (.01), thus does not warrant concern. It has remained to be a very reliable measure with 29 items, rather than the original 40. This data set was problematic because the use of rotations was not possible. Because of this, no deletion was based on this data set alone, but rather, the previously identified problematic or weak items from round 1, option two were installed for this data set. The use of checking Cronbach’s alpha displayed that as the questionnaire was reduced the reliability was only affected a very small amount (<.02). Thus the reduction of 11 items appeared to be valid for this set of data based on this technique.

129 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Data Set 3: Survey Round Two (Semester 2 2013) Survey Tool Option Four: Analysis was completed on this data set- the KMO and Bartlett’s test for sphericity assumptions were met (this was done for both the overall data set as well as the individual questions). These assumptions ensured that a factor analysis was possible on the data as the items were related, and that the data was significantly different from the Identity Matrix (assumes that only variables are related to themselves, and work independently of other variables). Similarly, the normality was checked using histograms, where slight positive skew was detected, however the sample size permits the continued use of the data as well as the rationale behind the data, as it was partially expected to be positively skewed. The normality was also checked via the skew and kurtosis for each subtype of question, were no items were excessive of breeching this assumption. To view the output for assumption checking for each item go to folder “assumption testing round two option four”. A factor analysis was conducted on this round of questionnaires to ensure that with the same deletion of problematic or weaker items, the questionnaire’s reliability remained high, as well as the basic factor structure of the questionnaire. This was done by running a reliability scale on the original data and computing an overall Cronbach’s alpha, assessing inter-item reliability; looking at the factor loadings (with oblimin rotation) from the “pattern matrix” and the total amount of variance explained table (see below and next page). Original data:

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 130


131 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 132


As can be seen in the previous tables, the items are loading quite well onto components, the Cronbach’s alpha is high indicating strong reliability and the amount of total variance using the Kaisen-eigenvalue-above-one rule (K1) indicates seven factors are extracted to explain 78.38% of the total variance. Stage One: Deletion of 5c and 8B: based on the previous finding that these two items affected the Cronbach’s alpha worse; had poorer factor loadings and either too high or low inter-item correlations. The same three tables are displayed below.

133 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 134


As can be seen above and previous tables, the reliability improved with the exclusion of Q5C and Q8B. The total explained variance was almost the same (78.03%) and the factor structure looked the same, with the exception of Q1B which seemed to load slightly differently.

135 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Stage two: Items were excluded that had a very high (>09) inter-item correlation (multi-colinearity) or substantially low inter-item correlation (not truly valid item) and smaller factor loading on the component (indicating that they contributed to that factor less). The same items were identified as Round one, option two, where items 1D, 3B, 4A, 4B and 6B were deleted.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 136


As can be seen above and on the previous page, the reliability is still very high with a strong Alpha. The explained variance and the general factor structure remains very similar. However, the pattern matrix identifies Q2D and Q6H to be loading onto a second different component. On further analysis Q1B related to several (in fact all Q7 items) very poorly (see inter-item correlation matrix). On this basis, Q1B was eliminated and Q6h was retained (sufficient inter-item correlations). Q2D was scheduled for deletion at stage three (according to round 1, option two factor analysis).

137 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Stage three: Deletion based on inter-item correlations and factor loadings and using more clinical judgement: deletion of Q1B, Q2, Q2D, Q3D: Factor purification was continued by investigating the semantic content when two items were very similar statistically. For example, if two items had problematic inter-item correlations, and were the lower factor loadings for that component, the items were viewed according to what the question was pertaining. If two were very similar, the weaker one was chosen.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 138


As can be seen above and on the previous page, although the alpha has decreased, it was only a very small decrease (.01), thus does not warrant concern. It has remained to be a very reliable measure with 29 items, rather than the original 40.

139 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


OVERALL SUMMARY: Overall, the three stage deletion method utilising three techniques simultaneously and systematically allowed for the questionnaire to be shortened down to 29 items. The problems in round two, option three are noteworthy, and should be further investigated. • EXCLUDED ITEMS: Q1B, 1D, 2, 2D, 3B, 3D, 4A, 4B, 5C, 6B, 8B. In time and further analysis we may be able to produce further reduction in the measurement tool questions, however the practical and logical elements seem to play an important role in this research. Therefore, reducing the questionnaire purely based on statistical importance of items seems invalid. Also, see introduction about our aim for a ‘holistic’ measurement tool. All the files are sectioned according to round and option- including assumption testing, histograms and factor analyses.

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 140


APPENDIX M

MEASUREMENT TOOL 5

Physical health

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

1

2

3

4

5

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

4. Emotional issues have impacted upon my usual work activities over the last month

1

2

3

4

5

About living in my neighbourhood:

Not at all

A little bit

Moderately

Quite a bit

Extremely

5. I generally like living here

1

2

3

4

5

6. I generally feel safe here

1

2

3

4

5

7. It is easy to get around

1

2

3

4

5

8. There are enough community services

1

2

3

4

5

9. I have access to further education

1

2

3

4

5

10. I am able to participate in arts, sports & cultural activities

1

2

3

4

5

About how I feel in my community:

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1.Physical health problems have impacted upon my usual social activities over the last month 2. Physical health problems have impacted upon my usual work activities over the last month

Emotional health 3.Emotional issues have impacted upon my usual social activities over the last month

11. I feel accepted 12. I feel culturally respected 13. I am able to have a say about community matters 14. I feel pride in my community

141 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Over the last month, I have been: 15. Generally content with life 16. Feeling positive about the future 17. Feeling that bad things keep happening to me 18. Feeling calm and peaceful

Finally: a.

How do I rate my general sense of wellbeing now?

b. How does this compare to 3 months ago?

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

Poor

Fair

Good

Very Good

Excellent

1

2

3

4

5

Much worse

Somewhat worse

About the same

Somewhat better

Much better

1

2

3

4

5

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 142


APPENDIX N

COMPLETE LIST OF QUESTIONS: TOOLS 2, 3, 4,& 5

Tool 2 Question 1 Have physical health problems interfered with my usual social activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Had difficulty because of physical pain?

Question 2 Have emotional issues interfered with my usual social activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Found myself feeling anxious? e. Found myself feeling sad?

Question 3 Have physical health problems interfered with my usual work or other activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Had difficulty because of physical pain?

Question 4 Have emotional issues interfered with my usual work or other activities over the last month? a. Reduced the overall amount of time spent? b. Achieved less than I would have liked? c. Didn’t do them as well or carefully as usual? d. Found myself feeling anxious? e. Found myself feeling sad?

143 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Question 5 Over the last month, have I been: a. Generally content with life b. Feeling positive about the future c. Feeling that bad things keep happening to me d. Feeling calm and peaceful

Question 6 About living in my neighbourhood: a. I generally like living here b. I generally feel safe here c. It is easy to get around d. There are enough community services e. Community services are accessible f. I have access to further education g. Public places are pleasant and appealing h. I am able to participate in arts, sports & cultural activities

Question 7 About how I feel in my community: a. I feel accepted b. I feel culturally respected c. I am able to have a say about community matters d. I feel pride in my community

Question 8 a. How do I rate my general sense of wellbeing now? b. How does this compare to 3 months ago?

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 144


Tool 3 Changes to wording and order of questions

Question 1 About living in my neighbourhood: a. I generally like living here b. I generally feel safe here c. It is easy to get around d. There are enough community services e. Community services are accessible f. I have access to further education g. Public places are pleasant and appealing h. I am able to participate in arts, sports & cultural activities

Question 2 About how I feel in my community: a. I feel accepted b. I feel culturally respected c. I am able to have a say about community matters d. I feel pride in my community

Question 3 Physical health and social activities a. Physical health problems have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent on social activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

Question 4 Physical health and work activities a. Physical health problems have impacted upon my usual work activities over the last month More specifically about physical health and work activities: b. I reduced the time spent on work activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

145 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Question 5 Emotional issues and social activities a. Emotional issues have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent on work activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

Question 6 Emotional issues and work activities a. Emotional issues have impacted upon my usual work activities over the last month More specifically: b. I reduced the time spent on work activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

Question 7 Over the last month, I have been: a. Generally content with life b. Feeling positive about the future c. Feeling that bad things keep happening to me d. Feeling calm and peaceful

Question 8 a. How do I rate my general sense of wellbeing now? b. How does this compare to 3 months ago?

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 146


Tool 4 Changes to wording and order of questions

Question 1 Physical health and social activities a. Physical health problems have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent on social activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

Question 2 Physical health and work activities a. Physical health problems have impacted upon my usual work activities over the last month More specifically: b. I reduced the time spent on work activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I had difficulty because of physical pain

Question 3 Emotional issues and social activities a. Emotional issues have impacted upon my usual social activities over the last month More specifically: b. I reduced the time spent on work activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

147 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Question 4 Emotional issues and work activities a. Emotional issues have impacted upon my usual work activities over the last month More specifically about emotional issues and work activities: b. I reduced the time spent on work activities c. I achieved less than I would have liked d. I didn’t do them as well or carefully as usual e. I found myself feeling anxious f. I found myself feeling sad

Question 5 About living in my neighbourhood: a. I generally like living here b. I generally feel safe here c. It is easy to get around d. There are enough community services e. Community services are accessible f. I have access to further education g. Public places are pleasant and appealing h. I am able to participate in arts, sports & cultural activities

Question 6 About how I feel in my community: a. I feel accepted b. I feel culturally respected c. I am able to have a say about community matters d. I feel pride in my community

Question 7 Over the last month, I have been: a. Generally content with life b. Feeling positive about the future c. Feeling that bad things keep happening to me d. Feeling calm and peaceful

Question 8 a. How do I rate my general sense of wellbeing now? b. How does this compare to 3 months ago?

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 148


Tool 5 Changes to wording, number of questions (reduced), numbering of questions

Physical health 1. Physical health problems have impacted upon my usual social activities over the last month 2. Physical health problems have impacted upon my usual work activities over the last month

Emotional health 3. Emotional issues have impacted upon my usual social activities over the last month 4. Emotional issues have impacted upon my usual work activities over the last month

About living in my neighbourhood: 5. I generally like living here 6. I generally feel safe here 7. It is easy to get around 8. There are enough community services 9. I have access to further education 10. I am able to participate in arts, sports & cultural activities

About how I feel in my community: 11. I feel accepted 12. I feel culturally respected 13. I am able to have a say about community matters 14. I feel pride in my community

Over the last month, I have been: 15. Generally content with life 16. Feeling positive about the future 17. Feeling that bad things keep happening to me 18. Feeling calm and peaceful

Finally: 19. How do I rate my general sense of wellbeing now? 20. How does this compare to 3 months ago?

149 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


APPENDIX P

FINAL WELLBEING MEASUREMENT TOOL

Physical health

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

1

2

3

4

5

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

4. Emotional issues have impacted upon my usual work activities over the last month

1

2

3

4

5

About living in my neighbourhood:

Not at all

A little bit

Moderately

Quite a bit

Extremely

5. I generally like living here

1

2

3

4

5

6. I generally feel safe here

1

2

3

4

5

7. It is easy to get around

1

2

3

4

5

8. There are enough community services

1

2

3

4

5

9. I have access to further education

1

2

3

4

5

10. I am able to participate in arts, sports & cultural activities

1

2

3

4

5

About how I feel in my community:

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

12. I feel culturally respected

1

2

3

4

5

13. I am able to have a say about community matters

1

2

3

4

5

1.Physical health problems have impacted upon my usual social activities over the last month 2. Physical health problems have impacted upon my usual work activities over the last month

Emotional health 3.Emotional issues have impacted upon my usual social activities over the last month

11. I feel accepted

Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 150


14. I feel pride in my community

1

2

3

4

5

Over the last month, I have been:

Not at all

A little bit

Moderately

Quite a bit

Extremely

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

Poor

Fair

Good

Very Good

Excellent

1

2

3

4

5

Much worse

Somewhat worse

About the same

Somewhat better

Much better

1

2

3

4

5

15. Generally content with life 16. Feeling positive about the future 17. Feeling that bad things keep happening to me 18. Feeling calm and peaceful

Finally: b.

How do I rate my general sense of wellbeing now?

b. How does this compare to 3 months ago?

Prior work by Community Indicators Victoria on community wellbeing and by RAND Health in their medical outcomes study is acknowledged in the development of this Wellbeing Survey.

151 Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014


Development and Refinement of a Wellbeing Measurement Tool in the Community Services Sector November 2014 152


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.