Missing The Mark

Page 1

Missing the mark:

why resident satisfaction isn’t measuring up

April 2014


Family Mosaic: an introduction

Contents

Family Mosaic is one of the largest housing providers in London and the South East.

Summary

3

1 The rise of satisfaction

4

2 Methodology

6

3 Three things matter

8

4 Delving deeper

12

5 The future of satisfaction

15

Appendix

18

We provide affordable homes to rent and buy as well as care and support services to thousands of people who need extra support. We have around 23,000 homes for rent and serve more than 45,000 people. We provide a range of opportunities for our customers such as training, employment and access to learning. We partner local communities to make our neighbourhoods better places to live.

www.familymosaic.co.uk

2 | Family Mosaic


summary Missing the mark In April 2012, we launched “Health, Wealth and Wellbeing”, our manifesto for the future of social housing. It sets out a fundamental change to the relationship we have with our residents: we want to help people move on from social housing or support them if they can’t. In this context, is resident satisfaction still a meaningful measure of how successful we are?

Over the last 15 years, resident satisfaction has been one of the key ways for housing associations to measure their operational performance, and to benchmark themselves against others. Yet there have always been questions about the veracity of the measure: ours was never a traditional supplierconsumer relationship, and the answer to a question often depends on how, where and when you ask it. As consumers, residents never had much power or choice. As suppliers, we had little control over pricing or differentiation. The relationship was simple: we provided our residents with a home; they paid us rent. The introduction of fixed term tenancies and affordable rent means this relationship is now changing. As well as providing residents with a home, we want to enable them to move on from social housing. This entails supporting them back into work, helping them lead healthier lives and giving them opportunities to own their home so we can help the next generation. What relevance does resident satisfaction have in measuring this new relationship? To find out, we analysed the data on 3,000 of our tenants in depth to discover what drove satisfaction. The findings weren’t particularly surprising: •      repairs have the biggest impact; •      poor responsiveness drove dissatisfaction; •      areas with a higher concentration of our properties had higher satisfaction rates; •      residents only engage with us when they need something, like a repair.

Our new relationship is predicated on residents having to engage with us in a different way. We know from working with support clients that when we first interact with them there is usually a decline in their wellbeing, as they become conscious of their problems. We also know that leaseholders tend to be less satisfied. So as our relationship with our residents evolves, we want them to be more demanding and more aspirational. At times this might mean they are less happy with us. The old approach to measuring resident satisfaction won’t be sufficient. We need a new approach, a new set of tools. This isn’t just about tweaking performance indicators or asking a different set of questions. It’s about a completely new mindset. We need to develop a more sophisticated index incorporating a range of measures around health, wealth and wellbeing. Some might be purely numerical, like the number of residents who have moved in, or the number who have moved on. We might be able to express others in monetary terms: we estimate, for example, that our social and financial inclusion activities produce a return of over £7 million every year. And the index should still include satisfaction levels with our basic services. This new index will take time to evolve, so that it can measure not just how good we are operationally, but how we are enabling our residents to be more demanding and more aspirational. Only then will we be able to determine how successful we really are.

| 3


1

The rise of satisfaction

The use of resident satisfaction as a measure within social housing can largely be traced back to the promotion of the theory of consumerism in the public sector in the 1980s. It was intended to remove the paternalistic, top-down delivery of services: as residents had no real choice over their landlord, the theory was that resident satisfaction would enable them to hold their housing association to account. Some commentators, for example Furbey and Goodchild, have argued that the measurement of resident satisfaction actually started in the 1960s, in the context of the optimising of the architectural design of housing estates. Most agree, however, that it wasn’t until the 1990s, and the introduction of Charter Marks, that resident satisfaction surveys became embedded within the social housing sector. The Best Value regime in 1997 further entrenched the central importance of resident satisfaction. Two years later, a standard survey model – known as STATUS – was commissioned by the Government and the National Housing Federation. The survey was designed to create a level playing field, to be relevant to every housing association, regardless of size or type. It had to be completed every three years by English housing associations with more than 1,000 homes.

Residents) was launched by Housemark. While this retained the principles of STATUS, it could be argued that it is less representative than its predecessor. Commentators within the social housing sector have highlighted a number of limitations and frustrations with both the STATUS and STAR surveys. First, the approach is impressionistic. It provides a snapshot of a limited number of tenants at a specific moment in time. It doesn’t show any trends other than at one point in time every three years. Secondly, it can be affected by a range of other factors, many of which might be completely unconnected with the performance of the landlord. One study showed how there was a strong inverse relationship between overall satisfaction with landlord services and the incidence of local deprivation, ethnic fractionalisation and council housing as a proportion of local housing stock.

A year after its introduction, housing associations were required to collect and submit to government their survey scores. These showed the percentage of residents who were satisfied or very satisfied with overall landlord services, and the opportunities they had to participate in decision-making regarding service delivery.

Thirdly, the survey questionnaires are often too long, which tends to result in what statisticians call the central tendency effect. In other words, by the time the tenant has reached the 30th question, they will tend to give the same score again and again, without really considering the question.

In 2011, after a refocusing of regulation away from consumer services, STAR (Survey of Tenants and

Fourthly, the use of postal surveys: people tend to respond to postal surveys when they have something

4 | Family Mosaic


to complain about, or they have the time to complete the questionnaire. Another issue around methodology are the scales used: one person’s fairly satisfied might be the same as another person’s very satisfied. And, finally, the surveys are expensive. Some landlords spend up to £50,000 per survey, on postage, staff time, and external assistance. Despite these concerns, resident satisfaction surveys are embedded within the social housing sector. They are used by some to determine bonus payments to staff, while others use them as a means of benchmarking against other housing associations. The importance attached by housing associations to resident satisfaction can also be seen in the values they espouse publicly. At Family Mosaic, one of our values is being dedicated to our customers. We’re not alone amongst the G15 group of leading London housing associations in professing such a belief or dedication to customer service (see figure 1). The nature of our relationship with our residents is, however, changing. We will still be dedicated to them. We will still provide them with a home. But we are now also supporting people into work, encouraging them to lead healthier lives and enabling them to buy their own homes. With the dawn of this new era of social housing, does resident satisfaction still work as a reliable metric on its own: does it tell us how satisfied our residents are? Or do we need a new approach to measure this new relationship?

Figure 1: A selection of G15 value statements Maximising resident satisfaction with our homes, services and neighbourhoods

We will... deliver what we promise: excellent customer services We believe in putting the needs of our residents first Services our customers value We are committed to putting our customers first Circle Housing puts our customers and our people at the heart of everything we do We put our customers at the heart of everything we do We aim to deliver outstanding customer service

Customer focus: putting the customer first

Our residents are at the heart of everything we do

Passion for customer services

Value statements taken from relevant organisation’s web site.

Missing the mark | 5


2

Methodology

We wanted to discover how satisfied tenants were with us, and what was driving this satisfaction, and dissatisfaction. So every month for a year, we phoned 250 tenants and asked them approximately 40 questions each, before collating the data for use in this report.

The age, ethnicity, gender, location and type of property lived in by these 3,000 respondents are shown on the next two pages. These are a valid representation of the age, ethnicity, gender, location and type of property of all Family Mosaic tenants.

Figure 2: Age range of respondents

Under 35 year olds

35-54 year olds

55-74 year olds

Over 80 Refused years old

Figure 3: Ethnicity of respondents White British

Black / Black British

Asian / Asian British

6 | Family Mosaic

Other

White Irish

Other White

Mixed

Chinese

Refused


Figure 4: Gender of respondents

68%

32%

Figure 5: Where the respondents lived

North London (46%) Essex (9%) South London (45%)

Figure 6: Type of property respondents lived in

7%

60%

were living in a maisonette

were living in a flat

25% were living in a house

Shared flat or house Bedsit Bungalow basement

8%

were living in other accommodation

Missing the mark | 7


3

Three things matter

We started by asking respondents about how satisfied or dissatisfied they were with our overall service. Over the 12 months, just over 81% were satisfied, under 12% were dissatisfied, and 7% gave a neutral response: they were neither satisfied, nor dissatisfied.

To find out what was driving this dissatisfaction, we asked each respondent who had said they were dissatisfied to tell us why. Repairs was consistently cited as the reason for this dissatisfaction: over 60% of respondents said repairs were the reason they were dissatisfied.

Figure 7: How satisfied overall are you with Family Mosaic?

81%

7%

12%

Satisfied

Neutral

Dissatisfied

Figure 8: Why were you dissatisfied?

60%

18%

9%

7%

Repairs

Other

ASB

Cleaning & gardening

8 | Family Mosaic

2%

2%

2%

Service Rent Transfers & charges moving home


To confirm this was the case, we asked tenants what issues had made them get in contact with us in the previous three months. Of those respondents who had contacted us, 46% cited repairs as the reason, followed by 17% who had been in contact with us about their rent, and 11% about ASB issues.

As people could choose more than one answer to this question, we then asked which of these issues was the main or most important issue to them. Of those who cited a reason, 55% said repairs. It’s clear repairs is the main issue driving dissatisfaction, and the main reason our residents contact us.

Figure 9: What did you contact us about? 46%

17%

Repairs

11%

10%

ASB

Cleaning & gardening

Rent

8%

5%

2%

Other Transfers & moving home

Service charges

NB: only those respondents who had been in contact with us are included (41% had not been in contact with us). Respondents could choose more than one reason as to why they had been in contact with us.

Figure 10: Which of these is the main or most important issue to you?

55%

14%

Repairs

11%

8%

ASB Rent

Transfers & moving home

6%

Cleaning & gardening

3%

3%

Other Service charges

NB: only those respondents who gave an answer have been included in this graph (42% of the original respondents said that none of the issues were important to them). Respondents could only choose one answer.

Missing the mark | 9


Having established that repairs was the cause of the biggest variance in resident satisfaction, the second discovery was around the impact of how well we respond to enquiries. Using data over a two year period, we looked at the answers given by dissatisfied respondents to an open-ended question about why they were dissatisfied.

On average, there were about 60 responses each month. There were nine recurring issues including transfers, rent, estates, staff attitude and ASB. The reason most often cited by respondents nearly every month was responsiveness. And regression analysis showed each time this was cited as a problem, dissatisfaction increased by 1.2%.

Figure 11: Factors driving dissatisfaction

40

December 2013 was the only month when responsiveness was not the most cited reason

feb 2012

30

jan 2014

apr 2012

20 nov 2013

jun 2012

10

sep 2012

sep 2013

jul 2013

nov 2012

may 2013

The number of people citing responsiveness peaked at 43 in January 2013

mar 2013

10 | Family Mosaic

Responsiveness

Contact

Transfers

Misc

Reliability

Attitude

Estates

Rent

ASB


The third area we examined was locality: did size, or proximity to the resident, matter? To find out, we surveyed 200 residents from one of our subsidiaries, Charlton Triangle Housing. Then, we compared the results with those from our general needs residents only: we found that overall satisfaction levels were 14% higher amongst Charlton Triangle residents. When we examined the reasons behind these residents’ dissatisfaction, there were similar findings: repairs and responsiveness were the main reasons cited. So why was there such a difference in the overall satisfaction levels? Examining the data, one difference emerged: Charlton Triangle residents tended to be either satisfied or dissatisfied. Fewer said they felt neutral. When we looked at the ratings for our repairs service, however, the satisfaction levels were similar. In addition, the number saying their enquiry had been dealt with the first time was comparable. The figures differed most substantially, however, when it came to contact: more Charlton Triangle residents had either been visited by a housing officer or had visited their local office. Similarly, more Charlton Triangle residents said their issue had been resolved. And, perhaps unsurprisingly, more felt that their landlord listened to them and acted accordingly, and was able to solve their problem. This shows the importance of a visible repairs service, as well as a local office with staff who are easily accessible. It also demonstrates the benefits of providing a repairs service to an estate comprised of similar homes, each of which has standardised parts, compared to delivering a repairs service to a wide variety of properties across London and Essex.

Figure 12: Family Mosaic vs Charlton Triangle

Satisfied

+14%

Dissatisfied

+6%

Neutral

+8%

Satisfied with repairs

+4%

The first person was able to deal with my enquiry

+3%

Had a visit from a housing officer in the last three months

+10%

Have visited a housing office in the last three months

+24%

My issue has been resolved

+20%

My landlord listens to and acts upon my views

+11%

Satisfied with the way my landlord solved my problems

+15%

Family Mosaic customers Charlton Triangle customers

Missing the mark | 11


4

Delving deeper

Our analysis was telling us that repairs had the most impact on resident satisfaction. We also discovered that poor responsiveness caused dissatisfaction, and that where we had a higher concentration of properties, and closer proximity of local offices, there were higher satisfaction rates. Having established these reasonably self-evident truths, we wanted to delve deeper into the data to see if we could discover more about what drove resident satisfaction. Our first step was to use regression analysis. This is a statistical tool that measures the relationship between two variables. It results in a score between 1 and -1; if it’s close to +1, it indicates a more positive relationship: if it’s close to -1, it means there is a more negative relationship. A score of 0 shows there is no relationship between the two variables.

For the purpose of this research, we wanted to measure the relationship between satisfaction as one variable, and then age , ethnicity, gender and length of tenure as a tenant as the other variables. When we ran regression analysis, none of these other variables stood out as having a positive or negative impact on satisfaction (see figure 13). When we look at the relationship between satisfaction and more than one factor, however, there is correlation. Take BME female residents: we know as a group that they have an above average level of dissatisfaction with our repairs service. When we mine the data, we find most female BME tenants are in full or part-time employment. Further analysis shows their dissatisfaction was not with the repairs service itself, but with the times work was scheduled. The solution is to provide repairs work slots for evening and weekend hours.

Figure 13: Regression analysis: how age, gender, ethnicity or length of tenure correlates to satisfaction

(-0.028) correlation between satisfaction and gender

+1

(-0.005) correlation between satisfaction and length of tenure as a tenant

0

(-0.048) correlation between satisfaction and age

12 | Family Mosaic

-1

(-0.094) correlation between satisfaction and ethnic origin


The second statistical tool we used was trend analysis: this approach helps us understand trends over time, rather than reacting to peaks and troughs in satisfaction levels. One key discovery from trend analysis was the so-called neutrals.

So what do neutrals want? In the main, they reported shorter-term transient problems, like estate issues, cleaning and low level neighbour disputes. Get these basic services right, and satisfaction levels might improve.

When we first reviewed the data, we identified a group of hard-core dissatisfied residents. Respond to them, the argument went, and your overall satisfaction score would improve. Trend analysis, however, showed this was not necessarily the case: the satisfaction trend for neutrals mirrored the satisfied group (see figure 14).

This led us to think about the nature of the relationship residents want with us: is it solely about us being responsive and fixing problems as and when they arise? Or do they want more?

So if we wanted higher satisfaction levels, we should focus on this group of neutrals. Focusing on the dissatisfied residents would not change the overall score: there would always be a core group of residents who were dissatisfied.

A useful analogy here is Disneyland. When you visit Disneyland, you won’t be satisfied if the rollercoaster rides aren’t working or your child’s favourite Disney character makes them cry. During your stay at the hotel, if the heating is broken or your bedroom isn’t cleaned, you’ll also be disappointed. As a social housing landlord, this is the equivalent of what we do: we maintain the heating and clean the rooms.

Figure 14: Trend analysis: how neutrals mirror satisfied residents

Satisfaction score

Satisfied

Dissatisfied Neutral Time

Missing the mark | 13


For many years, we’ve invested money in the equivalent of the Disneyland rides. We’ve promoted them with glossy brochures. Our research, however, continually highlighted how residents only contact us when the basics go wrong – when the heating doesn’t work or the rooms haven’t been cleaned.

A lot of research around customer satisfaction has loyalty at its core: the assumption has been that there is a relationship between customer loyalty and satisfaction. A paper published in the Harvard Business Review in 2010, however, argued that the relationship is actually about the level of customer effort required to resolve their enquiry.

To demonstrate the importance of access to services, we ran some further regression analysis. This time we used satisfaction as the dependent variable and a number of current issues as the other variables, such as whether the respondent was working, number of spare bedrooms and access to services.

Our customers have no choice. They cannot choose another supplier. So how does this issue over customer effort, or access to services apply to us? Do we need a metric that measures basic provision with minimal effort?

One correlation stood out: access to services. Defined as access from initial contact with us, until final resolution of the issue, this variable was strongly correlated with satisfaction. This confirmed our previous findings: when tenants contact us, they want us to respond quickly to their enquiry. And, from our previous analysis, the reason for that contact is most likely to be about repairs.

In addition, our relationship with our residents is now changing. To continue the Disneyland analogy, we are no longer just fixing their heating or cleaning their rooms. We are supporting them so they can get to Disneyland in the first place. And the current approach towards measuring resident satisfaction levels won’t tell us whether or not we’re successful in helping them to get there.

Figure 15: Regression analysis: how work, bedrooms, spare rooms or access to services correlates to satisfaction (0.101) correlation between satisfaction and whether the respondent is working

+1

(0.618) correlation between satisfaction and access to services

0

(0.015) correlation between satisfaction and number of bedrooms

14 | Family Mosaic

-1

(0.059) correlation between satisfaction and spare rooms in respondent’s property


5

The future of satisfaction

The current approach we use to measure satisfaction tends to reflect whether our residents are satisfied with how quickly we respond when they have a problem – and usually this problem involves repairs. In essence, it measures our operational performance only. Can this, though, suffice for the future?

Following an initial assessment with us at the start of their tenancy, we will develop a tailored support plan for them, so we can support them into work and off benefits. This plan might include one-to-one employment support, specialist training courses or signposting to other services.

Our relationship with our tenants is no longer predicated on our just providing them with a landlord-based service. The introduction of five year, fixed term tenancies, has changed this relationship fundamentally. Most tenants on fixed tenancies will have to engage with us, not because of a fault that needs repairing or a noisy neighbour, but as part of their tenancy agreement.

When we first engage with people in supported housing there tends to be a dip in their wellbeing, as they become aware of their problems. We expect this might also happen with those tenants on fixed term tenancies if they have to engage with our employment service. If we are more demanding about our tenants finding work, then this might increase their dissatisfaction with us.

Figure 16: How the new five year fixed tenancy works

Tenant moves into home ownership Five year fixed tenancy finishes

Five year fixed tenancy starts Year 1 assessment

Year 4 review

Tenant moves on to other accommodation

Fixed term tenancy renewed for five years

Initial assessment and tailored support plan developed, supporting tenant into work and off benefits. This will be assessed regularly, and might include: • employment support and training courses • specialist training with other agencies

Lifetime tenancy, if circumstances changed

Tenancy not renewed, if breach of tenancy

• a volunteering placement • signposting to other services

Missing the mark | 15


At the end of the five year fixed tenancy, there are a number of potential outcomes for our new tenants. One involves them having the means to move on from social housing and into shared ownership. They could then continue to have a relationship with Family Mosaic as leaseholders. We know leaseholders tend to be less satisfied than tenants by as much as 20%. Yet this is what we hope many of our tenants will aspire to, and become, once their fixed term tenancy has finished. The reality is that as our relationship with our customers evolves over this five year period, we will want them to be more demanding of us, and more aspirational about their own lives. Inevitably, however, this might mean they will be less happy with us. The old approach towards measuring resident satisfaction won’t tell us how successful we are being. Being a successful housing association won’t just be about having an efficient repairs service.

So one approach we could take would be to contact 200 people every three months – a survey sample that provides us with a sufficient confidence level – and ask them these three questions, and then report these probably inflated scores publicly. This might work on a PR level, but it wouldn’t actually measure what we do. And it wouldn’t provide us with the depth of data we can then use to improve our services. We need a new approach, and a new set of tools, to measure customer satisfaction, and customer dissatisfaction. This isn’t just about tweaking the performance indicators or asking a different set of questions. It’s about a new mindset. That doesn’t, however, mean we no longer need to use the old measure: we need to retain it, if only as a corrective to the tendency towards paternalism that can creep into any support service. What we need, though, is to use it as part of a suite of measures: in essence, what we’re proposing is a more sophisticated index.

Of course, if we just wanted to have a good satisfaction score there are ways to do this: one approach, as reported by academics from Heriot-Watt University in a research paper from 2010 was to interpret the survey guidance very flexibly. This might, for example, involve using a sampling methodology that wasn’t representative of your tenants.

To reflect our manifesto pledges within this satisfaction index, our starting point might be to look at measures around: •      health; •      wealth; •      wellbeing; •      and basic services.

To illustrate how the headline result can be massaged, during this research we decided to reduce the number of questions that we asked our survey respondents. Instead of the usual 35 questions, we asked them just three. The results showed an immediate impact: our overall satisfaction score rose by 10%.

On pages 18-19, we have illustrated what these measures might include. For some, we have indicated monetary values, by using some of the headline wellbeing factors outlined in HACT’s report, Measuring the Social Impact of Community Investment (March 2014). As this approach is relatively new, we have had to use data from our

16 | Family Mosaic


social and financial inclusion activities to estimate the overall value of this work, rather than adopting the approach recommended by the report’s authors. Nonetheless, the results are impressive: for the first nine months of the financial year, those figures that we have used equate to a total of just over £5.3 million. Extrapolating this further, this represents an annual value of over £7 million. For some of the other proposed measurements in this new index, we do not yet have a monetary value. So, for example, the number of people reporting improved health, the number participating in our community health initiatives, as well as the number who have moved on from our homes. We would also suggest that the new index includes measures such as the number of people who have moved into home ownership, as well as the number who have not had to return into social housing. The index should also include measures relating to our basic services: how satisfied our residents are with our repairs and our overall services. As well as this headline number – which is currently at 81% – we might also look at developing a value added score for residents on fixed term tenancies. This might, for example, involve measuring the improvement in their satisfaction levels during this five year period. There is much to think about, and much to debate: this new resident satisfaction index will take time to evolve. In time, though, it should be able to show not just how good we are operationally, but to reflect how demanding, aspirational and, at times, dissatisfied our customers are with us. Only then will we be able to measure how successful we really are.

Figure 17: How a satisfaction index might look

Overall health score

Overall wealth score

Overall wellbeing score

Overall basic services score

Missing the mark | 17


Figure 18: Possible measures, and monetary values, in satisfaction index

292: number of people attending health initiatives 575: number of health interventions

6,990: number of people receiving care and support services

Health measures £862,000: annual savings made for NHS (estimate)

3,011: number of people benefitting from home improvement service

£642,082: value of 726 residents into training + 50 into vocational training or work placements

£569,154: value of 318 people into volunteering £62,055: value of 35 young people into volunteering

£85,603: value of 49 apprenticeships

Wealth measures

£2,648,682: value of helping 246 people into full time employment All monetary values, excluding NOSP avoidance savings, lump sum gains for residents and annual savings for NHS, have been calculated using values contained in HACT’s Measuring the Social Impact of Community Investment.

18 | Family Mosaic


£317,475: value of 225 people engaged in gardening activities

£61,145: value of 35 improved green spaces

£101,200: value of 44 young people into citizenship projects £105,800: value of 46 young people onto Duke of Edinburgh award scheme

Wellbeing measures

£170,000: value of NOSP avoidance savings £57,627: value of supporting 9 ASB cases

£159,477: value of lump sum gains for residents from welfare rights advice £263,984: value of 28 residents being relieved from being heavily burdened with debt

5,412: number of residents who moved on in previous three years and not returned to social housing (estimate)

81%: resident satisfaction

Basic services measures

2,104: number of new residents

£241,875: value of training 129 people in improved digital skills course

+9.8%: average value added score for fixed term tenants 882: number of new homes built

2,351: number of residents who have moved on

Missing the mark | 19


For further information contact Joanna Birch: T 020 7089 1046 M 07960 821 007 E Joanna.Birch@familymosaic.co.uk

Credits Original research by Dr Rob Wray Edited and designed by Matthew Grenier

20 | Family Mosaic


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.