New Look Coming soon… The Quality Observatory, NHS South East Coast quality.observatory@southeastcoast.nhs.uk nww.sec.nhs.uk/qualityobservatory
INSIDE THIS ISSUE 2
Annual Health Check Prediction Tool
3
Trim-points
4
Sample sizes
5 6 8 9 10 11 12 13
Sussex Commissioning Data Warehouse Makings Links – the National Innovation Centre SPC in practice Improving GP Data Quality in Surrey Finalising your Dashboard 3
A : Ask an Analyst Importance of NHS Number Indicators for Quality Improvement
14
News & quick quiz
15
SUI Dashboard
16
Hellos and goodbyes
Fascinating Fact Remember to get your factor 50 out in this hot weather - last year in South East Coast 10 people were hospitalised due to sunburn!
June 2009 Volume 3 Issue 2
Welcome to Knowledge Matters By Samantha Riley
Over the past few weeks we have for the first time undertaken some comparative analysis to evidence the innovative practice taking place in one of our Trusts and demonstrate the variation that exists across Kent, Surrey and Sussex. Having read Candy’s ‘Start the Week’ bulletin at the beginning of June when she highlighted the work of Mr Hugh Apthorp and his team at the Conquest Hospital, we thought it would be interesting to see what the data said. For some time, the team at the Conquest Hospital have been pioneering short stay hip replacement surgery for suitable patients (approximately 50% of cases). Using innovative surgical and anaesthetic techniques, superb pre and post hospital planning, and with the close cooperation of patients, the whole process can shorten hospital stays to just an overnight stay compared to the normal five to seven days. Having used a range of techniques to analyse length of stay data (including statistical process control) to a variety of levels (Trust, hospital site and individual consultant), we have been able to evidence the extent of the variation in individual practice which is considerable. Mr Apthorp has the lowest average length of stay of orthopaedic surgeons across South East Coast. At the other end of the spectrum, one consultant has an average length of stay which is five times that of Mr Apthorp (which may of course be related to a significantly different case-mix). There is of course a wide spectrum between these two – the challenge is how this variation can be reduced and productivity increased. By the way, we have also analysed information on mortality, readmissions and complications which has shown nothing untoward. In the next edition, we will cover this example in more detail. I think though it is safe to say that we believe that well presented, comparative analysis will play a major role in spreading innovative practice across Kent, Surrey and Sussex. We are going to apply the same approach to a number of the NHS Institute’s High Volume Care Series – keep reading Knowledge Matters for updates on our progress. The clinical leads for the Healthier People, Excellent Care pathways have now been appointed so a key priority for the Quality Observatory going forward will be working with the clinical leads to ensure that they have a meaningful set of indicators and analyses. We have already provided some input to the Clinical Leaders Network in terms of measurement and explaining the added value that a Quality Observatory can provide. In addition to this, we are currently developing a De-mystifying data course’ specifically aimed at a broad range of clinicians which we will deliver on a regular basis from late summer. Details and dates will be available via our website and also of course advertised in Knowledge Matters. Enjoy the sunshine! Samantha Riley
Knowledge Matters
Page 2
Annual Health Check Prediction Tool Rebecca Owen, Performance and Planning Analyst
The Annual Health Check scoring tool has been developed to help organisations predict the Quality of Services component of the 2008/09 Annual Health Check. Currently the tool is only set up for use by PCTs and Acute Trusts as so many of the thresholds relating to Ambulance and Mental Health Trusts are still to be confirmed. To use the tool, the organisation type (PCT or Acute Trust) will first need to be selected. This will ensure that the relevant data and summary sheets are visible. Data for the Core Standards, Existing Commitments and National Priorities will then need to be entered into the ‘Core Standards’ and ‘Data Input Sheets’. Data should be entered where cells are shaded light green (as shown in the screen shots) Information regarding the indicator thresholds are contained within the tool, so once all of the data sheets have been populated the score for each threshold will be calculated. The final Quality of Services score will then be shown on the ‘Final Score’ sheet.
There are still a number of indicators for PCTs and Acute Trusts where the thresholds are yet to be finalised. In these cases assumptions have had to be made to allow a final score to be predicted. All of the assumptions made are detailed on a separate sheet within the tool. In general, where the indicator was part of last year’s Annual Health Check these thresholds have been used; where the construction of the indicator is similar to another the methodology applied to that indicator has been used; in other cases, thresholds and targets detailed in the Vital Signs technical guidance or VSMR guidance have also been applied to the Annual Health Check indicators where appropriate. For a small number of indicators it has not been possible to determine a sensible threshold to use. In these cases, organisations are asked to enter an expected score directly onto the summary sheet. This estimated score will then be taken into account in the overall calculations.
The Annual Health Check tool is now available to download from our website and can be found in the downloads section under ‘Annual Health Check’. Please do have a look at it and let me have comments and suggestions to make this as useful as possible
If you have any comments or queries regarding this, please contact Rebecca Owen: rebecca.owen@southeastcoast.nhs.uk or log on to nww.sec.nhs.uk/qualityobservatory Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 3
HRG trim-points What are they, how are they are calculated, and how are they used? Simon Berry, Specialist Information Analyst Trim-points define the length of stay in hospital beyond which a spell is considered an “outlier” for its Health Resource Group (HRG) and not representative of the clinical resources normally required to treat that condition. Typically, patients staying in hospital longer than the trim-point will include patients with delayed discharge or other multiple comorbidities. Under Payment by Results, trusts are entitled to be reimbursed for days beyond the trim-point at a nationally-determined daily rate, in the case shown below, at £198 per day. Trim-points are calculated and published each year along with the national tariff. They are calculated individually for each of the one thousand or so Health Resource Groups (HRGs) in the National Tariff and, for example, vary from just four days for normal maternity, to over a hundred days in the case of a surgical amputation. They are calculated separately for elective and non-elective methods of admission, and change from year to year, significantly in the case of low volume HRGs. The formula used for calculating each trim-point is a relatively simple piece of arithmetic, drawing on the distribution of length of stay, a worked example of which is shown below.
Trimpoints: An illustration HRG SA01F: aplastic anaemia without complications. Elective tariff: £2,203
HRG H80
Elective Primary Hip Replacement Cemented, Tariff £5,176
Use of trim-points in length of stay calculations
Number of Spells
Inter-quartile Range = 4 days
HRG Trimpoint Upper Quartile (12 days) plus 1.5 x interquartile range (1.5 * 4 = 6 days) Trimpoint is 12 + 6 = 18 days
Reducing length of stay is a key part of managing and reducing clinical costs, hence the current interest in length of stay and how it should be calculated. HRG trim-points are one of a number of tools which are helpful in understanding and tracking changes in length of stay.
EXCESS BEDDAYS
Length of Stay in days
Lower Quartile stay 8 days
Mean length of stay 10 days
Upper Quartile Quartile Upper stay 12 days stay 12 days
Length of stay is strongly influenced by the inclusion of long stays, which occur irregularly, and therefore can mask underlying improvements in length of stay. Thus various methods of how to treat long stays in length of stay have been adopted. The most common is the use of the HRG trim-point to reduce but not necessarily eliminate the influence of outliers, by ”trimming” stays in excess of the trim-point back to the trim-point. There are a whole variety of ways in which length of stay can be calculated – some are better than others and some measures and methods can be mis-leading (more on this next time). However compared with more traditional methods of trimming, such as ‘excluding all stays over 28 days’, the HRG trim-point is sensitive to the clinical needs of patients, and maps to Payment by Results. It is therefore a preferred methodology. As ever, if you have any queries, please do not hesitate to contact one of the team – we’re here to help! Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Knowledge Matters
Page 4
Sample sizes Katherine Cheema, Specialist Information Analyst Ever needed to carry out a patient, public or staff survey? Ever been worried about how many people you need to survey? In an ideal world we’d like to canvas everyone’s opinion and experiences of the services the NHS provides, but with an average population of a little over 500,000 people per PCT it clearly isn’t feasible! So, you have to take a sample; but how big should this sample be? Contrary to popular belief, a sample size does not necessarily need to be related to population size. The size of a sample needs to be determined on the basis of how accurately you want to be able to make statements about your whole population based on the results from your sample. So, if you want a high level of confidence that what your sample says about a particular service is applicable to the whole population, then you will need a larger sample size than if you are happy to accept a lower level of confidence. The parameters you need to set to ensure your sample size meet your expectations are, the confidence interval and the confidence level. Both sound similar but are in fact slightly different. When combined they describe very accurately the level of confidence you have in how well your sample represents the views of the whole population of interest. A confidence interval is the margin of error you are allowing; your sample is never likely to reflect the whole population 100%, but how far away from that 100% are you willing to go? The closer agreement you need the smaller your confidence interval must be. If 100% agreement is what you need, then a census must be undertaken. A confidence level is the level at which you can be confident your results are within the specified margin of error. If you set the confidence level at 99%, you can be 99% sure that the whole population would return the same answers to a survey, within the confidence limits you have specified. Sample Size Calculator Confidence level 90% 90%
95%
98%
Confidence interval Smaller
Larger
99%
The confidence level tells you how sure you can be. It is expressed as a percentage and represents how often the true percentage of the population who would pick an answer lies within the confidence interval. The 95% confidence level means you can be 95% certain; the 99% confidence level means you can be 99% certain. Use the radio buttons to select the level of confidence you require; the lower the confidence level, the lower the recommended sample size. The confidence interval is the plus-or-minus figure usually reported in newspaper or television opinion poll results. For example, if you use a confidence interval of 5 and 50% percent of your sample says they are unhappy with a service, you can be "sure" that if you had asked the question of the entire relevant population between 45% (50-5) and 55% (50+5) would have picked that answer. Use the slide bar to select the required confidence interval. Convention is usually 5 and the maximum value available in this calculator is 15. The higher the confidence interval, the lower the recommended sample size. When you put the confidence level and the confidence interval together, you can say, for example, that you are 95% sure that if you surveyed the whole population, between 45% and 55% will answer they are unhappy with the service.
So, using a patient survey example, if you specify a confidence level of 95% and a confidence interval of 5%, you are 95% sure that the opinion of the whole patient population will be within 5% either side of the actual results, gained from the sample you have taken.
OK, sounds reasonable but how do we calculate all that? Fear not, an Population Excel based tool is available that Enter your whole population size here (e.g. PCT population, number of inpatients etc.). Often you may not know the exact population size. This is not a problem. The mathematics of probability proves will do all that for you. All you the size of the population is irrelevant unless the size of the sample exceeds a few percent of the total population you are examining. This means that a sample of 500 people is equally useful in examining 15,000 have to do is select your chosen the opinions of a population of 15,000 as it would of 100,000. Population size is only likely to be a factor when you work with a relatively small and known group of people (e.g. the members of a confidence level in the green specific patient group). section, use the slidey bar to set a Recommended sample size This is the recommended sample size that will meet the confidence level and interval that you have confidence interval in the yellow specified, relevant to the whole population size. When reporting results from a survey you may need to use the setting specified here, not only justify the choice of sample size but also give readers an section and enter your total idea of the range of numbers of the population who are likely to have answered in a certain way and 5855 your level of confidence. population size in the pink section. Then, as if by magic, you will see in the blue section a recommended sample size is displayed. In the screenshot above, its recommending a sample of 5,855 from a total population of 15,000 (that’s 39%) which is quite high. Change the confidence interval to 5, the sample size is calculated at 374 (a bit more do-able!). 1
As ever, this tool is available on the Quality Observatory website (nww.sec.nhs.uk/qualityobservatory) and includes short on-screen guidance. If you have any queries, questions or requests for other tools, please do let me know (Katherine.cheema@southeastcoast.nhs.uk)
Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 5
The Sussex Commissioning Data Warehouse Project Colin Styles, Information Architect, Sussex HIS The national ‘World Class Commissioning’ agenda requires that PCTs gain an understanding of their population's entire healthcare pathway, not just within but also across services. Although we already extract information from patient activity systems (PAS systems, for example), to date we have not attempted to link different activity datasets to form a complete (or ‘holistic’) picture of the patient care journey. The Commissioning Data Warehouse (CDW) has been developed to do just this, plus provide tools to understand this data, by comparing patterns and experiences across areas. Phase one of development completed in April. This phase built the framework, or ‘skeleton’ of the system – this consists of: • • • • •
Patient demographics (anonymised, see below) Patient key indicators – eg diabetes status, risk of admission Hospital events – inpatient, outpatient, A&E attendances Community clinics – eg physiotherapy Community contacts – eg District Nursing visits
We have also successfully piloted inclusion of clinical system datasets, plus GP data from a cluster of Hastings practices (using the automated ‘Apollo’ data extract software) and it is planned to proceed with a wider roll-out of this data extraction. Information Governance has been a key aspect of the development. Data accessed within the system is ‘pseudonymised’ – this means that personal details about individuals are not accessible (although events for an individual can be linked using an internally generated identifier). We are now making this data available to PCT users, both via direct access and via a web-based ‘event viewer’, which allows the user to select a cohort of patients (eg by practice, age band, or illness) and view the care pathways for these patients. Stage two of the project will focus on adding value to this raw data for our users, and will prototype five areas: • • • • •
Frequent flyers – patients who exceed a level of service selected by the user (eg cost, number of attendances); Benchmarking – comparison of illness rates and admission risk by practice/locality; Patient Pathways – analysis of patient events by care pathway (eg diabetes, COPD); Commissioner needs – costing of current and future healthcare provision; QOF (Quality of Outcomes Framework for GPs) reconciliation – comparison of primary care, secondary care and nationally predicted prevalence of disease;
In addition to the above a number of wider uses of this dataset have been identified and are also being considered, for example to give GPs access to this complete view of their patients’ event history, as part of Practice Based Commissioning. In conclusion, CDW adds significant value to patient data by providing a holistic view of the patient journey, enabling comparison of current service usage and future needs. If you would like further information, please do contact me! (colin.styles@sussexhis.nhs.uk)
Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Knowledge Matters
Page 6
Innovate to Accumulate Brian Winn, Head of the National Innovation Centre Is there anyone in 2009 who doesn’t have ‘innovation’ or ‘quality’ or ‘productivity’ somewhere in their job description? If not, they should! But before you dismiss these as the new buzz words of the moment, stop and consider - if only because everyone seems to be in agreement for once! NHS managers are emphasising these elements are the only way to transform the NHS, Government ministers are hailing them as the solution to our present economic difficulties and commerce is saying that they will support the Healthcare ‘industry’ in becoming a key source of future wealth for the UK. But is this mantra new to the NHS? Although our world standing may have ebbed and flowed, the NHS has always been a by-word for quality, productivity and innovation – particularly in the eyes of other countries. Perhaps what we need to do is to bring these aims back into sharp focus. Although each of us will have a different contribution, the three aims are inseparable. To increase productivity, we need innovation that must produce quality products and services that, in turn, will increase productivity. At the NHS National Innovation Centre (NIC), we embrace all three in equal measure. Set up in 2006, as a result of a joint DH/DTI policy, the NIC was charged with increasing the speed at which new technology reached the patient. Working in the field of medical devices and diagnostics to deliver innovations, we take our lead from the DH and work in equal part with industry and the NHS. The NIC supports innovators with a set of web based ‘tools’ and other services to help them bring their ideas to market. We have two approaches to innovation – proactive and reactive: Proactive approach Because the NHS doesn’t always articulate its priority ‘needs’, innovators are compelled to take a leap into the unknown and can’t be sure their solutions address a problem or, at least, not the main problem. Since innovation has most value when products and services meet a defined clinical need, the NIC developed a “Wouldn’t It Be Great If…” (WIBGI) approach that challenges teams of clinicians, academics and NHS staff, working in specific areas, to identify the priority ‘needs’ ie problems or deficiencies in their jobs. Once the clinical needs are agreed, the NIC scans the market to see if a product exists that will meet those needs. If not, it embarks on a tightly managed system to develop the solutions. The first step calls for users, designers and manufacturers to produce outline designs for potential devices or diagnostics. This proactive approach – getting out there and talking to the practitioners – has already yielded huge efficiencies in speeding up the time it takes to get products to the bedside or to people’s homes.
At the recent DH EXPO in London, the NHS ran a WIBGI session with for the Ambulance service. The group, comprising clinical experts and people from industry, produced three key ‘needs’, together with potential solutions, that will now be taken forward by the NIC with the aim of turning them into market ready products.
Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 7
Reactive approach The NIC does, of course, receive hundreds of ideas that arrive, unprompted, from innovators. But the most successful are always from innovators who have a clear sense of need. One example is Cambridge-based company TwistDX, who had developed a method of amplifying DNA so that patients could be tested for MRSA in under 15 minutes before being admitted to hospital. The result was that anyone found to be carrying the infection could be isolated immediately, according to the hospital’s infection control procedures. TwistDX received business advice and funding from the NIC to develop their product. At the same time, companies approach the NIC with ideas that have potential but might need moderating or strengthening. The NIC offers advice to them via its online tools www.nic.nhs.uk and offline services. Lein Applied Diagnostics, a Reading-based SME, approached the NIC in 2007 with a prototype for a non-invasive blood glucose meter. A hand-held device would replace the finger stick methods currently used by people with diabetes to test blood glucose levels and, instead, would take a painless reading from the eye. Dan Daly, director of Lein, said: “I put the product through the NIC’s web-based ‘Scorecard’ to begin with, and carried out a confidential assessment myself of the device using the NIC’s rigorous tools. I then contacted the NIC and asked for further support. Their advice on markets, business strategy, legal requirements and arranging NHS introductions for trials was invaluable.”
'The National Innovation Centre demonstrates its unique innovator support tools via a 'Minority Report' style wall' at the EXPO conference.
The NIC is currently developing further on-line refinements and off-line services that will support not only innovators but also the NHS Strategic Health Authorities who are in the front line to fulfil the quality, innovation and productivity agenda for the benefit of patients and UK plc. For further information, visit our website www.nic.nhs.uk
Our approach in South East Coast Hello everyone. My name is Peter Houghton and I joined NHS South East Coast at the end of March as Director of Innovation. This means that I am responsible for the areas of innovation, service improvement and the Quality Observatory. I also Chair the Innovation Leads Group at National Level. So, what’s our approach to Innovation in South East Coast? We are eager to support the successful exploitation of new ideas (whether these are technological or related to new ways of providing services) and are keen to identify the innovations which will have the biggest impact on quality and productivity within each of the Healthier People, Excellent Care Pathways. There is some funding available to support innovation (The Innovation Fund) and we are currently finalising arrangements for accessing this fund. Needless to say, we will be focusing our service improvement resources to support the spread of innovation and a key area of work for the Quality Observatory will be evidencing innovative practice and the variation in current practice across South East Coast. If you would like to learn, please do contact me – peter.houghton@southeastcoast.nhs.uk Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Knowledge Matters
Page 8
SPC in practice: development of a CDI monitoring tool Kate Cheema, Specialist Information Analyst Statistical Process Control (SPC) is widely considered as a ‘good thing’ and there is lots of literature and guides about that can help you understand why it is so useful and how to construct the right kind of SPC chart. But any complexities surrounding SPC should not take away from its fundamental message; to understand and improve a process you must first look at the variation within it. As the SHA moves into its role of promoting innovation, improvement and leadership, the use of information purely for judgement is becoming less appropriate. Instead we seek to assess where new or innovative practice can improve outcomes for patients and performance for organisations. With these ideas in mind, the development of an SPC based tool for monitoring Clostridium Difficile infections (CDI) seemed like a logical step. The SHA collects CDI information from acute providers weekly, so there is lots of scope for seeing how changes in practice can impact on CDI numbers in a relatively short space of time. The data doesn’t have to be perfect (the weekly CDI returns are considered ‘unvalidated’) as we are not using the information to hold providers to account. The SPC based tool developed is CDI cases by week 14.0
12.0
Cases
10.0
8.0 Risk assessment tool introduced
6.0
4.0
2.0
03/08/08 10/08/08 17/08/08 24/08/08 31/08/08 07/09/08 14/09/08 21/09/08 28/09/08 05/10/08 12/10/08 19/10/08 26/10/08 02/11/08 09/11/08 16/11/08 23/11/08 30/11/08 07/12/08 14/12/08 21/12/08 28/12/08 04/01/09 11/01/09 18/01/09 25/01/09 01/02/09 08/02/09 15/02/09 22/02/09 01/03/09 08/03/09 15/03/09 22/03/09 29/03/09 05/04/09 12/04/09 19/04/09 26/04/09 03/05/09 10/05/09 17/05/09 24/05/09 31/05/09 07/06/09 14/06/09 21/06/09 28/06/09 05/07/09 12/07/09 19/07/09 26/07/09 02/08/09 09/08/09 16/08/09 23/08/09 30/08/09 06/09/09 13/09/09 20/09/09 27/09/09 04/10/09 11/10/09 18/10/09 25/10/09 01/11/09 08/11/09 15/11/09 22/11/09 29/11/09 06/12/09 13/12/09 20/12/09 27/12/09 03/01/10
0.0
Moving range chart
Moving range
03/01/10
20/12/09
06/12/09
22/11/09
08/11/09
25/10/09
11/10/09
UCL
Status for selected options: Change mean
27/09/09
13/09/09
30/08/09
16/08/09
02/08/09
19/07/09
05/07/09
21/06/09
07/06/09
24/05/09
10/05/09
26/04/09
12/04/09
29/03/09
15/03/09
01/03/09
15/02/09
01/02/09
18/01/09
04/01/09
21/12/08
07/12/08
23/11/08
09/11/08
26/10/08
12/10/08
28/09/08
14/09/08
31/08/08
17/08/08
03/08/08
10 9 8 7 6 5 4 3 2 1 0
AMBER
Back to Rules &
an example of an xmr chart, showing a standard x-bar SPC chart with the usual upper and lower control limits and centre line. The lower chart plots the moving range which also has an upper limit and can be used to assess the magnitude of changes between points and whether these are unusual. In the context of CDI, the rules that are normally applied to SPC charts have been made more sensitive to reflect clinical priorities, highlighting the message that SPC is a tool to be used according to need.
SPC need not be inflexible and can be adapted to requirements. Other functionality of the tool includes: • A RAG status to indicate the status of the chosen trust for the chosen week • The ability to change the basis for the centre line, upper and lower control limit calculation, meaning that as more data points are available, more can be included in the maths bit! • The ability to ‘freeze’ centre lines, upper and lower control limits and recalculate them based on subsequent points • Flexibility to change chosen metric • A shaded ‘warning’ area between the upper 2 and 3 sigma limits There are two main aims in using this methodology to look at CDI over time. Firstly, given the necessary criteria, it can give ‘early warning’ of possible issues in providers, which can then be followed up. Secondly, it can help assess whether new or changed practice or interventions have had an effect on CDI numbers. In figure 2 for example, lets say a provider has introduced a new bedside risk assessment tool from w/e 22nd February 2009; can it be shown that this has decreased (improved) the number of cases? Or does it reduce the variation? This kind of SPC tool cannot give the answers, but it might point you in the direction of the right questions. As always, a blank version of the tool is available on the website (nww.sec.nhs.uk/qualityobservatory) and if you have any questions just e-mail quality.observatory@southeastcoast.nhs.uk Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 9
Improving GP Data Quality – the Surrey Data Quality Service Tracy Weller, Primary Care Data Quality Facilitator, Surrey PCT The Data Quality Service originated as a short-term project in October 2000 established by the forward thinking East Surrey Health Authority as part of its Local Implementation Strategy to support “Information for Health” (DoH 1998). It soon developed into a permanent service which is now provided by a team of five highly experienced Data Quality Facilitators (DQFs) based in the Directorate of Knowledge Management of NHS Surrey. Why is there a Data Quality Service? The Primary Care Data Quality Service aims to: Improve the quality of data held within General Practice/Primary Care Help practices make better use of their clinical systems by raising awareness of information management techniques What is the Data Quality Service? An unbiased, confidential, flexible, responsive, professional specialist service. Who do the team work with and where? The Primary Care Data Quality Facilitators work with groups of general practices across Surrey. There is close liaison with, and support to, staff within NHS Surrey. For example:
Primary Care Managers Public health specialists IM&T Directed Enhanced Service Leads Commissioning Managers Information Governance Lead
The Facilitators have a professional relationship with the National PRIMIS+ (Primary Care Information Services) team that provides: Training and development opportunities for Facilitators Tools for use locally e.g. CHART The team also engage with other organisations e.g. the SHA on topics such as the Summary Care Record (SCR)
“All healthcare is information driven, so the threat associated with poor information is a direct risk to the quality of healthcare service and governance in the NHS”
Examples of the core services and areas of expertise offered by the Primary Care Data Quality Facilitators Data quality audit - carried out on a regular cycle which includes: searches feedback of results facilitation of practice action plan on-going advice and support Read code training – carried out on or off site for groups of any size Data quality training - carried out on or off site for groups of any size MIQUEST interpreter (a search tool for clinical computer systems) training and support CHART—practice search tool Support for clinical computer system - Local User Groups Information and updates Where Data Quality may impact on potential patient safety issues GMS contract—QOF, QMAS New Read codes Consultancy/advisory service GMS contract—QOF, QMAS Paperless practices IM&T DES
Examples of the specialist skills and support offered by individuals or groups within the Surrey DQF team Responsive to local and/or national needs / initiatives i.e.: Paperless accreditation process IM&T Directed Enhanced Service/LES Practice Based Commissioning Summary Care Record
Awareness of potential risk areas to patient safety Data quality specialists in primary care for Surrey Bespoke MIQUEST query writing service Support independent projects where appropriate
We are more than happy to share our leaflets and documents and have already posted a number on the new Data Quality espace community which we would encourage you to join: http://www.espace.connectingforhealth.nhs.uk/community/data-quality-guild You may also be interested in having a look at the PRIMIS+ website where you can discover lots about primary care data quality and also find your local facilitator: http://www.primis.nhs.uk/ Look forward to meeting you in espace! Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Knowledge Matters
Page 10
Skills Builder – Finalising your Dashboard Charlene Atcherley-Steers and Nia Naibheman, Performance Analysts Welcome to the final edition of ‘How to make a dashboard. To re-cap, the first edition covered the initial design stage i.e. graph types, colours, fonts; the second edition covered the data set behind the dashboard where we looked at v-lookups, raw data and chart data. This final edition will show you how to combine the two together. 1) Layout of your chart sheet At this stage you will have identified how many graphs you will need on your chart sheet and the type of graph(s) you will be creating, if you are unsure just take a look at your chart data section. To create the chart page you could either insert a blank chart sheet or use a previously created one. If you create a new sheet you will need to create each graph from the chart data and select the location as your new chart sheet. 2) Combo boxes For dashboards which require a drop down to enable the user to select an option e.g. Commissioner or Provider then you will need to: 2.1) Firstly you will need to create the dropdown box (combo box). This can be found in the form toolbar (view, toolbars, forms) 2.2) Position your combo box appropriately i.e. on top of the actual graph that will change according to what is selected or across the whole sheet if all graphs are to change according to what is selected. 2.2) To link the Combo box to the graph and data right click on your combo box, select ‘format control’. The ‘input range’ will be the list of data you want the combo box to show- i.e. provider names and the ‘cell link’ will be where the combo boxes selection is output (i.e. the position of the selection on the list). This is linked to the look up data mentioned in article two. If you have a long list in your drop down box you can change the number of options visible by changing the ‘drop down lines’. 2.3) Test your dashboard Now you have finally created your dashboard, you need to check that your drop downs are working and are picking up the correct data. This should be clearly visible on your chart sheet. If you are experiencing difficulties with your dashboard you can always contact a member of the Quality Observatory. 3) Finishing touches
Once all the graphs and If required, drop down boxes, are completed you will need to ensure that the colours of the data plotted, graph size and the axis are consistent. Finally you will need to add the corporate image to the header and your e-mail address and file link in the footer. If you do not want your drop down box to be visible when the dashboard is printed; simply click on the ‘Properties’ tab in Format control and unselect the ‘Print object’ option.
If you’ve followed all of our advice, your dashboard should look something like this. If you experience, problems you can book onto one of our Quality Observatory drop in sessions (forthcoming dates appear on page 14). Happy Dashboarding! Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 11
A3: ASK AN ANALYST – If you have a question for the team please e-mail: quality.observatory@southeastcoast.nhs.uk Q I've been looking at the SHA version of the SPC tool and feel confused about what data type to choose. Can you explain what the difference types are? If you have a set of data which are all measured on a continuous scale such as height, weight, blood pressure etc. then you have continuous data. The key attribute of continuous data is that each observation value has a mathematical relationship with all the others. An example might be blood pressure; a reading of 145/80 is mathematically related to a reading of 170/110 because both are measured in the same units (mmHg). You can also ‘do’ arithmetic on continuous data such as addition, subtraction, squaring or calculating a product. If you have data that is expressed in percentages then you have proportional data. This means that you are describing a subset of a defined number of ‘opportunities’ generally of the same measure. For example, the A&E standard is a subset (the number of patients leaving A&E within 4 hours) of the total number of A&E attendances, which can also be seen as the number of opportunities for meeting the 4 hour standard . This is different than data expressed as a rate (such as CDiff cases per 100,000 population, or medication errors per 1,000 occupied bed days), where what you are measuring is expressed in relation to another measure which is generally constant. Count data is a little more difficult to explain as it can often seem similar to continuous data, but generally, if you have sets of observations that are not measured on a scale or as a subset of a whole (like a percentage) then you have count data. So, for example, a count of people attending a clinic on each day, or a count of MRSA positive swabs per month; these data aren’t given a context. The final option on the SPC tool is rare event data which is a bit more specialised; if you have very small numbers of a particular incident (e.g. MRSA bacteraemias, complaints etc.) then the statistical validity of the control chart approach is called into question, so a slightly different methodology has to be used. We usually look at ‘days between’ rare events and then plot these against observation numbers rather than time. This is specifically related to the data types that have been included in the SPC tool. If you want to know more about data types in general and what you can and can’t do with them, the University of Nottingham have an excellent series of online resources (you’ll need headphones) which explain the differences really well. Check out ‘Levels of measurement’ at http://www.nottingham.ac.uk/nursing/sonet/rlos/rlolist.php. Please do contact one of the team if you need further guidance.
Quick Quiz
Answer to April’s Quick Quiz What does VODIM stand for? answer…..
Here’s the
Valid – a valid correct code Other – a valid code – generally used as a bucket code for those things that don’t fit any of the categories Default - a code that is automatically used when nothing else is specified Invalid – a code that isn’t allowable for the field Missing – the absence of any data from the field.
Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Page 12
Knowledge Matters
The Importance of NHS Number Guy Roberts, South East Coast NHS Number Lead Hi everyone. I’m one of the Programme Managers in the Health Informatics Transformation Team and I also am the NHS Number Lead for NHS South East Coast. The NHS Number is the only national unique patient identifier in operation in the NHS. Using the NHS Number makes it possible to share patient information safely, efficiently and accurately across NHS organisations. The complete adoption of the NHS Number was included as a key priority as part of the NHS Operating framework 2008/9 as a requirement to "underpin the delivery of world class patient care" by mandating the use of the NHS Number in all relevant administrative and clinical systems. It was reinforced in the NHS Operating framework 2009/10 stating, "Making consistent and effective use of the NHS Numbers and the Personal demographics Service will reduce the number of mis-associated records and will support the appropriate sharing of patient information with partners in the delivery of patient care". How many of you are aware that the in September 2008 the National Patient Safety Agency (NPSA) issued the following Safer Practice Notice 18 September 2008 Ref: NPSA/2008/SPN001 By 18 September 2009, all NHS organisations in England and Wales that provide primary, secondary and all other types of care such as community pharmacy, should take the following action: 1. Use the NHS Number as the national patient identifier; OR the NHS Number as the national patient identifier in conjunction with a local hospital numbering system (NB where local hospital numbers are used they must be used alongside and not instead of the NHS Number; 2. Use the NHS Number (and its barcoded equivalent) in/on all correspondence, notes, patient wristbands and patient care systems to support accuracy in identifying patients and linking records; 3. Put processes in place to ensure that patients can know their own NHS Number and are encouraged to make a note of it (for example through patient literature that explains the NHS Number, its uses and advantages, and how patients can use it to increase safety); 4. Primary care organisations that have stopped issuing medical record cards should reinstate this practice and use it as a means of informing patients about their NHS Number and encouraging them to use it where appropriate. In December 2008, Professor Sir Bruce Keogh wrote to all NHS Chief Executives and Medical Directors asking for their assistance with implementing the NHS Number Programme. The use of NHS Number is critical in facilitating the safe and effective transfer of information across the system and fundamentally underpins the provision of safe, effective, high quality care. Between June 2006 and the end of August 2008, the NPSA received over 1,300 reports of incidents resulting from confusion and errors about patients’ identifying numbers. Many of these involved duplication in local numbering systems, for example, two patients having the same number, or one patient having more than one number. While no deaths or cases of serious harm to patients have been reported so far, healthcare staff have commented that this is causing significant risk to patient safety. Using the NHS Number as the national identifier for patients will significantly improve safety by ensuring patients are identified correctly. I will be coming along to the next Data Quality Workshop on 28th July 10.30 – 2pm to talk about the NHS Number Programme (for further details of this workshop please contact samantha.riley@southeastcoast.nhs.uk). I hope to meet lots of you there!
Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 13
Indicators for Quality Improvement Samantha Riley, Head of the Quality Observatory High Quality Care for All defined quality in the NHS as safe and effective care of which the patient's whole experience is positive. Lord Darzi set out ambitious commitments for making quality the organising principle of the NHS. His vision is that all NHS staff will measure what they do as a basis for improving quality. In partnership with professionals right across the NHS, the Department of Health and The NHS Information Centre have identified an initial, but evolving, set of indicators to describe the quality of a broad range of services – the Indicators for Quality Improvement (IQI). The initial menu is made up of existing indicators that are used by NHS organisations across the country. However the long-term vision is to build an extensive menu of indicators that will help every tier of the NHS understand and improve the quality of services it provides to patients. Many of you will have read about some of the work that the Quality Observatory has undertaken in South East Coast to develop new clinical indicators (stroke and dementia are two areas that we have focused on). No doubt other teams across the country have also been busy developing new indicators. The idea is that, over time, locally developed metrics will find their way into the IQI. Clearly there will need to be a process of testing new indicators in terms of robustness of methodology and data sources before a new indicator makes it into the IQI and the process associated with the approval of new indicators is currently being worked through. In terms of what is available now, the initial release (which number over 200 indicators), has been published. For each indicator there is a set of 21 meta data items available. These explain the detailed methodology and data sources used, and provide a link to where the calculated indicators can be obtained. Where the meta data is currently not available it will be added in later releases. It is possible to search the menu of indicators to find indicators based on key words (e.g. diabetes, mental health, safety) or you can browse through a tree structure based on the 3 quality domains (effectiveness, safety and experience) and the Next Stage Review Pathways. The search facility is basic for this first release and simply searches all the meta data fields (prioritising the title field) for the search text. It is possible to make suggestions via the website and find out about what is happening regarding the development of indicators for primary care, mental health, community services and Allied health Professionals. There are also links to useful documents. I’d strongly encourage you to have a good look at the website as it is a really rich resource of information on quality improvement. The website address for the Indicators for Quality Improvement is https://mqi.ic.nhs.uk/ I do think that it is important to stress that the IQI is a menu of indicators from which clinicians can select. A whole variety of quality indicators are already reviewed by clinical teams and Boards on a regular basis. The intention is not that everyone has to monitor all indicators within the IQI and as the menu grows clearly this would be impossible. I thought that you might be interested to know about a couple of pieces of work that we have underway which will provide comparative information for quite a number of the IQI. Firstly, we are developing a web-based QOF tool which uses national data not just South East Coast data. This tool is being designed to use the QOF data tables to provide benchmarking on QOF disease areas at Practice and PCT Level. The tool is being developed to allow the display of indicators and ranking of recorded achievement against PCT/SHA/Country and ONS categories. In addition, we are developing an inpatient survey tool (again using national data) which will allow all England NHS acute Trusts to benchmark their scored results on the 2006, 2007 and 2008 inpatient surveys, against national, regional and similar organisational benchmarks. Performance on key themes such as privacy and dignity and infection control can be examined separately against the previous years to enable trusts to assess their progress in the areas that matter most to their patients. The tool will also provide some inbuilt flexibility to allow trusts to choose a shortlist of the questions that matter most to them and display them in a scorecard with the opportunity to populate in-year performance and locally agreed target fields to help monitor progress short term. For further information on either of these developments, please e-mail quality.observatory@southeastcoast.nhs.uk Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Knowledge Matters
Page 14
News Unify2 Enhancement As you may be aware, there is currently a Unify2 Enhancement Project underway to improve the functionality and performance of the Unify2 website. Phase 1 of the project is due to be completed by the end of the year. The Unify2 team from the Department of Health will be visiting SHAs to demonstrate the new website later in the year – we have 2 training sessions booked for 3rd November. If you would like to attend one of these please email rebecca.owen@southeastcoast.nhs.uk with your name and contact details, stating if you would prefer the morning or afternoon session. As spaces are limited there is currently only space for one person per organisation but will keep additional names on a reserve list and let you know if space is available. New Cancer Waiting Times dashboard As the Cancer Waiting Times database has now been upgraded we are developing a new dashboard to show performance against the new cancer waiting times targets. This is still draft at the moment but will shortly be available to download from the Quality Observatory website. More details will follow in the next newsletter but please contact Rebecca Owen if you need more details in the meantime. E-Learning for Improvement The NHS Institute has recently completed the development of three short e-learning objects in collaboration with the University of Nottingham. These are among a number of e-learning modules available on clinical, research, statistical and information themes. You can access the modules via the NLH repository, or visit http://www.nottingham.ac.uk/nursing/sonet/rlos/rlolist. php where you will find a table of all their e-learning objects. Opportunity to feedback to the NHS Institute Our area is taking part in a national programme with the NHS Institute for Innovation and Improvement. They wish to understand how managers and frontline staff use data, tools and strategies to make improvements. We would like your feedback using a short anonymous online survey. It will take just 5-10 minutes to complete, and will help the Institute plan work and demonstrate success over They’re time: www.evidencecentre.com/survey interested in feedback from executives, managers, analysts, frontline staff and everyone else who might be interested in improving care. The deadline for feedback is 7 July 2009.
New Data in the SUS Data Quality Dashboards The Information Centre’s Data Quality dashboards have been updated to include: 2009/10 data submitted to SUS by the Month 1 inclusion date (21/05/2009). • Coverage - the coverage of trusts' A&E, Outpatient (OP), Adult, Paediatric and Neonatal Critical Care and Maternity data in SUS compared to aggregate returns submitted to the Department of Health • Coding - the take up of OPCS 4.3, 4.4 and 4.5 codes, and Paediatric TFC's in the APC CDS • Submission Patterns - the timeliness, completeness and method of submissions to SUS, including submission of monthly comprehensively coded APC data (revised definition for 2009/10), sending via Net protocol and the population of key 18 week wait fields • Duplicates - the number of identified duplicates in Trusts' APC, OP and A&E data in SUS For registered users of the Dashboards, the updated file is available at: http://nww.connectingforhealth.nhs.uk/reportingservices/data-quality/kpi.swf/ Knowledge Management Event Two more events are scheduled for 22nd September and 9th December in Tunbridge Wells:Each one day event will provide an overview of Knowledge Management and the people, tools and technologies that are available to make it work. The day will be highly interactive. The course is free and open to all NHS staff working within Kent, Surrey and Sussex. Further info is available from louise.goswami@nhs.net
Do you need help ???
Having trouble with a dashboard? Got an analytical challenge that you can’t solve? If so, why not book a slot at one of the Quality Observatory’s drop in sessions? Sessions run all day and are being held on the following dates at York House, Horley: 15th July 19th August 16th September Simply e-mail your request to the team quality.observatory@southeastcoast.nhs.uk or call Nia on 01293 778886
Do you have something you would like to contribute to Knowledge Matters? Please contact us!
Knowledge Matters
Page 15
Data Quality Programme update Serious Untoward Incident Dashboard David Harries, Health Analyst The first version of the Serious Untoward Incident (SUI) dashboard will shortly be available to registered users of the website. All data used in the dashboard has been sourced from the SUI system accessed via the Strategic Executive Information System (STEIS) or from the National Reporting Learning Service (NRLS) Organisation Patient Safety Incident Reports published by the National Patient Safety Agency (NPSA). The SUI module within STEIS enables electronic logging, tracking and reporting of Serious Untoward Incidents. Once an incident is raised by a Trust and entered into the SUI module, an e-mail alert is sent to nominated officers at the SHA/Host PCT. The SHA/Host PCT is responsible for closing an incident, although the information will still be available to view as a ‘read-only’ file. Recent improvements implemented on 9th June to the STEIS SUI system have simplified the process for extracting data to track and monitor SUIs. The dashboard has initially been set up to monitor and compare SUI reported by Acute Trusts, however, there are plans to develop the dashboard to include Primary Care Trusts, Mental Health Trusts and other Service Providers. EAST COAST ACUTE NHS TRUSTTrust SUI Acute Trust Dashboard - EXAMPLE Ashford SOUTH & St Peter's Hospitals Incidents reported in lastest available 6 month period: (Dec-08 to May-09)
(Apr-08 to Sep-08)
SUIs reported compared with SUIs closed per month
Ongoing SUIs by timeframe
All SUI Incident types by Quarter
10
20
9
18
8
16
7
14
6
12
5
10
4
180
Days
All Trusts median
2007-08
2008-09
Time from incident to incident closed (SUIs ordered sequentially by date of incident)
Days
800
8
160
700
140
All Trusts median
251 115
2007-08
2008-09
Q4
Q3
Q2
Q1
Q4
Q3
Q2
Q1
Q4
Q3
Q2
Q1
Q4
Q3
Q2
Q1
Q3
Q4
2006-07
NRLS1: Consistent reporting of patient safety events reported to RLS (requied level 6/6)
Median (all Acute Trusts) Trust median
2009-10
2009-10
Incidents reported to NRLS between Apr-08 to Sep-08
900
14
Q2
0 2006-07
Q1
90+ days
Q4
60-90 days
Q3
under 60 days
Q2
0
Median (all Acute Trusts) Trust median
5
2
0
Time from incident to SUI reported (SUIs ordered sequentially by reported date)
200
10
6
Q1
Apr-09
Jun-09
May-09
Jan-09
Feb-09
Mar-09
Oct-08
Dec-08
Nov-08
1
Jul-08
0 Sep-08
2
Aug-08
1
Apr-08
3
Jun-08
2
15
8
4
Q1
3
Q4
4
Avg All Trusts
20
Q3
5
Trust
25
Q2
6
Quarterly reporting rate per 10,000 admissions 30
Q1
SUIs reported per month SUIs closed per month
7
Q4
8
Q3
1,488
Q2
NRLS
500
No. of incidents reported
19
STEIS
May-08
The dashboard provides an overview of the numbers of SUIs opened and closed by month, and includes charts showing quarterly numbers of SUIs reported and rates per 10,000 admissions (both of which can be selected by SUI incident group). Other SUI analysis includes the timeliness of reporting both from date of incident to SUI reported and date of incident to closure.
NRLS1:
450
6/6
Degree of Harm 100% 90%
400
80%
350
70%
300
60%
Sep-08
Aug-08
Jul-08
Jun-08
0%
200
Apr-08
10%
0
Mar-08
50
300
Jan-08
20%
Feb-08
30%
100
May-08
40
400
40%
150
Dec-07
60
50%
200
Nov-07
80
500
250
Oct-07
100
Number of days
Number of days
As well as providing a general overview of SUI reporting the dashboard also includes patient safety incidents data sourced from the NRLS Organisation Patient Safety Incident Reports. The NRLS reports include three of the Patient Safety indicators listed in the Indicators for Quality Improvement (see page 13 for information on how to find out more about Indicators for Quality Improvement). 120
600
NRLS2 Timely reporting of patient safety events:
20
100
0
0
NRLS3 Rate per 100 admissions:
4.99
n= 1
No harm Low harm moderate harm
41
severe harm
RLS median:
Reporting rate (similar trusts):
n= 3
death
57
4.78
The latest data refers to the six month period from 1 April 2008 to 30 September 2008 that were reported to the Reporting and Learning System (RLS) by 28 November 2008. For further information about NRLS: http://www.npsa.nhs.uk/nrls/patient-safety-incident-data/organisation-reports/ I will shortly be leaving the team (see back page for further information) and I have therefore handed this work over to Rebecca. We really do want this dashboard to be a useful tool for Trusts and PCTs to improve the safety of care received by patients. We are intending to have a draft dashboard available for comment by mid July and will circulate this to Trusts and PCTs. As ever we are keen to hear your views and suggestions for enhancement so please do provide Rebecca with feedback (Rebecca.owen@southeastcoast.nhs.uk)
Is there something that you wish you knew more about? To suggest future topics for knowledge matters contact the team
Knowledge Matters
Page 16
Goodbye to…
Data Quality
David Harries who will be leaving the SHA on 3rd July to take up a 12 month secondment at East Sussex County Council working on the National Indicator set. As well as offering an opportunity to work on indicators covering the wider determinants of health and improve partnership working, the secondment (based at Lewes) will also provide David with a welcomed break from his daily commute to Horley. Rebecca, Nia and Charlene will be offering analytical support to Public Health until a replacement is found. Interviews for David’s replacement will take place on 7th July.
The quality of data is not strain'd,
We wish you all the best David!
It is an attribute of quality clinical outcomes;
It doth impact upon the nature of health And that of social care: it is of high import; It blesseth them that analyse and them that implement: 'Tis highest of the highest: it becomes The Chief Executive better than the annual healthcheck; Financial balance and target hitting shine When called to account by the DoH, Wherein doth sit the dread and fear of CEOs; But data quality is above this number counting: It is enthroned in the hearts of analysts,
And dashboards doth then show, like SUS,
Congratulations to Suzanne Gregg who was recently appointed at Assistant to the Quality Observatory. Suzanne has been providing support to the team for some time, however we are really pleased for her to formally join us. Much of Suzanne’s time has been (and no doubt will continue to be) spent on organising Samantha’s diary and making sure she turns up at the right time and place for her
Whe data quality seasons outcomes, Therefore, Though hitting the target be thy plea, consider this, That, in the course of achieving targets, none of us Should expect Amber to turn Green: we do pray for data quality; And that same prayer doth teach us all to render The quality of data. I have spoke thus much To impress upon thee the gravity of data quality; Which if thou follow, this strict court of NHS Must needs give consideration to make this a priority.
Welcome to… Hi. I’m Professor David Parkin and I have just joined South East Coast as Chief Economist, based in the Commissioning and System Development directorate. Knowledge matters is the newsletter of NHS South East Coast’s Quality Observatory. To discuss any items raised in this publication, for further information or to be added to our distribution list, please contact: Knowledge Matters C/O The Quality Observatory NHS South East Coast York House 18-20 Massetts Road Horley,Surrey, RH6 7DE Phone: 01293 778899 E-mail: quality.observatory@southeastcoast.nhs.uk To contact a team member: firstname.surname@southeastcoast.nhs.uk
I’m joining from City University London, where I was Professor of Economics. Before that I spent most of my career in the academic Public Health department at Newcastle University, but with strong links to NHS organisations in the North East. Throughout my career in research and teaching I have been an applied economist, with an emphasis on the practical use of economics in health care. My role at South East Coast is to develop, promote and support economics thinking and analysis throughout the region. I will no doubt be working very closely with the Quality Observatory. There are some obvious areas that are interesting to economists, such as markets, performance management, patient choice and priority setting – but economists tend to believe that economics can be applied to anything, so I may meet you in unexpected places …
Do you have something you would like to contribute to Knowledge Matters? Please contact us!