Strategy Pilot Report Cambridge Regional College May 2010 1
The Issue
1.1
Context: In 2008/09 the College’s out-turn in two key performance measures was poorer than predicted. Exploratory work to clearly identify the cause of this underperformance (forecasting income and improving student success on courses which supplement main learning programmes) concluded that the College was not effectively working across internal institutional “boundaries”; that insufficient attention was given to local intelligence when making decisions with regard to budgets and targets; that negotiation and consultation with managers was perceived to be underdeveloped and that deadlines were routinely missed because managers were “too busy”. As a consequence, in Autumn 2009/10, the Senior Management Team (SMT) commissioned an internal improvement project, “Improving Practices”, to seek further details from the College-wide Management Group (“CMG”), and selected others with distinct cross-College roles, about the specific areas they would like to see improved. Improving Practices ran in parallel with a culture change programme, aiming to empower managers more fully, and was intended to provide managers with the tools, processes and support to effectively deliver their responsibilities.
1.2
Strategic Planning Issues to Address: i)
SMT committed to: a) Reviewing communication mechanisms, particularly around consultation and negotiation; b) Actively reviewing decisions and priorities and identifying when not to take projects, initiatives or ideas forward; c) Clarifying priorities, allowing for low-priority work to be deferred; d) Providing people with an environment in which they can succeed in their roles.
ii)
The methods through which these changes were to be effected were discussed with SMT in early January. Primarily SMT agreed to: a) Take a more structured and deliberate approach to assessing their decisions when they impact upon CPG for implementation;
1|Page
b) Act more selectively in determining which activities fit the mission and therefore which should be taken forward, accepting the consequences of those decisions; c) Clarifying jointly the absolute priorities for the short term and the longer term, and committing to communicating them clearly; d) Where decisions had a material effect on CPG, involve them in decision making, through CPG meetings, rather than leaving them with the problem of how to deliver. e) Using the CPG meeting more actively to plan for the future of the College. Originally it was envisaged that the “Managing Strategic Activity” section of the Strategy Infokit could support the College to address the three issues above. Examples listed below were the initial areas for review: - “Change variables” may help SMT to determine whether or not change is necessary at all; - “CLARISCOPE” may help to support decision making when priorities change; - “Prioritisation matrix” may help to do both. Initially it was considered likely that there would be other parts of the tool which may be relevant, and that managers may have a preference to try different approaches. The first phase of the project was to therefore test which tools may work in what circumstances. 1.3
Metrics: It was envisaged that the success (or otherwise) of this activity would be measured, just after Easter, by: - Re-testing perceptions of managers about progress made, as per the original Improving Practices brief, to measure distance travelled. Accepting that this is a relatively short time period in which to effect significant change, a satisfaction measure of “X% of managers believing that the change process has begun, and is likely to impact in the medium term”, is planned, with an anticipated outturn of 80%; - A target of a further 10 managers (out of 28, 12 of whom already agree) agree that prioritisation has improved and that they are clear about expectations of them in the short term; - A target of a further 10 managers (out of 28, 10 of whom already agree) that they are actively involved in the key decision making processes in College.
2
The Approach
2.1
Scoping the Project: At the initial review SMT opted to target the project on the 14-19 part of the College’s business and to focus on the delegation of some decision making to curriculum managers, giving them the tools with which they could make informed decisions. Aspects 1.3i d), 1.3ii d) and 1.3iii) a and d above were intended to be
2|Page
addressed by this decision, with the remaining issues tackled either via the culture change programme or the Improving Practices project. As a consequence of this decision the actions which followed were slightly different to those originally planned: compounding this change was the loss of a key senior manager with curriculum leadership responsibilities and some restructuring which followed. The approach was therefore redefined into four phases: a) Phase 1: re-scoping o Reviewing the decisions which should be delegated from the Director of Curriculum to the curriculum managers; o Reviewing the tools available for supporting local decision making with the Director of Curriculum; o Selecting appropriate tool(s) to support delegated decision making; b) Phase 2: testing and consultation o Testing the tool(s) with one curriculum manager; o Getting feedback from that curriculum manager; o Adapting the tool then consulting more widely with all curriculum managers; c) Phase 3: roll out o Rolling out the tool to all curriculum managers to support decision making. d) Phase 4: evaluation o The impact of the tool to be evaluated one year after introduction 2.2
Phase 1: The outcomes from phase 1 of the process were that: - The Boston matrix should be the primary decision making tool but would need to be adapted so that it could show performance of courses by two aspects: financial viability (measured by income per guided learning hour) and educational viability (measured by success rates, the number achieving their qualification as a proportion of the number starting that qualification). Annex 1 illustrates an example adapted matrix. - For educational viability, the judgement about performance would be made by comparing success rates to national averages. - For financial viability, the judgement about performance would be made, under guidance from the Director of Finance, against hourly income targets. - The end product would therefore provide decision makers with a snapshot of their area’s performance in the two critical KPIs in 2008/09 and would clearly demonstrate the inter-dependencies and inter-relationships between course income and course quality.
2.3
Phase 2: The outcomes from phase 2 of the process were that:
3|Page
-
-
2.3
The viability input for the statistical analysis needed to include infill, day release apprenticeships, funded from an alternative source than the main “FE” qualification, so that all income and GLH were included in the calculation (apprenticeship payment rates are different to FE main programmes). Financial viability to be calculated both at individual qualification level (eg BTEC National Diploma in Engineering) and at programme level (eg a full time BTEC National Diploma engineering student would also study AutoCAD, literacy and numeracy and personal development qualifications such as CV writing). This would allow for the whole learner experience to be costed and would inform decisions about the financial and educational value of the whole programme as well as the financial viability of individual component parts.
Phases 3 and 4: The outcomes from phase 3 and phase 4 have been materially affected by major change in the FE qualifications and funding framework. At the time of writing, the national programme of “Vocational Qualifications Reforms“(VQR) has yet to conclude, and not every programme of study: - Has been confirmed as eligible for funding; - Has published funding rates, although most are anticipated to be funded at a lower rate; - Has expected guided learning hours attached to it. This position clearly affects the usefulness of the tool as it stands, but because of the currency of information to feed it rather than the actual tool itself. Despite this, 100% of curriculum managers agreed that the tool would be useful for them and would significantly assist them in targeting attention on “high risk” programmes when curriculum planning in future, and when monitoring curriculum performance in year. They have therefore asked for the tool to be produced: - For the 08/09 results, and the 09/10 results once available; - For the 10/11 curriculum plan, with updates on funding rates and guided learning hours being dropped into the tool as information is published.
3
Success Factors
3.1
Intended Metrics: Paragraph 1.4 above outlines the measures by which the project was originally to be evaluated. However, because of restructuring, delays in the VQR programme and planning for reduced Train to Gain budgets, the “Improving Practices” follow up interviews have been delayed, and will now commence on 10th June, conclude on 5th July and will be reported to SMT on 28th July.
3.2
Re-scoped Metrics:
4|Page
The re-scoping of project activity has resulted in the need to re-scope the metrics for success. The long term aim as defined above remains as originally defined for the combination of project activity (culture change, improving practices and strategy kit): this supports the overall aim for the College to become outstanding by 2011/12, with an operating surplus of >£1.2million in the same time period: clearly, course viability and success rate monitoring will be critical to enabling managers to effectively monitor (and therefore change and improve) their contribution to this overall target. In the short term, metrics have been devised as follows: - Develop tool which effectively enables the monitoring of course viability, on the grounds of both success and financial performance – achieved; - Use the tool as a key performance measure in termly monitoring of curriculum performance in 10/11 – planned; - 90% of curriculum managers to report that the tool is helpful when making decisions about the future of the curriculum – achieved in part this year, and will be achieved in full next year; - 90% of curriculum managers to report that the tool is a helpful aid to their performance management arrangements – to be reviewed at the end of 10/11. 4
Lessons Learned
4.1
Primary lessons learned: Testing the strategy tool as part of this project has resulted in four material learning points which may be of benefit to future users: • • • •
Adapting the tool or technique to make it explicitly relevant to the issue in hand has been a significant factor in the success of the pilot project; Consulting with key users of the tool to make sure it works for them has generated an interest in, and acceptance of, the tool; Being specific and comprehensive when defining inputs for the Boston Matrix has maximised its usefulness; However, the tool is only as useful as the context it operates in.
These points are discussed further below, in order of significance. 4.2
Making the tools relevant to the issue: When re-scoping the project, defining the issue to address was critical in selecting the appropriate tool from within the suite offered by the strategy infokit. None of the standard tools precisely meet the College needs but it was clear that a version of the Boston matrix or the “urgent/important” matrix could be adapted to provide a clear visual aid to managers to help inform decision making and monitoring. The tool was therefore adapted to analyse College success rates (proportion of students achieving a qualification as a proportion of the number starting that qualification) against national averages, and course viability (defined by income per guided
5|Page
learning hour) against College targets. Results were plotted onto a Boston matrixtype graph, the four quadrants of which were defined as: 1 success rates high and viability good, categorised as qualifications we should “maintain and grow”; 2 success rates high but viability only satisfactory, categorised as “in need of review on the grounds of efficiency” before confirming the curriculum offer; 3 success rates satisfactory but viability good, categorised as “in need of review on the grounds of quality” before confirming the curriculum offer; 4 poor success and poor viability, categorised as “courses to be closed”. Annex 1 provides an example of the adapted matrix showing qualifications in each of the quadrants (for the purpose of this report these are for illustration only and not based on actual results). 4.3
Consult as you develop and use the infokit tools: Consulting with key users as the tool was developed was critical to the success of the process, on three grounds: - gaining “buy in” to the tool and the process from the intended user group; - using their intelligence to modify and improve the tool; - using their intelligence to see different applications for the tool. For this project, consultation resulted in improvements to the tool by, for example, including the income generated from day release in-fill students into the viability calculation. Thereafter new applications for the tool were requested so that it can fulfil both its intended purpose (using past performance data to influence future curriculum planning decisions) and for in-year monitoring, an unanticipated output from the project.
4.4
When using the infokit tools, be specific and comprehensive in the information which will feed the process and allow sufficient time to ensure that the tool is fit for purpose: This is a version of the traditional “garbage in, garbage out” philosophy. It took several attempts and several college departments to provide then analyse the information to feed the adapted Boston matrix. However, without this cross-College effort, the adapted Boston matrix would have failed in its intention to provide accurate information to support delegated decision making.
4.5
The tools are only as useful as the context they operate in: This point is two-fold. Firstly, the tool was only one of a number of strategies employed to deal with the complex issues around organisational culture which generated the three strands of project work (culture change, process improvement and this planning/decision making tool). Secondly, although not a reflection on the
6|Page
tool itself, external factors such as vocational qualification reform and material changes in funding rates have limited the impact of the tool this year. Nevertheless the model will be used in future when uncertainties affecting the FE sector are resolved. 5
Recommendations for JISC infoNet
5.1
JISC are invited to consider how they could generate further examples of how tools can be specifically applied to educational context.
6
Outputs
6.1
Annex 1 contains the primary output from this project, an adapted Boston matrix which maps qualifications by success rates and viability and generates default recommendations for actions as a result.
Andrea Chilton Vice Principal, Quality, Staff and Learner Services 26.05.10
7|Page
ANNEX 1: SAMPLE REPORT: BOSTON MATRIX FOR VIABILITY AND SUCCESS: PROGRAMME AREA
£115 NVQ1
1st national
NVQ2
FINANCIAL VIABILITY: INCOME PER GLH
£82 £81 additionality
NVQ3
1st Diploma
0 -10%
-1%
=
+5%
EDUCATIONAL VIABILITY: COURSE SUCCESS V NATIONAL AVERAGES
8|Page
+10%
Green Amber Red
9|Page
Priorities for growth Review for efficiency or quality Stop or replace