Huddersfield Impact calculator lessons learned report

Page 1

Impact Calculator Pilot Project report Project description The University has a wider, long-term project to implement an electronic document and records management system (EDRMS), Wisdom. During the academic year 2009-10 the focus for the EDRMS implementation was planned to be on records relating to the quality of modules, including examination papers. This work has been carried out as planned, with additional focus on records relating to the quality of programmes (courses) as required by senior management. This additional work on programmes was encompassed by the JISC-funded project which implemented a change to the publication and dissemination of programme specification documents (following the completion of course validation processes). The intention of this change was to • provide “a single point of truth” about the University’s programmes 1 • enable validated changes to programme specifications to be disseminated automatically; • retain information about programmes consistently across the University. The project, which ran from January – June 2010 in line with the requirements from JISC infoNet, had four main phases of work outside project initiation and closure: 1. developing the EDRMS fileplan and populating it with the programme specifications; 2. developing a web service to publish documents from the EDRMS to the web and automating the publication of the individual specifications; 3. supporting staff to embed the change; 4. measuring benefits using the impact calculator tool. Phases 1, 2 and 4 have been completed and further work will be undertaken on phase 3 (as this is a longer-term package of work). The main project tasks and expected/actual timescales are shown in the Gantt charts in Appendix 1. Financial information is given in Appendix 2. During the course of the project it became apparent that only a third of the original budget (of £12,000) for the work on the EDRMS would actually be required. Permission was therefore sought from JISC infoNet to engage consultants to provide an independent view of both the process change undertaken by the project and of the impact calculator tool and its use in measuring the impact of information management changes. A delay in recruiting and appointing a suitable temporary staff member (as originally planned) constrained the time available for researching and establishing metrics, data gathering and analysis: in an ideal world more quantitative data would have been gathered. Also during the project, the University underwent an institutional audit by the QAA. In order to avoid issues with staff capacity both among the records management team (of 1.8fte) and among colleagues in the University’s Schools, slack time of 3 weeks was deliberately allowed around the long Easter weekend and University vacation. This effectively reduced the project duration from 6 months to 5 months.

What was achieved and learnt The project achieved its desired outputs through the process change, in that all programme specification documents are now available on the web direct from the EDRMS, providing a the master document as an authoritative “single point of truth”. Furthermore, when specifications are changed through formal University validation processes, amendment to 1

In line with the University’s Information Systems Strategy Governance work strand

1


the master document in the EDRMS means that the amended master is also available online without further intervention. The project was funded to gather real-life experience of using the impact calculator tool, and the project outcomes in the use of the tool have wider significance beyond the University. The full consultancy report (available separately) outlines the findings of the research and makes recommendations for the University in the areas of • use of the impact calculator; • the programme specifications web dissemination service; • programme validation processes and specification documents; and • wider information strategies. Partly as a result of the work on the impact calculator and through the consultancy funded by the project, a working group to review the validations process in its entirety is being established. Led by Registry and with the involvement of the University Records Manager and other colleagues, it will draw on the project findings. Further analysis of the data gathered by the project’s timed tests and focus group will also be used to plan and implement future development of records management initiatives including the ongoing EDRMS implementation project. Two factors meant that it was not possible to use the impact calculator tool fully in the way envisaged by JISC infoNet: the restricted scope of the process change for which the project had a mandate, and the relatively small size of the specific IT system in the wider systems to which it belongs. It was only feasible to develop and gather data for a limited number of metrics, which showed an overall cost in monetary terms with very little in the way of tangible benefits (see Appendix 3 for information about data gathering). However in the case of this change project, senior management had decided that the intangible benefits outweighed the few tangible benefits shown. The University is currently using the impact calculator for another similar project, the dissemination of joining information to applicants by means of drawing the joining information document for a course from the EDRMS and displaying the relevant document in the pre-enrolment portal. Despite the similarly restricted scope of this process change, it has a slightly wider range of tangible outcomes which can thus be measured more easily using the impact calculator tool. In addition this project was planned in advance, and publishes 862 documents from 7 EDRMS folders (rather than 525 documents in 525 EDRMS folders) enabling the cost of changes to EDRMS permissions - the greatest cost in the PSD dissemination change - to be avoided entirely. Despite the limited applicability of the impact calculator to the pilot project, the experience suggests the following lessons for information and records managers responsible for implementing and advocating change to processes in their organisations: 1. deconstructing a process as a whole, whilst potentially more time-consuming and requiring greater effort up-front, may better justify change in the long run; 2. the impact calculator is useful tool for considering the tangible benefits of a process change where metrics are readily identifiable and can be expressed effectively; 3. potential detailed metrics should be considered before change implementation, and used as part of the change requirements discussion; 4. there will be occasions when intangible benefits are accepted as the driver for a process change, although having a measure of the tangible benefits assists transparency and evidence-based decision-making; 5. the timing and planning of required systems development can have a significant impact on the cost; where there are few tangible benefits or small monetary benefit the intangible benefits should be considered carefully before proceeding. Finally, as with all projects, this project reinforces the need to plan and monitor the progress of the project itself, and to be realistic about scope and capacity in the context of ongoing service delivery.

2


Appendix 1: baseline Gantt and final Gantt charts

3


ID 1

Tas k Nam e

Set up project

3

Define the proces s(es) As Is in each School

5

01 January 28/12 11/01

Tes t auto-build of edrms fileplan Collate and verify inform ation about all program mes validated and running/to run

8

Information verified

9

Automatically generate fileplan in EDRMS to cover each program me.

10

Upload authoritative specification docum ents from current locations to the EDRMS.

Tes ting and development of web SDK for EDRMS Mors e permis sions work

15

Tes ting & delivery of web development

18 19

19/04

01 May 03/05

14/06

01 July 28/06

15/03

Programm e s pecifications available via web cm s Support & embedding Other s upport mechanis ms for revised proces s as neces sary Impact calculator Appoint temporary s taff member

21

Res earch & establis h m etrics

22

Data collection - Queens gate

22/04

23

Data collection - Univers ity Campus Barnsley

23/04

24

Appoint cons ultants

25

Data analysis

26

Focus group

27

Progres s meeting

28

Res earch

29

Conclusion m eeting

31

01 June 31/05

24/02

20

30

17/05

08/02

Programm e s pecifications available in edrm s

14

17

01 April 05/04

Web development

13

16

22/03

12/02

Agree com mon proces s To Be

7

12

01 March 22/02 08/03

Populate EDRMS

6

11

01 February 25/01 08/02

Project initiation

2 4

01 Decem ber

11/06

Final cons ultants ' report Project closure

32

Evaluation

33

Com pile and consult on les sons learned report

34

Sign off project

35

Return docum entation and data to JISC

25/06

4

12/0


Appendix 2: financial information Original budget £12,000.00 £2,000.00

Technical consultancy for EDRMS modification Temporary staff

Participation incentives Total

0 £14,000.00

Net contribution to research infrastructure fund by Service not included.

5

Actual expenditure £3,995.00 £514.47 Consultancy: £5,625.00 £104.85 £10,239.32


Appendix 3: data gathering methodology Metric

How measured

1. Increase in number of authoritative PSDs for currently-offered courses available online.

Baseline: PSDs available from webpage of the only School which previously published them Actual: manual count of web service crosschecked against edrms search & audit

2. Eliminate the time taken to add PSDs to webpages manually in order to make them available online.

Baseline: timed test by 1 staff member (as only 1 School previously did this) Actual: activity no longer performed

3. Eliminate the time taken to remove PSDs from webpages manually when programmes are no longer offered. 4. Decrease the time taken by those outside a Department to access an authoritative PSD.

Baseline: timed test by 1 staff member (as metric 2) Actual: activity no longer performed Baseline: timed test Figure in impact calculator weighted for 74% success rate (compared to 100% success rate for “actual” figure following change). Actual: timed test

Timed test for metric 4 A general invitation to assist with a 15-20 minute test of wisdom was sent to all School-based staff via the main internal communication channels, followed up by targeted invitations via School Record Administrators (local contacts). An incentive (Marks & Spencer voucher) was offered to participating staff, and one participant selected at random as the winner. In addition an invitation was sent to those School staff specifically based at University Campus Barnsley. Lunch was provided for the participants. 14 and 5 staff attended the tests at the University’s main Queensgate site and University Campus Barnsley respectively (of a total possible ~1220 and ~50, or 1.1% and 10%, respectively). Both groups of staff were asked to complete three tasks individually, and to time themselves using a stopwatch recording minutes/seconds/milliseconds. A member of staff observed the timed tests which were conducted in groups of about 5; no assistance or advice was given other than what was provided on the questionnaires. Your first task is to find a programme specification from another department in your school using pre-Wisdom methods, i.e. finding what you need on your schools network drive. Throughout these tests please search for the same programme specification. Once you have read the instructions and are ready to begin start your stop watch and record how many minutes/seconds it took to complete the task. At the end of the task we would like you to answer the following questions How long did it take to find what you were looking for __ minutes __ seconds Can you briefly describe the steps you took to complete this task? What date is the document? When was it last amended and by whom? How certain are you that you have found the most current and up-to-date version on a scale of 1 to 5 (one being confident, five not being confident) How long (estimate in seconds/minutes) did it take you to check that this was the most recent authoritative master document ? __ minutes __ seconds Please rank the ease of finding your programme specification on a scale of 1 – 5 (1 being easy, five being very difficult) 6


Please rank how intuitive and simple you found the process of locating a programme specification belonging to another department on your school network drive on a scale of 1 – 5 (1 being logical and intuitive, five not being intuitive) Your second task is to repeat this exercise (ie search for the same programme specification) but this time instead of your school’s network drive search for what you need in Wisdom. Again when you are ready to begin please start your stop watch and record how many minutes/seconds it took to complete the task. At the end of the task we would like you to answer the same questions [same questions] Your third task is to repeat the same exercise but this time you will access programme specifications via the web. Again, when you are ready to begin, start your stop watch and record how many minutes/seconds it took to complete the task. If you are struggling to find the programme specification this way, look at the bottom of this page where there is a hint. Please make a note of how long it took you before you used the hint. At the end of the task we would like you to answer the same questions [same questions] Hint: (Wisdom Programme guides can be accessed at http://halo.hud.ac.uk/ ProgrammeGuides/ )

Raw data from the timed tests was entered into a spreadsheet for analysis, with aggregated data included for the relevant metrics in the impact calculator. Participants were also asked the following qualitative questions: Are you a regular user of Wisdom? Are you more or less likely to use Wisdom in future? How long have you been in your current role at the University? What is your job role/title? Have you any further comments on Wisdom and this evaluation project? Of the 19 total participants, 12 staff identified themselves as “regular” users of the EDRMS, 4 as “occasional/moderate”, and 3 as non-users. The ratio of administrative staff to academics was 14:5.

Additional research A range of staff thought most likely to be affected by the process change in the dissemination of programme specification documents. These staff were invited to a focus group facilitated by the consultants to the project. The findings were compared with the ease and confidence rankings and free text comments made by participants in the timed tests. The consultancy report includes the agreed notes of the focus group meeting. The consultants also spoke informally with contacts among the academic staff, who were underrepresented in the timed tests and the focus group. It was felt that one possible reason may be because the tests and focus group were poorly timed in the academic year (during the last week of teaching and second week of examinations).

7


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.