JISC Impact Calculator 1. Lessons learnt
i.
Description of the project
The email project consisted of attempting to improve the effectiveness of storage by giving staff training on better methods for handling emails.
The process was very simple; a number of training sessions in which staff were given useful guidance on how to reduce the amount of emails they had in storage.
The business process we were looking at was the use of email as a storage method. The University, like most modern organisations, holds a vast amount of information in the form of emails sent and received by staff. The intention was to see whether, by training staff to use simple methods of managing their emails, it was possible to reduce the amount of email that was stored. Currently the arrangement is that staff are given a limited amount of storage and warned when this amount is almost full that they will not be able to send or receive emails. The intention was to reduce storage by using a less crisis-driven approach.
The project was accomplished in two phases: a research phase, and a training phase.
In the research phase, an electronic questionnaire was sent out to staff in two departments of the University – one academic (the Business School) and one nonacademic (Finance). The questionnaire asked a number of questions about their email usage and experiences.
The results of the questionnaire were used to design the next phase of the project. The intention was that a group of staff would be interviewed with more detailed questions about their email use. Some of them would then be given training on email management methods; the others would function as a control group against which the impact of the training could be judged.
In practice, the process did not go quite as planned. Although the response to the questionnaire was fairly high, only a small number agreed to take part in the training sessions and it was necessary to interview a number of others for comparison, which meant the ‘control’ group was not genuinely random.
Following the training sessions, those who took part were asked to nominate which of the techniques they had been shown they were likely to try. They were then followed up at a couple of intervals over the succeeding weeks to see whether they were, in fact, using them. Finally, all participants – those who had had training and the members of the ‘control’ group – were asked how many emails they now had in their accounts, and how much storage space they were using.
ii.
Experiences of using the impact calculator
Fitting this project to the Impact Calculator was worthwhile, although difficult. The Impact and savings were fairly low but in many ways that was not the point. On reflection, the Impact Calculator as at present constituted is probably unsuitable for a project of this type. Although the basic idea of calculating the impact of records management initiatives is an excellent principle, there is a degree of mismatch between the information requested and the information available: much of the useful data from the project was not captured by the Calculator, and much of what was asked for was irrelevant to this project.
This is not to say that the Impact Calculator was not useful; only that a different kind of Impact Calculator could be devised which would fit in with an evidence-based approach to records management without necessarily focusing on projected cost savings. Since records management is not, in practice, a cost-efficiency based service, its outcomes rarely match direct savings. While they sometimes do – and the ability to show an outcome in this manner can be useful – in reality the benefit of the evidence-based approach is to support records management as a tool for business improvement which leaves senior management with the option to decide how it is to be used. Many records managers with long experience of the profession would note that good records management does not always translate into cost savings (eg an extra square metre of space in 20 rooms does not equate to an extra 20 square metre room) and the danger of being seen as nothing more than a costcutter is one which can undermine the profession.
In this particular instance, the data produced by the project were varied. Only one piece of information was necessary for the Impact Calculator.
Before the training sessions were carried out, each participant was asked how much storage space their email account occupied. The average was 372MB. Two months later, they were asked again. This time the average was 335MB. It was concluded, therefore, that the direct impact of the training was an average reduction of 37MB or 10% of the total.
The same information was requested from the control group. Here the average at the beginning was 369MB of the period, and at the end it was 454MB: an average increase of 85MB or 23% of the total.
It was clear that the training sessions had a definite, measurable impact on the amount of email being stored by participants. The actual benefit of the project in practice was small, since it was a pilot and there were only a few participants. The figures used in the Impact Calculator, therefore, are projected from these results.
It was assumed that, to achieve the same effect for the University as a whole, around 3000 staff members would need to be trained, that this would involve 150 one-hour training sessions, and that the cost of one staff member to do so was £25 per session, with an extra £5 in other costs. It was further assumed that 15 sessions for new staff would be required in each succeeding year.
Calculating the benefits was a good deal more difficult. A strict comparison between the 37MB decrease of those trained and the 85MB increase of those trained suggests that a saving of 124MB of storage could be made for each staff member; projected over a year this would be 744MB per person. With 3002 FTE staff as a baseline, this would suggest that, in principle, a fully trained workforce could reduce their requirement for email storage by 2,181GB. At around £5.50 per gigabyte, this would translate into £11,996 per year; even without the sophisticated calculations available through the Impact Calculator it can be seen that this would be more than the cost of training all staff in better email management.
However, in practice, savings would not be made in quite this way. The University’s current arrangement for email management is the 500MB limit, and the real outcome would simply be that staff would reach that limit more often. The trained
staff would not continue to reduce their stored email and the remainder would not increase theirs indefinitely. For the purpose of showing a saving in the Impact Calculator, it was necessary to make the assumption that email-trained staff would hold their storage to around the level of 335MB and that the others would average 500MB. It was assumed that the cost of storage would remain the same over 5 years.
In reality, the benefit of the exercise is not to show clearly quantifiable savings in disk storage so much as to show that the issue was one which could be addressed by a training-based, and records management based, approach rather than the Procrustean solution of simply limiting storage to 500MB. Given that the current IT approach tends to focus on responding to increased demand with increased supply, the possibility of using a different method represents a shift from quantity to quality.
iii.
Hints and tips
Some things can be measured easily, such as office storage space. Email is not, despite appearances, one of them. Since it is largely invisible and controlled by individual staff members, extracting even simple information such as how much storage space a person uses is not as easy as it may seem. Outlook, the email software package used by almost all staff, does not have an obvious way of finding out how many emails a person has in their account. There is a trick to doing it, but you have to find it out. Other questions, such as how many emails are in the Inbox, are easy to answer but most people have never done it and did not now how.
The fact that email accounts are personal made for a need for a degree of diplomacy in asking people about their use of email. Questions were interpreted in different ways; some responders gave exact figures, others just an estimate. The amount of time they were prepared to spend was limited, and so as a result was the degree of detail that could be obtained. Some kinds of information had to be guessed rather than asked directly. For instance, instead of asking staff how many email attachments they had in their stored email – which would have taken them considerable effort to calculate – we simply divided the total amount of storage by the number of emails; and the average result was so much bigger than the average
size of a plain text email that it clearly suggests a very high proportion of storage space is taken up not by email content but by attachments.
2. Suggested changes
As currently envisaged, it seems as if the Impact Calculator is built around projected future savings from an acutual exercise; but as this has shown, it can work almost as well around projected savings from a ‘scaled up’ current exercise.
One element which made an important difference was the introduction of the focus group, which allowed a comparison to be made and which highlighted the value of the work done because it showed a measurable alternative to the process used.
3. Data produced by pilot project
Is attached:
Impact Calculator University of Aberdeen Impact Calculator email project results June Email questionnaire results