D R I V I N G I M P R O V E M E N T W I T H O P E R AT I O N A L R E S E A R C H A N D D E C I S I O N A N A LY T I C S
AUTUMN 2016
OPTIMISATION TECHNIQUES CONTRIBUTE TO SUCCESS OF ROSETTA MISSION Limited resources are maximised for scheduling experiments carried out on comet 67P
© ESA/CNES/ARIANESPACE-Service Optique CSG, 2004
SAVING LIVES IN SOUTH AFRICA Making an HIV counselling and testing service more efficient
SIMULATING CROWD MOVEMENT Helping planners reduce congestion during major events
Why do over 70% of Fortune 50 companies use SIMUL8 simulation software?
Powerful. Flexible. Fast.
Some of the world’s most successful organizations rely on SIMUL8 simulation software because it helps them to make and communicate their most important decisions. Don’t rely on spreadsheets to be your process improvement toolkit. Make bold, confident decisions because you have the evidence to be sure you are making the right choice. SIMUL8 has helped Industrial Engineers for over 20 years – saving money, reducing waste and improving efficiency.
Simulation software for innovative Industrial Engineers www.SIMUL8.com
E D I TO R I A L Welcome to the fourth issue of Impact. If you missed the first three issues, electronic copies are available at https://issuu.com/orsimpact. For future issues of this free magazine, please subscribe at http://www. getimpactmagazine.co.uk/. This fourth issue contains the usual eclectic mix of stories of analytical work making an impact in a variety of organisations. As I began thinking about this editorial, the Rosetta spacecraft was being crashed on Comet 67P, a 4km-wide ball of ice and dust, at the end of its mission after 12 years and a nearly 8 billion-kilometre journey. It was the first spacecraft to orbit a comet, and, in November 2014, the first to deploy a lander, Philae. The series of experiments carried out on Comet 67P by Philae, under extreme constraints of time and resources, were scheduled using sophisticated analytics. You can read about this in our lead article. Four years on from London 2012, we have another example of how analytics helped in the planning of the Olympic and Paralympic Games. Following the story in the last issue concerning security at the Games, in this issue we can read how analysis of crowd movements was used to alleviate congestion. I am pleased to bring you two stories of Operational Research being used to help not-for-profit organisations. In South Africa, Shout-it-Now, which is involved in HIV counselling and testing, used simulation to help them contribute significantly to reducing the rate of new HIV infections. In the UK, Bloodwise, a specialist blood cancer charity, took advantage of the OR Society’s Pro Bono scheme, which provided a volunteer analyst to help it understand better the various audiences with which it seeks to engage. Details of the scheme are found in the article. There are other good stories: how improved shipping schedules save the Noble Group some $150m p.a., how Andrew Jardine’s lifetime work in maintenance and asset management has been deployed around the world and how the O.R. team at HMRC are working to collect tax revenue more effectively. Truly an eclectic mix. I hope you enjoy reading them all, including the last mentioned!
The OR Society is the trading name of the Operational Research Society, which is a registered charity and a company limited by guarantee.
Seymour House, 12 Edward Street, Birmingham, B1 2RX, UK Tel: + 44 (0)121 233 9300, Fax: + 44 (0)121 233 0321 Email: email@theorsociety.com Secretary and General Manager: Gavin Blackett President: Ruth Kaufmann FORS, OBE (Independent Consultant) Editor: Graham Rand g.rand@lancaster.ac.uk
Print ISSN: 2058-802X Online ISSN: 2058-8038 Copyright © 2016 Operational Research Society Ltd Published by Palgrave Macmillan Printed by Latimer Trend This issue is now available at: www.issuu.com/orsimpact
Graham Rand
OPERATIONAL RESEARCH AND DECISION ANALYTICS
Operational Research (O.R.) is the discipline of applying appropriate analytical methods to help those who run organisations make better decisions. It’s a ‘real world’ discipline with a focus on improving the complex systems and processes that underpin everyone’s daily life - O.R. is an improvement science. For over 70 years, O.R. has focussed on supporting decision making in a wide range of organisations. It is a major contributor to the development of decision analytics, which has come to prominence because of the availability of big data. Work under the O.R. label continues, though some prefer names such as business analysis, decision analysis, analytics or management science. Whatever the name, O.R. analysts seek to work in partnership with managers and decision makers to achieve desirable outcomes that are informed and evidence-based. As the world has become more complex, problems tougher to solve using gut-feel alone, and computers become increasingly powerful, O.R. continues to develop new techniques to guide decision making. The methods used are typically quantitative, tempered with problem structuring methods to resolve problems that have multiple stakeholders and conflicting objectives. Impact aims to encourage further use of O.R. by demonstrating the value of these techniques in every kind of organisation – large and small, private and public, for-profit and not-for-profit. To find out more about how decision analytics could help your organisation make more informed decisions see www.scienceofbetter.co.uk. O.R. is the ‘science of better’.
Beale Lecture 2017: Thursday 2 March 2017 – Entry free
The OR Society’s Beale Medal is awarded each year in memory of the late Martin Beale. It gives formal recognition to a sustained contribution over many years to the theory, practice, or philosophy of O.R. in the UK.
Lecture: “Linking Public Policy Worlds: Working Together to Shape Public Policy Choices”
Mr John Friend (Beale Medal Winner 2015)
John is best known in the O.R. world for his pioneering role in developing the Strategic Choice Approach, now viewed as a leading member of the softer O.R. family of problem structuring methods. He has also attracted international attention among policy professionals and academics as a source of fresh insights into the inter-organisational dynamics of public policy choice. He will argue that UK universities are now strategically placed to develop links with international partners in promoting the further development of a useful science of public policy design.
Opening talk: “Randomized Coordinate Descent Methods for Big Data Optimization” Dr Martin Takác (PhD Winner 2014) There is an ever increasing demand for solving “big data” problems, each described by gigabytes or terabytes of data from uncountable sources. Often, the problem is formulated as an optimization problem, and in this talk we analyse iteration complexity of coordinate descent methods for various loss functions. Moreover, in order to make use of the modern high-performance computers, the parallel version of CDM is proposed and analysed.
Thursday 2 March 2017 The Royal Society, 6-9 Carlton House Terrace, London. SW1Y 5AG Timings: 14:30 Tea and biscuits 15:00 Lectures starts 16:30 Approximate finish Entry free
Register your place at: www.theorsociety.com/beale Please contact Hilary Wilkes hilary.wilkes@theorsociety.com with any queries.
CONTENTS 6
SCHEDULES IN SPACE Brian Clegg reports how a French team used constraint programming to ensure that the experiments on Comet 67P were scheduled to make the best use of very limited resources
13
SHIPPING FORECAST Neil Robinson shows how minimising transportation costs produced savings for the Noble Group of some $1.3m a month within 18 months of implementation
17
USING OPERATIONAL RESEARCH TO GUIDE INNOVATION AND SAVE LIVES Stephen Stafford tells how a South African organization, specializing in HIV counselling and testing, used simulation to make their service more effective and ultimately save more lives
20
IMPROVING PHYSICAL ASSET MANAGEMENT Elizabeth Thompson shows how the work of Andrew Jardine has influenced asset management in a wide variety of industries
26
THE THIRD WAY Neil Robinson reports how a volunteer with the OR Society’s Pro Bono scheme helped a cancer charity improve its promotional activities
© Aigars Reinholds/Alamy Stock Photo
30
O.R. IN HM REVENUE AND CUSTOMS John Lord and Andrew Culling explain how HMRC’s large group of analysts have policy and operational roles, aiming for effective use of resources and improved taxpayer experiences
34
MODELLING PEDESTRIAN MOVEMENT WITH SENSETM: OLD SCIENCE, NEW APPROACH Daniel Marin explains how a simulation tool combined with a Geographical Information System helped to model large crowds, from mass evacuation plans to optimized visitor experience at major events
4 Seen elsewhere
Analytics making an impact 10 Making an impact: too
important to ignore Mike Pidd reflects on Brexit, and the advantages of story-telling 25 Universities making an impact
A brief report of a postgraduate student project 39 Finding the best way with
Dynamic Programming David Smith demonstrates that dynamic programming can help you find the way forward 43 Life Rithms
Geoff Royston argues that insights gained from algorithms can help us tackle many everyday problems more effectively
SEEN ELSEWHERE TOP CHALLENGES IN DATA AND ANALYTICS
Mu Sigma’s recently released ‘State of Analytics and Decision Science’ report found that: • 34% of respondents noted data quality, consistency and availability are the most important issues plaguing their analytics initiatives. • Issues related to a dearth in available skills, whether due to talent shortages or lack of training, were the second-highest challenge (30%) faced by companies. • Underperforming companies are twice as likely to identify skill set deficiencies as their most pressing challenge in analytics. • Business acumen and communication skills are two of the top three skill set domains where businesses see the need for improvement. See more at: https://www.mu-sigma. com/our-musings/blog/uncoveringgaps-and-trends-in-data-and-analytics
to HIV and other infectious diseases. The NGO North Star Alliance tries to reduce this problem by placing so-called Roadside Wellness Centres (RWCs) at busy truck stops along major truck routes. Locations for new RWCs are chosen so as to maximize the expected patient volume and to ensure continuity of access along the routes. As North Star’s network grows larger, providing equal access to healthcare along the different truck routes gains in importance. In “A column generation approach for locating roadside clinics in Africa based on effectiveness and equity” (European Journal of Operational Research 25(3) 1002–1016), José Núñez Ares, Harwin de Vries, and Dennis Huisman use the North South Corridor Network to analyse the impact of including the equity criterion in the location problem. The results show that significant improvements in terms of equity can be achieved at marginal loss in terms of North Star’s current objectives.
His winning approach combined the machine learning method of Hidden Markov Models and a hyperheuristic technique with the ability to exploit the computing power of multi-core machines. Dr Kheiri said: “The potential impact of this work is significant and has the possibility to be applied to other real-world applications.”
AHMED KHEIRI
NOT ONLY HOT AIR
FAIR ACCESS TO HEALTHCARE ON SUBSAHARAN TRUCK ROUTES
Long distance truck drivers in SubSaharan Africa are extremely vulnerable
4
IMPACT © THE OR SOCIETY
Dr Ahmed Kheiri from Cardiff University’s OR Group has won the prestigious ROADEF/EURO 2016 challenge (http://challenge.roadef. org/2016/en/index.php), beating off competition from 41 teams across 16 different countries. The competition used a real-world problem of inventory routing with a focus on healthcare services delivering large volumes of liquid oxygen to large numbers of hospitals worldwide subject to a variety of constraints being met. A number of problem instances were provided by Air Liquide.
NOT ONLY TRAVELLING SALESMAN
Two unusual applications of the Travelling Salesman Problem (TSP) have been spotted. Who can see the most Olympic events in one day was the challenge of one New York Times reporter to another. One turned for help to Michael Trick, from the business school at Carnegie Mellon University, and President of the International Federation of Operational Research Societies. Guess who won! You can see
© Prisma Bildagentur AG / Alamy Stock Photo
what happens when you use the TSP to plan a day at the Olympics at http:// www.nytimes.com/2016/08/19/sports/ olympics/reporter-race-to-events-riogames-mather.html and what happens when you don’t at http://www.nytimes. com/2016/08/19/sports/olympics/ reporter-race-to-events-rio-games-lyall. html For Pokémon hunter-gatherers the challenge is to capture them all, as quickly as possible. If your Pokémon area has hundreds, or even thousands, of stops, the TSP can find you the best route. See http://www.math.uwaterloo. ca/tsp/poke/index.html LOADING CAR PARTS
Jean Respen and Nicolas Zufferey from the Geneva School of Economics and Management have investigated a complex truck loading problem faced daily by the French car manufacturer Renault, in which items, typically car parts (wheels, brakes, chassis, etc.), need to be placed in a truck such that they fulfil different constraints: see International Transactions in Operational Research 24 (2017) 277–301. They developed several heuristic algorithms which do not significantly outperform the values obtained by the Renault algorithms, but they solved some instances for which Renault was not able to generate any feasible solution. Renault possibly uses one additional truck for such instances, which significantly increases costs.
Driving in India is at best hazardous. A start-up company, Vahanalytics has produced “Driven”, a software app that works through a mobile phone which records whether the drivers are rash, reckless or safe. This data is then analysed and recommendations are fed back to the drivers in order to improve
TWITTER V GOOGLE
Do ephemeral tweets forecast future TV ratings better than the more deliberate searches recorded in Google Trends? In “A Structured Analysis of Unstructured Big Data by Leveraging Cloud Computing” (Marketing Science 35(3) 363–388), Xiao Liu, Param Vir Singh and Kannan Srinivasan argue that they do, following a massive data analysis. Though that conclusion isn’t immediately obvious from the paper’s title, is it? (Details of using the Twitter logo can be found at https://brand.twitter. com/)
gastric cancer, Dimitris discovered that no two hospitals were using the same drugs and/or treatment regimens. Using simple statistics, he decided to send his father to Massachusetts General Hospital in Boston. The result was that his father survived for 24 months, 14 months longer than the average. After the death of his father, he and his colleagues extracted relevant information from more than 400 papers published between 1980 and 2013. From this they were able to create a unique database that compares toxicity and survivability for those battling gastric cancer regarding a range of drugs and drug combinations. The database was used to build statistical models to predict a trial’s median overall survival and if a trial has unacceptably high toxicity. They have proposed models that use machine learning and optimization to suggest regimens to be tested in phase II and phase III trials. The indication is that the models might improve the efficacy of the regimens selected for testing in phase III clinical trials without changing toxicity outcomes.
Image courtesy of Dimitris Bertsimas
VAN ANALYTICS
their standards and, hopefully, reduce the number of accidents. More at: http://bit.ly/2amUzGp
CLINICAL TRIALS FOR CANCER
Research arising out of personal tragedy has been published by Dimitris Bertsimas (Sloan School and Operations Research Center, MIT) and his co-authors. (See Management Science 62(5), 1511-1531). When his father was diagnosed with stage IV
DIMITRIS BERTSIMAS
IMPACT | AUTUMN 2016
5
S C H E D U L E S I N S PAC E
© ESA/ATG medialab
BRIAN CLEGG
6
IMPACT © THE OR SOCIETY
THE EUROPEAN SPACE AGENCY (ESA) made the news in 2014/15 when its Rosetta spacecraft came to the end of a ten-year voyage to rendezvous with comet ChuryumovGerasimenko, better known as 67P. Crucial to the mission – and to the drama of the unfolding story – was the small Philae probe, named after the Philae obelisk which aided those decoding hieroglyphs using the Rosetta stone. Philae landed on 67P on 12 November 2014, only to have a partial failure of its anchoring mechanism. This resulted in a series of bounces that carried the probe into the shade of a cliff, meaning its solar panels could never get sufficient sunlight.
That anything was retrieved at all from Philae reflected careful planning work by the team of Gilles Simonin, Christian Artigues, Emmanuel Hebrard and Pierre Lopez from the French Laboratoire d’Analyse et d’Architecture des Systemes (LAAS), working for the Centre National d’Etudes Spatiales (CNES) in Toulouse. They had employed a versatile O.R. technique, constraint programming, to ensure that the experiments on Philae were scheduled to make the best use of available resources – and the flexibility of this model proved crucial after the landing. From the team, Emmanuel Hebrard took a Masters in computer science at the University of Montpellier
they all wanted enough resources for their experiment to run correctly, multiple times if possible. By carefully optimizing the activity plan, we wanted to help them do as much science as possible. For instance, one of the main objectives was to maximize the life span of the main batteries. Since Philae’s solar panels are not sufficient to sustain normal activity, everything had to run on the
A versatile O.R. technique, constraint programming, ensured that the experiments on Philae were scheduled to make the best use of available resources
power delivered by those batteries. In a nutshell, an ill-defined plan might result in damaging the system, losing data or sacrificing experiments because of the lack of resource.” Moreover, because of the distance to the comet, it was impossible to control the operations live. Signals
took about half an hour to reach Rosetta, and frequently Philae is hidden from the orbiter by the comet and unable to communicate. Plans had to be computed and uploaded onto the Philae CPU long in advance: an initial plan for the full three days of experiments was uploaded prior to the separation from Rosetta, and only a handful of updated plans could be uploaded later. To schedulers, Philae provided a resource-constrained, multi-project scheduling problem. Here the projects were the different experiments, while the resource constraints were the energy supply and the available memory. There were also secondary scheduling requirements – for example Philae carried a drill to collect samples from 67P’s surface. This had to be deployed first, then an oven moved into place to receive the sample, before moving again to link to Ptolemy, an instrument developed in the UK to analyse the gases given off as the sample was heated. This kind of problem is an ideal application of constraint programming.
IMPACT | AUTUMN 2016
7
© ESA/ATG medialab
and went on to gain a doctorate on uncertainty and robustness in constraint satisfaction and optimisation problems. He became involved in Operational Research in 2010 when appointed a CNES researcher at LAAS in Toulouse. Rosetta/Philae proved to be one the greatest challenges that Hebrard and his colleagues have faced. After its launch on an Ariane 5G+ rocket from French Guiana in 2004, the probe travelled more than 6 billion kilometres before the rendezvous with comet 67P. There it dispatched the Philae lander, carrying ten experiments, each with unique demands on energy and memory. Energy came from a battery that could only power a 12-watt light bulb for three days, while storage for data from all the experiments was restricted to 6 megabytes. In simulations this was kept down to 4MB to make operations more robust. The memory capacity seems ludicrously limited, thousands of times smaller than the storage of a smartphone. Yet, as Hebrard points out, “The CDMS (Command and Data Management System) was developed in the late 90s, as was most of the programme [and] the components must be of a very high grade to make sure they survive deep space radiation and extreme temperatures.” Although 128MB flash memory cards were available by 2003, the long lead time and intensive technical requirements meant that a leading edge project like Philae ended up behind the times in its technology. Because of the limitations of the system, good scheduling proved essential to make the best use of available time. Emmanuel Hebrard: “For each instrument there was a research team somewhere in Europe who designed it which was waiting on the results for about 20 years. Clearly,
© ESA/ATG medialab
Hebrard again: “Constraint programming is a paradigm to tackle combinatorial problems, that is, a problem where
Storage for data from all the experiments was kept down to 4MB
we want to find, among a discrete set of candidates, a feasible solution maximizing some objective, subject to a set of constraints.” The model is given a set of restrictions that limit the available solutions, and then uses an algorithm to work towards a preferred solution, usually by a repetitive process. Many of these problems are complex enough that it isn’t possible to find the best solution in any reasonable time, so the algorithm tries to find a solution that comes close to the optimum. Hebrard: “The list of such problems is endless: finding the shortest tour for a traveling salesman, computing university timetables, allocating resources to optimize the response to wildfires, optimizing the treatment of cancer patients by radiotherapy, designing wildlife corridors, computing carpooling itineraries, making pricing policies, computing the most likely fold of a protein… [all] can be seen as finding an item optimizing
8
IMPACT | AUTUMN 2016
some criterion, among a discrete set. In the case of Philae, among the huge set of possible plans we needed to find the one that satisfied all the constraints (i.e., which did not draw too much power at any given time, does not lose data, achieves all scientific goals, respects all precedences and exclusions, whilst maximizing the life span of the batteries). “The principle of constraint programming is to try to beat a theoretical worst case by being smart through a divide and conquer strategy. The main idea is to cut the problem into parts that are individually easier to reason about. Then we try to make as much (logical) deductions and ‘propagate’ this reasoning across the different parts. This simple scheme can be very powerful. However, it is not sufficient in general, and in order to solve the problem we still need a search component, just like a program for playing GO needs to explore the possible moves. In practice, it is often possible to dramatically reduce this brute-force search component down to a practical size. A constraint solver can be seen as exploring an immense tree whose leaves are the 101000 possible schedules, which corresponds to the number of permutations of about 450 tasks - about the problem size for Philae. However, through deductions, it can prune branches of this tree early enough
and effectively explore the whole tree without going through each leaf.” Each of the major constraints broke down into subunits. Energy supply, for example, could suffer a bottleneck from the power limit on the battery, the converter that turned the battery voltage into an appropriate one for the experiment and the power line that carried the electricity. Similarly, the memory constraints were both on the tiny 6 MB central store (or “mass-memory”), local memory on the experiments and the ability to transfer data from the mass-memory to Rosetta in the brief windows when there was radio contact.
An initial plan for the full three days of experiments was uploaded prior to the separation from Rosetta
The chief constraint on memory involved simple rules. Data was transferred in 256 byte blocks (one byte is the equivalent of a single ASCII character corresponding to, say, a single letter in this article). The scheduler was given a priority order for the experiments and would work through them until it found one with at least a block of data waiting, at which point a block would be transferred. If Rosetta was within radio contact, a block would then be transferred from the mass-memory to the spacecraft. However, if Rosetta was hidden by the comet, data would be accumulated repeatedly this way. When the mass-memory and the local experiment memories were full, data would start to be lost. Because the least certain aspect was the availability of data transfer to Rosetta, the early version of the model called MOST
(Mission Operations Scheduling Tool), based on IBM’s ILOG scheduling package, was turned on its head to see how practical a schedule of experiments was as data transfer options were varied. The original version of the model allowed data to build up in the experiments until it was lost – and it frequently was. As restructured, the model attempted to avoid data loss, which not only produced more robust solutions but drastically reduced the run times. The change in approach was important, as there were too many individual data transfers for detailed modelling – each experiment might require up to 700 transfers to Rosetta in a day. The effect of this change was to reduce the time taken to produce a schedule from hours to seconds. This would become a vital contribution to responsiveness when it was realised that due to the location of Philae, it would be necessary to undertake all activity in short bursts before the batteries gave out. Hebrard noted: “By making stronger deductions faster, our algorithms were able to considerably reduce the search tree and thus speed up the response time of the whole system, to the extent that scenarios to which no solution had been found prior to their use could now be solved in seconds.” Once it was clear that Philae was unable to charge its secondary solar batteries, a different prioritisation of instruments had to be applied to ensure that those not requiring physical movement, and hence having the least drain on the batteries, were activated first. This left enough resources to be able to deploy the drill, oven and Ptolemy to collect some surface data. Hebrard: “Because of the malfunction of the harpoons and boosters that were supposed
to maintain Philae on the ground after its landing, and the rebounds that followed, a period of intense activity, and re-planning, began at SONC [Scientific Operations and Navigation Control of the CNES]. During that period, it would have been very damaging to be limited by the system ability to compute new plans quickly.” With the quick response available, Philae managed to return 80 per cent of the science data expected during the first sequence making use of the primary battery, before any solar energy could be harvested. The system then shut itself down in midNovember 2014. The most significant discovery was an unexpectedly high presence of water ice, plus a range of organic compounds, including acetone and methyl isocyanate, many of them precursors of essential molecules for life.
Our algorithms were able to considerably reduce the search tree and thus speed up the response time of the whole system
Philae made new contact in June 2015 and sent back a small amount of additional data, including some from a surface drill. Overall, the data from Philae showed that the comet’s structure was more complex than previously thought; the unexpectedly hard surface of the comet suggested it could have undergone significant changes since its formation, where previously it had been assumed that such comets remained largely unchanged since the solar system formed. Perhaps the most reported result, that 67P contained a deuterium
to hydrogen ratio different from Earth water, suggesting our water did not come from comets of this type, was produced by the ROSINA experiment on Rosetta, rather than from Philae. When the Philae landing went wrong, it demonstrated the importance of planning when running a project where equipment must operate after a decade in a hostile environment. The preparation undertaken by the team at CNES paid off when it proved possible to undertake as many experiments as were achieved before the battery ran out. As Philae Project Manager, Philippe Gaudon commented: “Scheduling of 10 instruments (18 subinstruments) is not easy… the results were very satisfactory: instead of 24 hours, we got a result in a few minutes. During the real operation, the MOST tool was used before each upload of a new scenario. As you know we went on the Abydos landing site without knowing, for example, the landerorbiter visibility and Sun illumination. So we adapted the scenario of the science operations several times, using pieces of the previous optimized one.” The O.R. contribution may have been a small part of the overall mission, yet it proved crucial to make Philae a success. Space truly is the final frontier – and one where O.R. has proved its worth.
Brian Clegg is a science journalist and author and who runs the www. popularscience.co.uk and his own www. brianclegg.net websites. After graduating with a Lancaster University MA in Operational Research in 1977, Brian joined the O.R. Department at British Airways, where his work was focussed on computing, as information technology became central to all the O.R. work he did. He left BA in 1994 to set up a creativity training business.
IMPACT | AUTUMN 2016
9
M A K I N G A N I M PAC T : TOO IMPORTANT TO IGNORE Mike Pidd
A PUZZLING PARADOX
One of my daughters is a journalist who occasionally reviews bikes for fun, and even managed to wangle a trip to Brazil to cover the Rio Olympics. Alongside such pleasures, she also writes nitty-gritty pieces about life here in the UK. Like most of us, she spends much of her life with like-minded people and so was surprised by what she learned on a trip to the North East of England. For those who don’t know this region, it has some wonderful scenery, produces at least one world-famous beer and its people are lovely, warm and friendly. It has, though, suffered very badly from de-industrialisation. Historically it’s been a stronghold of the Labour Party and one of its constituencies returned Tony Blair as its MP during his time as Prime Minister. I’m writing this a few weeks after the EU referendum in which in a narrow majority voted to leave the EU. During the run-up to the vote, the rational, UK-based readers of this piece probably varied between boredom and annoyance at the various claims and counter claims they heard. I’ll keep my own views to myself and concentrate on the ways that the two sides conducted themselves. Some writers claim that the EU referendum campaign and also the tactics used in the Republican Party in the USA Presidential campaign are signs of a new era: post-truth politics. I think it’s worth unpicking this a little, since there may even be lessons for O.R. in what happened. Some weeks before the referendum vote, my daughter interviewed Labour Party members and activists in the North
10
IMPACT © THE OR SOCIETY
East and was surprised to find how negative they were about the EU. The majority were determined to vote for Brexit and indeed they did so. When the actual vote came, only the city of Newcastle itself voted to stay in; the rest of the North East wanted out. Even in Newcastle, the majority to remain was small, under 2,000 votes. According to the Daily Mirror, since 2007 the North East has benefitted from EU development funds more than any other English region. So, north-easterners have greatly benefited from EU membership, but they voted to leave. What caused this puzzling paradox? Was the North East being overwhelmed by EU migrants? Are its people more easily taken in by lies and half-truths than the sophisticates elsewhere? There were similar voting patterns in other regions that have benefitted greatly from EU funding. So, northeasterners were not out on a limb in voting as they did, odd though it may seem to some. It couldn’t have been a rational vote about actual migration either. According to Government migration statistics, of all the nations and regions of the UK, the North East has the lowest proportion of its population born abroad. I also doubt that northeasterners are more gullible than people who live elsewhere in the UK; the ones I know are pretty canny.
North-easterners have greatly benefited from EU membership, but they voted to leave. What caused this puzzling paradox?
SOME RESEARCH FROM LONG, LONG AGO
I struggle to find advantages for getting older, but one benefit, before my memory fails, is that I can recall some interesting research conducted about 50 years ago which changed the way I think about securing change. I’ve written about this in at least one book, but it bears repetition here. Henry Mintzberg, newly graduated from University, was employed in the Canadian railway industry. He worked in a department that conducted analysis to improve the operation of the railway. As he worked there he became more and more puzzled: soundly argued analytical work was ignored by senior managers. Why was this? Was it that the analysts were casting pearls before swine? Mintzberg returned to university to conduct research in which he followed senior managers in Canadian industry to try to understand what made them tick. Nowadays, we’d call this enthography. His findings made him famous and
One item of folklore is that ‘The manager is a reflective, systematic planner’
Of course, there have been many changes to the workplace since then, not least that senior management is no longer a wholly male affair, even though a glass ceiling still exists. Also, no one could dream of the digital communications we now take for granted. However, many of his findings still ring true. He reported that managers work long hours at an unrelenting pace, prefer to be active, engage in brief activities, are often in meetings and in groups; finally, they prefer personal and soft data.
In 1975, he wrote a seminal piece in the Harvard Business Review, extending the messages of his book. This article includes the following: ‘If there is a single theme that runs through this article, it is that the pressures of the job drive the manager to take on too much work, encourage interruption, respond quickly to every stimulus, seek the tangible and avoid the abstract, make decisions in small increments, and do everything abruptly.’ Does this till ring true 40 years on? In 1990, Harvard Business Review published a later piece about folklore and fact in management work. One such item of folklore is that ‘The manager is a reflective, systematic planner’. Says Mintzberg, ‘The evidence on this issue is overwhelming, but not a shred of it supports this statement.’ This of course, helps him understand why high quality analytical work was not implemented when he worked in the railways. They are intelligent people, well able to understand complex arguments, but they don’t have time to do so. In some cases they have no inclination to do so.
IMPACT | AUTUMN 2016
11
© Graeme Peacock / Alamy Stock Photo
led to a career in which he wrote much about planning and strategy. He wrote about his initial findings in a widely read book ‘The nature of managerial work’, published in 1973, but still worth a read today.
into it and rational arguments about economic and other statistics carried very little weight.
WHAT DOES ALL THIS HAVE TO DO WITH O.R.?
Many of us love getting our hands dirty using clever mathematics, sophisticated computing, or churning through huge data sets. However, though these are core parts of our toolset, I don’t believe they are enough to make real, sustainable impact. Making organisational change is almost always painful. Choose your messy cliché: you have to break eggs to scramble them; there will be blood when sacred cows are slain. Why should people bother, especially if they may be changing jobs before too long?
© Graeme Peacock / Alamy Stock Photo
Our analysis forms part of a programme of change involving many other people and groups, which creates a narrative that makes the inevitable pain worthwhile
BACK TO BREXIT
There are may be many reasons why the majority voted for Brexit. For some it was a chance to rebel against what they saw as a self-perpetuating Westminster-based elite. For others it was a chance to control what they thought of as too high a level of immigration. Some genuinely believed that the EU brought little of value to UK life. Most of the popular press was also in favour of Brexit. During the referendum campaign, both sides produced statistics, often scary, sometimes very dodgy. I don’t believe that these made any difference to the result. What I do believe made a difference was the Brexiteers coherent narrative about control; that the UK would once again be free to plough its own furrow in the world. Just as Donald Trump promises to make America Great again, the Brexiteers painted a picture in which Britain, freed from the shackles of the EU, would be great again. Whether or not this narrative was realistic is almost irrelevant. It seems that people bought
12
IMPACT | AUTUMN 2016
I really do believe that making a real impact depends on developing a narrative that does more than just support our analysis. Narratives, stories if you will, form soft and personal data of the types preferred by Mintzberg’s Canadian managers. They’re easy to digest and don’t require reading through detailed reports. They can even be given titles, slogans and straplines. If the changes that our analysis suggests fit within a coherent story, a coherent narrative for change, then they are much more likely to be adopted. Advertisers know this very well. I’ve written before that I am usually sceptical when I read claims that a particular O.R. study has created £X Million new business or saved £Y Million (choose X and Y to be any value you wish). The reality, albeit in my limited experience is that our analysis forms part of a programme of change involving many other people and groups, which creates a narrative that makes the inevitable pain worthwhile. Perhaps we should send new recruits on story-telling courses, to learn how to spin a yarn that enables people to see the value of their excellent analytical work? I am serious, by the way. Mike Pidd is Professor Emeritus of Management Science at Lancaster University.
NEIL ROBINSON
GLOBALISATION AS WE KNOW IT would be impossible without the costeffective transport of goods overseas. The influence of O.R. is proving increasingly significant in helping this vital element of the worldwide trade nexus keep pace with the demands of the 21st century. For more than half a millennium the world economy has relied heavily on maritime transport. For a long time there was almost nothing else. Even today, according to the United Nations Conference on Trade and Development, around 80% of global trade by volume and more than 70% of global trade by value is carried by sea and handled by ports. Cost-effectiveness has been a near-constant consideration.
Notwithstanding our tendency to view it through a prism of romanticism, even the Age of Exploration was in large part a response to the nearinsatiable exigencies of mercantilism: new goods, new trading partners and new trade routes. Generally speaking, the Plimsoll line and the bottom line have always run parallel to each other. Perhaps the earliest evidence of this enduring relationship can be found in the story of Zheng He, the Ming Dynasty mariner who led several huge expeditions to Arabia and East Africa a century before the adventures of his more renowned European counterparts defined the era. Zheng may well have been the first maritime trader driven out of business by the harsh lessons of profit and loss.
IMPACT Š THE OR SOCIETY
13
Š Olivier Lantzendorffer/E+/Getty Images
SHIPPING FORECAST
© Daniel Barnes/E+/Getty Images
TROUBLED WATERS
Although the Europeans’ modest flotillas would have looked puny next to his vast fleets of warships and trading vessels, Zheng’s voyages were notably short-lived. China’s legacy of exploration died with them. The likeliest explanation for their curtailment is that they were tremendously expensive and resulted in scant reward. Whereas Europeans sought silks and spices that could fetch several times their weight in gold, there was little of worth for Chinese traders: the West in particular had nothing to offer. Fast-forward 600 years or so and we find the complications of presentday maritime transport extend far beyond wondering whether an epic journey will reap sufficient riches to justify the effort of setting sail. It is a complex and competitive sector – one in which fortunes can still be won or lost, just as in Zheng’s day, but in which guesswork and unpleasant surprises are increasingly giving way to sophistication and optimisation.
14
IMPACT | AUTUMN 2016
The O.R. community is playing a significant role in shaping this ongoing shift. With supply chains both ever more integrated and ever
Noble desperately needed a procedure that would enable it to strike an economically satisfactory balance between hiring barges on a leased or spot basis and avoiding the financial penalties arising from late deliveries
more vulnerable to disruption, the need to enhance performance and guarantee seamless operations is arguably unprecedented. Professor Bert de Reyck’s work with the marketleading Noble Group is a classic example of how O.R. is helping maritime transport stay afloat – literally and figuratively – amid the myriad pressures of 21st-century commerce.
The Noble Group specialises in transporting industrial and energy products. Its mission statement is to be “the best company in the world at moving a physical commodity from producer to consumer and managing the market, credit and operational risk associated with that”. The direction of travel is usually from a country with low production costs to a country with high demand. This is certainly the case with the group’s operations in Borneo, Indonesia, where maritime logistics are vital to one of Noble’s biggest and potentially most profitable activities: the transportation of coal. Indonesia is the world’s top coal exporter, serving as a supplier to highgrowth and developed nations such as China, India, Japan and Korea. Much of its output departs from Borneo’s two major trading ports, Taboneo and Muara Kaman, on enormous oceangoing vessels capable of carrying up to 120,000 tonnes. In part due to inaccessibility and in part due to cost and environmental concerns, the journey from mine to port is almost invariably made by barge. It might sound leisurely, even serene, but the reality can be chaotic. Up to 16 barges may be required to load a single ship. Noble owns its own fleet of barges, has long-term contracts for the lease of others and must occasionally hire more on a spot basis. Supplier schedules can turn out to be imprecise, if not downright fanciful. Busy jetties and coal shortages alike tend to translate into delays. When things go wrong – when extra barges suddenly need to be hired or timetables are not adhered to – it costs Noble money. In early 2012, when Professor de Reyck and one of
avoiding the financial penalties arising from late deliveries. Professor de Reyck reasoned that to achieve this elusive goal – and to realise the savings it would bring – it would be essential to equip managers with the ability to assess with confidence three key considerations: • How many owned, leased and spot barges to allocate to each vessel • When to dispatch each barge • Whether to hire a floating crane to speed up barges’ unloading. The solution, known as the Barge Rotation System, would be a long way removed from the damaging rigidity of its predecessor.
PLAIN SAILING
The Barge Rotation System was devised to integrate large quantities of information in a way that would at last assist logistics managers rather than confound them. “What really hampered the old approach was
a combination of hard-to-find or incorrect data and the need for repeated and time-consuming updates,” says Professor de Reyck. “Our objective was to give Noble something altogether less unwieldy.” The first step was to arrange all the relevant data in a single spreadsheet model, separating factors on the basis of update frequency. For example, supplier locations, which change comparatively rarely, and available jetty slots, which change regularly, were allocated different sheets. This improved ergonomics and allowed users to complete data entry in a matter of minutes – as opposed to the many hours they previously had to devote to the job. “Inputting and scheduling could take up to half a day with the old system, and there was also no means of cross-validating data consistency – a potentially sizeable waste of time and energy,” says Dr Fragkos, now an assistant professor of technology and operations management at Erasmus
IMPACT | AUTUMN 2016
15
© Teun van den Dries/E+/Getty Images
his PhD students, Ionnis Fragkos, were first asked to help unravel these problems, the group was accustomed to losing tens of millions of dollars every year in what are known as demurrage and detention penalties – the supplementary charges incurred by hiring additional barges and failing to meet schedules. “The fact is that making bargehiring and scheduling decisions is a multifaceted affair,” says Professor de Reyck, director of the UCL School of Management at University College London. “At the time Noble used a manual scheduling procedure that logistics managers had to perform many times a day. It was cumbersome and shortsighted and didn’t take into account the sheer complexity of the various interrelated and competing factors involved.” One substantial limitation of the established approach was its lack of flexibility. A common response when unexpected events threatened to upset schedules was to resort to rough calculations and rules of thumb. Some strategies might have appeared reassuringly intuitive, but often they were not just myopic but fundamentally incorrect. “It wasn’t unusual for managers to view the existing guidance as confusing and contradictory,” says Professor de Reyck. “Trying to make calculations in such circumstances, especially in the absence of a proper decision-support system, demands a lot of mental effort and can result in wild inaccuracies. In many instances, somewhat inevitably, the ultimate outcome was inefficiency.” What Noble desperately needed was a procedure that would finally enable it to strike an economically satisfactory balance between hiring barges on a leased or spot basis and
University’s Rotterdam School of Management. “We resolved these issues with an Excel-based graphical user interface that encouraged much quicker data entry and validated inputs with pop-up notification messages. This also satisfied senior management’s request for a spreadsheet environment that could run without specialist software and be easily circulated via internal email.” Of course, although imperative, ensuring user-friendliness was only one element of the remit. A system that is inherently straightforward to work with counts for little if its decision-making framework and underpinning algorithms do not address the task at hand. “Ideally, our algorithms would incorporate uncertainties that could affect schedules,” says Dr Fragkos. “These might include vessel arrival dates, loading times and supplier availability. However, data about these uncertainties weren’t readily available, and managers were uncomfortable about assigning probabilities to uncertain events. Bearing all of this in mind, we opted to build a reactive deterministic model to incorporate unforeseen changes and new information.” Crucially, the model was designed to distil the overall problem into two distinct sub-problems. The first algorithm, governing voyage allocation, was formulated to ensure that the quantity of barges sourced from each supplier and the number of voyages allocated to each vessel would not exceed the maximum number of available barges. “The aim at this stage is simply to minimise transportation costs,” says Professor de Reyck. “An important point is that using leased or spot barges might sometimes be cheaper than using
16
IMPACT | AUTUMN 2016
owned barges, because the former are often bigger and can combine shipments that would otherwise require several owned barges.” Professor de Reyck describes the second algorithm, which deals with voyage scheduling, as the Barge Rotation System’s “backbone”. “It creates a feasible schedule while adhering to the allocation decisions made by the first algorithm,” he explains. “Vessels are scheduled in order of non-increasing demurrage penalties, taking into account restrictions that vessels already scheduled have imposed on the availability of barges, floating cranes and jetties. Once a feasible schedule has been determined, a further algorithm then decides whether substituting owned barges with leased or spot barges would result in a lower total cost.”
Within a year and a half the application of stateof-the-art O.R. was saving Noble approximately $1.3m a month
THE TIDE IS TURNING
The transformation that has taken place since the Barge Rotation System’s introduction certainly reinforces Professor de Reyck’s optimism. Within a year and a half the application of state-of-the-art O.R. to the company’s operations at Taboneo and Muara Kaman was saving Noble approximately $1.3m a month. As well as augmenting efficiency and reducing costs, the system has yielded some surprising qualitative findings – including the revelation that batch hiring can represent an optimal course of action during busy periods.
“The implemented system has improved the financial performance of my division” confirms Noble executive director Tim Gazzard, global head of the group’s iron ore and special ores business. “It has also helped planners to improve their understanding of operations and other departments to appreciate the complexity of logistics and understand better how their decisions influence the logistics operations.” Professor de Reyck now hopes Noble’s experience will serve as a stimulus for further collaboration between the maritime industry and the O.R. community. Given globalisation’s unrelenting reliance on the shipping industry, it seems almost extraordinary that many practices are still so outdated and profligate. “Many large maritime businesses continue to make complex operational decisions manually by using intuition and limited data,” says Professor de Reyck. “This needn’t be the case. Although some degree of customisation is required, advances in optimisation mean we can now address a much broader class of difficulties in this sector and provide a range of tangible benefits.” All the emerging evidence suggests he is right. They might not have been able to save Zheng He’s flotilla from the ancient Chinese equivalent of the breakers’ yard, but O.R.’s insights are of undeniable consequence and value to today’s ocean-going traders. As the case of the Noble Group demonstrates, a long-overdue sea-change is finally under way. Neil Robinson is the managing editor of Bulletin Academic, a communications consultancy that specialises in helping academic research have the greatest economic, cultural or social impact.
U S I N G O P E R AT I O N A L R E S E A R C H TO G U I D E I N N OVAT I O N A N D S AV E L I V E S STEPHEN STAFFORD
SHOUT-IT-NOW (S-N) is a South African not-for-profit organization that likes to break moulds. Specializing in mobile community-based HIV counselling and testing (CBHCT) and linkage to clinical care services, S-N has contributed significantly to the South African health ministry’s goal of stemming the rate of new HIV infections among youth and other key populations at high risk of acquiring HIV. S-N’s team of founders are a mix of technology, marketing and public
health professionals who began the organization with a dream of designing a new approach to HIV testing. After careful analysis of conventional CBHCT services, they designed a radically different system that could feasibly be scaled up to test all South Africans. The result is a high volume, high quality service that is technology-rich, extremely engaging to patients, and cost effective for funders. Since 2007, S-N has provided its innovative services to more than 650,000 clients in a variety of
IMPACT © THE OR SOCIETY
17
community settings including schools, prisons and a range of community venues throughout Gauteng, Limpopo, Northwest and Western Cape provinces. S-N’s highly scalable, modular HCT service is delivered by 15-person teams that bring a new mix of technology and human support. Teams go out into communities most affected by HIV and set up mobile computer labs, register and track each client using biometrics (fingerprint), educate people about HIV/STIs/TB with on-line, MTV
style videos featuring an interactive risk profile system, and provide high quality HIV counselling and testing. A tollfree Call Centre operates out of S-N’s headquarters in Cape Town to facilitate all referrals of clients identified as HIV positive or needing other care or support services. The Call Centre allows for improved client care through targeted messaging and continuous follow-up, data centralization and quality assurance through audio recordings and audits.
DEDICATED TO INNOVATION
One founding principle of Shout-itNow is its commitment to innovation. The organization constantly reevaluates each step of its operations to create the most efficient and effective HIV testing service for the benefit of its clients and its funders. Much of this monitoring and evaluation—especially as it relates to operations—was for years extremely time and labour intensive. Then, in 2014, Shout-it-Now’s founder, Bruce Forgrieve, learned how other public health colleagues at the University of Cape Town (UCT) were using SIMUL8’s simulation software. The scenarios they were able to experiment with saved time and money and greatly enriched the outcomes of their operations decisions.
Since 2007, S-N has provided its innovative services to more than 650,000 clients
Impressed with the results achieved by colleagues at UCT, Forgrieve contacted SIMUL8 and explained how expansion opportunities were challenging the organization to further improve efficiencies. When SIMUL8 founder, Mark Elder, learned about Shout-it-Now and how the organization could use its software to improve its innovative HIV testing service in South Africa, he donated the software to the organization. Since then, SIMUL8 has been instrumental in eliminating bottlenecks and identifying taskshifting opportunities among S-N’s employees, which has resulted in many more South Africans being HIV tested by the Shout-it-Now teams each day. Dr. Kathryn Pahl, Managing Director: “Our operations quality improvement
18
IMPACT | AUTUMN 2016
initiative started out using spreadsheets and it was a nightmare, so to have SIMUL8 as a way to model what we were doing was a God-send.”
HOW O.R. HAS MADE SHOUT-IT-NOW MORE EFFECTIVE
To meet the goal of testing large numbers of people with a high quality service delivered in a cost effective manner, Shout-It-Now needed a solution to model their processes and identify bottlenecks, enabling them to create a highly efficient service. In the early days of the project, Excel was the only operations tool S-N had but it proved to be extremely difficult, time-consuming and ineffective for communicating plans and getting buy-in from staff. Prior to using SIMUL8, S-N had a system that collected a range of data including the arrival times of each patient and the appointment duration. While this system allowed running reports and calculating averages, it did not help improve processes. Realising that what was needed was a way to use this data to drive decisions and improvements, led to the use of O.R. Through O.R., the first bottleneck identified was the shortage of personal computers for the volume of patients seeking HIV education and testing services. In addition to video-based education about HIV, computers are used to deliver an interactive risk assessment, as S-N found that people are more inclined to give honest answers to questions about their sex life via a computer, rather than in person. Therefore, it was critical to have enough computers to serve large groups of clients who accessed services at the same time. The shortage of computers slowed down the counselling stage and therefore impacted the flow of
each team’s entire day. O.R. modeling also found that having additional computers prior to patients being seen by a counsellor could speed up the entire process, as computers could be used to ensure each client’s personal data were captured beforehand, freeing the counsellor’s time to address the patient’s risk behaviours identified by their risk profile.
The first bottleneck identified was the shortage of personal computers for the volume of patients seeking HIV education and testing services
A bottleneck was also identified at the registration desks where there was limited staff to cope with service demand at peak times. To solve this, a second registration desk was opened after the site had been open for two hours to prevent lines from forming at the desks. Use of SIMUL8 enabled operations and management teams to fine-tune S-N’s CBHCT process: to identify exactly how it had to be configured, and how it had to evolve throughout the day to meet performance objectives. A culmination of small process changes created a big impact. A more robust, reliable and efficient testing process has enabled the organization to exceed the targets set by its funders. Shout-It-Now’s operations team is continuing to work on exciting new developments to increase the number of patients being tested and treated for HIV. Together with SIMUL8 consultants, they are using SIMUL8 to refine a new SMART (Safe, Mobile, Accessible, Rapid Testing) process that is more than doubling the number of patients served without adding resources.
This is a significant milestone and highlights the importance simulation brings to process improvement. Shoutit-Now’s achievements were recognized in July 2014 by the World Health Organization (WHO) for best practices in HIV counselling, testing and care for adolescents when they published “Adolescent HIV Testing Counseling and Care: Implementation Guidance for Health Providers and Planners”, an interactive, web-based tool kit that highlights best practices around the world. The tool cites Shout-itNow twice, both for its innovative approach to community-based HIV Counseling and Testing, as well as for its Call Centre approach to linkage to care. Bruce Forgrieve, Chairman and Founder, Shout-it-Now: “I’m now a huge proponent of O.R. The tools we have learned have not only made our service more efficient but we are ultimately saving more lives and that is an invaluable innovation.” Stephen Stafford is the Partnerships Manager at Shout-it-Now. He can be contacted at stephen.stafford@shoutitnow.org. A version of this article appeared in the June 2016 edition of the IFORS Newsletter, and has been used with kind permission. Shout-it-Now is supported by the U.S. President’s Emergency Plan for AIDS Relief (PEPFAR).
IMPACT | AUTUMN 2016
19
MTR expo: www.mtr.hk365day.com. Creative Commons license: www.creativecommons. org/licenses/by-nc-nd/3.0/. The image has been cropped and levels adjusted
I M P R OV I N G P H YS I C A L A S S E T M A N AG E M E N T ELIZABETH THOMPSON
20
IMPACT © THE OR SOCIETY
IN 2005, TIM JEFFERIS, Supportability Specialist at the UK’s Ministry of Defence Science and Technology Laboratory, spoke to an international conference of maintenance experts in Toronto. “The reality of changing maintenance practice,” he said, is an opportunity to “implement new techniques and associated maintenance policy” to reduce costs and increase the deployability of assets. As Jefferis quite rightly pointed out, physical asset management is a hightech field with software able to analyze condition monitoring data and suggest
to companies when to intervene with a maintenance action such as repair or replacement. In his presentation, Jefferis focused on his ongoing collaborations with Andrew K.S. Jardine, then Director of the Centre for Maintenance Optimization and Reliability Engineering (C-MORE) at the University of Toronto, a research centre dedicated to the study of engineering asset management, funded by a consortium of companies. A few years earlier, Jefferis had started to wonder if it was possible to further improve how
first found the levels of oil viscosity and molybdenum predicted engine failure (see Figure 2); their subsequent cost analysis suggested a cost savings of approximately 30% by using the EXAKT optimal policy instead of replacement only on failure.
The project was a success: the levels of oil viscosity and molybdenum predicted engine failure; cost analysis suggested a cost savings of approximately 30%
Looking back, Andrew Jardine says, “This was a classic collaboration – and it was the foundation for many of our other EXAKT collaborations with mining companies, utilities, pulp and paper etc.” For over 30 years, Professor Jardine has been working on optimizing condition-based maintenance (CBM) decisions by using software able to determine both the risk of failure and the economic consequences of failure. His software, he says, is simply a “tool to interrogate information already existing in company databases.” That may be true,
but a considerable amount of work has gone into the creation of this tool. In the 1980s, when Jardine was Professor of Engineering Management at the Royal Military College of Canada, a bio-statistics professor mentioned the use of the proportional hazards model (PHM) in the medical field to predict the survival of patients. Jardine quickly realized this principle could be applied to predict the probability of failure in the management of physical assets. The first step was to develop the theory. Accordingly, Jardine and several RMC colleagues worked with the Department of National Defence (DND) using the level of metal particles in the oil of aircraft engines to develop a proportional hazards model to predict the risk of engine failure. To this point, DND had been monitoring the rate of change of metal deposit in oil samples to establish if an engine should be changed-out before its design life. However, the prematurely removed engines were often fine, so DND wanted a better method to estimate engine health and risk of failure. The PHM approach worked, and the preliminary study was followed by work on diesel engines on ships.
Source: EXAKT
the Ministry of Defence was handling its maintenance. Could it be more cost effective, for example? A colleague in the Canadian Navy mentioned Jardine’s work to him at a NATO meeting, and Jefferis went to Toronto in October 2001 to discuss possible research. By the time MOD UK joined C-MORE as one of the consortium’s collaborating research partners in 2003, Jefferis and Jardine had formulated a specific project involving the use of software created by Jardine’s lab. Simply stated, the initial project involved a particular type of diesel engine installed in British Army military vehicles. There were approximately 800 of these engines in service at various locations worldwide. Reliability was obviously a key concern, as catastrophic failure in the field incurred economic costs and could endanger personnel. When Jefferis approached Jardine, the policy was to repair/replace these engines upon failure, but he wanted to know if failures could be predicted, and therefore prevented, specifically by analyzing the engine oil. Although MOD UK had been collecting oil analysis results from these diesel engines, as well as details of failures recorded, maintenance was largely performed as needed. Oil was changed only when engines had major repairs or when the oil was contaminated or degraded. At Jefferis’ request, C-MORE researchers began to examine the existing oil analysis and failure data. If an appropriate risk of failure model could be created, the potential cost savings of implementing a condition-based maintenance (CBM) policy for the diesel engines could also be calculated using the EXAKT software developed in Jardine’s lab specifically to optimize CBM decisions (see Figure 1). The project was a success: C-MORE researchers
FIGURE 1 FLOWCHART SHOWING HOW EXAKT WORKS
IMPACT | AUTUMN 2016
21
Source: EXAKT
FIGURE 2 SCREENSHOT OF OIL ANALYSIS
The result was the creation of EXAKT software, the software later used to good effect by Tim Jefferis at MOD UK. “EXAKT and our membership in the Consortium has already paid for itself many times over ... by showing us the type and form of information that we need to collect and pay attention to,” said Commander Jonathan L. Paterson (Rtd.), Canadian Department of National Defence. Jardine’s software is now used around the world; the following cases are illustrative of its enormous success.
discovered the optimal CBM policy for the inspection interval dropped to one year. The new policy reduced the expected number of failures per year from nine to one (in the year following analysis, the actual number of failures was reduced to two; in subsequent years, it was sometimes zero) with an associated total costs reduction of 55%. Importantly, given the nature of the industry, this substantial economic benefit included quantification of the value of fewer passenger disruptions.
IRVING PULP AND PAPER HONG KONG MASS TRANSIT
The Hong Kong Mass Transit Railway (MTR) Corporation had excessive traction motor ball-bearing failures in its trains. The current inspection interval was 3.5 years. When MTR carried out a CBM program based on the EXAKT software to monitor bearing grease colour, researchers
22
IMPACT | AUTUMN 2016
Irving Pulp and Paper located in Saint John, New Brunswick, implemented a CBM program for Goulds 3175L pumps. When vibration measurements on pump bearings were collected by accelerometers, EXAKT software found only two vibration measurements out of 56 were significantly related to the probability of pump bearing failure.
By applying the optimal CBM policy, C-MORE estimated a saving of 33% to the company. While the exact cost benefit of this improvement was not declared by the company, bearing replacement after failure was, on average, 3.2 times more expensive than a preventive replacement. Irving Pulp and Paper installed EXAKT for everyday use and developed a tool to link the relevant databases. They also took steps to make small modifications to the pumps to improve reliability. Over a period of almost two years, there were no pump failures, as compared to 15 failures in five years for 12 pumps. Peter Sundin, then Reliability Engineer with Irving, sent a quick thank-you note to C-MORE: “Thanks for running the numbers on (pump) 46-3001. Another ‘success,’ in that it confirmed the vibe guys’ call to replace. I think this is four operational calls based on EXAKT.”
The new policy reduced the expected number of failures per year from nine to one with an associated total costs reduction of 55%
CARDINAL RIVER COALS
Cardinal River Coal mines (CRC) is a major producer of coal located in northern Alberta. Oil analysis data and work-order records were gathered for 50 wheel motors on 25 haul trucks. A C-MORE study found two out of 12 possible measurements, iron and sediments, were significantly related to the wheel motor failure. The savings accrued by applying the optimal CBM policy determined by EXAKT were estimated at 22%, with the average times between replacements now 8% shorter than the original practice. In this case, the cost of a failure replacement was, on average, three
POLICY
PREVENTIVE
FAILURE
TOTAL COST
MEAN TIME
REPLACEMENT COST*
REPLACEMENT COST*
PER DAY*
BETWEEN REPLACEMENTS
Existing SASOL policy
63.21 (18.3%)
281.95 (81.7%)
345.16
214.60 days
EXAKT policy applied
100.56 (47.0%)
113.47 (53.0%)
214.03
263.60 days
Optimal age policy
107.99 (42.3%)
147.59 (57.7%)
255.58
191.22 days
Age policy applied
250.95 (65.0%)
135.38 (35.0%)
386.33
185.76 days
* All costs in Rand/day TABLE 1: SUMMARY OF COST ANALYSIS (Adapted from P.J. Vlok et al.)
times more expensive than a preventive replacement.
TWISTRAAI PLANT, SASOL, SECUNDA, SOUTH AFRICA
Campbell Soup Company is a food processing company using condition based monitoring techniques such as oil analysis and vibration analysis to monitor equipment health. The company’s existing condition based maintenance policy was established according to the warning and alarm limits set for each type of equipment by the manufacturer. The experience of maintenance engineers and technicians was also a major factor in its existing policy. For this particular use of EXAKT, vibration data were
The savings by applying the optimal CBM policy were estimated as 55%
The researchers’ conclusion was that Campbell’s current practice of vibration monitoring was a better option than using a simple age-based replacement policy. However, with the use of EXAKT, they were able to take a good policy and make it optimal. If Campbell’s had not had a CBM policy in place, researchers commented, the savings would have been even greater - up to 70%.
ÉLECTRICITÉ DE FRANCE (EDF)
In 2004, EDF, then a collaborating member of the C-MORE consortium, asked researchers to study pads in a type of rotary turbine used in its
collected on shear pump bearings, with 21 vibration measurements provided by accelerometers. EXAKT found three of the 21 measurements significantly related to the probability of pump bearing failure. The savings by applying the optimal CBM policy were estimated as 55%. In this case, the cost of a failure replacement was, on average, 9.5 times more expensive than a preventive replacement. Note: 9.5 is quite high and results from the high cost associated with production disruption due to bearing failure.
© rinderart / Alamy Stock Photo
SASOL is a large petrochemical company producing most of South Africa’s fuels and oils. C-MORE conducted a study at one of the company’s coal wash plants about 200km east of Johannesburg. This plant was using a total of eight axialin, radial-out Warman pumps driven by 220kW motors to circulate a water and magnetite solution. They were important in the production process, with serious production losses if a pump were out of circulation. Each pump had two SKF bearings; although their vibration levels were monitored, there had been numerous failures. C-MORE proposed using EXAKT to model the vibration covariates and calculate an optimal decision policy. The resulting model of the cost analysis appears in Table 1. As there is a close relationship between the optimal EXAKT policy (theoretical costs) and the applied EXAKT policy (real costs), the optimal policy is clearly a realistic one. As a failure analysis tool, then, EXAKT helped these particular maintenance engineers make better replacement decisions.
CAMPBELL SOUP COMPANY
IMPACT | AUTUMN 2016
23
electricity generation.The turbine, approximately 100m long, is supported by bearings. The maintenance practice involved a planned shutdown every 18 months. The turbine had to be stopped in a controlled manner so as not to damage the bearings, and the only way to confirm the bearings had not been damaged was to undertake costly and time-consuming inspections. The company wanted C-MORE’s help to find a way to avoid these inspections using the condition monitoring data they had collected during past shutdowns, which included oil pressures measured at four locations and the distance (“height”) from the rotor to bearing. Of the 37 turbine shutdown histories, 15 had resulted in bearing failure. When C-MORE researcher Neil Montgomery applied the EXAKT
FOR FURTHER READING Jardine, A.K.S., and A.H.C. Tsang. (2013). Inspection Decisions. In Maintenance, Replacement, and Reliability: Theory and
model to the data, he found the data could have been used to identify which bearings should have been inspected and which ones remained in good health. “In this case study the whole idea of failure was turned on its head,” stated Montgomery in a presentation at the Surveillance 5 Conference in 2004. “It wasn’t a matter of preventing the bearing from being damaged – it was a matter of avoiding unnecessary work. But the EXAKT model is general enough to handle even these novel situations.” Before applying the EXAKT optimal policy, the company had 15 failures, with an average cost/ day of €95.36. With the EXAKT optimal policy, they were likely to have only three failures, with an average cost/day of €75.63, representing an overall savings of 20%.
These calculations can be applied to any industry where asset replacement cost or equipment failures represent a significant part of the operation
Applications, 2nd ed., 101-134. New York: CRC Press. Vlok. P.J., J. L. Coetzee, D. Banjevic, A. K. S. Jardine and V. Makis (2002). An Application of Vibration Monitoring in Proportional Hazards Models for Optimal Component Replacement Decisions. Journal of the Operational Research Society 53(2): 193-202. Wong, E.L., T. Jefferis, and N. Montgomery. (2010). Proportional Hazards Modeling of Engine Failures in Military Vehicles. Journal of Quality in Maintenance Engineering 16(2): 144-155.
24
IMPACT | AUTUMN 2016
CONCLUSION
Tim Jefferis said, in a 2006 presentation at the International Maintenance Excellence Conference, that data analysis using EXAKT is “better than an expert.” The experiences of MOD UK and the other companies cited here show the ability of the C-MORE group to use the PHM-based software EXAKT to predict equipment failure, estimate the remaining useful life of equipment, and define the best mix of preventive replacement and run to failure to minimize costs, optimize reliability, maximize availability and achieve the best possible risk/cost/ reliability balance.
Since its first use with DND Canada, the software has been constantly evolving. Suitable input data now include: equipment and component parameters, event data from work orders (data relating to events affecting equipment, such as failures, suspensions, frequency, working age), condition data (vibration, oil sampling, temperature etc.), failure modes, and preventive and failure replacement costs. Sample output data range from the optimum percentage balance of preventive replacement and run to failure maintenance, to the cost impact of current practices, the statistical validity of alternative models, the remaining useful life of an asset, and the expected time between replacements. These calculations can be applied to any industry where asset replacement cost or equipment failures represent a significant part of the operation. Speaking of Irving Pulp & Paper’s experience with EXAKT software at the International Conference of Maintenance Societies in Melbourne, May 2007, Peter Sundin said: “The pulp mill industry is a highly competitive, capital intensive, low margin industry in which any cost saving is a significant benefit. The implementation of complex mathematical models can provide such a potential cost saving, as well as providing useful information for maintenance planning.” As the preceding examples suggest, Sundin’s comment could be applied to a wide range of industries. Dr. Elizabeth Thompson is a writer and freelance academic editor based in Toronto, Ontario. She also works as Researcher/Administrative Assistant for Professor Emeritus Andrew K.S. Jardine, University of Toronto.
U N I V E R S I T I E S M A K I N G A N I M PAC T EACH YEAR STUDENTS on MSc programmes in analytical subjects at several UK universities spend their last few months undertaking a project, often for an organisation. These projects can make a significant impact. This issue features a report of a project recently carried out at one of our universities: LSE. If you are interested in availing yourself of such an opportunity, please contact the Operational Research Society at email@theorsociety.com
ENABLING FINANCIAL LOAN DECISIONS TO BE MADE MORE QUICKLY (Shanmugam Palaniappan, MSc Decision Sciences, London School of Economics)
Shanmugam built a Multi Criteria Decision Analysis (MCDA) model for Term Finance (a Caribbean based lender), with the aim of improving the turnaround time of SME loan applications from 4 days to 24 hours. The MCDA model evaluates the credit-worthiness of small-medium size businesses (SMEs) and is used in decisions about loans involving a substantial amount of Term Finance’s capital. Shanmugam delivered the model to the Directors of Term Finance at the end of August 2015, and by October 2015 the completely web-based lender had lent out over one million Trinidad & Tobago dollars (the equivalent of £100,000), depending solely on the scores produced by the model. Oliver Sabga, the CEO of Term Finance, described the MCDA methodology to be “more innovative, flexible and fit-for-purpose than credit assessment techniques used by legacy financial institutions”. A year later Oliver explained that in 2016 Trinidad & Tobago entered a recession, leaving SMEs struggling to keep their footing in the market. Revenue for SMEs took a serious hit and the Directors of Term Finance
became concerned that their customers may not be able to meet their loan obligations. Oliver, who also studied Decision Sciences at LSE, gave the Directors his assurance that the MDCA model evaluated SMEs on both quantitative criteria such as ‘capacity to repay’ and qualitative criteria such as ‘reputation of the business proprietor’ which meant that where a SME might face financial constraints, the qualification of the proprietor’s reputation and other ‘softer’ attributes would mean that their customers (the SME borrowers) would do all they could to honour their commitments. In the worst case, Oliver explained, Term Finance would restructure their customers’ loans to bring their monthly commitments down and so become more manageable for the SMEs; “our goal is to keep our customers in business and work with them to overcome this economic downturn.” By April 2016, 90% of Term Finance’s SME customers had repaid their first loans in full and one in every two had taken a second loan. 10% of SME customers required restructuring
and are continuing to meet their (now smaller) monthly loan commitments. Since then, Term Finance has lent out an additional $2.5M TTD (£250K) to SMEs using the MCDA model. Oliver commented that “It’s really amazing to see that giving SMEs the benefit of depending on ‘softer’ criteria has proven to be fit-for-purpose and necessary, especially given the current economic environment. SMEs contribute to over 30% of our country’s GDP and it is not commercially sensible (from a Macro-economic perspective) to turn them away.” Term Finance continues to use the MCDA model to make credit decisions with confidence that the multi-criteria feature of the model will play a compensatory role; where one criterion suffers, the other prevails. Oliver explained that because the project was so successful for Term Finance, he had full support from his Directors in hiring another LSE Decision Science student in 2016. He has full confidence that the results of this second project will further help develop their Lending business.
IMPACT © THE OR SOCIETY
25
© felipe caparros cruz / Alamy Stock Photo
T H E T H I R D WAY NEIL ROBINSON
26
IMPACT © THE OR SOCIETY
THE IMPROVED EFFICIENCY AND ENHANCED DECISIONMAKING that O.R. can bring are as vital in the third sector as they are anywhere else – if not more so. As the following case study illustrates, a pioneering OR Society initiative is making these advantages freely available to organisations whose resource constraints might otherwise preclude the use of such skills. Organisations in the third sector face decisions every bit as complex and crucial as those found in the public and private spheres. Some are under pressure to do more with less; some struggle to achieve objectivity in emotionally charged situations; some know they are
doing a good job but have no obvious means of proving as much. O.R. can play a substantial role in bringing clarity to these and other issues, but the resources required for inhouse expertise are frequently lacking. This is why the OR Society launched its Pro Bono O.R. initiative in 2013. The scheme provides UK-based third-sector organisations with free consultancy, helping them to reduce costs, improve efficiency and plan strategy. O.R. practitioners oversee projects on a voluntary basis, often in their own time, producing tangible gains in areas such as data analysis, options appraisal, scheduling, process enhancement and impact measurement.
“We can bring real benefits by looking at problems in novel ways and demonstrating approaches that the organisations we work with may not have thought of,” says Felicity McLeister, Pro Bono O.R.’s project manager. “It’s also good for our volunteers to meet new people, understand new organisations, learn about new issues and get a new set of perspectives. It’s a win-win.” Pro Bono O.R. has already assisted dozens of registered charities, associations and self-help and community groups.
Its successes have ranged from developing a sustainable set of priorities for the Disability Law Service to aiding the Dachshund Breed Council in predicting the probability of puppies being born with a form of epilepsy. In some instances the ultimate consequences of these interventions can be life-changing and maybe even life-saving. The Society’s work with Bloodwise, the UK’s specialist blood cancer charity, offers a compelling illustration.
PRO BONO O.R. The OR Society launched its Pro Bono O.R. initiative in September 2013. The Society recognises third-sector organisations can face an especially pressing need for efficiency and wants more of them to take advantage of the benefits O.R. can bring. Organisations may seek a range of operational improvements, including:
• • • • • • • •
Strategic planning/review Impact measurement Process improvement Data analysis Business planning Efficiency improvement Options appraisal Decision-making Since launch, the scheme has completed 60 projects with
organisations such as the RSPCA, Diabetes UK, Work for Us, The Care Forum and Parentskool. Twenty eight projects are currently in full progress, with a further 15 in their initial stages. Each project has three overarching aims:
•
To help third-sector organisations do a better job and build capacity by using the skills of volunteer O.R. analysts and consultants
THE BLOODWISE STORY
Susan Eastwood was just six years old when she lost her life to leukaemia, a cancer that starts in blood-forming tissue – usually bone marrow – and leads to the over-production of abnormal white blood cells. Believing
•
To promote awareness and understanding of the benefits of O.R. across the third sector and to wider audiences
•
To give O.R. analysts an opportunity to practise in a wider arena and develop their knowledge and skills For more information, contact Pro Bono O.R. project manager
Felicity McLeister at felicity.mcleister@theorsociety.com. Alternatively, please visit www.theorsociety.com/probono.
IMPACT | AUTUMN 2016
27
© Blend Images / Alamy Stock Photo
The Pro Bono O.R. scheme provides UK-based third-sector organisations with free consultancy, helping them to reduce costs, improve efficiency and plan strategy
© Juice Images / Alamy Stock Photo
there should be hope for others with the same condition, the family established what was then known as the Leukaemia Research Fund. That was in 1960. Since then the charity the Eastwoods founded has grown from a single branch into a national network. Now known as Bloodwise, the organisation has the support of researchers, health professionals, volunteers and, most importantly, patients and people affected by blood cancer. Like any charity, Bloodwise relies on donations. Currently it finances in excess of a thousand researchers and clinicians across more than 220 projects. Over the years it has raised more than half a billion pounds for studies into not just leukaemia but lymphoma, myeloma and other related disorders. To maintain this research commitment, Bloodwise needs to fundraise – largely from the public.
Even the very biggest charities know how hard it is to recruit committed donors. More broadly, every charity needs a deep and detailed understanding of the various audiences with which it seeks to engage. A solid basis for communicating with those audiences to best effect is imperative. It was here, Bloodwise’s staff reasoned, that Pro Bono O.R. could really aid the cause.
Our volunteers can make a huge difference in a small amount of time
The task fell to Mark Montanana, a senior modelling analyst with the Clydesdale and Yorkshire Banking Group. “I’ve got experience of working with marketing databases to extract key customer insights and support strategic decisions in
the financial sector,” says Mark. “That kind of specialism – customer profiling, cluster analysis and datamining – was very much in line with what Bloodwise needed from us.”
FROM LITTLE ACORNS...
As with any Pro Bono O.R. project, the process began in earnest with a meeting between volunteer and client. Mark quickly discovered a familiar scenario: an organisation with a lot of useful data but lacking the time and some of the expertise to get the most from it. “We often find a mass of data whose significance has gone unrecognised or untapped,” says Mark. “We have to explore that wealth of data, find out what it can tell us and work out how it might be used to bring genuine added value.” In this case there was an abundance of transactional data. This presented an opportunity to enhance Bloodwise’s knowledge of its supporters – their variety, their distinctive traits and, critically, their tendencies in terms of donations. Making sense of this information, Mark realised, would allow the charity to fundraise more effectively. Since the data contained a mix of categorical and continuous variables, it was first necessary to group all characteristics into a smaller set of components. This was achieved with a clustering algorithm. The principal advantage of such an approach was to shrink the dimensionality of the data, so making the set more manageable. Mark then applied a hierarchical clustering algorithm, revealing eight distinct segments among Bloodwise’s donor base. Each of these was defined by features including the following: • Acorn categories, as derived from the eponymous segmentation tool
28
IMPACT | AUTUMN 2016
that analyses social dynamics and population behaviour to divide the UK populace into demographic types • Activities of choice – including, for example, cycling, running, rambling, retail and volunteering • Attributes based around factors such as number and frequency of donations. • The identification of these clearly defined segments was itself of considerable benefit. In tandem with a decision tree coded into an Excel macro, it meant Bloodwise could determine the most relevant characteristics of each group and build specific engagement strategies. But there were further revelations lurking within the data. One of the most arresting was that the members of just two of the eight groups, although accounting for only 20% of the overall active base of supporters, were responsible for more than 80% of all donations. According to Owen Bowden, Bloodwise’s insight and analysis manager, these findings provided the charity with “valuable insight” in terms of optimising its promotional activities.
REFLECTIONS ON A WIN-WIN
Like other Pro Bono O.R. projects, Mark’s collaboration with Bloodwise was – to borrow Felicity McLeister’s phrase – “a win-win”. First and foremost, Bloodwise now has a strategic framework that it can use to shape and maximise its fundraising for years to come. “Pro Bono O.R. has given us access to skills we wouldn’t be able to have in-house or which would have taken us a very long time to acquire,” says Owen. “I look at fundraising and marketing – how we raise the money that supports our cause – but there are lots of different aspects within our organisation that can
A QUESTION OF PRIORITIES The Pro Bono O.R. initiative has also helped Bloodwise identify patient priorities. This was achieved by analysing a survey in which more than a thousand respondents answered questions about their experiences during diagnosis, treatment, recovery and other stages of the patient journey. Statistical analysis of the survey responses first identified the key drivers of patient satisfaction. Regression trees were then used to construct a hierarchy of importance and to establish the areas of support considered absolutely essential. This made it possible to determine the best means of delivering the most valued support at every stage of the patient journey – for example, through members of the medical profession, through written communication, through family and friends or through a combination of any of these. The use of data visualisation to represent and understand variations in patient satisfaction scores was crucial to conveying the benefits of this process to Bloodwise.
use O.R., which can really go across the whole spectrum of the business. We’ve also used it to look at how we provide information to patients, for instance [see A question of priorities]. And I’ve been amazed by the quality of people who volunteer for us.” The experience was equally rewarding for Mark. “It was very fulfilling for me,” he says. “It gave me a chance to meet a wonderful bunch of people and to contribute to a fantastic cause. It also allowed me to experiment with different modelling techniques, and it was great to see that the outputs were so well received. It’s nice to think the analysis should go on helping Bloodwise for a long time.” Mark is one of over five hundred volunteers whose services are now available through the Pro Bono O.R. initiative. Their expertise is being brought to bear in organisations ranging from large national concerns to small community set-ups, many of which are struck by how much can be achieved in just a short period.
“Without the tools to model different scenarios and understand their consequences, it isn’t surprising that many organisations tend to rely on gut feelings,” says Felicity. “An O.R. practitioner comes armed with an array of analytical tools, plus the skills and experience to identify the critical factors and issues, explore the different options and explain the impact of them in real terms. That’s why our volunteers can make a huge difference in a small amount of time.” In the end, as Bloodwise has found, the fundamental goal is simply to help the third sector do a better job. “O.R won’t make the decisions for you,” says Felicity, “but it provides some of the head to your organisation’s heart. And when you combine the two you’re more likely to act in the interests of your organisation and its beneficiaries.” Neil Robinson is the managing editor of Bulletin Academic, a communications consultancy that specialises in helping academic research have the greatest economic, cultural or social impact.
IMPACT | AUTUMN 2016
29
O. R . I N H M R E V E N U E & C U S TO M S JOHN LORD AND ANDREW CULLING
HM Revenue & Customs (HMRC) is the UK’s tax, payments and customs authority, collecting the money that pays for public services and helping families and individuals with targeted financial support. We do this by being impartial and increasingly effective and efficient in our administration. We help the honest majority to get their tax right and make it hard for the dishonest minority to cheat the system; playing a major role in helping us to achieve these goals are Operational Research analysts.
A GOVERNMENTWIDE COMMUNITY
Operational Researchers in HMRC are part of the Government Operational
30
IMPACT © THE OR SOCIETY
Research Service (GORS) which supports policy-making, strategy and operations in 25 government departments and agencies, employing more than 600 analysts; ranging from sandwich students to members of the Senior Civil Service. HMRC has over 100 of those analysts – based in London and the North West – a few working across government departments, with many in multi-disciplinary teams with other analytical professionals such as economists, statisticians and social researchers. Examples of work with other Departments include working with the Department for Work and Pensions on Universal Credit and PAYE’s Real Time Information (RTI).
PAYE RTI was introduced in April 2013 and requires all UK employers to tell HMRC of their liability to PAYE at or before the point they make payments to their employees, i.e. around each payroll run. Previously employers paid on account monthly and reported actual liabilities annually. Elsewhere, Operational Researchers join up across government on specific methodology areas: the Government Predictive Analytics Network being one example, sharing best analytical practice from industry experience and joining up thinking on the use of new tools and open source technologies.
HMRC ROLES
HMRC’s analysts typically work in either policy-facing or operational/ compliance roles. Those with a policy focus work on fast-paced analyses, answering policy and ministerial questions to sometimes short timeframes, alongside work preparing for the Chancellor’s budgets and autumn statements. These analysts also support Treasury and the Office for Budget Responsibility, while maintaining taxpayer confidentiality. Operationally, analysts deliver insights and taxpayer-level modelling to make effective use of resources and test new techniques for improving taxpayer experiences, revenue collection and efficiency of HMRC’s services.
DATA, DATA EVERYWHERE
As HMRC oversees the tax affairs of all UK taxpayers, we have plenty of data for analysts to explore, explain and inform business decisions. Two million VAT-registered businesses for example, submitting eight million tax returns a year; millions of Income
Tax Self-Assessment returns received annually; and PAYE’s RTI – which provides HMRC with pay details of every business’ employee each time the payroll is run – has surpassed the three billion payslip records mark. New digital services meanwhile, are delivering more new data on the behaviour of taxpayers, which we’re mining for insights to improve and target services better.
The Government Operational Research Service (GORS) supports policy-making, strategy and operations in 25 government departments and agencies
EXPLOITING DATA FOR PREDICTION
There is a variety of O.R. applied in HMRC, with one classic technique being Predictive Analytics. HMRC has to demonstrate effective use of resources and with millions of taxpayers and constrained resource to ensure tax is paid correctly, improved targeting to ensure effective resource deployment is key to our operations. O.R. analysts have been targeting the non-compliant – those not paying the right amount of tax at the right time – for many years. Taxpayer interventions provide outcomes that enable use of credit scoring type techniques to rank taxpayers according to risk (of non-compliance).
Techniques have evolved as computing power has improved: from early stepwise regressions to sophisticated logistic and linear regression scorecards nowadays. Analysts are applying similar techniques to assist taxpayers that will potentially require extra assistance in complying with their tax obligations. We try to predict those likely to fall into that category and offer help proactively: by adopting predictive modelling techniques we significantly increase the probability of finding those vulnerable taxpayers that need help. This kind of modelling can have other welcome knock-on effects too. Where a taxpayer is struggling and gets into debt, our data analysis can tell us if they’re likely to correct things pretty soon. We can then take a more appropriate action, assessing risk accordingly. Similarly, in the debt management space, we look to predict who is likely to get into a substantial amount of debt over a sustained period of time, even if they aren’t currently in debt. This helps operational staff to direct sufficient resources towards long-term debtors. We can further use information held on taxpayers to forecast the occurrence of debt. The use of decision trees and scorecards enables prediction of risk for each individual, meaning we can adopt different strategies depending on whether we feel they fall into a low or very high risk category. The model used has been adapted from its original purpose but is effective nonetheless. It can be used to predict
IMPACT | AUTUMN 2016
31
WORKFORCE MODELLING
those likely to miss key tax return submission deadlines and give a gentle nudge in the right direction, as we want them to avoid getting caught up in our processes. This approach has led to a reduction in debt and an increase in on-time filing.
EVALUATION OF NEW APPROACHES TO IMPROVE EFFECTIVENESS
As HMRC has large and complete (taxpayer) population information, we can legitimately use the gold standard of evaluation techniques – randomised controlled trials, or RCTs – to test if new methods should be applied in improving tax administration. For many years, central analytical teams in HMRC have established themselves as units of expertise in areas such as impact and process evaluation: using RCTs to drive decision making and measure success. However, as the government digital agenda builds momentum, we see decision making cycles spinning at an ever increasing
32
IMPACT | AUTUMN 2016
rate. For traditional evaluation teams, this poses substantial challenges, in extreme cases commitment and accountability for future delivery has been set before project designs are finalised and, in turn, this undermines the influence of any post-hoc evaluation design. Despite the inherent challenges, design and implementation of robust RCTs, A/B testing ( randomized experiments with two variants, A and B, which are the control and variation in the controlled experiment) and post-hoc evaluation has never been more important for HMRC operational decision making and performance measurement. HMRC analysts use a combination of O.R., Statistics and Social Research approaches to overcome issues. Analysts can influence design and evaluation of the trials using logic models to understand links between taxpayer behaviour and performance measures; plus the use of Monte Carlo methods to measure the costs and benefits of new approaches.
Operational Researchers provide support to those planning both workforce and workplaces of the future. HMRC have increasing numbers of staff nearing retirement age - recruitment plans and talent pipelines need to be designed with this in mind. To help anticipate the business areas most affected, we use logistic regression analysis to derive the probability of staff leaving HMRC in future years - and use these probabilities in a simulation model to show the impact on future staff numbers. Results are calculated with confidence intervals and we can drill down to show the impact on lines of business, grades or professions. HMRC is reducing its office estate from 170 offices across the UK to 13 regional centres - to create modern, adaptable work spaces. Our O.R. analysts have worked with our HR people to estimate the impact of this change on our workforce, and to use that analysis to develop fully costed plans for phasing the migration, minimising adverse performance impacts during the transition.
DEVELOPING PERFORMANCE MEASURES
O.R. was at the heart of developing a new performance framework to reflect HMRC’s diverse and complex range of work. Problem solving techniques were used to build a picture of our key objectives, work themes that contribute and measures required to determine our success. O.R. analysts were responsible for development of key new measures of our strategy to become digital by default, to work efficiently and effectively and putting the customer at the heart of everything we do. Additionally, O.R. analysts
were instrumental in the design and development of clear and simple yet insightful - performance visuals, presented within a digital performance hub for easy access and interrogation, which are used to drive decision making across the organisation.
THE FUTURE FOR HMRC
Making Tax Digital is at the heart of our plans to make HMRC one of the most digitally advanced tax administrations in the world. It will mean the end of the annual tax return as we know it – replaced by simple, secure and personalised digital accounts, closer to real-time updates to HMRC (for businesses, based on digital records), and a more modern tax framework. How will customers benefit: • Customers will no longer have to provide information that HMRC already holds and we will make better use of the information we already receive from third parties to reduce under and over payments; • By collecting information as close to real time as possible, playing this back to customers and showing them what this means in terms of estimated tax due, they can plan in advance how much tax they will owe and budget accordingly; • By 2020 customers will be able to see all their tax affairs in near real-time, in one place, with overpayments offset against liabilities. All businesses now have access to the Business Tax Account and if you haven’t tried the Personal Tax Account you can register at https://www.gov. uk/personal-tax-account. For those who genuinely cannot get online due to their individual circumstances such
as disability, geographical, or other reasons, we will exempt them from obligations of Making Tax Digital. Alternatives, such as telephone filing and home visits, will be provided. Digital record keeping, as well as built in prompts, will help reduce common errors, giving customers greater certainty that they’ve got their tax right first time. Targeted guidance and tailored alerts will also help customers be more aware of their relevant obligations, entitlements and reliefs. Sophisticated tools are already in place to analyse data, combined with this more real-time approach, we can take action earlier, improving compliance, reducing need for a costlier and more disruptive intervention later for customers.
are required to push the Department forward: joining up across government and academia is feeding this thinking, with trials of new data visualisation and collaborative tools currently ongoing. John Lord is leading an Analytics team in HMRC and has twenty years’ experience of O.R. in HMRC and previously in Customs & Excise. Andrew Culling is leading an Operational Strategy team and has worked across several government departments in his ten year career. John and Andrew are the joint Heads of Profession for O.R. in HMRC.
Search ‘HMRC Making Tax Digital’ to find material on consultations HMRC have set out: https://www. gov.uk/government/collections/
Making Tax Digital is at the heart of our plans to make HMRC one of the most digitally advanced tax administrations in the world
making-tax-digital-consultations and an overview of what this means for taxpayers: https:// www.gov.uk/government/ uploads/system/uploads/ attachment_data/file/484668/ making-tax-digital.pdf
O.R. is playing a key role in shaping deployment of these services through modelling customer behaviour around take up of digital services and the impact on traditional contact channels like phone and post. To deal with large and increasing volumes of data being collected, HMRC is currently building a centralised, unified data source, an Enterprise Data Hub: a scalable and flexible Hadoop platform that should be able to cope with expansion in future years. Much effort is needed in the coming months and years to see how we can make best use of all of the data and new technology. In the growing area of Data Science we are considering the new skills we think
JOB OPPORTUNITIES As HMRC widens its analytical capabilities we have attracted investment; funding should see us grow our capability by recruiting up to 300 more analysts over the next few years. Those interested in Operational Research roles should search the Civil Service Jobs (https:// civilservicejobs.service.gov. uk/) or GORS (http://www. operational-research.gov.uk/ recruitment) websites.
IMPACT | AUTUMN 2016
33
DANIEL MARIN
34
IMPACT Š THE OR SOCIETY
MOVEMENT STRATEGIES, a company that has specialised in the analysis of crowd movement for over 12 years, have created their own simulation tool to apply their knowledge of crowd movement, human behaviour and fluid dynamic principles. In the aftermath of 9/11, the question of how to evacuate multiple buildings
safely and quickly was first raised. When planning for this scenario, there are a number of questions that need to be considered, such as the approach to develop, the criteria for defining safe evacuation, which tools to use and so on. Simon Ancliffe, Chairman and founder of Movement Strategies, is a recognised crowd movement specialist,
Photo courtesy of Movement Strategies
MODELLING PEDESTRIAN M OV E M E N T W I T H S E N S E ™ : OLD SCIENCE, NEW A P P R OAC H
spanning observation, data collection, analysis, modelling and consulting. This knowledge of people’s movement and behaviour, from both a qualitative and quantitative perspective, is something unique and one that lends itself to a number of other applications, in addition to large scale mass evacuation plans. Music festivals, stadiums (through football and rugby clubs or architects), major events, cultural sites, amongst others, became key areas of focus. In the last decade, there have been many opportunities in the UK and worldwide to apply and further develop these specific skills. The real catalyst for the business was the London 2012 Olympic and Paralympic Games. Movement Strategies helped the London Organising Committee of the Olympic and Paralympic Games (LOCOG) deliver a safe, enjoyable and welcoming games, accommodating hundreds of thousands of spectators and staff. This work started in 2006 with the analysis of the design of the Olympic venues and continued throughout the games, analysing the movement of people to and from the Olympic venues in real time. Throughout this project, many questions were answered, e.g.: “There are X people in A, willing to go to B. What routes will they use? How long does it take to walk from A to B? Is the route wide enough to accommodate this demand? Are there any pinch points along the route? Are queues formed? How long are the queues? What Level of Service are these pedestrians experiencing? In small scale operations, these questions could be answered with basic assumptions, including average walk speed and distance and a simple division. In more complex cases, modelling tools need to be used to understand the various permutations and scenarios. See the panel for discussion of different potential modelling tools.
WHAT MODELLING TOOLS ARE AVAILABLE? In layman’s terms, a model is a simplified and idealised understanding of a physical system. A model typically is built using a series of equations which translate the physical world into mathematical terms and the underlying equations solved. Models need to be simple enough that they can be understood, manipulated and solved, but also complex enough to accurately reflect the most important factors affecting the real life scenario. A model is an approach to solve complex systems where intuition and common sense are not sufficient (although always necessary). A tool, on the other hand, is a piece of software that enables the modeller to develop the model. Flow calculations have been studied for many decades and are well understood. There are a variety of approaches that can be used when modelling the situations previously described. There is a choice between a static and a dynamic approach. A static approach does not take time into account, whereas the dynamic one does. One must also consider whether to employ aggregate and disaggregate (or macroscopic and microscopic) approaches. In a disaggregate (microscopic) approach, the flow being modelled is considered as a series of individual agents, with individual properties (i.e. speed or density). In an aggregate (macroscopic) approach, the flow and its properties are, as the name suggests, aggregated. Whilst microscopic models are usually stochastic (use of random parameters to play with the uncertainty of the model), macroscopic models tend to be determinist (when running the same model several times, one always gets the exact same result). Many commercial tools are available to model vehicle flows. These tools again include both microscopic and macroscopic aggregation, static and dynamic assignments. They enable the development of dynamic microscopic and static or dynamic macroscopic models. Some software tools, such as AIMSUN (TSS Systems) for example, also employ the so-called mesoscopic approach, in which flow characteristics such as the speed of vehicles across a section are aggregated, but vehicles are modelled in a microscopic way. Pedestrian simulation tools are usually either static macroscopic (normally through Excel spreadsheets) or dynamic microscopic (through commercial tools such as LEGION or Mass Motion), although dynamic macroscopic tools such as PEDROUTE and PAXPORT were extensively developed and used in the 1990s to test station and airport terminal layouts. The static macroscopic approach is satisfactory when deriving quick results with high level assumptions. However, when the modelled scenario becomes more complex, with route choices, cross-roads, merging flows, input flows changing with time etc., the static macroscopic approach becomes almost impossible to update, sustain and communicate. Using complex spreadsheets comes with their own issues; some studies claim that 88% of spreadsheets issued in the world have errors, which sometimes can lead to misinterpretations, resulting in real changes in our daily lives. The dynamic microscopic approach theoretically enables modelling of any kind of situation, behaviour and interaction with the environment. However, when the number of agents gets increasingly high (hundreds of thousands), the scale of the modelled area very wide (several square miles) both modelling and computing time become a real issue, which usually discourages the modeller from running enough scenarios to have a full picture of the outcomes that could occur.
IMPACT | AUTUMN 2016
35
When a client challenged us to model the mass evacuation of a wide tourist area in London with many potential scenarios and the ability to detail the evacuation per building, the question of which tool to employ needed to be addressed. A dynamic model had been developed previously at Movement Strategies called SENSE™, built using VBA for Excel. It was a smart way of computing pedestrian movement across a network and applying specific pedestrian parameters. The weakness of this approach was the calculation time for big networks and the limited editing tools provided by Excel. This was the first option. Other established agent based modelling tools such as LEGION were assessed as being unsuitable for this large modelled area. Movement Strategies investigated the use of simulation software designed for vehicle modelling and applying pedestrian characteristics to the modelled network and flow. Whilst this was considered an option, there are significant differences between
36
IMPACT | AUTUMN 2016
roads modelled for cars and footpaths modelled for pedestrians. In the latter case, crossing flows share the same capacity (the width of the pavement) and the network capacity in terms of throughput depends on the uni- or bi-directional characteristic of the flow
When challenged to model the mass evacuation of a wide tourist area in London the question of which tool to employ needed to be addressed
(when density reaches a critical level, people don’t walk at the same speed when following the flow or facing it). This could be modelled with simulation tools for which an Application Programming Interface (API) was available, enabling the modeller to dynamically program the characteristics of the network so it fits pedestrian properties. We chose not to pursue this option.
Map data © 2015 Google
FIGURE 1: STATIC ASSIGNMENT (ROUTE CHOICE MODULE OUTPUT)
After internal discussion, and considering the skillsets available in our team - urban planners, crowd modellers, software programmers, mathematicians… - we decided to build our own tool, designed for our client’s specific needs. Whilst most of the techniques are not new, by combining our knowledge from different fields we felt we were able to create an in-house tool that we could customise according to our needs. And more important than customisation, we would be able to explain the outputs at any level of analysis, without having to manage a ‘black box’ such as in some commercial tools. The requirements were as follows: the tool needed to be able to run a dynamic model that could handle millions of people, across areas of several square miles, with a time resolution lower than one minute. It also needed to be suitable for a wide range of environments and the model outputs should be of high quality. It was identified that a Geographical Information System (GIS) framework was an intuitive way to model a network of routes with properties defining the way people can move through those routes. A GIS tool enables the modelling of a network configuration and demand scenario and communication of the results graphically. This way, both the modeller and the client can associate, for example, a pinchpoint of the model to a real physical location they recognise. QGIS (http://www.qgis.org/ en/site/), a free, open source, stable GIS Package that provided an API allowing customised plugins was selected as the core package. The core algorithm that computes the evolution across the network had to be fast and be able to handle big tables in memory and C++ was selected.
Map data © 2015 Google
FIGURE 2: DYNAMIC EVOLUTION OF THE DENSITY ON THE NETWORK (TIME DISPLAYED AT THE TOP LEFT OF EACH PICTURE).
The resulting tool, also called SENSE™, like its predecessor, is a network based simulation tool. The network link (footpath, road, pavement, stairs, escalators etc.) is modelled as a line with two main properties: a throughput capacity and a storage. To understand this, one can imagine people as a fluid, and the network they walk through as an upside down bottle. The throughput capacity will be the diameter of the neck whereas the storage capacity will be the volume of the bottle itself. Taking this simple example, it is possible to understand the model behaviour as follows: if water is poured into a pipe which has a varying
diameter, what is the throughput at the end of the pipe? Will the water spill out of its container? These are exactly the questions the SENSE™ tool answers when water is replaced by pedestrians and the bottle by a network of streets and spaces. The SENSE™ tool is split into two main modules: • A route choice module, based on Dijkstra’s algorithm (first conceived in 1956), • A dynamic assignment module, based on basic flow conservation principles, which have been applied in fluid mechanics since the 19th century.
The route choice module enables a user to find, for any Origin Destination demand, a set of routes that minimise the cost (usually the walk time). The dynamic assignment module computes the evolution over time of the demand across the routes provided by the route choice module, based on the simple principle that in any section of the network: Population at time t = Population at time (t-1) + section inflow at time t – section outflow at time t The core algorithm is based on the Finite Difference Method (which is the Finite Element Method in one dimension), that gives a framework to transform a continuous problem into a discrete problem in both time
IMPACT | AUTUMN 2016
37
38
IMPACT | AUTUMN 2016
Map data © 2015 Google
Population profiles at public transportation stations Population at PT station gates
2000
350000
1
1800
300000
1600 1400
4
3
1200 1000
250000 200000 150000
800 600
5
2
400
100000
Stratford International Stratford Station Northern entrance Stratford Station Southern entrance Stadium Northern gate
50000
200
Stadium Southern gate
0
00:00 00:05 00:10 00:15 00:20 00:25 00:30 00:35 00:40 00:45 00:50 00:55 01:00 01:05 01:10 01:15 01:20 01:25 01:30 01:35 01:40 01:45
0
Population profiles at stadium gates 350000
Population at stadium gates
300000
1
250000 200000
Stadium Northern gate Stadium Southern gate
150000 100000
2
50000
01:45
01:40
01:35
01:30
01:25
01:20
01:15
01:10
01:05
01:00
00:55
00:50
00:45
00:40
00:35
00:30
00:25
00:20
00:15
00:10
00:05
0
00:00
and space (and one that is therefore computer friendly). Examples of outputs provided by the tool can be seen in Figures 1-3. Here we analyse the egress phase of an event in London’s Olympic Stadium. Figure 1 indicates that many more people use the northern entrance, as opposed to the southern entrance. Figure 2 indicates that the time taken for the northern entrance of Stratford Station to ‘clear’ is much higher than for Stratford International and the southern entrance. Knowledge of the network, local constraints etc. will then enable us to understand the reasons for this imbalanced demand and work out with our client the best solutions to optimise the system. Figure 3 shows profiles of population versus time at the northern and southern stadium gates as well as at the three different station entrances. One can see that the profile showing the population at Stratford Station – Northern entrance lasts more than 90 minutes whereas the vicinity of the stadium is cleared in less than 30 minutes and the other station entrances are cleared in less than 50 minutes. To conclude, we have built on the innovations and work of Taylor (mathematics), Navier-Stokes (fluid dynamics), Lighthill and Whitham (traffic models), Dijkstra (route choice) and combined this with our knowledge of people movement, physics, maths and open-source development frameworks to design a tool capable of answering most of our client’s questions with regard to the movement of crowds. SENSE™ is now a mature software tool, with further developments carried out in partnership with the University of Southampton in the areas of route choice optimisation. We are also focusing on the modelling of dynamic events that occur within the network
FIGURE 3: COMPARISON BETWEEN EXIT PROFILE FROM THE STADIUM AND ARRIVAL PROFILE AT THE PUBLIC TRANSPORTATION GATELINES.
(e.g. stop and go situations), and we are undertaking further research on flowdensity curves (including the analysis of mobile phone data to better understand how the crowd density affects the average walk speed) and implementing these as enhancements in the tool.
Daniel Marin is Managing Consultant at Movement Strategies. Before joining Movement Strategies in April 2015, he worked for seven years in Paris in public and private transportation modelling. His background is applied mathematics, modelling and simulation.
FINDING THE BEST WAY W I T H DY N A M I C PROGRAMMING
© The AA
DAVID K. SMITH
EVEN THOUGH THEY SELDOM NEEDED IT in the 1960s, my parents always carried a large book in the car. Produced by the Automobile Association, it had maps of England and Wales, street plans for some of the larger cities, and a brief description
of the sights of hundreds of villages, towns and cities. In the front of the book, there were recommended routes between some of the larger places, listing the roads to take and intermediate localities. There were also pages with outline maps showing such routes from five different starting points. Although my parents believed that reading in the car would make children sick, we were allowed to look at maps. One of these five outline maps is shown here. The spidery lines show the recommended routes from Birmingham to a host of other locations. And naturally, these are also the recommended routes TO Birmingham from these locations as starting points. As such they show an interesting pattern. There is clearly only one recommended route for each starting point. That route joins with another, and another, and so on, until all the hundred routes converge to a small number of roads leading into the city. Nowhere do the routes cross. (These days, you can see the same pattern for yourself on any tablet with an app for route finding. Choose two places, A and B, about a hundred miles apart, and find the best route from A to B. Then select a third place, C, close to that route, and find the best route from C to B. If it is reasonably close, you will find that the two routes converge – but if C is far enough off the route, then the best route from C to B may be completely different.)
IMPACT © THE OR SOCIETY
39
© apply design / Alamy Stock Photo
All this route finding is a simple, pictorial, demonstration of some of the principles behind the problem solving approach known as dynamic programming, DP for short. One principle is that if you want to solve a big problem, then it is often possible to solve several smaller problems. Here the big problem is to find the best route from point A to point B. The smaller problems are to find the best routes to B from several points (C, D, E, F say) which are closer to B and might be on the route from A, and then find the routes from A to each of these points. Then find the best route from A to B via one of these points. Finding a lot of short routes is much easier than finding one long route. A second principle of DP is that once you are on one of the recommended routes to Birmingham, it doesn’t matter whether you started the journey a hundred miles earlier or just ten. The history of how you got there is immaterial. There is no sense of “memory” associated with progress along the route. This principle is attributed to Richard Bellman, the American pioneer of DP. He went on to describe problems like this as sequential decision problems, because, on the
40
IMPACT | AUTUMN 2016
route, you are making a sequence of choices – where do I go next? In the jargon of DP, the sequence of decisions is known as the set of stages of the problem, which may be physical progress on a journey, or successive days, or they may be created by breaking up the problem
the Dynamic Programming approach is well suited to problems where the result of a decision is uncertain
into something more artificial. To help make decisions at each stage, you need to describe what state you are in for the decision. So for the routing problem, the state might be “Outside Bristol Parkway station”. The app on my tablet will also ask what transport I want to use, so I need to extend the description of my state to include whether my journey is by car, public transport, bicycle or foot. The description might be extended to specify the time and date to allow for the timetables of trains and buses. Dynamic Programming models start with states close to the end of the
sequence of decisions, find the best decision for all of these, and then work backwards, stage by stage, looking at the cost and benefit of making choices which lead to new states at the subsequent stage (from which the best decision is already known). There isn’t a unique form for DP problems; in Operational Research, it is considered as an approach to problem solving, rather than a mathematical straight-jacket. Introductions to DP usually start with problems of finding routes, because these are generally simple. The cost of a decision is the distance to the next state, and the cost of the best policy – or set of decisions – is the sum of the distances. (Or it could be the time to the next state, or the cost of public transport fares.) The same introductory lectures or books often move on to look at a problem where the stages are artificial; that’s the problem of selecting which of a set of things to pack into a container which can’t take everything. As you pack items, the space available for the remainder is reduced, but the value of the contents is increased. The stages imagine that you have only one kind of item to pack, and determine what is best for packing that item, with different amounts of space available. Then, do you pack a second item, knowing the best way to pack any space that is left after you have made that decision. Then, do you pack a third item, knowing how best to pack the remaining two? It doesn’t take much to extend this sort of packing problem to one of investment – which of a portfolio of possible assets should be bought with a given sum of money? These two introductory problems have no randomness. However, the Dynamic Programming approach is also well suited to problems where the result of a decision is uncertain.
and networks; when the problem is deploying limited resources, as in the packing problem earlier, DP enters the world of financial mathematics and mathematical programming. A further area of application is in engineering. How should you adjust the controls of some device or system so that it performs optimally? What is the correct setting for a water valve in a network of pipes? How much power should an electrical generator produce
The versatility of the approach makes dynamic programming useful in numerous practical problems
and should it be changed to cope with forecast demand? Even the thermostat on the oven in your kitchen uses a rudimentary form of DP – when the oven is too hot, the elements are turned off, when it cools, they are tuned back on again. The oven has, in effect, six states: too cold, about right and too hot, for each of which the elements may be on or off. In each state there is a recommended decision. Dynamic programming has found its way into sporting decisions. A long
jumper has to choose where to leap from – it cannot be in front of the takeoff board, but even the most consistent jumpers can overstep this line. So the aim is to select a target point behind the line for each qualifying jump. A foul jump on the first attempt should encourage the jumper to aim further back on the next attempt, while a good jump then may be an incentive to risk moving a little forward to boost the recorded distance. In tennis, the server has two attempts and often hopes to serve an ace, which runs the risk of hitting the top of the net. So a net call on the first service should encourage the server to aim a little higher on second service. The Journal of the OR Society once published guidelines for which part of a dartboard to aim for when playing 301-up. Each dart thrown required a decision, and the state was the current score; randomness meant that a poor throw might be disqualified or change the score to an awkward total. It is in cricket that DP has had world-wide success. The DuckworthLewis method uses the DP approach, and is a way of treating two cricket teams fairly when a limited-over match is interrupted by rain. The state of a batting team is described as the resources available to it, in
IMPACT | AUTUMN 2016
41
Jack Sullivan / Alamy Stock Photo
Instead of calculating the cost or benefit of the decisions you make at a stage and in a state, you model the expected value of the decision. So who uses this approach to solving problems? The versatility of the approach makes it useful in numerous practical problems, not just in the programming of your sat-nav. The DP approach is at the heart of the analysis of chess and other board games, where it goes under alternative names such as “tree search”. The white player looks at the board, chooses a move to make, which advances the game one stage and changes the state, which is further changed by the black player’s response, assuming rational play as well. So for each move that the white player can make, there is a resulting state after black’s response, advancing the stages towards the end. In another type of problem, dynamic programming is renamed search theory. And this is one recent use of the DP approach. The search for the missing Air France flight 447 after it disappeared over the Atlantic in 2009 involved a sequence of decisions about which of several sectors of the ocean to search next. Searching a sector had a particular chance of finding the plane – if it was in that sector. Models of the plane’s flight path, and the evidence of wreckage, provided estimates of the chance of the wreck being in a sector; these estimates were updated after each unsuccessful search of a sector, reducing the chance of the wreck being in that sector and increasing the chance of it being in the others. The “state” in the dynamic programming model did not depend on the order in which previous searches had been made – only on the estimates of the chance of a successful search. For routing problems, DP overlaps with the mathematics of graphs
terms of wickets remaining and overs to be bowled. With those resources, it is assumed that a rational captain will plan a rate of scoring with the intention, roughly, of exhausting both those resources at the end of the match. Fewer wickets remaining reduces the planned run rate because of the risk of a wicket falling; fewer overs to be bowled increases the planned rate. Then each ball bowled and the outcome for the batting team affects
who should use dynamic programming? – everyone!
the resources. Analysis of thousands of matches has calibrated the method, so that a target score for the second team to bat can be established for any raininterrupted match. So, in conclusion, who should use dynamic programming? The answer
is – everyone! We all make decisions in sequence, often without thinking about the assumptions we make. (It’s breakfast time, the weather forecast is for sunshine, so you don’t take an umbrella. You assumed the forecast was accurate, and there was only a slight chance that it wasn’t.) But, in business practice, the approach of dividing a substantial problem into smaller, easier, problems, is one with widespread application. So is the concept of identifying real, or artificial, sequential stages in that substantial problem. The approach of DP clarifies and rationalises a natural process of “one thing at a time”. And that is its great value.
DAVID SMITH
David Smith studied mathematics as his first
his career - with a brief diversion to teach at
degree and decided that operational research
the University of Jordan. In the course of time,
offered an excellent way of using his mathematics
he wrote research papers and three textbooks. In
doing something useful and varied. After
retirement he is on the staff of an Exeter church,
postgraduate study at Lancaster, he joined the
writes about operational research, philately and
staff at the University of Exeter where he made
local history, and tries to keep fit.
ADVERTISE IN IMPACT MAGAZINE The OR Society are delighted to offer the opportunity to advertise in the pages of Impact. The magazine is freely available at www.theorsociety.com and reaches a large audience of practitioners and researchers across the O.R. field, and potential users of O.R. If you would like further information, please email: advertising@palgrave.com. Rates Inside full page: £1000 Inside half page: £700 Outside back cover, full page: £1250 Inside back cover, full page: £1200 Inside front cover, full page: £1200 Inside full page 2, opposite contents: £1150 Inside full page, designated/preferred placing: £1100 Full Page Adverts must be 216x286mm to account for bleed, with 300dpi minimum.
42
IMPACT | AUTUMN 2016
LIFE RITHMS Geoff Royston
WORLD DOMINATORS?
They are intelligent, invisible, and they are taking over the world. No, not aliens, but algorithms; defined as “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer”. Anyone working in operational research or data science will be very familiar with the key role of algorithms, whether in, say, methods for finding the shortest route for a delivery firm, or the most efficient schedule for a travel company, or the best mix of products for a manufacturer to make from a limited supply of parts. Indeed a while ago the front cover of the New Scientist proclaimed one such method (known as the simplex algorithm) as “the algorithm that runs the world - called upon thousands of times a second to ensure the world’s business runs smoothly”. On the face of it this is a highly technical area, and one becoming ever more sophisticated, with algorithms underpinning artificial intelligence-based systems that are now increasingly capable of tasks ranging from diagnosing illness to driving cars. But; while algorithms are enabling machines to deploy what have traditionally been human skills, perhaps we humans can learn a trick or two from them, and use a few new algorithmic skills ourselves. After all, we already use simple algorithms by, say, making a cake from a recipe, or knitting a sweater from a pattern, so maybe the approach can be developed for use in other areas of our daily life. That at least is the message of a new book “Algorithms to Live By: The computer science of human decisions” by Brian Christian and Tom Griffiths. They make a link between some common human problems and algorithmic solutions.
FROM APHORISMS TO ALGORITHMS
We are all familiar with sayings such as “look before you leap”, but how much looking should we do before leaping? Can we turn an aphorism into an algorithm? Christian and Griffiths argue that we can. Let’s stay with the looking and leaping problem. Suppose you are house hunting, and you have up to a month to find somewhere. How much time should you spend looking for the best one before you choose? Let’s assume it is a sellers’ market, so you won’t be able to go back to houses that you have turned down. The starting point for an algorithmic approach is the observation that the “best-yet” houses will become increasingly more impressive as the search continues, but also will become decreasingly frequent; so there must be a balance point between looking and leaping. Further analysis shows that your best chance of getting the best house is to spend 37% of the time available, i.e. 11 days, looking, to gauge standards, and to then commit to the first place you then see that is better than anything that you have already seen. This is not just an estimate; it is provably the best approach, at least in the simple situation described above. (Read the book to find out why, or to see how this point shifts under different scenarios e.g. to 25% if offers to buy will get rejected half the time, but to 61% if you can go back later and make an offer, with a 50/50 chance of success, on houses that you had previously passed over). Nor is this algorithmic finding limited to house hunting; the book shows how it is relevant to searching for a parking space, to looking for staff - or for a spouse! EXPLORE OR EXPLOIT?
A related problem arises when considering, in choosing, say, a restaurant, whether to try something novel or to stick with an old favourite. Should you explore the unfamiliar or exploit the familiar? This, oddly, is known as the “multiarmed bandit “ problem. Its origin has nothing to do with strange-looking highwaymen but is about the best strategy for playing a row of slot machines in a casino (apart, maybe, from “don’t”). Suppose there are just two machines; you have a few goes on them and one pays out more than other. Do you stick with the one that has paid out most, or try the other one for a bit longer, or what? You may have had an unrepresentative lucky streak with the first machine, and playing the other longer might reveal its true colours, on the other hand (arm?) the initial results could be true indicators of future pay-offs, so switching may cost you. A simple algorithm is “stay on a winner, shift from a loser” and that
IMPACT © THE OR SOCIETY
43
can be shown to be better than choosing at random. That is far from the whole story however, and the book provides a fascinating account of how attacks on the bandit problem have developed. Such complications notwithstanding, the ‘multi-armed bandit” problem proves to have a least one very important and perhaps surprising application area clinical trials. Establishing whether treatment A works better than treatment B involves carefully designed comparison experiments in which patients who have agreed to take part in a trial are allocated, typically at random, to one or other treatment. One crucial issue is how long to continue to allocate patients this way in a trial in which one treatment (say A) is beginning to show better results than another. Stop allocating to B early on and you may get a misleading result from the trial, and so risk giving an inferior treatment to future patients, stop late and you may have unnecessarily mistreated a number of trial patients. This looks like our “bandit” problem in a different guise. After much argument – lasting over a quarter of a century - the merits of looking at clinical trials in this way has gained ground and variants of a “stay on a winner” strategy (rather than the conventional “stick with the original planned allocation” approach) are increasingly adopted in them. So far we have looked at some problems and associated algorithms related to managing time pressures. What about problems related to managing shortage of space? Christian and Griffiths offer some algorithmic insights into those too. NOTICES OF EVICTION
44
IMPACT | AUTUMN 2016
LIMITS OF SPACE AND TIME
Christian and Griffiths cover many more issues - scheduling tasks, making predictions, handling communications and so on. A short piece like this cannot hope to cover the range of issues addressed in a 350-page book. But underlying them all is a fundamental fact - our lives are carried out with finite resources of time and space. So, for example, we have always to choose how much time to devote to any task. Computers face similar constraints and much analytical work has been done on designing algorithms that allow computers to deal with them efficiently. As time has gone on computer algorithms have got better at coping with real-world problems where information is missing or approximate answers will suffice – increasingly opening up possibilities for borrowing some of them in human decision-making. “Algorithms to Live By” shows that, even though life can be too messy to allow exact numerical solutions to many everyday problems, the insights gained from algorithmic approaches can help us understand them better and tackle them more effectively. Dr Geoff Royston is a former president of the O.R. Society and a former chair of the UK Government Operational Research Service. He was head of strategic analysis and operational research in the Department of Health for England, where for almost two decades he was the professional lead for a large group of health analysts.
© Courtesy of William Collins
Take for instance how to store your paperwork. Conventional thinking is to have some sort of alphabetical or subject classification. But this is not how computers store files. Computers have to juggle two limited resources – memory size and access speed. A large memory is generally a slow one, while a small memory (called a cache) can be accessed quickly. Obviously, computers should keep frequently needed items in small, fast access, memory stores and use large, slow access, stores for infrequently needed items. To decide how to do this they use a caching algorithm. For example it could be “random eviction” - adding new data to the fast memory, making room by moving other data to the slow store at random (surprisingly not that bad an approach); or “first in, first out”, moving out the oldest data from the cache; or “evict the least recently used”, based on the principle that the thing least likely to be needed next is the one that has not been used for the longest time (and the thing most likely to be needed
next is the last thing used). That algorithm turns out often – and indeed provably - to be the best. Which suggests that it could usefully be applied to our own lives, “evicting the least recently used items” when decluttering our house and having a “caching pile” for our documents, with the most recently used file, rather than in any alphabetical or other taxonomic arrangement, simply being placed after use at the top of the pile. The most recently used files will therefore be the easiest to access when, as is most likely, they will soon be needed again. A self-organising filing system - and no more guilt over that small mountain of paper on your desk!
OR ESSENTIALS Series Editor: Simon J E Taylor, Reader in the Department of Computer Science at Brunel University, UK The OR Essentials series presents a unique cross-section of high quality research work fundamental to understanding contemporary issues and research across a range of operational research (OR) topics. It brings together some of the best research papers from the highly respected journals of The OR Society.
ACCESS THESE TITLES AT: http://www.palgrave.com/series/14725
Conference on Mathematics of Operational Research Innovating Mathematics of New Industrial Challenges
The OR Society and the IMA invite you to their new joint conference, exploring new mathematics underpinning operational research (OR) applications. This conference will be an exploration of that work and a celebration of how it is changing lives for the better.
Thursday 20 – Friday 21 April 2017 Aston University, Birmingham The conference will host plenaries from leading international experts, talks, workshops, posters and other interactive sessions to draw together the considerable community of researchers and practitioners who develop new mathematics of relevance to and which underpin Operational Research applications.
To join the conference, submit papers or find out more:
Plenary Speakers
Professor Jacek Gondzio University of Edinburgh
Professor Edward Kaplan Yale School of Management
www.theorsociety.com/MathsofOR and http://tinyurl.com/MathsofOR #MathsofOR
Guillermo Ortega
European Space Agency
Lizzi Lake, Conference Officer conferences@ima.org.uk +44 (0) 1702 354 020 Institute of Mathematics and its Applications, Catherine Richards House, 16 Nelson Street, Southend-on-Sea, Essex, SS1 1EF
Professor Bert Zwart CWI Amsterdam