Impact Magazine Spring 2020

Page 1

D R I V I N G I M P R O V E M E N T W I T H O P E R AT I O N A L R E S E A R C H A N D D E C I S I O N A N A LY T I C S

SPRING 2020

IMPROVING BRITISH AIRWAYS’ BUSINESS WITH DATA SCIENCE AND AI The Data Science team drives smarter decisions across British Airways

SIMULATION BENEFITS EUROSTAR’S PASSENGERS

© British Airways

Smoothing the flow through stations of record numbers of travellers

O.R HELPS MAKE PRISONS SAFER Analysis supports prison staff to make data-driven decisions


Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

00

Perspicax agricolae suffragarit Augustus. Suis vocificat fiducias.

00

Saburre miscere Aquae Sulis. Pessimus tremulus matrimonii insectat Octavius.

00

Satis saetosus ossifragi agnascor incredibiliter perspicax apparatus bellis.

00

Satis quinquennalis fiducias imputat gulosus agricolae.

00

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

00

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

00

Perspicax agricolae suffragarit Augustus. Suis vocificat fiducias.

00

Saburre miscere Aquae Sulis. Pessimus tremulus matrimonii insectat Octavius.

00

Satis saetosus ossifragi agnascor incredibiliter perspicax apparatus bellis.

00

Satis quinquennalis fiducias imputat gulosus agricolae.

00

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

00

JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY

Contents

VOLUME 00 NUMBER 00 MONTH 00 ISSN: 0960-085X

THE EUROP JOURNAL O INFORMAT SYSTEMS

Dov Te’eni VOLUME 00 NUMBER 00 MONTH 2018


E D I TO R I A L A feature of the annual conference of the OR Society is a good mix of academics and practitioners, enabling mutual sharing of insights and experiences. This issue contains three articles that arise out of presentations at the last conference, OR61, held at the University of Kent in September 2019. Another article from the conference, reporting work to support the Maritime Warfare Centre during the Exercise Joint Warrior 19-1 which took place in April last year, will appear in the Autumn issue. The lead article in this issue tells us about work undertaken as part of a Knowledge Transfer Partnership between the University of Kent and Eurostar, to add new simulation and analytical capabilities to Eurostar’s planning framework and use them to help tackle key strategic and operational business challenges. In each issue I include an article focussing on the work of an analytical group. Our cover photo is of a British Airways Boeing 787 undergoing planned maintenance at the airline’s Cardiff engineering base, relating to the article concerning the Data Science Team at British Airways. The third article from OR61 is, in fact, the winner of the President’s Medal, awarded for the best submitted practical application of OR. The Ministry of Justice analysis supports those making a difference in the lives of people in custody, enabling them to make more data-driven decisions to rehabilitate and support offenders and keep the public safe. I hope you enjoy reading all the reports of how O.R. and analytics continue to make an impact. Electronic copies of all issues are available at https://issuu.com/ orsimpact. For future issues of this free magazine, please subscribe at http://www. getimpactmagazine.co.uk/. This issue contains the last of Louise Maynard-Atem’s Data Series, and, indeed, the last of her columns. Thank you, Louise, for giving us the benefits of your perspectives on various aspects of data - how we use it, how we derive insights from it and how it is shaping our society. I am very grateful.

The OR Society is the trading name of the Operational Research Society, which is a registered charity and a company limited by guarantee.

Seymour House, 12 Edward Street, Birmingham, B1 2RX, UK Tel: + 44 (0)121 233 9300, Fax: + 44 (0)121 233 0321 Email: email@theorsociety.com Secretary and General Manager: Gavin Blackett President: Edmund Burke Editor: Graham Rand g.rand@lancaster.ac.uk Print ISSN: 2058-802X Online ISSN: 2058-8038 www.tandfonline.com/timp Published by Taylor & Francis, an Informa business All Taylor and Francis Group journals are printed on paper from renewable sources by accredited partners.

Graham Rand

OPERATIONAL RESEARCH AND DECISION ANALYTICS Operational Research (O.R.) is the discipline of applying appropriate analytical methods to help those who run organisations make better decisions. It’s a ‘real world’ discipline with a focus on improving the complex systems and processes that underpin everyone’s daily life – O.R. is an improvement science. For over 70 years, O.R. has focussed on supporting decision making in a wide range of organisations. It is a major contributor to the development of decision analytics, which has come to prominence because of the availability of big data. Work under the O.R. label continues, though some prefer names such as business analysis, decision analysis, analytics or management science. Whatever the name, O.R. analysts seek to work in partnership with managers and decision makers to achieve desirable outcomes that are informed and evidence-based. As the world has become more complex, problems tougher to solve using gut-feel alone, and computers become increasingly powerful, O.R. continues to develop new techniques to guide decision-making. The methods used are typically quantitative, tempered with problem structuring methods to resolve problems that have multiple stakeholders and conflicting objectives. Impact aims to encourage further use of O.R. by demonstrating the value of these techniques in every kind of organisation – large and small, private and public, for-profit and not-for-profit. To find out more about how decision analytics could help your organisation make more informed decisions see www.scienceofbetter.co.uk. O.R. is the ‘science of better’.



CO N T E N T S 7

USING SIMULATION TO IMPROVE THE CUSTOMER EXPERIENCE AT EUROSTAR William Jones, Kathy Kotiadis, Maria Paola Scaparra and Jesse O’Hanley explain how simulation modelling of Eurostar’s London and Paris terminals has helped growth objectives and benefitted passengers

15

A VISION FOR DATA SCIENCE AT BRITISH AIRWAYS Stefan Jackson and John Tozer tell us about the many improvements the British Airways’ Data Science team have brought to the business

21

OPERATIONAL RESEARCH IMPROVES BIOMANUFACTURING EFFICIENCY Tugce Martagan explains how the use of complex analytical models has significantly improved production outcomes at Merck’s Dutch Boxmeer facility, without increase in investment

30

IN THE PIPELINE Brian Clegg tells of the work of Black and Veatch’s Paul Hart and colleagues in determining routes of water pipelines

35

UNDERSTANDING STRATEGIC PRIORITIES AT RNIB Stewart Williams explains how a Pro Bono project helped RNIB amend their strategy by understanding better the barriers blind and partially sighted people face to participating fully in society

43

HARNESSING OPERATIONAL DATA TO MAKE PRISONS SAFER

4 Seen Elsewhere

Analytics making an impact

12 The Rise of the Data Translator

Louise Maynard-Atem interviews Ben Ludford to understand what the phrase Data Translator might mean for the world of data science

25 Discovering Heuristics

Jonathan Thompson explains how heuristics can give good results for scheduling, routing and packing problems

41 Universities making an impact

Brief report of a postgraduate student project

47 Think like a ….?

Geoff Royston considers whether analysts should think more like engineers and give greater emphasis to their own creative thinking and system-building skills

Martine Wauben, Phil Macdent, Adam Booker and Ben Marshall report how their analysis supports those making a difference in the lives of people in custody by enabling them to make more data-driven decisions

DISCLAIMER The Operational Research Society and our publisher Informa UK Limited, trading as Taylor & Francis Group, make every effort to ensure the accuracy of all the information (the “Content”) contained in our publications. However, the Operational Research Society and our publisher Informa UK Limited, trading as Taylor & Francis Group, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by the Operational Research Society or our publisher Informa UK Limited, trading as Taylor & Francis Group. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. The Operational Research Society and our publisher Informa UK Limited, trading as Taylor & Francis Group, shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions​

Reusing Articles in this Magazine

All content is published under a Creative Commons Attribution-NonCommercial-NoDerivatives License which permits noncommercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.


SEEN ELSEWHERE

© Qlik

2020 VISION

Biologist Peter Turchin, after a career using sophisticated mathematics to show how the interactions of predators and prey produce oscillations in animal populations in the wild, found himself drawn to history: could the rise and fall of human societies be captured by variables and equations? Could calculating the patterns and cycles of the past lead to an objective version of history – and help prevent a looming crisis? Human societies go through predictable periods of growth, during which the population increases and prosperity rises. Then come equally predictable periods of decline. These “secular cycles” last two or three centuries and culminate in widespread unrest – from worker uprisings to revolution. In 2010 he wrote a letter to the scientific journal Nature warning that a coming decade of dazzling technological progress risked being unravelled by mounting global political instability. This turmoil, he predicted, is due to peak in the US and western Europe around 2020.

FREE E-BOOK ON BIG DATA

A new, and free, e-book provides many useful pointers and tips to “getting more out of your business intelligence solution by super charging it with big data”. A list of “10 simple keys” includes using an agile analytics environment that can meet the needs of every user, providing access to analytics solutions anywhere, on any device and implementing a scalable solution that grows with your organization’s changing needs. It shows how to choose the right method to access data

4

IMPACT © THE OR SOCIETY

and discover hidden connections in data. It can be downloaded at: bit.ly/ downloadQLIK

SPORTS GIANT BUYS ANALYTICS FIRM

Nike has acquired a company specialising in predictive analytics to improve its inventory management across different channels. The giant sports brand retailer paid an undisclosed sum for MIT start-up CEL ECT, which has a range of technologies across data science and software engineering environments. According to Erik Sprunk, Nike COO, the acquisition of CEL ECT with its consultancy connection to MIT scientists will provide a predictive capability that is insight-driven, data-optimised and highly focused on customer behaviour. Read more at: bit. ly/CELECT

THE THIRD WAVE OF ANALYTICS

In a recent article, Barbara Darrow, a senior director of communications for Oracle, published on diginomica, asserted that business analytics has entered its “third wave”. She believes the analytics industry is entering a

stage which combines the advantages of the IT model with the convenience and immediacy of user self-service to satisfy both constituencies. This third stage offers both central control to ensure compliance and accuracy while letting users run the reports they need when they need them. Analytics today is faster than ever before, especially when it is based on the right data, and this can help unlock new business insight quickly and thus new business opportunities faster as well. She argues that this third wave makes full use of AI, Machine Learning (ML) and natural language processing (NLP). ML enables software to track user interactions and automate repetitive tasks thus freeing users to focus on the jobs at hand. But AI is only as good as the data that is fed to it – the more [good data] you feed to it, the better it gets. NLP also makes it easy for employees who are not data scientists or, for that matter, technologists to query their data and generate reports. Before this third wave of analytics, it was common within companies for analytical groups to use a “patchwork of one-off analytic tools” to derive some forms of insight. There was also no standardisation of data synchronisation. If datasets cannot be synchronised companywide, there will always be a risk of insight repetition and even misuse. What is needed to avoid such eventualities is centralisation of AI systems across all enterprise datasets. Oracle has produced an Interactive Business Analytics Assessment to enable organisations to carry out their own


MACHINE LEARNING IN SUPPLY CHAINS

Polly Mitchell-Guthrie, VP of Industry Outreach and Thought Leadership at Kinaxis, in her article, “How machine learning can heal a supply chain” in the September/October issue of Analytics magazine said: “The trends shaping supply chain today are global trade and tariffs, automation and labour shortages, the digital supply chain, rising consumer expectations of speed, and yes, machine learning. Machine learning opportunities in supply chain are abundant – improving forecast accuracy, inspection of physical assets, improved modelling for new product introductions, predictive asset maintenance, and great visibility across the collaborative supply chain network.”

to uncertainty about product quality and seller credibility. Live chat tools allow e-vendors to communicate with customers in real time. “Sellers with limited feedback benefit more from live chat conversations than sellers with a lot of feedback. And products with high past sales volume sell better after live chat, indicating a reinforcement effect,” continued Tan. “It is interesting that a seller can sell multiple products with varying levels of sales performance, and the seller feedback is measured based on all products.” The article can be seen at bit.ly/ChatPurchase

IS AI O.R.? LIVE CHATS CAN INCREASE SALES

Research published in the INFORMS journal Information Systems Research says live chats can actually increase sales and boost profits. The study, looking at data from Alibaba on consumers’ purchase decisions of Apple and Samsung tablets. “We found live chat can increase purchase probability of tablets by 15.9%,” said Xue (Jane) Tan of Indiana University. “We see that human interaction results in better sales performance.” Jane Tan of Indiana University, Youwei Wang of Fudan University and Yong Tan of the University of Washington say the fact that sellers and buyers cannot speak in person, like brick-and-mortar stores, leads

AI is O.R. was a claim made by not one but two plenary speakers at the OR Society conference in September 2019. John Hopes, President of the Society, drew attention to this in Inside OR (November 2019). He commented that the importance of AI to O.R. and vice versa has been a major theme of all Society events in 2019. For example, the Maths of O.R. conference showcased how leading O.R. research is right at the heart of machine learning, providing the advanced optimisation at the core of the technique. John said he had lost count of how many examples of the application of machine learning have been presented at Society events. When the technique is being applied to a management decision making problem, it very clearly falls within the remit of O.R. AI and its

associated challenges are now a major area of focus for the OR Society.

THE GUARDIAN VIEW ON MACHINE LEARNING

In a Guardian editorial in September 2019 (bit.ly/CleverML) concern was expressed that “it is in the nature of AI that makers do not, and often cannot, predict what their creations do”. The editorial concluded that “programmers do not understand completely the knowledge that intelligent computing acquires. If we did, we wouldn’t need computers to learn to learn. We’d know what they did and program the machines directly ourselves. They can recognise a face, a voice, be trained to judge people’s motivations or beat a computer game. But we cannot say exactly how. This is the genius and madness behind the technology. The promise of AI is that it will imbue machines with the ability to spot patterns from data, and make decisions faster and better than humans do. What happens if they make worse decisions faster? Governments need to pause and take stock of the societal repercussions of allowing machines over a few decades to replicate human skills that have been evolving for millions of years. But individuals and companies must take responsibility too”.

BUSINESS ANALYTICS MATURITY

A study undertaken by Deloitte has found that most companies are not mature when it comes to business analytics. While 76% of survey respondents reported that their analytical maturity has increased over the past year, most are still using the traditional tools of spreadsheets (62%), such as Microsoft Excel, and business

IMPACT | SPRING 2020

5

© wutzkohphoto/Shutterstock

peer comparisons. As a starting point to understand more about this type of technology, see the following link: bit. ly/InteractiveBAA


intelligence programs (58%), such as Microsoft Power BI or IBM Cognos. Analytics and Artificial Intelligence (AI) were not regarded with any level of esteem by business executives, although some 46% of executives saw AI as an important initiative over the coming years. Thomas Davenport and his study co-authors think that data analytics is something everyone needs to build into their jobs. They argue that it is time to eliminate the idea that only highly skilled mathematicians or data scientists are the only ones responsible for business analytics and time to spread accountability broadly and train all employees about the role of analytics and AI in their respective jobs and not just rely upon spreadsheets for gaining insight. More at: bit.ly/SpreadsheetInsights

are largely concerned with collecting, analysing and interpreting data of actual operations rather than of experiments made at the Institute”. In his summary, Tippett comments that “the work does not make very much use of elaborate statistical or mathematical methods. Tests of statistical significance and simple analyses of variance are occasionally used, especially in the methodological studies, but mostly simple tabulations and averagings provide the results. The mainstay of the work is the analytical detail in which it is done, and the background of technical knowledge with which it is guided”. That was typical of O.R. at the time, but O.R. has come a long way during the last 70 years.

70 YEARS OF THE JOURNAL OF THE OPERATIONAL SOCIETY

In 1950, the first volume of the Journal of the Operational Society, called at the time the OR Quarterly, was published by the Operational Research Club, as the Society was then known. The issues were very slim compared to today’s journal, with only one, short, article in each issue. In the second issue the activities of the Shirley Institute (The British Cotton Industry Research Association) were reported. The author, L. H. C. Tippett, said that the activities formed “a continuous spectrum from pure research at one extreme, through applied research to something that, in the form of special services to member firms, verges on operations. What part of this spectrum is to be termed operational research, or indeed if any part of it comes directly enough into contact with operations to be so termed, is a moot point”. The difficulty of defining O.R. is still a live issue! He continues: “The distinguishing feature of these activities is that they

6

IMPACT | SPRING 2020

AI, MACHINE LEARNING AND DATA ANALYTICS IN 2020

In an article in Analytics Magazine (bit. ly/5AIpredictions) Genpact Chief Digital Officer Sanjay Srivastava offered his top five predictions for AI, machine learning and data analytics for 2020. These included “Human in the loop”, the increasing value of judgment and reskilling. Sanjay believes that humans will play a critical role in the last mile of AI and data analytics. While machines predict and analyze, humans are needed for their judgment, empathy and creative

problem-solving. In 2020, he sees the value of data decreasing while the value of human judgment increases. AI DISCOVERS NEW ANTIBIOTIC

In a Guardian article, see bit.ly/ antibioticML, Ian Sample reported that, for the first time, a powerful antibiotic has been discovered using machine learning. The research team at MIT says that halicin kills some of the world’s most dangerous drug-resistant bacteria. “In terms of antibiotic discovery, this is absolutely a first,” said Regina Barzilay, a senior researcher on the project and specialist in machine learning at MIT. “I think this is one of the more powerful antibiotics that has been discovered to date,” added James Collins, a bioengineer on the team at MIT. “It has remarkable activity against a broad range of antibiotic-resistant pathogens.” The researchers first trained a “deep learning” algorithm to identify the sorts of molecules that kill bacteria. To do this, they created a database of the atomic and molecular features of nearly 2,500 drugs and natural compounds, and how well or not the substance blocked the growth of the bug E coli. This enabled the algorithm to learn what molecular features made for good antibiotics. Then it was set to work on a library of more than 6,000 compounds under investigation for treating various human diseases. The algorithm focused on compounds that looked effective but unlike existing antibiotics. This boosted the chances that the drugs would work in ways to which bacteria have yet to develop resistance. The team then moved on to a massive digital database of about 1.5bn compounds and set the algorithm working on 107m of these. Three days later, a shortlist of 23 potential antibiotics was available, of which two appear to be particularly potent.


U S I N G S I M U L AT I O N TO I M P R OV E T H E C U S TO M E R EXPERIENCE AT EUROSTAR

WILLIAM JONES, KATHY KOTIADIS, MARIA PAOLA SCAPARRA AND JESSE O’HANLEY

EUROSTAR HAS LED THE WAY IN CROSS CHANNEL TRAVEL for the last 25 years. It is the only high-speed railway company operating international train services between London and continental Europe via the Channel Tunnel. One of its

main appeals is that passengers can board trains in central London’s St Pancras International (SPI) station and reach either Paris Gare du Nord (GdN) or Brussels Midi, in the heart of the French and Belgium capitals, respectively, in just over two hours.

IMPACT © 2020 THE AUTHORS

7


The popularity of the most recently launched service between London and Amsterdam shows the growing appetite among customers for international high-speed rail travel as a sustainable alternative to air travel. In addition to Paris, Brussels and Amsterdam, Eurostar further runs services to Lille, Disneyland Paris and seasonally to the south of France and French Alps. The popularity of Eurostar’s services is, of course, an enormous boon to the company. However, this success has brought its own set of challenges. As an international service, passengers are required to pass through security screening and border controls (similar to airports) before boarding trains. Due to the growth in demand since services began in 1994, these checks became, at times, a bottleneck for passengers, occasionally resulting in queues. Terminal throughput, in turn, has constrained the number of services that can be operated and the number of tickets that can be sold. What is more, scheduling of rolling stock to carry upwards of 11 million passengers per year is no easy task. Travelling at 300km per hour back and forth to Europe each day invariably causes wear and tear on trains, which necessitates that they undergo regular maintenance. Operating across several countries and sharing infrastructure with multiple other operators while managing planned and unforeseen maintenance and coping with daily events that cause disruption to the schedule, Eurostar faces the highly complex challenge of ensuring trains are in the stations and ready to depart on time. Needless to say, this requires extensive and holistic forward planning on Eurostar’s part and the design of flexible operating procedures that are able to adapt to ever-changing conditions.

8

IMPACT | SPRING 2020

scheduling of rolling stock to carry upwards of 11 million passengers per year is no easy task

DEVELOPMENT OF STATION AND FLEETLEVEL DIGITAL TWINS

Starting in 2018, Eurostar and University of Kent Business School began a 2-year Knowledge Transfer Partnership (KTP) funded by Innovate UK. The aim of the KTP was to add new simulation and analytical capabilities to Eurostar’s planning framework and use them to help tackle key strategic and operational business challenges. Eurostar can best be thought of as a complex system of systems. The cumulative efforts of several systems operating seemingly autonomously, including each station across Europe and the UK, individual trains, maintenance depots and the control room, must interact together seamlessly in order for the service to run smoothly. Inefficiency or disruption within any one entity has ripple effects throughout the wider system. Initial modelling efforts focussed on the London and Paris terminals, where passenger throughput is highest and improving customer experience is a top priority. Each terminal is a very

complex system in its own right and understanding the causes of queues is not immediately obvious. Speak to any customer service team member who welcomes passengers as they arrive and he/she will tell you it is not just the number of travellers but the profile of a traveller that causes throughput to slow. The team in London will point to the increased time required to process a family group going on a skiing holiday compared to that of an individual business traveller off to a morning meeting in Brussels as evidence. Working closely with frontline staff and terminal managers, detailed simulation models of SPI and GdN were developed using AnyLogic. Figures 1 and 2 show the resulting 3D visualisations of SPI and GdN respectively. In SPI passengers check in on the ground floor and proceed to the 1st floor via the escalators when their train is ready to board. In GdN, Eurostar’s border controls are on a raised mezzanine layer. Passengers proceed to the departure area and then return to the ground floor via escalators to board the trains. The simulation models combined both a discrete event type framework to represent the processing of passengers through a series of checkpoints as well as agent-based features to describe passenger profiles, their movements around the terminal, and interactions with each

FIGURE 1 3D VISUALISATION IN ANYLOGIC OF ST PANCRAS INTERNATIONAL DEPARTURE AREA


FIGURE 2 3D VISUALISATION IN ANYLOGIC OF PARIS GARE DU NORD

other. Ahead of any day, Eurostar can use the simulation, driven by data from their passenger bookings, to anticipate where and when bottlenecks are likely to occur in a station. That information can, in turn, be used to plan the availability and positioning of staff in the station and further adjust tensator (the retractable belt barriers on movable stanchions often seen in airports, busy department stores at check-out or anywhere there are queues) queue positions to ensure the processing of travellers can keep up with the speed of arrival. It can also be used to plan how to manage passengers in the event of foreseen or unforeseen disruption and to inform passengers how much time in advance of their service departure they need to arrive in order to check-in, clear screening and border control, and board their train. The use of simulation has been particularly useful to Eurostar by giving station staff a much better understanding of the key triggers that create queues and, in turn, adopt a much more proactive and less reactive approach to manage queues in response to varying arrival and passenger profile patterns. Besides day-to-day management, one of the more important benefits of the station simulation models has been to inform strategic level planning for addressing Eurostar’s longer-term growth objectives. A case in point was the in-depth testing of a range of proposed changes to the border control layout

at GdN. Simulation helped to identify how best to position new border control stations and reorganise queue lanes in order to increase processing rates and smooth passenger flows through the station. The design ultimately adopted in May 2019 through iterative use of simulation and consultation with various companies, governments, and third-party stakeholders resulted in significantly improved throughput (c. 20%) and customer satisfaction throughout the summer peak period compared to the previous year.

Eurostar can use the simulation, to anticipate where and when bottlenecks are likely to occur in a station

At SPI, simulation has helped Eurostar to evaluate the merits of proposed plans to repurpose a portion of space used for arrivals into a new departure area. Simulation experiments provided a firm evidence base that this proposal could indeed increase departure throughput at peak times over the week and the year, without adversely affecting arrivals. Eurostar has since gone ahead with implementing the new departure area and, as a result, seen the highest ever throughput and customer satisfaction scores over the peak summer travel period. Despite the improvements, sporadic disruption on the rail network always risks spoiling the station teams’ best efforts. In an effort to mitigate against this, a second phase of the KTP project explored how simulation might be used to improve the robustness of timetables by modelling the movement and control of trains. Given the distributed nature of Eurostar’s operation, an agent-based modelling perspective was deemed the most suitable way to represent the different interacting components of the business involved with trains movement/control, including individual stations, maintenance depots, main control room and the trains themselves. Here, an agent-based architecture was implemented with individual agents defined for each sub-system being modelled (i.e., station, depot, control

IMPACT | SPRING 2020

9


room, train). The interactions of the different agents, in turn, produce emergent behaviours that effectively mimic how the overall system behaves in practice. As with the station-level simulation, the fleet-level simulation model adopts hybrid features in which discrete-event process models, contained within some agents, are used to capture the individual actions and processes of that agent. Other agents simply record data. Figure 3 shows a highly simplified illustration of the developed fleet-level simulation model structure. The fleet-level model allows Eurostar to simulate a whole day's schedule based on the set of trains available. Simulation runs can indicate which trains are most at risk of delay and identify the source of those delays, which subsequently allows managers to propose preventative actions. This may be as simple as notifying a particular station that they are likely to be stretched during a certain period with several fast ‘turn arounds’ required to get incoming trains heading out again on time. In a more extreme case, such as when bad weather has led to significant unforeseen maintenance, resulting in a shortage of trains, preventive action involves notifying passengers that their train has been cancelled and re-allocating them to other services. Although very much a last resort, this is much better done a day in advance, rather than after

10

IMPACT | SPRING 2020

FIGURE 3 A HIGHLY SIMPLIFIED ILLUSTRATION OF THE DEVELOPED FLEETLEVEL SIMULATION MODEL STRUCTURE

passengers have arrived and are waiting in the station. For longer-term planning, the model can even be used to compare and rank proposed timetables.

EVIDENCE-BASED DECISION MAKING

As well as a model to run and test out different scenarios, undertaking a simulation study provides several other ancillary benefits. Working with key

stakeholders to develop a model requires the group to agree and articulate their understanding of the impact of certain actions on the system. This logic can then be tested in the simulation and perceived wisdom challenged, perhaps leading to revised understanding. Further, the process requires a thorough exploration of available data. This can highlight gaps that need to be redressed by new data collection programmes and form the basis for improved monitoring of an organisation’s performance. Above all, for Eurostar, the simulation study has provided a sound and objective evidence base for decision making in situations where the best choice is not obvious. As the organisation continues to grow, simulation and other analytic techniques will play a key role in ensuring decisions are based on the best available information. This will help ensure the smooth flow of passengers through stations, confidence that they


will cross the channel as scheduled, and ensuring the Eurostar brand remains strong and the enthusiasm for fast, efficient, low carbon travel continues.

for Eurostar, the simulation study has provided a sound and objective evidence base for decision making in situations where the best choice is not obvious

Philippe Mouly, Chief Operations Officer: “we have been able to take full advantage of the improvements of the station layout that we made during the spring. This is the outcome of an intensive work, mixing design thinking, staff involvement, and simulations. The result is very positive. In June, July and

August we managed to accommodate through Paris Nord a record number of customers, with a record low number of delays and congestion. Consequently, customer satisfaction is at a high level, the Net Promoter Score being more than 10 points higher than last year on average for the last 3 months.” William Jones was a Knowledge Transfer Partnership (KTP) Associate at Kent Business School, University of Kent. He worked with Eurostar International Limited to develop simulation and optimisation techniques to tackle the organisation’s key challenges and support long term strategic aims. He is now a postdoc at the University of Sydney working in the area of robotics and automation. Kathy Kotiadis is a Reader (Associate Professor) in Management Science/

Operational Research at Kent Business School, University of Kent. Her main research interests include discrete event-simulation modelling applied to health care and simulation model development using problem structuring methods. Maria Paola Scaparra is the head of the Management Science group at Kent Business School and a member of the Centre for Logistics and Heuristic Optimisation. Jesse O’Hanley is a Professor of Environmental Systems Management in the Kent Business School, University of Kent and a member of the Centre for Logistics and Heuristic Optimisation. He is the winner of the 2015 EURO Excellence in Practice Award for work on optimising river connectivity restoration.

IMPACT | SPRING 2020

11


T H E R I S E O F T H E DATA T R A N S L ATO R Louise Maynard-Atem

In this fourth and final part of the Data Series, I wanted to look to the future and try and get a view on what the data landscape is going to look like over the course of this new decade. For that, I posed some questions to my former colleague, Ben Ludford at Efficio Digital, who describes himself as a Data Translator, to understand what this new buzz phrase might mean for the world of data and data science.

WHAT EXACTLY IS A DATA TRANSLATOR? AND WHAT ARE THE KEY SKILLS REQUIRED TO BE ONE?

A data translator is someone who can bridge the gap in expertise between technical teams, made up of data scientists, data engineers and software developers, and business stakeholders. Their primary objective is to unlock the value of investment in technical resource and teams, by maximising the impact of their projects over the entire project lifecycle. This could involve (but isn’t necessarily limited to) the following steps: 1. Engaging the business functions to identify and prioritise projects in which data science could be beneficial 2. Defining the detailed business question/problem 3. Identifying the data that could be used within the project 4. Stakeholder management 5. Ensuring the model output will be understood and useable by users

12

IMPACT © THE OR SOCIETY

6. Validating the model results against real life 7. Developing the story and recommendations based on insights 8. Driving ongoing adoption within the business They are ultimately responsible for the success of data science and analytics projects. To deliver on this objective, they must be highly skilled in the softer and non-technical areas like communication, domain expertise, interviewing/questioning, facilitation and project/stakeholder management. There are also the less teachable skills such as entrepreneurial mindset, creativity and adaptability. They also require working understanding of what quantitative models exist, their applicability and what the output really means.

IF DATA TRANSLATORS HAVE BOTH THE BUSINESS AND TECHNICAL UNDERSTANDING, HOW ARE THEY DIFFERENT TO DATA SCIENTISTS?

A typical explanation of what a data scientist would say is something like ‘they are an interdisciplinary individual with expertise in statistics, computer science and an understanding of the applications and domain they are working in’ (see Figure 1). Under that kind of definition, I would say that there are two big differences between them. 1. A translator does not need deep expertise in programming or modelling (but it certainly wouldn’t hurt) 2. Data scientists are often not defined as needing the translator soft skills Translators do not do the hardcore model development and data handling that data scientists most enjoy. Some would argue that data scientists also have these soft skills. I do not disagree, but I would say that the activities associated with these soft skills are usually in the areas that are not the core focus for data scientists. Therefore, the two can complement each other nicely in that a translator can take away the work that data scientists do not revel in, to allow them to focus on the things they really enjoy. To summarise it, I would describe a translator as a soft skills specialist for data science.


FIGURE 1 THE INTERDISCIPLINARY NATURE OF THE MODERN-DAY DATA SCIENTIST

WE’VE ALL HEARD THE STATS ABOUT HOW MANY DATA SCIENCE PROJECTS FAIL. HOW CAN THE DATA TRANSLATOR ROLE IMPACT AND IMPROVE THOSE STATS?

That being said, projects do fail. Reasons vary, so to summarise, here are 10 (https://caserta.com/data-blog/ reasons-why-data-projects-fail/):

When I hear those statistics, it always makes me question what the definition of failure actually is. Data science projects, by their very nature, will be applying cutting edge analytical techniques to untapped and, most likely, previously unexplored data. This could result in the finding that the data does not contain sufficient predictive qualities to address the business question or need. This is not necessarily a failure; it has helped you learn and probably throws up secondary questions. This is one example of how a translator could help. In a brand-new area, the first objective for a data science project should not be to enable the business to predict an event using a dataset or system. It should be to answer the question, does this dataset contains the right information to attempt to predict an event. The next question will then be whether the event can be predicted. Answering each question will result in other questions forming to answer. And answering a question with no is not, in my opinion, a failure.

1. Insufficient, incorrect or conflicting data 2. Failure to understand the real business problem 3. Misapplication of the model 4. Solving a problem no-one cares about 5. Poor communication of results 6. Change is not handled well 7. Unrealistic expectations from stakeholders 8. Poor project management 9. Excessive focus on the model over the problem 10. Lack of empathy Now aside from number 3, none of these issues are down to a lack of core data science capability. But they are down to a lack of focus on everything else that can make a project a success in the real world. It is the soft skills I have mentioned that are behind this. So, I absolutely believe that having translators on data science teams will ultimately only have a positive impact but

IMPACT | SPRING 2020

13


changing the way projects are delivered and managing closely those areas and interactions that are not invested in enough by existing teams. Translators plug the gap that exists between data science teams and the rest of the business. They enable data scientists to do their best work, manage the expectations of wider stakeholders and drive better outcomes for the business overall.

WHAT DOES A PERFECT DATA SCIENCE TEAM LOOK LIKE IN YOUR MIND’S EYE?

I don’t think there’s a perfect team, as it completely depends on the environment in which they operate. But in any given context, there are two things that will help describe what the optimal data science team looks like. Firstly, they have a clear definition of what their capability needs to be and what will be filled from other parts of the organisation. For example, if an organisation has a Chief Data Officer who has a team dedicated to unlocking access to data, and has documentation about data sources then you do not need the replicate this capability. Secondly, the team should attempt to achieve the full spectrum of skills missing elsewhere. The team needs to have the full set of capabilities. However, this is can be made up of a few subsets of different skillsets.

topic on everyone’s lips at the moment: ethics. Data science is so open to ethical issues from people not truly thinking through the implications of the models they build and how the output might be used in unintended ways. This needs to be handled properly for every model or the field as a whole risks irreparable reputational damage. As an opportunity, I think that, while specialist skills will continue to be hard to find, there will be an overall increase in data literacy within the workforce. This closes the communication gap between data science and stakeholders and will only have a positive effect on outcomes. Data Science is not going anywhere anytime soon. It will continue to grow. Those who work within data science, or are customers of it, need to understand the risks and make sure that what they’re delivering is fit for purpose. So there we have it, the future (or at least one version of it) is translation. I’m really keen to get your thoughts on this, and I’m sure Ben would too, so please feel free to reach out to either of us directly on LinkedIn.

WHAT ARE THE BIGGEST OPPORTUNITIES AND THREATS THAT YOU THINK THE INDUSTRY WILL FACE OVER THE NEXT DECADE?

As a threat, it is lack of skills that always comes to mind. However, that is partly driven by a high expectation on the range of capabilities that those in data should have. For me, I think that as the industry grows, there will be more specialisation in roles and that will help to draw in more people from the edges of the data space, to help meet demand for data science. Related to this threat is the proliferation of “data scienceas-a-service” type tools. These tools will likely be the biggest contributor in meeting the demand for data science skills. But the thing that concerns me is that I can see a future where anyone can create, build and deploy a model where the underlying assumptions have not been considered and the input data not really tested. And that brings us to the

14

IMPACT | SPRING 2020

Ben Ludford is a Manager at Efficio Digital, a fast-growing specialist consultancy. He focusses on aligning new products and services in the data space with clients’ needs. His background is Operational Research having studied it as an MSc at Lancaster University Management School and then working in the field for several years. Louise Maynard-Atem is an innovation specialist in the Data Exchange team at Experian. She is the co-founder of the Corporate Innovation Forum and is an advocate for STEM activities, volunteering with the STEMettes and The Access Project.


© British Airways

A V I S I O N F O R DATA S C I E N C E AT B R I T I S H A I R WAYS STEFAN JACKSON AND JOHN TOZER

BRITISH AIRWAYS HAS A HUGE AND COMPLEX OPERATION, centred at Heathrow, connecting hundreds of destinations worldwide. Although capacity is constrained at our main hub, we are gradually expanding our network to keep up with demand. We are proud of our punctuality, the main driver of customer satisfaction. In 2019 we celebrated British Airways’ centenary – 100 years of connecting Britain with the world and the world with Britain – and collected

the CAPA award for Airline of the Year, as well as two prestigious Superbrand titles as the top British brand for consumers and business. But none of this allows us to stand still as a business. The way our customers choose and consume their air travel is changing. They can no longer be stereotyped as ‘business up front, leisure down the back’, as they mix and match premium and economy travel for many different reasons. In this market we must provide quality, choice and value for everyone.

IMPACT © 2020 THE AUTHORS

15


Customers expect more when things go wrong – a quick resolution and compensation. British Airways is very aware of its environmental responsibilities, having recently committed to a target of Net Zero greenhouse gas emissions by 2050. This multi-faceted problem requires a multi-faceted response that will mean changes to our aircraft fleet, and making more and more choices where sustainability is the overriding consideration. From the start of 2020 we are offsetting all our domestic flights, and customers travelling further afield can choose to offset their individual emissions with our carbon calculator. We are also investing in the development of sustainable aviation fuel with our colleagues at Velocys and Shell, and are the first airline in Europe to build, with our partners, a waste-to-fuels plant. Our ability to respond to change depends entirely on the decisions made by our colleagues and management in the air, on the ground and at Head Office.

DATA SCIENCE AND ANALYTICS IN BRITISH AIRWAYS

In the Central Analytics department, we like to say that the company’s success is equal to the sum of its decisions. For the 70 years since the original Operational Research Team was formed in 1949, we have delivered data and analysis based on that equation; and our department has gone from strength to strength. The current Central Analytics department comprises around 80 data scientists, analysts and performance specialists, whose remit is to drive smarter decisions across British Airways.

16

IMPACT | SPRING 2020

The analytics teams see it as their responsibility to adapt new techniques and technologies to the airline’s benefit. Two years ago, we set up our Data Science and Innovation team to pioneer new techniques and Artificial Intelligence. We believe AI will transform the airline industry, and data is key to AI. The team is a core group of data scientists and data engineers who drive innovations, proactively looking for new ways to help our business. We have a mandate and funding to innovate, and we invest in our people and in the tools we use. Part of our work is pure innovation or exploration of data and methods, but an equally important part is finding ways to deploy the new methods within British Airways’ many long-established processes and systems. We have looked at the opportunities for Data Science in each part of British Airways. We have thought about the key business challenges, but also looked at other companies that successfully apply Data Science to their business process and customer proposition. From the research we have created several themes and prioritised them. At the same time, we have reviewed our analytics architecture of IT and techniques, and identified where we can improve it. We summarise the vision as “Bringing contemporary and edge analytics into the business”. Bringing a new approach means we are defining the requirements for infrastructure, tools, and working practices that will more easily manage data, that will get models in to testand-learn frameworks, and that will deploy the finished products into operational processes long term. There are some key components to this. First, an appropriate cloud

infrastructure to process data at scale, and aid deployment. Second, a DevOps way of working: this is a set of practices that combines software development (Dev) and IT operations (Ops) to shorten the product development life cycle while delivering features, fixes, and updates frequently in close alignment with business objectives. And third, the technology building blocks that will make analysis and analytics reporting slicker. To deliver all this we work in close partnership with British Airways’ Data team, which manages our databases and data tools. Data Science and AI are bringing, or may soon bring, improvements to many parts of our business. In the following paragraphs we look at some of the areas that we are working in and investigating.

Analytics and forecasting have been fundamental to improving levels of customer satisfaction for our Buy on Board catering

Planning the operation

A successful airline operation depends on successful planning. Understanding and forecasting changes to plan is a critical analytics role that supports making the right day-to-day decisions. The decisions in turn lead to actions that minimise disruption and inconvenience to customers, aircraft and crew, and ensure we meet our commercial goals. We want to apply more sophisticated methods to input into decisions, complementing or upgrading the existing models. And before we can exploit the latest techniques we need to test and review


© Stuart Bailey AN ELECTRIC MOTOTOK TUG PUSHES BACK AN AIRCRAFT AT HEATHROW T5

their performance. As well as Data Science methods, we are investing in new computer simulation software to experiment with operational and planning decisions. The aim is always to improve the service we offer our customers and ensure their flights depart on time. Analytics and forecasting have been fundamental to improving levels of customer satisfaction for our Buy on Board catering. We have designed and improved machine learning algorithms to adjust the loading of fresh food, and integrated these into the automated supply chain systems. We have also introduced anomaly detection, to identify and rectify potential loading errors. We have processes to constantly monitor customer satisfaction and regularly review our product range, including the use of text analytics to analyse customer feedback. With tens of thousands of flying and ground colleagues, we can encounter complex and dynamic people scheduling issues. There is a need to design and implement shift patterns (rotas) that efficiently match

rostered hours to the workload or business demand subject to operational, contractual and social constraints. One particular problem is to decide how many flying crew to have on standby. The requirement changes dependent on a variety of issues – time of year, sickness rates, number of flights, and others. Here we have opportunities to test and learn and deploy analytics, especially forecasting and optimisation techniques. Flying sustainably

British Airways is committed to achieving Net Zero greenhouse gas emissions by 2050, with a number of milestones needed to be reached over the next thirty years. A large percentage of the target will be met by replacing older aircraft with newer, more fuelefficient ones.

A Data Science project looked at optimising the charging regime of electric tug batteries using telematics data

In making decisions on our future aircraft fleet, Data Science, forecasting and optimisation techniques could bring underlying improvements to the existing models, for example, improvements to the prediction of the demand for individual flights and total market demand. The company is also committed to reducing its carbon footprint through the development of new, greener aviation fuels, and through introducing more electric vehicles in airport operations. A recent Data Science project looked at optimising the charging regime of electric tug batteries using telematics data. Staying safe

Safety is always the airline’s top priority and British Airways has a programme of scheduled maintenance for every aircraft, with a 24-hour highly skilled Engineering team to handle any issues. In addition to this programme, the Data Science team has introduced predictive maintenance techniques to help determine the condition of in-service equipment and provide smarter data to predict the optimum window for minor maintenance to be performed. It is known as condition-based maintenance, carried out on the basis of very high-quality estimations of the lifespan of a piece of machinery. This approach to part of our maintenance programme promises increased efficiency over routine or time-based preventive maintenance, because tasks are performed when they are warranted. One approach we have taken is to use readings from airframe sensors, to identify anomalous patterns in the data, either during the flight, or average readings from flight to flight. In the wider business, risk management is the identification,

IMPACT | SPRING 2020

17


evaluation, and prioritisation of risks followed by the right application of resources to minimise, monitor, and control the probability or impact of adverse events. The opportunities for Data Science and AI include using new data to identify risks (Internet of Things, telematics data) and improving the intake and storage of existing data in our data warehouses and data lakes.

recommend destinations to customers based on their travel and browsing histories.

We use text analytics to automatically generate insights from crew feedback, and to process online customer feedback

Offering the right product at the right price

18

IMPACT | SPRING 2020

Listening to customers and colleagues (and talking back)

We regularly ask our customers and colleagues for feedback – in a highly competitive environment we recognise this is what makes us better. We are setting up strategies to process, analyse and propose actions based on the raw verbatim feedback we receive from respondents. We use text analytics to automatically generate insights from crew feedback, and to process online customer feedback, for example filtering out personal comments about individuals. We have methods for analysing social media, and for finding themes in feedback from different customer surveys. Another application is tracking

A VISION FOR DATA SCIENCE

We said earlier that we view British Airways’ success as the sum of its decisions. When we look at new data analytical methods and new technologies, our focus is to identify where we can most effectively employ

© British Airways

British Airways has a corporate programme to bring a step change in the way we price add-ons to the basic flight product, such as extra baggage or a selected seat. One part of the programme is introducing new mechanisms and systems to price these ancillary products appropriately for our customers. As we better understand our customers’ needs, we are building models to allow seats to be dynamically priced based on the features of the booking and flight. We are also looking to make an ambitious step change in the capability and structure of our flight revenue management system. We aim to develop new models and capability that exploit Machine Learning within any new system developed. We want to understand and relate to our customers as individuals. One important element of Customer Relationship Management and oneto-one personalisation is creating and using new customer models. These models often make use of external data sources or deploy data science techniques. More specifically, we need to employ new methods like machine learning to refine established models. Two examples are: the algorithm that estimates whether a booking is for business or leisure travel; and the recommender engines that

colleague sentiment, as a way of monitoring satisfaction. We intend to shift the work away from ad-hoc project work, to automated self-service reporting. Chatbots, or the ‘conversational virtual assistant platform’, enables customers, businesses and colleagues to self-serve with information and simple transactions, linking in to back-end systems. We are helping define Business to Business, Business to Customer and Business to Employee use cases. The more advanced third-party virtual assistant uses natural language processing and machine learning, so there is an opportunity to collaborate with the vendor in its development. We also recognise that these tools generate new data in large quantities – presenting both opportunities for analysts and challenges for data engineers.

BRITISH AIRWAYS CABIN CREW


them to improve decisions in an organisation with many established practices and constraints, but one that strives to evolve in a changing business environment. This means that sometimes we need to explore, speculate, innovate, and even fail; at other times we need to adapt our methods to existing systems and processes. The Data Science and Innovation team is open to both approaches. The team’s make-up deliberately combines newly-recruited postgraduates with a grounding in the latest techniques, and established experts with great business, systems and data knowledge. We believe that a rich and flexible learning programme is key to the success of the department. We offer a range of learning opportunities, from short, in-house knowledge sharing sessions, external training

and attendance at conferences, to accreditation, MSc apprenticeship courses, and mentoring. Data Science and Innovation is a small team with a big remit in a large organisation, and we depend on our relationships across the business – fellow analysts, IT and data professionals, and colleagues across head office and out into the business – to deliver. But we are also developing ways to deploy our products faster and with less dependency on functions such as IT to implement them. We are exploring DevOps (an industry standard methodology for Data Science projects), big data infrastructure, APIs (web interfaces for data) and software containerisation to this end. We have endorsement of our vision at the highest level of our business. Alex Cruz, British Airways’ Chairman

and CEO, talking to the Central Analytics teams recently, described the opportunities for Analytics and Data Science as ‘limitless’. With all this goodwill we still need to produce a constant flow of success stories, and we are adopting agile project methodologies and communications to keep our central place in the business. Stefan Jackson is British Airways’ Data Science and Innovation Manager, with a background in Operational Research. John Tozer is a Data Exploitation Specialist with a background in Operational Research and Data Management in British Airways. This article was written in January 2020.

IMPACT | SPRING 2020

19



O P E R AT I O N A L R E S E A R C H I M P R OV E S B I O M A N U FAC T U R I N G EFFICIENCY TUGCE MARTAGAN

MILLIONS OF PATIENTS HAVE BENEFITTED FROM NEXT GENERATION DRUGS to recover from cancer, autoimmune disorders, and many other diseases. These drugs are produced using biomanufacturing technologies. The biomanufacturing industry is growing rapidly and becoming one of the key drivers of personalised medicine and life sciences. As such, the global biomanufacturing

market is projected to reach $388 billion in 2024. Despite its success, biomanufacturing is a challenging business. It is cost intensive with high risks of failure. In addition, biomanufacturing methods use live systems (e.g., bacteria, virus, insect cells, etc.) during the production process. This enables highly complex and unique active ingredients to be generated compared with conventional

IMPACT Š 2020 THE AUTHOR

21


drugs. However, the use of live systems leads to unique production challenges related to predictability, stability, and batch-to-batch variability.

the project used techniques such as machine learning and simulation-optimisation to predict and control biological systems

22

IMPACT | SPRING 2020

bioreactor changeovers, increasing fermentation yield, and creating better production plans.

HOW TO REDUCE CHANGEOVER TIMES IN BIOMANUFACTURING?

A typical biomanufacturing process can be broadly classified into two main steps, fermentation and purification. Fermentation is often carried out in bioreactors (e.g., highly controlled stainless steel vessels for a cell culture to grow). After fermentation, the batch proceeds with a series of purification operations (e.g., centrifugation, filtration, chromatography, etc.) to achieve good quality standards. The main focus of the first project is © C MSD AH

To date, several industries have benefited from Operational Research (O.R.) methodologies to improve operational efficiency, and reduce costs and lead times. However, the applications of O.R. methodologies to the biomanufacturing industry are still immature. One of the main reasons is that the competitive advantage in biomanufacturing used to be mainly driven by the scientific advances related to the underlying biological and chemical dynamics. However, with the growing market competition, the industry encounters an increasing need for a data-driven, O.R.-based approach to improve business practices. A multi-disciplinary team of researchers from Eindhoven University of Technology (TU/e) and Merck Sharp & Dohme Animal Health (MSD AH) have been collaborating over three years to improve biomanufacturing efficiency. The team consists of Dr. Tugce Martagan, Prof. Dr. Ivo Adan, and Ph.D. candidate Yesim Koca from TU/e, and Oscar Repping, Bram van Ravenstein and Marc Baaijens from MSD AH. The collaboration resulted in a portfolio of optimisation models and decision support tools to reduce biomanufacturing costs and lead times. The project combines operations research and life sciences. This is one of the first examples showing how operational research can improve biomanufacturing practice.

Overall, the project used techniques such as machine learning and simulation-optimisation to predict and control biological systems. A variety of optimisation models and decision support tools were developed to translate the underlying biological dynamics into business metrics, such as lead times and costs. Although the project was conducted and implemented by MSD AH, its true impact extends to other biomanufacturing companies, including human health applications. This is mainly because the project addresses common industry challenges related to predictability, batch-tobatch variability and planning under uncertainty. In general, the project consists of three parts: reducing

FULL-SCALE PHARMACEUTICAL FERMENTER LINE USED IN BACTERIOLOGICAL ANTIGEN PRODUCTION


fermentation. During fermentation, the cell culture follows a specific growth pattern: The fermentation starts with the lag phase with no cell growth. The cells start to slowly grow in the acceleration phase, and continue with the exponential growth phase where the cell growth speed reaches its maximum. Then, the cell growth slows down through the deceleration and stationary phase, followed by the death phase. The cells produce the final product (e.g., proteins, antibodies) as they grow. Typically, the batch culture is harvested during the deceleration or stationary phase. After the batch is harvested, the bioreactor needs to be cleaned and sterilised to be prepared for a new batch. These changeovers are costly and time consuming. For example, it might take one full work day to clean and sterilise a bioreactor. Therefore, there is a significant business case for reducing changeover times in biomanufacturing. To reduce the bioreactor changeover times, MSD AH developed a new replenishment technique called bleed-feed. With this technique, some predefined fraction of the culture is extracted (to be sent downstream for further processing) and a special fresh medium is added to the remaining culture. The remaining cell culture acts as a new seed for the next bioreactor run. This means that the technique allows the changeover activities for the subsequent batch to be skipped. However, the technique can be performed during the exponential cell growth phase only. Otherwise, the technique does not work, and the culture needs to be harvested in full. In this setting, identifying the best bleed-feed time is challenging because the duration of each cell growth phase is stochastic. This means that we do not know the exact moment when the

exponential growth phase stops. If the bleed-feed technique is carried out too early, then we might not achieve the maximum yield from that batch. In contrast, if it is conducted too late, then the batch needs to be harvested and the bioreactor needs to be set up for the next batch.

The proposed configuration has enabled an increase in the bioreactor yield by 50% while reducing the yield variability by 25%

To address this problem, a stochastic optimisation model was developed using the theory of Markov Decision Processes. The objective of the model is to identify the optimal bleed-feed time that maximizes the expected profit. The optimisation model captures the complex dynamics and uncertainties related to the underlying biological dynamics. Then, the model links the underlying biological dynamics with business metrics, such as cost and throughput. Prior to the optimisation model, the bleed-feed decisions were made based on domain knowledge. The optimisation model provided a data-driven, quantitative approach for decision-making, which improved process efficiency.

HOW TO INCREASE FERMENTATION YIELD?

In this project, the main objective was to maximise the production yield obtained from fermentation. This was especially critical for the Boxmeer facility in the Netherlands, seen in the photograph at the start of the article, as one of their bioreactors consistently produced a lower amount

of yield compared with others in the facility. Interestingly, the bioreactor with the lower output was new, using state-of-the-art technology. Further investigation showed that this new technology used a different bioreactor mixing mechanism, leading to a different type of airflow inside the bioreactor. This implied that some of the controllable input parameters needed to be adjusted for that specific bioprocess (specific parameters are not disclosed for confidentiality). However, there was no available information in the literature to help decision-makers define the best configuration for these controllable input parameters. This implied that several experiments needed to be conducted at industryscale to collect data and optimise the controllable input parameters. However, industry-scale experiments involved very high risks due to process uncertainties, limited resources, and high operating costs. Therefore, a smart decision-making mechanism was needed to identify the best parameter configuration through minimum number of industry-scale experiments. Although this problem was motivated by a case study in the Boxmeer facility, it addresses a common industry problem in the context of optimal learning: industryor laboratory-scale experiments need to be designed in a smart way to collect the required information in the least possible number of experiments. This is especially critical when the resources for conducting these experiments are limited because of budget restrictions or operational constraints. To address this problem, predictive models were built, based on the theory of Bayesian design of experiments. These models used limited amount of industry data to predict how bioreactor yield would change as a function

IMPACT | SPRING 2020

23


of the critical process parameters. Then, a stochastic optimisation model that belongs to the class of the optimal stopping problem was designed to control these critical process parameters. As a result, the model suggested an optimum process configuration based on the results of eight industry-scale experiments. The proposed configuration has been implemented for more than one year at MSD AH and enabled an increase in the bioreactor yield by 50% while reducing the yield variability by 25%. This also helped to improve the environmental sustainability of these processes through higher production outcomes per bioreactor run.

HOW TO DEAL WITH PLANNING AND SCHEDULING?

Biomanufacturing operations are performed by highly skilled scientists using specialized equipment. Capacity planning for these limited resources is critical for successful and timely completion of orders. Failing to satisfy delivery dates results in loss of credibility and reputation. In addition, biomanufacturers face unique challenges in operational scheduling. Each client order requires several tasks to be completed. The tasks and their durations differ between orders depending on the final product requirements and quality of the starting material. The use of live cells often introduces ‘no-wait’ constraints between steps. The engineered nature of these products adds uncertainty at each step and imposes simultaneous requirements on highly skilled labour resources and specialized equipment to guarantee the best outcome. Creating a good schedule that can quickly react to these dynamics

24

IMPACT | SPRING 2020

is a challenge. Despite these challenges, there is a need for a specific rhythm in the production system.

The project created novel tools to complement life sciences with operational research To address these planning and scheduling challenges, we first developed a simulation model of the biomanufacturing operations. The simulation model was developed with the Arena software, and contained 8000 connections representing 48 different products with their unique routings on 25 pieces of equipment and more than 50 process steps. The simulation model was validated using two years of historical production data on lead times, utilisations, bottlenecks, inventory levels, and throughout. Then, simulation-optimization was used to generate a portfolio of flexible production schedules for each week (namely, rhythm wheels). The optimisation module used the Tabu search algorithm to maximise the throughout. The tool was designed to enable MSD AH to quickly react to changes in their production system by dynamically adjusting their production schedules. In addition, the tool was also used to evaluate and justify capacity expansion decisions.

INVENT. IMPACT. INSPIRE

‘The project greatly serves towards our motto: Invent. Impact. Inspire.’ says Bram van Ravenstein, Associate Director at MSD AH. More specifically, it created novel tools to complement life sciences with operational research. The developed tools have been in-use at MSD for

almost two years. As a result, the production outcomes of certain batches from the Boxmeer facility increased by 97% without investments on additional resources, such as equipment and workforce. Recently, several follow-up projects were initiated nationally and internationally. For example, the Boxmeer facility is currently collaborating with several other facilities to help them use the O.R. tools. The upper management has recently encouraged new initiatives for knowledge transfer to the human health department. As more companies embrace the applications of O.R. methodologies, the impact will be significant for the society – cheaper and faster access to lifesaving treatments. As Oscar Repping, Executive Director at MSD AH states, ‘We [the industry] will benefit from O.R., as such, we will avoid investments, we will become more predictive, leading to cost reduction, leading to more capacity on our production lines, meaning that we can make this world a better place.’

the production outcomes of certain batches from the Boxmeer facility increased by 97% without investments on additional resources Tugce Martagan is an Assistant Professor in the Department of Industrial Engineering at Eindhoven University of Technology. She received her Ph.D. in Industrial and System Engineering from the University of Wisconsin-Madison in 2015. Her research focusses on stochastic modelling and optimisation, especially in the context of the pharmaceutical industry and healthcare operations management.


D I S C OV E R I N G H E U R I S T I C S JONATHAN THOMPSON

WHAT ARE HEURISTICS AND WHEN SHOULD THEY BE USED?

IN A WORLD OF LIMITED RESOURCES, there is growing pressure on businesses, organisations and individuals to use these resources effectively and prudently. This leads to many optimisation-type questions – for example how should we route our delivery vehicles to minimise CO2 emissions? How should we allocate our nurses in order to meet the medical needs of the ward? How should we travel to a given destination in an environmentally friendly, yet timely manner? Such decisions are being made on a daily basis and the quality of these decisions has a huge impact on the usage of scarce resources. In academia, a considerable amount of research focuses on making the optimal decision using methods such as linear programming, integer programming and dynamic programming. However, for many business situations, time is an extremely scare commodity and there is a need to produce solutions in a timely fashion. Therefore, a huge number of these decisions are made using heuristic methods.

The word heuristic, from the Greek word heuriskein meaning find or discover, refers to approaches to problem solving that do not guarantee finding an optimal solution, but produce solutions of sufficient quality to meet the needs of the problem solver. In electing to use a heuristic, the problem solver is accepting a corresponding reduction in solution quality. However, there are many scenarios in which exact solution methods are not appropriate. These situations include:

The word heuristic, from the Greek word heuriskein meaning find or discover, refers to approaches to problem solving that do not guarantee finding an optimal solution, but produce solutions of sufficient quality to meet the needs of the problem solver

• When no efficient solution methods are known to exist. For the set of problems for which no efficient solution methods are known, even powerful software for finding exact solutions will struggle to solve to optimality in a reasonable amount of time other than for relatively small problems. Many real-life problems fall into this category. • When time is limited. Even if efficient methods are known, it may be that the time required to find an optimal solution is too long and the problem solver requires a solution in a shorter amount of time. • When the problem solver is willing to accept a decrease in solution quality. For some problems, the value of the objective function is crucial and finding the optimal solution is absolutely vital. In some other contexts finding the precise

IMPACT © 2020 THE AUTHOR

25


In these situations, a heuristic approach can be considered as the solution method as exact methods are not appropriate. In designing a heuristic approach, it is important that the problem is fully defined and understood. Similar problems can then be identified and literature can help to pinpoint attributes of heuristics that work well on related problems. Different types of heuristics exist and one challenge is knowing which type of heuristic to use in a particular situation. Heuristics may be classified as constructive, improving and metaheuristics.

© Kursat Unsal/Shutterstock

optimal solution is less important and a reduction in solution quality is acceptable. • When the problem contains uncertainty. In many practical situations, the precise problem definition is uncertain or includes uncertain information. There is little value in using considerable resources to find an optimal solution to a vaguely defined problem instead of using a heuristic method. • When the problem lacks information or includes contradictory data.

FIGURE 1 MAP OF LONDON ATTRACTIONS

AN EXAMPLE– THE TRAVELLING SALESPERSON PROBLEM

Consider a London tourist who wishes to leave their hotel near Hyde Park and visit six famous landmarks in the shortest distance before returning to their hotel. This is an example of the travelling salesperson problem (TSP) which is an extremely well researched problem and for

which no efficient solution methods are known, so heuristic methods are typically used in practice. Consider the distance matrix in Table 1 showing the distance between the six landmarks and the hotel. The spatial arrangement of the hotel and landmarks can be seen in Figure 1. The optimal route is Hotel -> London Zoo -> Nelson’s Column -> Tower Bridge -> Big Ben -> Buckingham

TABLE 1 DISTANCE MATRIX CALCULATED USING GOOGLE MAPS HOTEL Hotel

BIG

BUCKINGHAM

CONCERT

LONDON

NELSON’S

TOWER

BEN

PALACE

HALL

ZOO

COLUMN

BRIDGE

0

3.7

2.4

2.4

3.9

3.5

8.9

3.7

0

1.3

3.9

5.6

0.8

4.5

2.4

1.3

0

2.7

5.6

1.1

6.4

Concert Hall

2.4

3.9

2.7

0

6.1

3.9

9.2

London Zoo

3.9

5.6

5.6

6.1

0

4.8

9.0

3.5

0.8

1.1

3.9

4.8

0

4.8

8.9

4.5

6.4

9.2

9.0

4.8

0

Big Ben Buckingham Palace

Nelson’s Column Tower Bridge

26

IMPACT | SPRING 2020


Palace -> Concert Hall -> Hotel, which has a distance of 24.4km. A constructive heuristic generates a solution from scratch and repeatedly adds to the solution until it is complete. The simplest example of a constructive heuristic for the TSP consists of going to the nearest unvisited landmark until all have been visited, at which point the salesperson returns to the hotel. In this example, the resulting route would be Hotel -> Buckingham Palace -> Nelson’s Column -> Big Ben -> Concert Hall -> London Zoo -> Tower Bridge -> Hotel, which has a distance of 32.2km, significantly further than the optimal solution. This approach is also called greedy because it makes locally optimal decisions at each step, but may incur large costs towards the end of the process. However, other constructive heuristics can be used. Another simple heuristic involves choosing the cheapest link in turn and adding it to the tour as long as this does not create subtours. We start by selecting Nelson’s Column -> Big Ben (distance 0.8) and add Buckingham Palace -> Nelson’s Column (1.1). The next shortest link is between Big Ben and Buckingham Palace (1.3) but including this link would result in a subtour so is ignored. When completed, this heuristic gives a solution of Hotel -> Concert Hall -> London Zoo -> Tower Bridge -> Big Ben -> Nelson’s Column -> Buckingham Palace -> Hotel, which has length 26.3km, which although not optimal, is a significant improvement over the previous heuristic. This illustrates that different heuristics can give very different solution quality and spending a small amount of time to

consider slightly cleverer heuristics may give significantly improved results. A further issue for heuristics is that the problem solver does not know whether the resultant solution is close to optimal, or of very poor quality. Lower (or upper) bounds may be used to give some context to the results. An improvement heuristic takes some starting solution and gradually improves it. A neighbourhood relationship defines those solutions that can be reached in one step from the current solution. A simple improvement heuristic, also known as local search, will accept a neighbouring solution as the current solution if its solution quality is superior.

Different types of heuristics exist and one challenge is knowing which type of heuristic to use in a particular situation In the above example, we start from a simple starting point corresponding to visiting the cities

in the order given, which leads to the solution Hotel -> Big Ben -> Buckingham Palace -> Concert Hall -> London Zoo -> Nelson’s Column -> Tower Bridge -> Hotel which has a distance of 31.4. Our neighbourhood is defined as the set of solutions created by removing two edges from the current solution and forming a new feasible solution. In all cases there is only one way of doing this. An example is shown in Figure 2 where the links between Concert Hall and London Zoo, and Tower Bridge and Hotel have been identified for removal. To form a new valid route, the new links must be Concert Hall to London Bridge and London Zoo to Hotel. The links that are removed are of distance 15 whereas the new links are of length 13.1. As the neighbouring solution has lower cost than the current solution, the move is accepted and the new tour is formed which has a distance of 29.5. Note that the links between London Zoo and Nelson’s Column, and Nelson’s Column and Tower Bridge, are now being traversed in the opposite

FIGURE 2   ILLUSTRATING AN IMPROVEMENT HEURISTIC

The diagram on the left shows the initial route. The links in orange have been selected for removal. The diagram on the right shows the route after new links are added.

IMPACT | SPRING 2020

27


HEURISTICS FOR SCHEDULING PROBLEMS

Many scheduling problems have been solved using heuristic methods. When tasks need to be sequenced, known as the job shop scheduling problem, ordering jobs according to some priority rule such as minimising slack time or earliest due date can prove effective. To produce examination timetables, exams can be ordered in terms of difficulty to schedule, e.g. in order of most clashing exams, and then allocated to the time period that ensures a feasible timetable while spreading exams out for students. The scheduling problems to which heuristic methods have been applied are wide-ranging. For instance, Karapetyan et al. (2015) schedule image acquisition from a satellite using several heuristic methods which all outperformed human methods. They identified simulated annealing as the most

28

IMPACT | SPRING 2020

© Andrey_Popov/Shutterstock.com

direction. Neighbouring solutions will continue to be sampled until some stopping criteria is satisfied. Other neighbourhoods will provide different solutions, for example removing 3 links before reforming the tour. Improvement heuristics give locally optimal solutions but once the neighbourhood contains no improving solutions, the search is trapped and cannot escape. Many meta-heuristic methods have been proposed to overcome this problem, including simulated annealing, tabu search and genetic algorithms. These are well-researched methods that have proved effective on many practical problems. We now consider examples of heuristics being used to solve classic optimisation problems – scheduling, routing and packing.

efficient method. Vali-Siar et al. (2018) compare a constructive heuristic to a genetic algorithm to schedule hospital operations and found that both improved on manual methods, with the constructive heuristic performing particularly well in reducing overtime by over 80%.

HEURISTICS FOR ROUTING PROBLEMS

Several construction heuristics for vehicle routing problems have been widely used in practice. The Clarke and Wright savings heuristic starts with a set of n routes where n is the number of customers and each route consists of a journey from the depot to a customer and back again. The heuristic then merges two routes into a single route as long as it causes a decrease in the overall distance. There are several cluster first, route second heuristics that group the customers into feasible sets, and then solve a TSP to form the route for each cluster. Many improvement heuristics have been proposed, using a variety of neighbourhood definitions such as exchanging n items in one route

with m items in another. Tabu search has proved particularly effective for solving a number of different routing problems including problems with time-windows and dynamic travel times.

Compared to the previous manual methods, the company report savings in labour usage, a reduction in product damage and improved loading quality

Delgado-Antequera et al. (2020) use a greedy algorithm hybridised with a variable neighbourhood search method to determine routes for waste collection vehicles in Antequera, Spain. They identify routes that reduce the distance travelled by more than 50%. Louati et al. (2019) use a heuristic to route waste collection vehicles and find solutions that reduce the operational time by 23% and distance by 7.3% compared to manual methods.


HEURISTICS FOR PACKING PROBLEMS

Packing problems involve packing items into containers. Constructive heuristics such as first fit decreasing and best fit decreasing have long been used to produce solutions to packing problems. The items to be packed are sorted from largest volume to smallest. First fit places each item in turn into the first bin in which it fits whereas best fit places the item in the bin in which the smallest empty space is left. Improvement heuristics often move items to different positions or boxes and implementations of ­metaheuristic methods are commonplace.

Heuristics are used in a large number of practical scenarios where run time is limited and a compromise in solution quality is acceptable

A real-life example comes from Atlas Copco, an American company that supply parts to mining companies around the world, shipping packages in sea containers. These containers have to be packed in a way that minimises damage to items, satisfying constraints on weight limits, weight distribution, stability, and ease of loading while fitting as many items as possible into the container. Olsson et al. (2020) use a greedy construction heuristic to generate load plans and a genetic algorithm to fine tune the objective function for the construction heuristic. The method has been successfully implemented and generates suitable plans in a timely fashion. Compared to the previous manual methods, the company report savings in labour usage, a reduction in product damage and improved loading quality.

CONCLUSIONS

Heuristics are used in a large number of practical scenarios where run time is limited and a compromise in solution quality is acceptable. The right heuristic to use in a particular situation depends on several factors including the available time and resources, the importance of the solution quality and the certainty of the problem definition. Therefore, for each problem the “correct” heuristic to use will differ from situation to situation. When a reasonable amount of time is available, a promising research area is matheuristics which combines heuristic and exact methods to improve solution quality. This includes such methods as using exact methods to define neighbourhoods in improvement heuristics, using heuristics to improve bounds in branch and bound and dividing the problem into smaller components that can be solved exactly, and then merging the solutions. Work in quantum computing may yet lead to improvements in the size of

problems that can be solved exactly but currently heuristics are the right and only option in many practical situations.

Jonathan Thompson is a senior lecturer in the School of Mathematics at Cardiff University. He has implemented heuristics to solve many real-life problems including manpower planning problems, scheduling hospital operations and timetabling problems.

FOR FURTHER READING Delgado-Antequera, L., R. Caballero, J. Sanchez-Oro, J. M. Colmenar and R. Marti (2020). Iterated greedy with variable neighbourhood search for a multiobjective waste collection problem. Expert Systems with Applications 145: 113101. Karapetyan, D., S. Mitrovic-Minic, K.T. Malladi and A.P. Punnen (2015). Satellite downlink scheduling problem: A case study. Omega 53: 115–123. Louati, A., L.H. Son and H. Chabchoub (2019). Smart routing for municipal waste collection: a heuristic approach. Journal of Ambient Intelligence and Humanized Computing 10: 1865–1884. Olsson, J., T. Larsson and N-H Quttineh (2020). Automating the planning of container loading for Atlas Copco: Coping with real life stacking and stability constraints. European Journal of Operational Research 280: 1018–1034. Vali-Siar, M. M., S. Gholami and R. Ramezanian (2018). Multi-period and multi-resource operating room scheduling under uncertainty: A case study. Computers & Industrial Engineering 126: 549–568.

IMPACT | SPRING 2020

29


© wsf-s/Shutterstock

IN THE PIPELINE BRIAN CLEGG

MODERN SANITATION AND PLUMBING have had a huge impact on living conditions – but our expectation that we can turn on a tap and have water on demand presents a serious challenge to the water companies. Getting water from A to B involves a constant battle with obstacles and the physics of keeping a mass of fluid on the move.

A GEOGRAPHICAL PUZZLE

The task faced by Paul Hart and his colleagues at Black and Veatch, an international engineering firm specialising in energy, water and telecommunications, was to find a way to route water pipes from place to place, finding the best course given a host of often conflicting constraints.

30

IMPACT © THE OR SOCIETY

The geology of the ground that the pipes are to be laid in, water’s reluctance to flow uphill, the impact of bends and turns, environmental considerations and the complex interference of other human activity, from pipes and cables to roads and housing, all make finding a good route a three-dimensional puzzle of immense complexity. From the start, it seemed likely that the best tool would be a geographical information system (GIS), Hart’s speciality, which he first discovered on his geography degree course: ‘In year 2, there was a module called GIS. I’ve always been quite techie and here were my two interests brought together. I was hooked, using it on several other modules during my degree. I’d never


really known what I wanted to do, but here it was. After leaving uni, I joined my current company Black & Veatch as part of a national mapping programme called the Right to Roam Access Land project for the Countryside Agency. I’ve worked my way up through the company since. The thing that’s kept me interested is every day is different. I’ve worked on over 100 projects across the world, challenging me to innovate and problem solve.’ The idea of the GIS can be traced back to medical doctor John Snow’s detection of the source of a cholera outbreak in London’s Soho in 1854. Snow plotted the locations of cholera cases on a detailed map of the area. At the time, cholera was thought to be spread by ‘bad air’, but Snow suspected that the source was a sewage-contaminated water pump in Broad Street. He was able to show on his map how clusters of outbreaks centred on that spot. A modern GIS combines a database of potentially dozens of data sources with mapping to enable an analyst to study anything from crime outbreaks to crop planting. For the challenges faced in routing water pipelines, the GIS approach provided an opportunity to take in far more factors simultaneously than was possible with the traditional method of human experts deciding on the best routing for a pipe. The package of data and tools was named PROM – Pipeline Routing Optimisation Method.

the GIS approach provided an opportunity to take in far more factors simultaneously than was possible with the traditional method of human experts deciding on the best routing for a pipe

FIGURE 1 EXAMPLE OF PROM OUTPUT (Brown = least favourable area, yellow = most favourable area)

CRUNCHING THE DATA

The PROM system pulls together more than 70 collections of data into a specialist database (see Figure 1). When PROM was first envisaged, around twelve years ago, this proved quite a challenge. Hart: ‘Data was hard to get hold of, licence restricted and internet speed restricted. We used to spend weeks gathering what we needed for

PROM. Now, with open data, data standards and things like WMS and WFS [internet standards for mapping] data is fairly easy to get, and in a fairly ready to use format. It has certainly taken some of the hassle out of doing this type of exercise.’ In any particular decision, the factors that influence the potential route might be more or less important,

IMPACT | SPRING 2020

31


so experts debate the weighting to give each contributory input – what relative importance, for example, should be given to the physical issues of getting through a particular piece of ground compared to the need to minimise disturbance of a rare species habitat. The system then crunches the numbers to discover the optimal route via the best combined weighted score. Using traditional methods there is limited ability to do ‘What if?’ variations, but because of the speed of testing out options using PROM, it has proved possible not only to find the optimal route from A to B but also to query where to locate A and B, which might not be fixed by the requirements of the project. Sometimes making a small change to the starting point of a route, for example, could have significant cost benefits. Similarly, the system can be re-run giving more influence to, say, environmental or domestic disruption contributions – perhaps avoiding a specific street – exploring sensitivity to these changes. Although PROM makes use of vast quantities of factual data, perhaps the most important set of variables are the experts’ weights for the different factors. Getting these right makes a crucial difference, and inevitably there is a degree of subjectivity to the decision. Although the model could have used any scale of weightings, Hart and his colleagues decided to limit the range to be 1 to 5. Guidelines direct the decision makers, for example, to only use a 5 where a constraint is ‘almost certain to be refused consent.’ Although this makes the weightings broad, it results in more practical decision-making rather than ‘too much time spent deliberating over whether decision

32

IMPACT | SPRING 2020

criteria should be a 7 or 8 weighting when 4 is suitable.’

SHORTEST IS NOT ALWAYS BEST

The PROM system was first used to contribute to a water resources management plan for Anglian Water Services, a UK water company with more than six million customers in the East of England, responsible for 112,833 kilometres of water and sewage pipes and around a quarter of the sewage works in England and Wales. Black & Veatch were tasked with reviewing the routing of around 500 kilometres of water pipelines, ranging in diameter from 0.5 to 1.8 metres.

in a complex environment it can often be the case that travelling a longer distance will result in lower operating costs

Without a system like PROM it might have been easy to resort to adopting the shortest feasible route, but in a complex environment it can often be the case that travelling a longer distance will result in lower operating costs. It’s a little like the Baywatch effect. When rescuing a drowning person, a lifeguard does not take the shortest distance, straight towards the victim, but rather runs further on the land to reduce the distance in the sea, where movement is much slower. (Light similarly bends as it travels into a medium with a different transmission speed, minimising the transit time.) With PROM it’s not speed that is being optimised, but the combined cost of installation and operation.

In the first deployment of PROM, one of the key considerations was energy use. By modifying weightings, it was possible to boost the scoring of routes that reduced energy use (and hence operational costs). This was done by making it prohibitive to reach too high an elevation, which would have involved excessive cost in the pumping stations needed to get water to travel uphill. Once the weightings have been settled on, recommended routes are selected by generating combined scores from the various datasets for five metre square cells covering the area where the pipeline might run. The analyst then enters start and finish points and produces routings, and, if required, also fixing waypoints where it is necessary for the pipeline to encounter, say, a treatment works along the way. Using the system is more of a process than an off-the-shelf package. As Hart points out: ‘We’ve not developed PROM as a user tool, it’s still run by GIS experts guided by engineers. The “M” stands for method and the reason for that is that it is an iterative process. The first run will often use “the database” but subsequent runs are based on issues highlighted by the first run. These iterations tend to require some GIS intelligence to gather, manipulate, or create an appropriate dataset to address the local requirements. We also output the data to meet requirements so the output can change per project and client. To an extent, we’ve tooled some of the steps to make our lives easier, but in a way an end-to-end tool would limit the value users get from PROM.’

planning for a pipeline route could take as long as three months – with PROM it takes less than a day


© Ordnance Survey

speeding up the traditional process was in a scheme running through the small town of Costessey, located about four miles west of Norwich (see Figure 2). PROM was introduced after planners had already spent around three months developing a route. The expectation was that the run would verify PROM’s effectiveness by generating a similar route more quickly – instead it defied apparent common sense by taking a totally different course which crossed the river Weesum twice (an expensive undertaking), where the original route only crossed it once.  However, when the analysts looked at why PROM was recommending what appeared to be a more difficult routing, they discovered that the original path would have interfered with a number of major planning applications and would have taken the pipeline under earthworks. When representatives of Anglian Water walked the original proposed route, they found that it was totally impractical, while PROM’s recommended route was far more suitable and was selected with only minor modifications.

FIGURE 2 ROUTES AROUND COSTESSEY (Proposed route in purple, PROM route in red)

The output from the system is far more than a simple route. One of the most valuable reports has proved to be detailed schedules, showing the run lengths of pipe, roads and rivers crossed, and so forth – all information that would have required extra work using a traditional approach. It used to be the case that the planning for a pipeline route could take as long as three months – with

PROM it takes less than a day. Mark Chandler of Anglian Water commented that ‘PROM has saved an estimated £40K during the water resources project’.

SPOTTING THE UNEXPECTED

A good example of how PROM has had benefits over and above simply

the original proposed route was totally impractical, while PROM’s recommended route was far more suitable and was selected with only minor modifications Since its first use, PROM has continued to support pipeline projects at Anglian Water and has been taken up by Bristol Water and South East Water. By 2010, PROM had been used on over 100 schemes, dealing with over 5000 kilometres of water pipelines – this is likely to have more than doubled by now. As Hart points out, the applications aren’t always new

IMPACT | SPRING 2020

33


connections: ‘Just because an old pipe follows a route doesn’t mean it’s the right route to construct now, so we will be given start and end locations, maybe intermediate connection points, maybe an option of connection points, maybe just a length of pipe. PROM can quickly generate routes between any locations, giving the optimal route between them. The speed a route can be generated and the outputs it gives has meant engineers testing scenarios such as “What if we included this treatment works?”.’ The system is being expanded to provide other reports, such as detailed cost assessment and carbon footprint analysis. Hart also sees a significant future for PROM in tackling other utility routing issues: ‘We have used it for a gas pipeline in the UK. My work in Australia used

34

IMPACT | SPRING 2020

PROM to determine the route of roads, transmission lines – where going from peak to peak is key – and high-pressure penstock – these are high pressure, above ground pipelines between upper and lower hydroelectric reservoirs where steepness and straightness is key. You change the weighting; you change the horizontal and vertical directional change rules – but otherwise the principles are the same.’ PROM is a great example of an application of O.R. analytical work which doesn’t involve complex mathematical techniques, but rather a powerful combination of large sets of data with geographical information to give speed of analysis, flexibility and ‘What-if ’s that go far beyond anything possible without it. Not only does PROM provide the water companies

with financial savings, it produces routing options that might never otherwise have been envisaged. Pipeline routing has never been so smart. Brian Clegg is a science journalist and author and who runs the www.popularscience.co.uk and his own www.brianclegg.net websites. After graduating with a Lancaster University MA in Operational Research in 1977, Brian joined the O.R. Department at British Airways. He left BA in 1994 to set up a creativity training business. He is now primarily a science writer: his latest title Conundrum features 200 challenges in the form of puzzles and ciphers, requiring a combination of analytical and lateral thinking plus general knowledge.


UNDERSTANDING STRATEGIC PRIORITIES AT RNIB STEWART WILLIAMS

THE ROYAL NATIONAL INSTITUTE OF BLIND PEOPLE (RNIB) is one of the UK’s leading sight loss charities and forms the largest community of blind and partially sighted people. In mid2018 the organisation was looking to relaunch its brand and put in place a new strategy focussed on achieving a world with no barriers to people with sight loss. There were many areas at that time where RNIB was actively involved in pressing for social change and many others where they had previously acted. Within the management team there

was a recognition that the new strategy should focus on a smaller number of priority areas, enabling them to focus their resources on initiatives where they could maximise their impact. The difficulty the organisation faced was in deciding what those areas should be.

IMPACT © 2020 THE AUTHOR

35


RNIB approached the OR Society and asked if it could provide consultancy support (via the Pro Bono scheme) to facilitate a prioritisation exercise to address this. A Pro Bono project profile (Figure 1) was subsequently issued, and it was at this point that I became involved.

DETERMINING PROJECT REQUIREMENTS

Having applied to provide the support, I was fortunate enough to be selected by RNIB to help them develop and implement their prioritisation approach and headed into their offices in London to discuss their requirements in a little more detail. From the Project Details, I had in mind that this would be a relatively straight-forward, workshop-based, multicriteria decision analysis (MCDA) exercise and I would need to: • Help RNIB determine a ­multicriteria hierarchical model of barriers faced; • Run a workshop or two to establish the weights (relative importance) of those barriers, considering the views of a representative group of ­stakeholders; • Assist RNIB (again via ­workshops) in rating how well each of their ­potential strategic actions ­contributed to removing each of the barriers in the model, resulting in a prioritised list of actions. It became clear during the meeting that my expectations were a bit simplistic and that there were two main strands to the prioritisation exercise: 1. To determine what priorities blind and partially sighted people placed on barriers (established

36

IMPACT | SPRING 2020

FIGURE 1 THE PRO BONO PROJECT PROFILE

from ­previous RNIB work) to their full participation in society by ­surveying as large and representative a sample as possible (in the hundreds); 2. To provide an initial indication of how well those barriers were a ‘fit’ with RNIB strategy and ­capability, making use of the more usual ­workshop-based approach. This would feed into the planned Theory of Change exercise later in RNIB’s strategic planning process, where ­options for action were to be ­developed further.

DECIDING ON THE PROJECT APPROACH

Achieving these objectives would enable us to present the overall results as (quote) ‘one of those quadrant plots much loved by consultants’, in this case showing Barrier Priority vs Fit.

Achieving these objectives would enable us to present the overall results as (quote) ‘one of those quadrant plots much loved by consultants’

During email and telephone conversations over the next week we discussed further the best ways of achieving these objectives and completed a Pro Bono Project Proposal form that effectively acted as the ‘contract’ for the engagement. It became clear while completing the proposal that the amount of support required from me was likely to exceed the 5-7 day estimate in the original Project Details.


Our discussions focused initially on Barrier Prioritisation, which RNIB felt to be the most important of the two areas. It became evident that RNIB really liked the prospect of using a pairwise comparison approach (as used to establish priorities within the Analytic Hierarchy Process (AHP) method) to determine not only the rankings of the perceived barriers but how much more of a priority one barrier was over another. For those unfamiliar with the approach, the box gives an idea of how pairwise comparisons work. It was decided to survey blind and partially sighted people using an online questionnaire that collected various demographic information and that solicited views on priorities using a series of pairwise comparison questions. To reduce the number of questions asked of respondents, and to ensure that problems with inconsistent judgements did not materialise, the minimum number of comparisons required to calculate priorities were included. Barrier 1 was compared to Barrier 2, Barrier 2 to Barrier 3, Barrier 3 to Barrier 4, etc. Identifying and dealing with inconsistent judgements requires the use of specialist software such as Transparent Choice and, often, facilitation, neither of which were available via the survey. To ensure accessibility, the survey questionnaire had to be implemented by RNIB in their preferred software (SurveyMonkey). This meant that I needed to write software to store the survey results in a database, enable all calculations and analyses to be performed, and results presented. In addition: • A help line was to be set up to enable people to ring in and provide their survey responses over the telephone; • YouGov were asked to provide results for a similar survey (hosted by

OVERVIEW OF THE PAIRWISE COMPARISON APPROACH Suppose that we have three options A, B and C. One approach to ­determining the relative priorities of A, B and C is to give each participant 100 points to divide between the three options, based on their p ­ erception of the relative priorities of those options. The total number of points ­allocated to each option then becomes a measure of the relative priorities of the options. In practice, particularly as the number of options increases, this becomes very difficult for the participants to do. People find it much easier to break down their assessments in a p ­ airwise fashion, comparing only two options at any one time. For instance, which of options A and B is the highest priority and by what factor? Which of ­options B and C is the highest priority and by what factor? If we know that A is twice as important as B, and B is three times as important as C, then we can deduce that A is six times as ­important as C. The pairwise ­comparison method in the AHP goes a step further by allowing direct comparison of A and C and (using eigenvector ­calculations) calculating the relative priorities based on all three judgements, taking account of any inconsistency between them and providing a measure of the level of inconsistency present. Verbal scales are used to help the participants make those comparisons, as shown below. The Fundamental Scale of Pairwise Comparisons Intensity of Importance 1 3 5

7

9

Definition Equal Importance

Explanation Two elements contribute equally to be objective

Moderate

Experience and judgement slightly

Importance

favour one element over another

Strong Importance Very Strong Importance Extreme Importance

Experience and judgement strongly favour one element over another One element is favoured very strongly over another, its dominance is demonstrated in practice The evidence favouring one element over another is of the highest possible order of affirmation

Intensities 2,4,6,8 can be used to express intermediate values. Source of table: Saaty, T.L., ‘Decision Making for Leaders’, RWS Publications, 1995.

them) from a panel of respondents diagnosed as visually impaired. It was decided that assessing the ‘Fit to RNIB’ of the barriers would be performed using a relatively simple MCDA approach.

ESTABLISHING BARRIER PRIORITIES

405 survey responses were received via SurveyMonkey and a further 251 from the YouGov panel. Their pairwise comparisons were aggregated and relative priorities for the barriers calculated. It is

IMPACT | SPRING 2020

37


FIGURE 3 THE RATINGS MODEL

FIGURE 2 THE RESULTS OF THE BARRIER PRIORITIES SURVEY

worth highlighting that those priorities were relative, we did not assess how serious a barrier was in absolute terms as part of the survey. Previous RNIB work had established that the barriers were significant and worthy of further attention. The highest priority barrier thus ‘scored’ 100 and the other barriers were allocated values that reflected their relative priorities. The results are shown in Figure 2. Dan Fisher, Head of Strategy and Performance at RNIB commented on these results: “Contrary to expectations, by far the most significant barrier to blind and partially sighted peoples’ lives was deemed to be public attitudes (which, when correlated with other ongoing RNIB research, were confirmed as being due to lack of awareness rather than prejudice). This insight led to a shift in RNIB’s strategy towards increasing investment in ‘social

38

IMPACT | SPRING 2020

FIGURE 4 FIT OF BARRIERS WITH RNIB

change’ and brand activities – i.e., the things which tackle the root causes of barriers in mainstream society, rather than continuing to focus on ameliorating the symptoms through services.” As we had collected a range of demographic data in the survey we were also able to present results for different segments of the sample, though in all cases ‘Facing low awareness of, and negative attitudes to, sight loss from the general public’ was found to be the highest priority.


ASSESSING FIT TO RNIB

The ‘Fit’ of the 12 barriers with RNIB’s strategy and capabilities was assessed using a very simple AHP ratings model (Figure 3) implemented in the Transparent Choice software, with the ratings assessments determined at a workshop with members of the RNIB Executive Leadership Team, Trustees and other senior managers.

by far the most significant barrier to blind and partially sighted peoples’ lives was deemed to be public attitudes

“The pro bono scheme gives charities like RNIB much-needed access to professional analytical skills which we struggle to be able to afford on the open market. More than that, though, it enables us to engage in partnerships with volunteers wanting to ‘give something back,’ which is inspiring for us, and hopefully useful and interesting for the volunteers who learn about a new topic or a new way of looking at the world. In our case, the piece of work that Stewart Williams from Hartley McMaster undertook for us has

become central to RNIB’s strategy and definition of itself. The results of the exercise were particularly interesting in two ways: i. As I highlighted earlier, by far the most significant barrier to blind and partially sighted peoples’ lives was deemed to be public attitudes, an insight that led to a shift in RNIB’s strategy. ii. Discussions with customers and staff during the process led to the ­crystallisation of a belief in an

The workshop attendees were split into three groups and each group came to a consensus assessment of: • How well addressing each barrier fitted with RNIB’s strategy; • RNIB’s capability to make a ­difference in addressing each barrier. Each of these assessments was made on a scale of High through to None as illustrated in Figure 3. Average assessment scores across the groups were then combined with the criteria weights to arrive at the overall ratings shown in Figure 4.

DID WE SATISFY THE PROJECT OBJECTIVES?

Of course, the ultimate test of success was whether we ended up with ‘one of those quadrant plots much loved by consultants’! Figure 5 demonstrates that we did. But seriously, the ultimate test of success was whether the process that was followed and the results that were obtained were of value to RNIB. In the words of Dan Fisher:

FIGURE 5 BARRIER PRIORITY VS FIT TO RNIB

IMPACT | SPRING 2020

39


‘asset model’ of sight loss, whereby we ­believe that people and society benefit from understanding how people with a visual impairment ‘see’ a world which is ­fundamentally designed for people with sight. Achieving a world which is fully accessible and inclusive for blind and partially sighted people is, we now believe, good for everyone – i.e. society benefits both practically and psychologically. Both insights are now core to RNIB’s approach, and the whole exercise kick-started further review and development of our service portfolio, shifting it service by service towards

40

IMPACT | SPRING 2020

concentration on interventions that will achieve the greatest societal impact for the best social returns. The work that Stewart undertook for us was absolutely critical to this, and we are extremely grateful to the OR Society for enabling this level of support.”

The pro bono scheme gives charities like RNIB much-needed access to professional analytical skills which we struggle to be able to afford ACKNOWLEDGEMENTS

It was a pleasure to work with the RNIB Research team during this

project. They put in all the hard work necessary to (among many other things) implement the surveys, ­provide me with data and ­organise meetings and workshops. Many thanks to Stuart Easton at Transparent Choice for providing free access to their web based AHP decision support software during this work. Stewart Williams is a consultant with Hartley McMaster and has provided analytical consultancy to organisations in both the public and private sectors for more years than he cares to admit. This was his first Pro Bono OR project, but he is looking forward to carrying on with this kind of activity when he eventually retires.


U N I V E R S I T I E S M A K I N G A N I M PAC T EACH YEAR, STUDENTS ON MSC PROGRAMMES in analytical subjects at several UK universities spend their last few months undertaking a project, often for an organisation. These projects can make a significant impact. This issue features a report of a project recently carried out at one of our universities: London School of Economics. If you are interested in availing yourself of such an opportunity, please contact the Operational Research Society at email@theorsociety.com CUSTOMER FEEDBACK TEXT ANALYSIS AND ANALYTICS (Kaiyi Chen, LSE, MSc Operational Research and Analytics)

BT is the UK’s largest provider of fixed-line, broadband and mobile services, and also provides subscription television and IT services. As a consumer-facing organisation, providing excellent customer experience is a key strategic goal and numerical measures of customer loyalty are used for feedback. This is complemented by analysis to create a clearer picture and to help focus service improvements where they’ll have the most benefit. Kaiyi’s project explored the potential for using advanced text mining techniques and associated analytics to generate recommendations to the business based on a large pilot set of anonymised customer survey data. The core was the application of methods taught on the course but it was clear from the start that she would have to research additional tools and would need considerable consultancy and interpersonal skills to work across teams to deliver a meaningful outcome against tight deadlines. Kaiyi was selected by BT from the shortlist forwarded by LSE in part

because of her exceptional academic record but also because of the strong ‘systems focussed’ thinking she demonstrated. She came to the LSE with a first class Materials Science degree from Oxford University and experience of leading a solar cell technology research team as well as consultancy with Accenture. Many students on the course have a mathematical background but some of the projects with most impact have actually been delivered by numerate engineers like Kaiyi with a broader background, and the LSE is looking for ways to encourage more engineers to join the course. Jonathan Malpass, a Research Manager in BT’s Applied Research department, originated and supervised the project for BT. Jon said: “When we signed up to the LSE programme we knew that success is never guaranteed, but Kaiyi exceeded our expectations on a challenging project and delivered results that were genuinely useful to the business. She demonstrated both her technical and ‘soft’ skills on this for BT

and her Distinction for this part of her masters degree was fully deserved.” Kaiyi agreed that it had been something of a leap into the unknown, but she was well supported by the BT team. “The project allowed me to apply the research and analytical skills I accumulated during my studies and previous work experience, but it also helped me develop a wide range of valuable consultancy skills that have already proved very useful.” The aim of the projects on the MSc programme is to develop and test the consultancy and project management skills taught on the course as well as the students’ technical ability. This is demanding on the students and on the LSE’s supervision team, who have to coach and mentor them through the project. However, the LSE believes the result is a more rounded and more employable graduate who has demonstrated more than technical competence. Kaiyi is now working in London with British engineering consultancy Atkins as a transport consultant.

IMPACT © THE OR SOCIETY

41



HARNESSING OPERATIONAL DATA TO MAKE PRISONS SAFER MARTINE WAUBEN, PHIL MACDENT, ADAM BOOKER, AND BEN MARSHALL

STORIES ABOUT THE SUFFERING caused by violence, self-harm, and deaths in custody can make for difficult reading. At any one time, there are roughly 83,000 people in prison custody. However, annually we see more than 34,000 assaults, 57,000 self-harm incidents, and just over 300 deaths in custody. These figures are taken from

https://www.gov.uk/government/ collections/safety-in-custody-statistics. The Ministry of Justice are responsible for safeguarding everybody in the justice system. Keeping people living and working in prisons safe is the top priority for the prison service, and we are exploring all possible approaches to reduce these levels.

IMPACT © 2020 CROWN COPYRIGHT

43


The Ministry of Justice’s Data Science Hub was set a challenge: how can Operational Research (O.R.) help reduce the rising levels of violence in our prisons? We needed to harness operational data, deliver top-notch O.R., and manage its delivery. It was vital that our insights would be embedded inside the prison service. It is all well and good having fabulous figures and shiny sums, but if operational researchers can’t embed their work inside business delivery, it has no impact, no benefit, and no longevity. Our job was not just to be accurate, it was to be useful and novel in our delivery to stand the test of time.

if operational researchers can’t embed their work inside business delivery, it has no impact, no benefit, and no longevity

THREE CHALLENGES

We used problem structuring methods to find opportunities for analytical approaches to tackle this challenge. We found that front-line staff were not in a position to best use data for three reasons. 1. The data wasn’t timely enough; 2. Systematic approaches to identifying and understanding the individuals driving violence in custody were limited at best; 3. Data wasn’t delivering the right insights. As a result, data was not used optimally in decision-making and O.R. wasn’t embedded in front-line processes. Therefore, we set out to not only deliver top-notch analysis, but also resolve some of these organisational problems that would prevent our insights from being embedded and used.

44

IMPACT | SPRING 2020

UNSUNG HEROES

Our first obstacle was that operational data took too long to get from the database to the reports used on the front line. In some cases, the process took weeks. We would have to wait for static extracts from our legacy systems, make copies for different team members, manually clean them all, and somehow discern which spreadsheet was the true final version. No doubt this lengthy, error-prone, human process is familiar to many working in large organisations with legacy systems. To ensure that staff could use the information to make operational changes, we needed to deliver insights from this data before it became outdated. Our data engineering team has created a central database that automatically refreshes every night. Each morning, this provides us with all incidents from 2010 to yesterday. At the end of the pipeline sit opensource tools for analysis, pointed at millions of rows of data about individuals, sentences, and incidents. These tables may be massive, but we are able to explore the data, test hypotheses, and visualise our

results in a matter of seconds using tools like JupyterLab and RStudio. This improves our analysis and its reproducibility. Engineering teams like this are the unsung heroes of high-impact data science. We have used technologies like AWS, Docker, Spark and Parquet. Our analytical platform is fully coded in the open. It is of such high quality that other government departments are now using our codebase to learn lessons for their own development. This is a step change for government O.R. and underlies a lot of the impact we were able to deliver.

VIOLENCE VARIABILITY

The second obstacle we tackled was the fact that systematic approaches to identifying and understanding the individuals driving violence in custody were limited. We set out to provide staff with individual-level information on prison violence. This boils down to a textbook O.R. question: for any given person in custody, how violent should we expect them to be? This process resulted in the Violence in Prison Estimator (ViPer). It is a


linear mixed effects model to estimate how violent an individual may be during their time in custody, based on that person’s behaviour during previous time in custody. We adjust for the background level of violence in each prison and the historic upward trend each year by including them as random population effects in the model. Crucially, the model also includes a random effect that allows the model to vary its outputs based on an individual’s unique ID. This makes its estimates tailored to each individual. The model may seem deceptively simple, but this individual random effect makes the model hugely computationally intensive! ViPer does not produce a prediction, but a description of a person’s current rate of violence. This is one piece of information staff can use to inform their approach to offender management. To make such a tool acceptable to prison staff, we needed to balance ease of use and transparency. This modelling approach is fully transparent, and allows us to separate out populationwide trends from individual effects. This standardisation is important to ensure ViPer doesn’t change when individuals are transferred to a different prison, which would be difficult to explain to staff. We also needed to communicate uncertainty. ViPer needs some information about individual custodial history to be accurate. If a person is completely new to custody, it will be unreliable. We used the model’s standard error to approximate this uncertainty, and communicated that through a one- to three-star rating system. As a result, we have an easyto-use tool that can inform population management. This has

had a transformative effect on the conversation around violence in custody, and allows staff at all levels to have common terminology to describe individuals across the estate.

An easy-to-use tool that can inform population management has had a transformative effect on the conversation around violence in custody

DELIVERY REVOLUTION

The final obstacle was that prisons are a low-tech environment. The IT in the prison service is, by necessity, very locked down and secure. However, as a side-effect the systems were not suited to delivering our insights to the front line. We achieved a break-through with the Safety Diagnostic Tool (SDT). The SDT integrates R Shiny and bespoke HTML/CSS/Javascript/D3.js with existing IT infrastructure. The basic premise is that simply using the data properly, applying

innovative analytics to it, and a strong focus on user needs, combine to a very powerful tool. The SDT covers the number of assaults that have happened, and what sort they were. It covers how many people are self-harming, and how often. It lists the most recent incidents, so staff who weren’t on shift yesterday can catch up. And it lists the individuals most in need of management, as evidenced by ViPer. Besides individual-level information, it also includes dashboards that show trends over time: what are the most common reasons for assault? What selfharm methods are on the rise? When and where are incidents happening?

it also includes dashboards that show trends over time: what are the most common reasons for assault? What self-harm methods are on the rise? When and where are incidents happening?

All this can inform regime changes: are scuffles breaking out when queueing for food? Maybe unlock fewer wings

IMPACT | SPRING 2020

45


at a time. Are people using medication to self-harm? Maybe officers should keep a closer eye on whether everyone is taking their own drugs. This is a revolution in how data is used to make these crucial decisions, potentially saving lives.

DATA-DRIVEN DECISIONS

The SDT has been a massive success. We have thousands of registered users and get more than 200 unique daily log-ins across 120 prisons. The SDT is now part of most safety officers’ standard daily routine and they have come to rely on it for crucial tasks like preparing assessments. Feedback has reflected how integral the tool is. It is now so embedded that it will continue to be used for many years. Front-line staff have said that “this will be the ‘go to’ tool in regard to incident management and planning of prisoners involved in most incidents”, and “I think it’s a superb tool. Now that I have it I would not want to let it go”. The benefits are tangible. We have seen teams use ViPer to better spread the risk of violence across prisons and wings, so they can better manage it. SDT data is used in assessing prisoners for security and cell-sharing risk. It allows for more bespoke interventions for individuals because, most crucially, it saves staff huge amounts of time previously spent behind desktops in offices collating all this data. Staff now also recognise the value of data. One wonderful result has been that officers are better at recording information, which we can then use to improve our analysis. For example, after we rolled out, staff were much more likely to record the reasons for

46

IMPACT | SPRING 2020

assaults. We can now use this to spot patterns and anomalies, which then get fed back to improve services. Thus, the benefits all focus on delivering better front line services.

The SDT has created a lasting culture shift, with demonstrable benefits for the front line.

THREE SOLUTIONS

We tackled each of the three significant obstacles to using data on the front line: 1. We made data available by refreshing the data overnight and automating cleaning and pre-processing, thereby improving timeliness. 2. We used innovative, advanced O.R. methods to provide a way to systematically identify and understand individuals driving violence in custody, and adapted them so they could be accepted as business-as-usual. 3. We revolutionised the delivery of these insights through an interactive user interface. All steps were crucial for setting the scene to fundamentally change how data and O.R. are used by staff all the way down the management chain, and how it is delivered to those brave people trying to change peoples’ lives every day. The SDT has created a lasting culture shift, with demonstrable benefits for the front line. Our analysis supports those making a difference in the lives of people in custody. By

meeting staff where they are, we can embed insights into existing processes. We help them make more data-driven decisions to rehabilitate and support offenders and keep the public safe. Fundamentally, that is what we are here to do. Martine Wauben is a Data Scientist and Government Statistician at the Ministry of Justice, focusing on self-harm in custody. She has a background in Psychology and Epidemiology, through a BA from the University of Oxford and an MSc from Utrecht University. Phil Macdent is the Head of Prison Data Science at the Ministry of Justice, working on improving outcomes for offenders and reducing work pressures for staff through innovative data science solutions. He holds an MSc in Mathematics from the University of Warwick. Adam Booker is the Head of Prison Data Engineering at the Ministry of Justice. He holds an MSc in Operational Research from the University of Southampton and has a background in data science. Ben Marshall joined the Civil Service after studying philosophy and criminology at the University of Sheffield. He is currently pursuing a PhD in statistics. His research focuses on developing probability models for machine perception. This work was awarded the OR Society’s President’s Medal in 2019. This article is published under an Open Government Licence http:// www.nationalarchives.gov.uk/doc/ open-government-licence/version/3/


THINK LIKE A ….? Geoff Royston

gives of engineering thought and action – reducing traffic congestion, introducing just-in-time production, or even the development of London’s Victorian sewer network. So maybe a better question is; are there ways in which engineers think differently from analysts? (apart of course from those who have an engineering background!) - and if so, are there lessons for improving the way analysts work with clients?

THE BENEFITS OF FAILURE

You can’t judge a book by its cover, nor sometimes even by its pages. If you saw a book with chapter headings such as “optimising” and “solutions under constraints,” you might well think you were looking at a textbook on operational research. If you then flipped through the pages and saw phrases like “modelling techniques” and “parameter variation”; you might feel your assumption was confirmed. But, for the publication to which I am referring here, you would be wrong. The book is Think Like an Engineer by Guru Madhavan. As well as chapters with the headings mentioned, there are also chapters on “prototyping” and “learning from others,” which one might not be so likely to see in an O.R. textbook. Which raises the question, how far do - or should - analysts, think like engineers? How relevant is the engineering mindset to tackling managerial and policy issues? Anyway, how do engineers think?

HOW ENGINEERS THINK

In his book, praised by readers such as Alvin Roth, economics Nobel prize winner, and Tim Harford, of BBC’s More or Less, Madhaven describes engineering thinking as having three key characteristics: ability to see structure (especially where there currently isn’t any); ability to deal with constraints (e.g. physical, financial or behavioural); and ability to make trade-offs between conflicting objectives such as performance and cost; finding the intersection of feasibility, desirability and affordability. This begins to look a lot like an O.R. mindset, a view that is reinforced by some of the examples that Madhaven

Madhaven’s book gives some hints about this. He explains that engineers are very interested in failure, or to be more exact, on the long journey of exploration and evolution to move from a primitive prototype to a polished end product (a journey which often is not recognised, as it is only the final product that most people see). In explaining the importance of failure and step-wise refinement in engineering Think Like an Engineer gives the example of the opening ceremony of the 2012 London Olympics, where part of the brief was for a scene depicting the industrial revolution; with factory chimneys, steam engines and so on, all to be delivered live and to full-scale and to immediately follow an act featuring the English countryside complete with meadows and grazing animals. How to achieve a seamless transition from pastoral to urban? How could large props like chimneys be erected and dismantled quickly, and be concealed before and after the scene? Use digitally projected images perhaps? Or painted chimneys on fabric which could be unfurled, and rolled up, rapidly? No, the client said that nothing less than real 3-D chimneys would do. The technical team experimented with numerous models, none of which proved satisfactory. Then they came up with the idea of inflatables. After much prototyping they produced an inflatable, full-height, realistic, chimney. At the opening ceremony no less than seven “brick” chimneys rose majestically thirty metres into the sky, they even had smoke coming from the top. We could argue of course that O.R. and analytics professionals also engage in iterative design, for example by using simulation modelling to explore different possible solutions to a problem. But are analysts quite as devoted to spending time on searching for a wide variety of potential solutions, to testing and evaluating prototypes in the real world, to extracting every bit of learning from failure, as the engineer?

IMPACT © THE OR SOCIETY

47


DECISIONS OR DESIGNS?

For that matter, do O.R. and analytics treat the whole business of design as seriously as do engineers? Madhaven describes the core of the engineering mindset as being about “modular systems thinking - combining systems thinking with systems building” and notes that “if the core of science is discovery, then the essence of engineering is creation.” Design lies at the very heart of engineering – whereas, in the way it is typically presented, design appears to be somewhat peripheral to O.R. and analytics. O.R. analysts generally talk first about helping clients make better decisions, and may not even consciously consider their work as also being about design. Yet appearances can be deceptive, improving any system entails both analysis and synthesis, and so O.R. and analytics often involves building better systems for clients – undoubtedly a design task. It matters if analysts – or clients – do not recognise this; for instance analysts operating only “downstream,” assessing and helping clients decide between a set of given options, risk overlooking other, better, options that might have been identified if analysts had also been involved at the “upstream “conceptual design stage. (Two of the past giants of O.R. - Russ Ackoff, Herbert Simon – did stress the importance of design - as their respective books, Idealised Design and The Sciences of the Artificial, demonstrate - but their message seems not to have been sufficiently heard. Yet managers have been increasingly realising the relevance of design principles and thought – as evidenced for example by the book by Roger Martin, The Design of Business: Why design thinking is the next competitive advantage.)

HOW ENGINEERS MAY NOT THINK

Engineers have to be good with objects and ideas but are maybe not always so good with people – as caricatured by Dilbert, the engineering nerd created by the cartoonist Scott Adams. The importance of engineers balancing rationality with empathy is stressed by Madhaven: “engineers should rise above the comforts of cold, mechanistic, isolated problem solving.” This involves gaining an insight into what clients want, making sure they are involved right from the start in a process of codesign and coproduction to ensure that complex technologies are approachable and accessible. That must go for analysts too.

LEARNING FROM OTHERS

At its worst, a failure to listen to customers, to respect their existing knowledge and expertise, can lead to catastrophe.

48

IMPACT | SPRING 2020

Think Like an Engineer gives the example of the water temples of Bali. For millennia its rice terraces used an integrated system of organic farming and shared water management. A network of irrigation tunnels allowed monsoon rainwater to be shared between upstream and downstream farmers, synchronising harvests and ensuring both reduced pest attack and water conservation. This wisdom of centuries of experience was ignored when an attempt was made to increase rice production by introducing high yield varieties, pesticides and chemical fertilisers, hoping to emulate the green revolution that had transformed grain production in India. But here “miracle rice produced miracle pests”; millions of tons of rice were lost, soil erosion and collapse of the water system followed. Subsequent computer modelling confirmed the superiority of the traditional approach in the local physical and cultural environment.

THINKING, HARD, SOFT AND CREATIVE

So, maybe the message for analysts from Think Like an Engineer is that the mindset of engineers is not so different. Both are concerned with structuring problems, with dealing with constraints and with clarifying tradeoffs. Both need to ensure that they pay attention not only to the “hard” technical facets of problems but also to the “soft” social and behavioural aspects. But engineers do seem to differ in placing creative design at their professional centre; analysts could usefully give more emphasis to their own creative thinking and systembuilding skills – and do more to develop and deploy these further.

Dr Geoff Royston is a former president of the OR Society and a former chair of the UK Government Operational Research Service. He was head of strategic analysis and operational research in the Department of Health for England, where for almost two decades he was the professional lead for a large group of health analysts.


JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis. Perspicax agricolae suffragarit Augustus. Suis vocificat fiducias.

00

Saburre miscere Aquae Sulis. Pessimus tremulus matrimonii insectat Octavius.

00 •

Satis saetosus ossifragi agnascor incredibiliter perspicax apparatus bellis.

00

Satis quinquennalis fiducias imputat gulosus agricolae.

00

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

PEAN OF TION

JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY

EJIS

Contents

JORS is published 12 times a year and is the flagship journal of the Operational Research Society. It is the aim of JORS to present papers which cover the theory, practice, history or methodology of OR. However, since OR is primarily an applied science, VOLUME 00 NUMBER 00 MONTH 00 it is a major objective of the journal to attract and ISSN: 0960-085X publish accounts of good, practical case studies. Consequently, papers illustrating applications of OR 00 to real problems are especially welcome.

00

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

00

Perspicax agricolae suffragarit Augustus. Suis vocificat fiducias.

00 •

Saburre miscere Aquae Sulis. Pessimus tremulus matrimonii insectat Octavius.

00

Satis saetosus ossifragi agnascor incredibiliter perspicax apparatus bellis.

00

Satis quinquennalis fiducias imputat gulosus agricolae.

00

Apparatus bellis corrumperet Medusa, quod fiducias amputat verecundus suis.

00

Real applications of OR - forecasting, inventory, investment, location, logistics, maintenance, marketing, packing, purchasing, production, project management, reliability and scheduling A wide variety of environments - community OR, education, energy, finance, government, health services, manufacturing industries, mining, sports, and transportation Technical approaches - decision support systems, expert systems, heuristics, networks, mathematical programming, multicriteria decision methods, problems structuring methods, queues, and simulation

THE EUROP JOURNAL O INFORMAT SYSTEMS

Editors-in-Chief: Thomas Archibald, University of Edinburgh Jonathan Crook, University of Edinburgh

VOLUME 00

T&F STEM @tandfSTEM

Dov Te’eni @tandfengineering NUMBER 00

Explore more today… bit.ly/2ClmiTY MONTH 2018


#OR62

Learn Share Connect

The OR Society’s annual conference OR62 puts you in the centre of our vibrant OR and analytics community: • Share knowledge and raise your profile by presenting

• Network with big names, peers and early careers people

• Stay current with the latest

• Stay current with the latest developments in OR and analytics developments in OR and analytics

Operational research (OR) is the science of better decision-making.

Find out more at

www.theorsociety.com/OR62 www.theorsociety.com

@theorsociety


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.